top of page

JML Regression Models

Regression Overview:

Regression analysis is a set of statistical methods used for modeling relationships in data. We start by choosing a base function to use to model this relationship. To do this we define an error that explains how well or poor our model is performing then we work to minimize this error.

​

As an example, for simple linear regression, we choose a line as our model and find the "best fitting" line for the dataset by computing the slope and intercept (often called model parameters) that minimize the error (In this case it will be the sum of square errors). 

​

JML Regression Models:

The JML library provides four basic regression models: 1D linear regression, nD linear regression, polynomial regression, and logistic regression. Note that logistic regression can also be used as a classification model (see classification models for more). In addition to these models, there are also variants of the 1D linear regression, nD linear regression, and polynomial regression which are solved using gradient descent rather than forming the normal equations explicitly using least squares.

Linear Regression

1D linear regression using least squares. Fit a line of the form y = w0 + w1*x to a dataset.

Multiple Linear Regression

nD linear regression using ordinary least squares. Fit a N-dimensional plane of the form y = w0 + w1*x1 + ... + wn*xn to a dataset.

Polynomial Regression

Polynomial regression for nD polynomials. Fit a function of the form y = w0 + w1*x^1 + ... + wn*x^n

Logistic Regression

Single class logistic regression of 1 or multiple variables. Fit a function of the form y = 1 / (1+e^(wx)) to a dataset where w=(w0, w1, ..., wn) and x = (1, x1, ..., xn)

© 2023 by Jacob Watters.

bottom of page