One of the most popular technique to use for predictive modelling and data mining tasks is regression analysis. There are 15 types of regression analysis with its own regression techniques and assumptions. Regression analysis plays a very important role in statistics. There are good possibilities that some might have even done some assignments in regression analysis by getting some regression analysis homework help from your friends. In this article we will see the different types of regression analysis.

Linear Regression: Linear regression is the simplest form of regression where the dependent variable is continuous in nature and also the relationship between the dependent and independent is assumed to be linear.

analysis

Polynomial regression: Polynomial regression is a technique to fit a non-linear equation by taking the independent variable

Logistic regression: A Logistic regression is where the dependent variable is continuous or binary in nature.

Quantile regression:  A Quantile regression is where the extension of a linear equation where outliers, high skewness and heteroscedasticity is existing in the data.

Ridge regression: Ridge regression helps understand the concept of regularisation to help solve fitting problems. It means that the data has performed well in training but is weak on validation data.

Lasso regression: Least Absolute Shrinkage and selection operator or Lasso regression makes use of regularisation technique in the objective function.

Principle components regression (PCR): A regression technique that is widely used when there are many independent variables in the data. PCR is divided into two steps

  • Getting the principal components
  • Running regression analysis on principal components

Elastic Net regression: A regression that is preferred over ridge regression and lasso regression. It helps select the regression model that fits the data model the best.

Partial least squares (PLS): Partial least square is an alternative technique of principal component when the independent variables highly correlate.

Support Vector Regression: In Support Vector regression, both linear and non-linear models can be solved. They use non-linear functions to find an optimal solution for any non-linear model. The idea is to minimise error, individualise hyperplane that maximises the margin.

Ordinal regression: Ordinal regression can predict ranked values and is suitable when an independent variable is ordinal in nature.

regression analysis

Poisson regression: When the dependent variable has count data it is Poisson regression and can be applied when

  • predicting the number of calls for a particular product to
  • estimating the number of emergency service calls during an event.

Negative Binomial regression: Negative binomial regression deals with data count by not assuming the distribution of count having variance equal to its mean.

Quasi Poisson regression: Quasi Poisson regression can be used for overdispersed count data; it is a linear function of the mean, whereas the variance of the negative binomial model is a quadratic function of the mean.

Cox regression: Cox regression is suitable for time to time data updates

  • time from the account is opened to attrition
  • Time after treatments to death
  • Time from heart attack to the next