MLP FU
Models/Regression

Regression

Introduction to Regression

Regression is a type of supervised machine learning used to predict a continuous value (like price, temperature, or score) based on one or more input features. It helps us understand and model the relationship between variables.

Common uses:

  • Predicting house prices
  • Forecasting sales or demand
  • Estimating physical measurements

Types of Regression Models

There are several types of regression models, each with its own strengths and use cases. Here are some of the most common:

ModelWhen to UseKey Feature
Linear RegressionRelationship is roughly a straight lineSimple, interpretable
Polynomial RegressionRelationship is curved/non-linearFits curves by adding powers
Ridge RegressionMany features, risk of overfittingPenalizes large weights (L2)
Lasso RegressionMany features, want feature selectionCan set some weights to zero (L1)

Key Differences

  • Linear Regression: Fits a straight line. Good for simple, linear relationships.
  • Polynomial Regression: Fits a curve by adding powers of features. Good for non-linear data.
  • Ridge Regression: Like linear regression, but adds a penalty for large weights (L2 regularization). Helps when you have many features or multicollinearity.
  • Lasso Regression: Like ridge, but can set some weights to zero (L1 regularization). Useful for feature selection and simpler models.

Visual Comparison

Summary Table

ModelHandles Non-LinearityReduces OverfittingFeature Selection
Linear RegressionNoNoNo
Polynomial RegressionYesNoNo
Ridge RegressionNoYes (L2)No
Lasso RegressionNoYes (L1)Yes

Choosing the Right Model

  • Start with linear regression for simple problems.
  • Use polynomial regression if your data looks curved.
  • Try ridge regression if you have many features or your model overfits.
  • Use lasso regression if you want to automatically ignore unimportant features.

Reference this page for a quick overview and comparison of regression models!