There have been entire textbooks dedicated to this one model. For now, we’ll just cover what you need to know to pass the interview.
Now, say I want to prevent overfitting
-
L1 Regression (Lasso Regularisation)
-
L2 Regression (Ridge Regularisation)
-
grid search
-
Extra cool algorithms
- beam search
- Optimisation algortihms (heuristic)
or Linear Regression:
- L1 Regularization (Lasso): Adds a penalty equal to the absolute value of the magnitude of coefficients. This can lead to some coefficients being zero, effectively performing feature selection.
- L2 Regularization (Ridge): Introduces a penalty term equal to the square of the magnitude of coefficients. This discourages large values of coefficients but does not set them to zero.