8 tweets 63 reads Aug 17, 2023
5 Regression Algorithms you should know
🧡
1️⃣ Linear
Linear regression is the most fundamental and widely used regression algorithm.
It assumes a linear relationship between the variables.
The goal is to find the best-fitting line that minimizes the errors between the predicted and actual values.
2️⃣ Polynomial
Polynomial regression allows for nonlinear relationships between variables.
It adds polynomial functions to the line equation, so it can capture more complex patterns in the data.
3️⃣ Ridge
Ridge regression is useful when dealing with high-dimensional datasets.
Why?
It adds a penalty term to the linear regression cost function, which helps reduce the impact of irrelevant variables.
It is more robust to collinearity and overfitting.
4️⃣ Lasso
Lasso regression is similar to Ridge, but it can exclude useless variables from equations.
It is particularly valuable when the number of predictors is large relative to the number of observations.
5️⃣ SVR - Support Vector
SVR aims to find a hyperplane that maximizes the margin around the predicted values.
SVR is based on the principles of support vector machines.
It is particularly effective when dealing with datasets with nonlinearity and outliers.
That's it for today.
I hope you've found this thread helpful.
Like/Retweet the first tweet below for support and follow @levikul09 for more Data Science threads.
Thanks πŸ˜‰
You should also join our newsletter, DSBoost.
We share:
β€’ Interviews
β€’ Podcast notes
β€’ Learning resources
β€’ Interesting collections of content
dsboost.dev

Report this thread