Jacob Bumgarner
Jacob Bumgarner

@Bumgarner_JR

6 Tweets 3 reads Aug 20, 2022
Understanding the logic behind logistic regression can provide strong insight into the basics of deep learning.
I'm excited to share my visually-driven deep dive into this algorithm.
๐Ÿงต
Logistic regression is a form of supervised machine learning. Logit models act to create probabilistic labels for sets of input data.
These labels are generated by transforming the input data with the logistic function and learned weights and bias parameters.
๐Ÿงต
The logistic function has two chained components:
1. The linear transformation
This transformation flattens the input data to 1D using learned parameters. The weights balance the importance of input data features, and the bias shifts the data along a decision boundary.
๐Ÿงต
and
2. The sigmoid activation
The sigmoid function is then applied to the output from the linear transformation. This function generates an output probabilistic value between 0 and 1.
Here the decision boundary is drawn at 0.5, labeling outputs >= 0.5 as positive.
๐Ÿงต
Lastly, the model must be iteratively trained with test data to learn its optimal parameters.
The model predicts labels for the test data, which are used to generate parameter gradient vectors. The parameters are stepped down their gradients to minimize the model's cost.
๐Ÿงต
For more detailed insight, check out my article where I:
1. Build a logistic regression model from scratch with NumPy
2. Train the model to predict heart disease with the UCI Heart Disease Dataset
3. Build a TensorFlow logit model
tinyurl.com
Thanks for reading!

Loading suggestions...