We all have developed numerous regression models in our lives. But only few are familiar with using regression models for classification. So my intention is to reveal the beauty of this hidden world.
As we all know, when we want to predict a continuous dependent variable from a number of independent variables, we used linear/polynomial regression. But when it comes to classification, we can’t use that anymore.
Fundamentally, classification is about predicting a label and regression is about predicting a quantity.
Why linear regression can’t use for classification? The main reason for that is the predicted values are continuous, not probabilistic. So we can’t get an exact class to accomplish the classification. You will further understand by glancing at the below predictions.
Probability is ranged between 0 and 1. But in linear regression, we are predicting an absolute number, which can range outside 0 and 1.
Yes, you can still normalize the value to the 0–1 range but the results may be worse. This is because Linear Regression fit is highly affected by the inclusion of an outlier. Even a small outlier will ruin your classification.
On the other hand, using linear regression for multi class prediction makes no sense. Linear regression assumes an order between 0, 1, and 2, whereas in the classification regime these numbers are mere categorical placeholders.
To overcome the aforementioned problem, there are 2 great solutions.
- Logistic Regression — For binary classification
- Softmax Regression — For multi class classification
I am using Red Wine Quality Dataset (in Kaggle) for demonstrating this to you.
Original data set is publicly available in the UCI machine learning repository.
Note: I will not perform any detailed preprocessing or dimensional reduction techniques today as my intention is to walk through you mostly on the classification models.
Okay! First, we will see how a binary classification is achieved using Logistic Regression.
Before applying the model, let’s understand some of the core concepts in logistic regression.
I’ll show you how logistic regression works with an example. Consider 2 outcome probabilities of tossing a coin.
Let’s take the probability of displaying head as p and probability of displaying tail as q. Then we define a new concept called Odds as p/q.
In this scenario, we have only two possibilities. That…
Continue reading: https://towardsdatascience.com/regression-for-classification-hands-on-experience-8754a909a298?source=rss—-7f60cf5620c9—4