On the planet of knowledge and computer programs, the concept of Machine Learning might sound like a tricky nut to crack, filled with tricky math and sophisticated ideas.
This is the reason today I would like to decelerate and take a look at the fundamental stuff that makes all this work with a brand new issue of my MLBasics series.
Today’s agenda is giving our good old Logistic Regression a swanky upgrade.
Why?
By default, Logistic Regression is restricted to two-class classification problems. Nevertheless, we regularly face multiple-class problems.
So let’s dive into the fascinating world of leveling up Logistic Regression to have the opportunity to sort things into greater than two baskets 👇🏻
Within the ML field, Logistic Regression stands as an optimal model for binary classification problems.
It’s the trusted path towards decision-making.
Nevertheless, there’s an enormous problem with Logistic Regression: It is sort of a coin toss — heads or tails, A or B.
But what if you will have multiple classes?
Logistic regression will not be enough to handle a multiple-class classification. Subsequently, to perform so, the model must be adapted and there are two principal options:
- The primary easy approach is using multiple Easy Logistic Regression models to discover each considered one of the classes we would like. It is a simple solution.
- A second approach is to generate a brand new model that accepts multiple classes.
So let’s break down each approaches: