Multiclass Logistic Regression
Logistic Regression is a classification algorithm used for binary outcomes. However, when there are more than two classes (multiclass classification), standard logistic regression must be extended. There are two main approaches to handle this:
1. One-vs-Rest (OvR) Method
In One-vs-Rest (also known as One-vs-All), we break the multiclass problem into multiple binary classification problems.
- For K classes, we train K separate binary classifiers.
- Each classifier predicts whether a sample belongs to its assigned class vs. all other classes.
- During prediction, each classifier gives a probability score, and the final prediction is the class with the highest probability.
Example:
If we have classes A, B, and C:
- Classifier 1: A vs (B, C)
- Classifier 2: B vs (A, C)
- Classifier 3: C vs (A, B)
Pros:
- Simple and scalable.
- Works well when classes are well-separated.
Cons:
- Classifiers may be imbalanced (one class vs many).
- Predictions might be inconsistent (e.g., multiple classifiers predicting positive).
2. Polynomial Logistic Regression