Sunday, May 6, 2018

MACHINE LEARNING | MULTICLASS CLASSIFICATION

MACHINE LEARNING - DAY 9

MULTI-CLASS CLASSIFICATION: ONE-VS-ALL


For the basics, you can check the earlier articles.

Terms used in this article can be understood from:

Continuing our learning in machine learning today we’ll learn about the multi-class classification in logistic regression also known as one vs all.

Till now we have discussed about the 2 classification possibilities or 2 outcomes i.e., 1 or 0. Now, let’s see what happens when there are more number of possibilities.

for eg.,

lWeather: sunny, rainy, pleasant, windy

The outcome or the categorical value can be: 0, 1, 2, 3

lHealth: ill, dizzy, well
  
   The outcome or the categorical value can be: 0, 1, 2

The numbering doesn’t matter. It can be 1,2,3,4 or 0,1,2,3. These are just values which categorizes the given data or output into different categories.

y  {0,1,2…,n}

hΘ(0)(x) =  P(y = 0 | x; Θ )

hΘ(1)(x) =  P(y = 1 | x; Θ )
.
.
.
hΘ(n)(x) =  P(y = n | x; Θ )

prediction : max(hΘ(i)(x))
           i

STEPS OF COMPUTATION:

1. Plot the data





2. Take the classes one by one and rest of the 2 classes will behave as a single class or category. The probability of the single class is calculated in this way.

   For eg,




CONCLUSION:

Train a logistic regression hΘ(x) for each class to predict the probability that y = i.

To make a prediction on a new x, pick the class that maximizes hΘ(x) and that will be the output.


That’s all for day 9. Today we learned about the multi-class classification and how to compute it.

In day 10, we will be learning about the issue known as Overfitting which originates due to over-training of the model. The solution for this issue is Regularization which we’ll also cover in the next article.

If you think this article helped you in learning something new or can help someone then do share this article among the peers.

Till then Happy Learning!!!





No comments:

Post a Comment