-
Notifications
You must be signed in to change notification settings - Fork 6
Classification
A classifier tells us what class <math>c</math> an event/example <math>m</math> belongs to. We understand as a classifier a column vector <math>Y ^{\mathbb{R}(M,1)}</math> taking <math>C</math> different categorical values, either numeric or alphanumeric tags.
The drawback of having a model predicting classes with categorical values, however, is that we do not know the confidence attributed to every classification, which is something useful to know when combining several classifiers. Therefore, it is sometimes convenient to consider it equivalent to having multiple binary classifiers <math>Y^{\mathbb{R}(M,C)}</math>. Our model will then make a winner-takes-all vote of the estimations <math>\widehat{Y}^{\mathbb{R}(M,C)}</math> to decide which class <math>c</math> the example <math>m</math> belongs to.
A binary classifier can consist of a scalar output <math>\widehat{Y}_{mc}</math> expressing the confidence of that particular example <math>X_m</math> belonging to the class <math>c</math>. Values above the boundary threshold <math>\gamma_c</math> will be considered belonging to class <math>c</math>, and those below will be considered not to belong to class <math>c</math>. The typical boundary threshold <math>\gamma_c</math> chosen is 0, which simplifies things because:
- Class belonging can be computed by <math>sign(\widehat{Y}_c)</math>.
- Confidence value can be seen as <math>abs(\widehat{Y}_c)</math>.
The following classifiers are commonly used in Deep Learning as the final step of an unsupervised learning algorithm, as to 'agree' on what is what, sort of 'agreeing' to speak the same language. This last step is thus supervised.
Algorithms
- Machine Learning
- Deep Learning