Explain in Detail Different Types of Machine Learning Models?
Explain in Detail Different Types of Machine Learning Models?
For best results with binary classification, the training data should be balanced (that is, equal
numbers of positive and negative training data). Missing values should be handled before
training.
The input label column data must be Boolean. The input features column data must be a fixed-
size vector of Single.
This is called the regression line and it’s drawn (using a statistics program like SPSS or STATA
or even Excel) to show the line that best fits the data. In other words, explains Redman, “The red
line is the best explanation of the relationship between the independent variable and dependent
variable.”
In addition to drawing the line, your statistics program also outputs a formula that explains the
slope of the line and looks something like this:
Y= mx + c + error
Illustrate how to handle more than two classes in beyond Binary Classification.
Handling more than two classes
Certain concepts are fundamentally binary. For instance, the notion of a coverage curve does not
easily generalise to more than two classes. We will now consider general issues related to having
more than two classes in classification, scoring and class probability estimation. The discussion
will address two issues: how to evaluate multi-class performance, and how to build multi-class
models out of binary models. The latter is necessary for some models, such as linear classifiers,
that are primarily designed to separate two classes. Other models, including decision trees,
handle any number of classes quite naturally.
Multi-class classification
Classification tasks with more than two classes are very common. For instance, once a patient
has been diagnosed as suffering from a rheumatic disease, the doctor will want to classify him or
her further into one of several variants. If we have k classes, performance of a classifier can be
assessed using a k-by-k contingency table. Assessing performance is easy if we are interested in
the classifier's accuracy, which is still the sum of the descending diagonal of the contingency
table, divided by the number of test instances.
Examples of multi-class classification are
• classification of news in different categories,
• classifying books according to the subject,
• classifying students according to their streams etc.
In these, there are different classes for the response variable to be classified in and thus
according to the name, it is a Multi-class classification.
Illustrate the following
a. One-versus-one voting.
One vs. One (OvO)
In One-vs-One classification, for the N-class instances dataset, we have to generate the N* (N-
1)/2 binary classifier models. Using this classification approach, we split the primary dataset into
Taking the above example, we have a classification problem having three types: Green, Blue,
Each binary classifier predicts one class label. When we input the test data to the classifier, then
where,
where,
H --> Horn Clause
L1,L2,...,Ln --> Literals
and