0% found this document useful (0 votes)
24 views

Spriiprad - Machine Learning Model Basics Intermediate

This document provides an overview of various machine learning models and techniques, including: - Supervised vs unsupervised learning and their key differences - Linear regression and logistic regression for classification and prediction - How naive Bayes classification works using Bayes' theorem - Neural networks using layers of neurons to learn complex patterns - Ensemble methods like random forests and boosted trees that combine multiple decision trees - K-nearest neighbors classification based on similarity to known examples It covers the basic concepts, applications, assumptions and workings of these common machine learning algorithms in 3-6 sentences each.

Uploaded by

Shilpa Gadgil
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

Spriiprad - Machine Learning Model Basics Intermediate

This document provides an overview of various machine learning models and techniques, including: - Supervised vs unsupervised learning and their key differences - Linear regression and logistic regression for classification and prediction - How naive Bayes classification works using Bayes' theorem - Neural networks using layers of neurons to learn complex patterns - Ensemble methods like random forests and boosted trees that combine multiple decision trees - K-nearest neighbors classification based on similarity to known examples It covers the basic concepts, applications, assumptions and workings of these common machine learning algorithms in 3-6 sentences each.

Uploaded by

Shilpa Gadgil
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Machine Learning Model - Basics/Intermediate Cheat Sheet

by spriiprad via cheatography.com/122548/cs/22783/

Supervised Vs Unsupe​rvised Learnig 1. Linear Regression (cont) 3. How Naive Bayes Work

Supe​rvi​sed Unsu​per​vised Only One Only One


Used in Classi​fic​ation Dimension Dependent Variable Dependent Variable
and Prediction Reduction and Relati​onships that are signif​icant when
clustering using simple linear regression may no
Value of outcome No outcome longer be when using multiple linear
must be known variable to predict regression and vice-v​ersa.
3. Naive Bayes Classifier
or classify
Insign​ificant relati​onships in simple linear Type of Catego​rical
Learns from training No learning
regression may become signif​icant in Response
data and applied to
multiple linear regres​sion. Probab​ilistic machine learning model that’s
validation
used for classi​fic​ation task.
2. How Logistic Regression Works
How Supervised Learning Looks The heart of the Bayes theorem provides
classifier is a way relating the
based on the likelihood of some
Bayes theorem. outcome given some
inform​ative prior inform​‐
ation.
We can find the B is the evidence and A
2. Logistic Regression
probab​ility of A is the hypoth​esis. That is
Type of Catego​rical happening, presence of one
How Unsupe​rvised Learning Looks Response given that B has particular feature does
It can be used for explan​atory tasks (=prof​‐ occurred. not affect the other.
iling) or predictive tasks (=clas​sif​ica​tion) Bayes Theorem P(A/B) = (P(B|A​)*P​‐
The predictors are related to the response Y Probab​ility (A)​)/P(B)
via a nonlinear function called the logit Formula
Reducing predictors can be done via Naive Bayes It also works when there
variable selection works well are missing values.
Supervised vs Unsupe​rvised TLDR Types when there is a
large number of
1. Binary Two Example:
predictor
Regression Catego​‐ Spam or Not
variables
ries.
The probab​ility The classi​fic​ations or
2. Multin​omial Three or Example:
estimates are predic​tions are generally
Logistic more Veg, Non-
not very accurate.
1. Linear Regression Regression catego​‐ Veg, Vegan
accurate
ries.
Type of Response Continuous
Assu​mpt​ions
3. Ordinal Three or Example:
Simple Regres​‐ Multiple Regres​sion
Logistic more Movie rating 1. Predic​tor​s/f​‐ 2. All the predictors have
sion
Regression categories from 1 to 5 eatures work an equal effect on the
One Indepe​ndent Multiple Indepe​ndent indepe​ndently outcome.
Variable Used Variable Used on the target
variable.

By spriiprad Not published yet. Sponsored by Readable.com


cheatography.com/spriiprad/ Last updated 15th May, 2020. Measure your website readability!
Page 1 of 2. https://readable.com
Machine Learning Model - Basics/Intermediate Cheat Sheet
by spriiprad via cheatography.com/122548/cs/22783/

4. How Neural Net Works 5. How Ensemble Model Works 6. How KNN works

5. Decision Trees

The decision tree is produced by succes​‐ 6. K-Nearest Neighbors


sively cutting the data set into smaller and
4. Neural Networks Type of Both Catego​rical and
smaller chunks, which are increa​singly "​pur​‐
Response Continuous
Type of Both Catego​rical and e" in terms of the value of the target
KNN is method for classi​fying objects based
Response Continuous (parti​cularly useful) variable.
on their similarity to a data with known
Learns complex patterns using layers of Random Forest - Boosted Trees -
classi​fic​ations.
neurons which mathem​ati​cally transform the Ensemble Method Ensemble
K-Nearest Neighbors (KNN) makes a
data. Method
prediction for a new observ​ation by
The layers between the input and output are Consists of a large Boosting is a
searching for the most similar training
referred to as “hidden layers”. number of individual method of
observ​ations and pooling (usually done by
Learns relati​onships between the features decision trees that converting weak
taking the mean average) their values
that other algorithms cannot easily discover. operate as an learners into
Training set has to be very large for this to
ensemble strong learners.
Arch​ite​cture of Neural Net work effect​ively
Each individual tree in Boosted trees is
Input Nodes(​var​iables) with inform​‐ Redundant and/or irrelevant variables can
the random forest spits the process of
Layer ation from the external enviro​‐ distort the classi​fic​ation results; the method
out a class prediction building a large,
nment is sensitive to noise in the data.
and the class with the additive tree by
Output Nodes(​var​iables) that send Nominal variables pose problems for
most votes becomes fitting a sequence
Layer inform​ation to the external measuring distance
our model’s prediction of smaller trees
enviro​nment or to another
The predic​tions (and In boosting, each It is a non-pa​ram​etric model ... does not
element in the network
therefore the errors) new tree is a fit require distri​bution assump​tions regarding
Hidden Nodes that only commun​icate the variables and does not make statis​tical
made by the individual on a modified
Layer with other layers of the network inferences to a population
trees need to have low version of the
and are not visible to the
correl​ations with each original data set. KNN is an example of a family of algorithms
external enviro​nment
other. known as instan​ce-​based or memory​-based
Random Forests train GBTs train one learning that classify new objects by their
5. How Decision Trees Work
each tree indepe​nde​‐ tree at a time, similarity to previously known objects.
ntly, using a random where each new
sample of the data. tree helps to
correct errors
made by
previously trained
trees.
5. Different Types of Trees

By spriiprad Not published yet. Sponsored by Readable.com


cheatography.com/spriiprad/ Last updated 15th May, 2020. Measure your website readability!
Page 2 of 2. https://readable.com

You might also like