0% found this document useful (0 votes)
75 views

Machine Learning Techniques - Types of Machine Learning - Applications Mathematical Foundations of Machine Learning

Machine Learning - Machine Learning Foundations –Overview – Design of a Learning system - Types of machine learning –Applications Mathematical foundations of machine learning - random variables and probabilities - Probability Theory – Probability distributions -Decision Theory- Bayes Decision Theory - Information Theory

Uploaded by

vijayalakshmis
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
75 views

Machine Learning Techniques - Types of Machine Learning - Applications Mathematical Foundations of Machine Learning

Machine Learning - Machine Learning Foundations –Overview – Design of a Learning system - Types of machine learning –Applications Mathematical foundations of machine learning - random variables and probabilities - Probability Theory – Probability distributions -Decision Theory- Bayes Decision Theory - Information Theory

Uploaded by

vijayalakshmis
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 15

MACHINE LEARNING TECHNIQUES

Presented By
S.Vijayalakshmi B.E,
Assistant Professor,
Department of Computer Science,
Sri Sarada Niketan College for Women, Karur.
What is Machine Learning?

Definition: Machine learning is a field of artificial intelligence


(AI) that enables computers to learn from data without being
explicitly programmed.
Key Characteristics:
• Data-driven decision making.
• Use of algorithms to discover patterns in data.
• The system improves its performance over time.
Machine Learning Foundations
Components of Machine Learning:
• Data: Input that the model uses to learn.
• Model: The mathematical representation of the problem
(e.g., regression, decision trees, neural networks).
• Algorithm: The method used to train the model and
update its parameters.
• Loss Function: A measure of how well the model’s
predictions match the true data.
• Optimization: The process of minimizing or maximizing
the loss function.
Design of a Learning System
• Problem Definition: Define the problem and the goal (e.g.,
classification, regression).
• Data Collection: Gather and preprocess data (cleaning,
normalization, etc.).
• Model Selection: Choose an appropriate algorithm (e.g., linear
regression, decision tree, neural network).
• Training: Use the data to train the model and learn patterns.
• Evaluation: Assess model performance using metrics (e.g.,
accuracy, precision, recall).
• Deployment: Deploy the trained model into production for real-
world use.
Types of Machine Learning

Supervised Learning:
• The model is trained on labeled data (input-output
pairs).
• Examples: Linear regression, logistic regression,
support vector machines (SVMs).
• Applications: Spam email detection, image
classification.
Unsupervised Learning:

• The model is trained on unlabeled data and must find


patterns on its own.
• Examples: Clustering (e.g., k-means), dimensionality
reduction (e.g., PCA).
• Applications: Customer segmentation, anomaly
detection.
Reinforcement Learning:

• The model learns by interacting with an environment


and receiving feedback through rewards or
penalties.
• Examples: Q-learning, deep Q-networks (DQN).
• Applications: Robotics, game playing (e.g.,
AlphaGo).
Applications of Machine Learning

Healthcare:
• Disease diagnosis, medical image analysis, drug
discovery.
Finance:
• Fraud detection, stock market prediction, risk
management.
Marketing:
• Customer segmentation, recommendation systems,
personalized ads.
Applications of Machine Learning

Manufacturing:
• Predictive maintenance, quality control, supply chain
optimization.
Autonomous Systems:
• Self-driving cars, robotics, drones.
Mathematical Foundations of Machine Learning
• Random Variables: A variable whose value is subject to
randomness.
• Probability Theory: A framework for quantifying
uncertainty.
• Decision Theory: Provides a formal framework for
decision-making under uncertainty.
• Bayes Decision Theory: A probabilistic approach to
making decisions based on Bayes' Theorem.
• Information Theory: The study of the quantification of
information.
Random Variables and Probabilities

Random Variables:
• Discrete Random Variable: Takes on a finite number of
values (e.g., coin flip outcomes).
• Continuous Random Variable: Takes on any value in a
range (e.g., temperature).
Probability:
• The likelihood of an event occurring, expressed as a
number between 0 and 1.
Probability Distributions
Probability Distributions
• Definition: A probability distribution describes the likelihood of different
outcomes in a random experiment.
• Common Distributions:
• Normal Distribution: Bell-shaped curve, commonly used in statistics and
machine learning.
• Bernoulli Distribution: Models binary outcomes (e.g., success/failure).
• Poisson Distribution: Models the number of events in fixed intervals of time or
space.
• Binomial Distribution: Models the number of successes in a fixed number of
independent trials.
• Example:
• Normal Distribution: Used for modeling errors in regression models.
Decision Theory
• Definition: A framework for making decisions under
uncertainty, considering both the outcomes and the
probabilities of various decisions.
• Key Concepts:
• Expected Utility: The weighted average of possible
outcomes, with weights as their respective probabilities.
• Risk Aversion: A preference for certain outcomes over
uncertain ones with equal expected utility.
• Loss Functions: Functions that measure the cost
associated with making a wrong decision.
Information Theory
• Definition: Information theory deals with quantifying,
storing, and communicating information.
• Key Concepts:
• Entropy: Measures the uncertainty in a random variable.
• Mutual Information: Measures the amount of information
one random variable contains about another.
• Kullback-Leibler Divergence: Measures the difference
between two probability distributions.
• Applications: Data compression (e.g., Huffman coding),
feature selection in machine learning.
THANK YOU

You might also like