0% found this document useful (0 votes)
4 views

AIML.PPT

Uploaded by

Adarsh Khare
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

AIML.PPT

Uploaded by

Adarsh Khare
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Presented by:

1.Adarsh Khare
7.Tanvi baraskar
14.Shrinidhi Deshmane
22.Tanay Jain
Content
1.Introduction
2.Flowchart of Procedure
3.Working
4.PSO in Machine Learning
5.Application
Feature One

6.Advantages
7.Limitations
8.Conclusion
Introduction History:
Particle Swarm Optimization (PSO) is a nature-
inspired optimization algorithm based algo .It
was introduced by James Kennedy and Russell
Eberhart in 1995.

Definition:
PSO is a computational method inspired by the
social behavior of birds flocking or fish schooling.
It is used for solving optimization problems.

Purpose:
To find optimal or near-optimal solutions by
simulating a swarm of particles exploring the
search space.

Concepts: Swarm intelligence, optimization,


iterative improvement
Key Concepts
Particles: Each potential solution is called a "particle." These particles explore the search space to find the optimal
solution

Swarm: A group of particles that move collectively in the search space forms a "swarm."

Search Space:The Search Space in optimization refers to the range of possible values that the algorithm explores
to find the best solution.

Fitness Function: Each particle has a fitness value, determined by the function being optimized (called the
objective function). This value indicates how good the particle's position is.

Position and Velocity: Each particle has a position and velocity. The position represents a candidate solution, while
the velocity determines how the position changes in the next iteration.
Flowchart
Working of PSO
•Initialization: Each particle has a personal best position(pbest) and swarm
(gbest)global best position
•Evaluation: Calculate Fitness value
•Update: calculate Velocity and position updates with equations.

•If max iteration happens then we close the termination . otherwise continues in
same loop .
PSO can be used to optimize the weights and
Neural Network
biases of neural networks, improving their

PSO in Machine
Training
performance.

Learning •Apply PSO to select relevant features from


2. Feature Selection: datasets, improving model performance.

Enhance model p•Use PSO to optimize the


Hyperparameter hyperparameters of machine learning models
Tuning: (e.g., learning rate, number of hidden
layers).arameters and architecture for better
accuracy and efficiency.

Clustering and PSO can be used to optimize the parameters of


Classification clustering and classification algorithms.
1.Neural Network Training
Neural networks are powerful machine learning models that can learn complex patterns
in data. When combined with techniques like Particle Swarm Optimization (PSO) and
Artificial Intelligence and Machine Learning (AIML), neural networks become even more
versatile and effective.
Fundamentals of Neural Networks
1)Layers
Neural networks consist of interconnected layers of nodes, with an input layer, hidden
layers, and an output layer.
2)Activation Functions
Activation functions determine how the weighted inputs are transformed into outputs for
each node.
3)Training
Neural networks are trained by adjusting the connection weights to minimize the error
between predicted and actual outputs.
2. Feature Selection
Feature selection is the process of selecting the most informative and influential variables from a
dataset to improve model performance.

Feature Selection Strategies


1)Filter Methods
Evaluate features independently based on statistical measures, such as correlation or mutual
information.
2)Wrapper Methods
Use a machine learning model as a black box to evaluate feature subsets and guide the selection
process.
3)Embedded Methods
Perform feature selection as part of the model training process, such as with regularization
techniques.
3)Hyperparameter Tuning:
Particle Swarm Optimization (PSO), being a global optimization
technique, can be effectively applied to hyperparameter tuning to find
optimal combinations of hyperparameters

Search Space Definition: Identify hyperparameters (e.g., C,


gamma) and their value ranges for PSO optimization.
Fitness Function: Evaluate each particle's hyperparameter set
using a performance metric like accuracy or cross-validation
score.
PSO Execution: Initialize particles randomly, updating their velocity
and position based on personal and global best solutions.
Update Velocity and Position: Adjust each particle's velocity and
position using its best experience and the swarm's global best
within defined bounds.
4.Clustering and Classification

Clustering:
Search Space: Particles represent centroids of clusters.
Fitness Function: Metrics like Silhouette score or Inertia.
Goal: Minimize intra-cluster variance, maximize inter-cluster separation.
Classification:
Search Space: Particles represent hyperparameter sets (e.g., learning rate,
regularization).
Fitness Function: Metrics like accuracy, F1 score, or AUC.
Goal: Maximize model performance (accuracy, cross-validation score).
PSO Workflow for Clustering &
Classification
Initialize Particles:
Randomly generate particle positions representing centroids (clustering) or hyperparameters (classification).
Evaluate Fitness:
Clustering: Evaluate clustering quality using metrics like Silhouette score.
Classification: Evaluate model performance using accuracy, F1 score, etc.
Update Particles:
Adjust particle velocity and position based on the personal best and global best solutions.
Iterate and Converge:
Particles move toward the optimal solution:
Clustering: Converge on the best cluster centroids.
Classification: Find the best hyperparameter combination for maximum model performance.

Visualization:
Clustering: Particles adjust centroids to optimize clustering quality.
Classification: Particles explore hyperparameter combinations to enhance accuracy.
Simple to implement and understand

Few parameters to adjust.


Advantages
Effective
Feature Two for continuous and discrete
optimization.

can escape local optima and explore large search


spaces.
Limitations

Diversity Loss: Revenue:


Local Optima Parameter Sensitivity: Scaling Issues:

Performance is highly Swarm can lose diversity, Struggles with high-


Tends to converge to local Calculate revenue per
sensitive to the choice of leading to premature dimensional problems due to
optima in complex search shop/product.
parameters (inertia convergence and limited the exponential growth of the
spaces. search space.
weight, cognitive and exploration.
social components).
Conclusions

In conclusion, Particle Swarm Optimization (PSO) effectively addresses complex


optimization problems through its swarm-based search strategy. Its simplicity and
efficiency make it a valuable tool for hyperparameter tuning, feature selection, and various
real-world applications. PSO’s adaptability and effectiveness ensure its continued
relevance and potential for future advancements.

Business One

Lorem ipsum dolor sit amet, consectetur


adipiscing elit. Ut lobortis placerat
Business dapibus. Cras sem ex, tempor a
Growth tincidunt eget, ullamcorper id nulla.
Refference
•Research paper
Kennedy, J., & Eberhart, R. (1995). "Particle Swarm Optimization." Proceedings of the IEEE International
Conference on Neural Networks, 1942-1948.
Shi, Y., & Eberhart, R. C. (1998). "Parameter Selection in Particle Swarm Optimization." Evolutionary
Computation, 1(1), 101-108.
Clerc, M., & Kennedy, J. (2002). "The Particle Swarm – Explosion, Stability, and Convergence in a
Multidimensional Complex Space." IEEE Transactions on Evolutionary Computation, 6(1), 58-73.
Van den Bergh, F., & Engelbrecht, A. P. (2004). "A Cooperative Approach to Particle Swarm Optimization."
IEEE Transactions on Evolutionary Computation, 8(3), 225-239.
Poli, R., Langdon, W. B., & McPhee, N. F. (2008). A Field Guide to Genetic Programming. Lulu.com
(Includes a section on PSO).
•Flowchart
Particle Swarm Optimization (PSO) flowchart | Download Scientific Diagram (researchgate.net)
•Youtube
https://www.youtube.com/watch?v=hoDui7kazU8&list=PLf9D4jRiXaaPipeLZX40wRLUB4Rn0Bvmq
Thank you

You might also like