190602_24-DeepLearning-fa-NNDL-Tutorial-1-
190602_24-DeepLearning-fa-NNDL-Tutorial-1-
and
Deep Learning
Ahmad Kalhor
Associate Professor .
School of Electrical and Computer Engineering –
University of Tehran
Fall 2022
Contents
* A few (optional) mini-projects are designed for extra work (bonus points)
Introduction
• Neural Networks : A set of simple units (neurons) which are wired together in
layers, in order to make important (desired) outputs against stimulating inputs.
Inputs
Outputs
(1) Dendrites
to receive the weighted electrochemical signals from adjacent neurons
(2) Cell body (soma)
to make a summation on received signals
(3) Nucleus
to generate an impulsive signal by comparing the absorbed signal with a threshold
(4) Axon
to send the generated signal to other adjacent neurons
(5) Synaptic gaps
to assign a weight to each send signal to adjacent neurons
𝑤1
𝑥1 𝑧 = 𝑤1 𝑥1 +𝑤2 𝑥2 +𝑤3 𝑥3
𝑤2
𝑥2 1 𝑧>𝜃
𝑦=ቊ
𝜃 0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
𝑤3 Soma Nucleus
𝑥3
1. Sensory neurons
Get information about what's going on inside and outside of the body and bring that information into
the central nervous system (CNS) so it can be processed.
2. Interneurons
which are found only in the CNS, connect one neuron to another. Most of interneurons are in the brain.
There are about 100 billiard neurons in the brain.
There are about 10^15 connections among neurons(10000 connections for each neuron on average)
3. Motor neurons
get information from other neurons and convey commands to your muscles, organs and glands.
(1) Classification
Localization, Detection and Classification of different objects, faces, voices, smells, and
approximation and prediction different physical variables: distances, temperatures, smoothness,
brightness, and so on…..
(2) Memory
Capability to create memories about different events with long and short dependencies.
Capability to associate sequenced different patterns together.
(3) Complex and difficult tasks/actions
Car driving and parking, Swimming, Playing music,…..
(4) Computational intelligence
Logic, mathematics, Inference
1. The learning process in brain is mainly performed by tuning the synaptic gaps. Information
gathered or resulted from environment is coded in synaptic gaps.
2. The communication speed (electrochemical signal transition) for each neuron is low but
Since the communications among neurons are performed in parallel, the processing speed
of the brain is high on average.
3. The learning process in the brain is not disturbed if some parts of the brain are damaged
and hence the robustness of the natural NNs is high (Fault tolerant).
4. Due to high level abstraction and inference in the brain, its generalization in learning is high.
𝑥 𝑓(∙) 𝑓(𝑥)
Learning methods
• Error back propagation(Steepest Descends) MSE, MAE, Cross Entropy,…. : Batch based- stochastic point
based-stochastic mini batch based
Gradient (Local search): Descent optimization algorithms:
SGD/SGD+Momentum/Nesterov accelerated gradient/Adagrad/RMSprop/Adam/AdaMax/Nadam/AMSGrad
• Evolutionary based (Global search) : Genetic Algorithm, Particle Swarm Optimization, Ant colony
• Intelligence-based (Global search): Simplex - Simulated Annealing
Ahmad 2- University of Tehran 25
2. Unsupervised Learning
•Unsupervised learning is the training of ANNs using information that is neither classified nor labeled
and allowing the algorithm to act on that information without guidance.
•Instead of explicit targets for the network, there are some statistical or geometric properties for the
suitable output of the network.
•Some examples of unsupervised learning algorithms include K-Means Clustering, Principal
Component Analysis and Hierarchical Clustering.
•Causal Relationship in Regression Problems.
•Applications: Pattern Generation/ Pattern clustering/ Pattern sorting/ Optimization
problems/Control tasks
𝜃: 𝑡ℎ𝑟𝑒𝑠ℎ𝑜𝑙𝑑
AND
OR
AND Not
• XOR 𝜽=𝟐
Thank you