Submitted By: Alka Gupta 2011emcs01
Submitted By: Alka Gupta 2011emcs01
Supervised
Show the computer something and tell it what it is. Classification occurs partly before system training to select training information. Additional classification and verification occurs after system training by a human problem domain expert.
Unsupervised
Show the computer something and let it sense what it is. Classification occurs after system training by a problem domain expert, usually by identifying data drawn to a cluster center.
1980s.
It is a type of artificial neural network that is trained
using unsupervised learning to produce a lowdimensional (typically 2-D), discretized representation of the input space of the training samples, called MAP.
They are different from other ANNs they use a
neighbourhood function
SOMs useful for visualizing low-dimensional views of
high-dimensional data,
Competition Cooperation
Synaptic adaptation
X= [X1,X2.Xm]
and
Competition aims at finding the best match of input vector X with the synaptic weight vectors Wj i.e. minimising the Euclidean distance between the vectors X and Wj. Let i(X) be the index to denote neuron that best matches X,then i(X)= arg min X-wj ,j=1,2,.l
j
Make a two dimensional array, or map, and randomize it. Present training data to the map and let the cells on the
map compete to win in some way. Euclidean distance is usually used. Stimulate the winner and some friends in the neighborhood. Do this a bunch of times. The result is a 2 dimensional weight map.
One-dimensional
i (completely interconnected)
For n-dimensional input space and m output neurons: Choose random weight vector wi for neuron i, i = 1, ..., m Choose random input x Determine winner neuron k:
neighborhood of neuron k: wi := wi + (i, k)(x wi) (wi is shifted towards x) Otherwise, narrow neighborhood function and learning parameter and go to (2).
There are two ways to interpret a SOM : In the training phase weights of the whole
neighbourhood are moved in the same direction as similar items tend to excite adjacent neurons. Therefore, SOM forms a semantic map where similar samples are mapped close together and dissimilar apart.
More neurons point to regions with high training sample