0% found this document useful (0 votes)
24 views

05 - CNN PDF

1. Convolutional neural networks (CNNs) apply successive layers of convolution and pooling to extract increasingly complex features from images. 2. CNNs use local connectivity and shared weights to reduce the number of parameters compared to fully connected networks. 3. Convolution applies filters to detect patterns in small regions of the image. Pooling reduces the spatial size of representations to induce translation invariance.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

05 - CNN PDF

1. Convolutional neural networks (CNNs) apply successive layers of convolution and pooling to extract increasingly complex features from images. 2. CNNs use local connectivity and shared weights to reduce the number of parameters compared to fully connected networks. 3. Convolution applies filters to detect patterns in small regions of the image. Pooling reduces the spatial size of representations to induce translation invariance.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 60

Machine Learning

2
Outlines
Convolutional Neural Networks: CNNs

• Why not use unstructured feed-forward models?

• Key parts: Convolution, Pooling


• Examples

Feed-Forward Neural networks 3


Image Classification

4
Our problem: Image Classification
• Example: image classification (1K categories)
Image Category

Mushroom

Dog

Feed-Forward Neural networks 5


feed-forward networks

1000 1000

x
v 1000 1000

Input Layer 1
10 × 10
6 6

Feed-Forward Neural networks 6


feed-forward networks

Feed-Forward Neural networks 7


Patch Classifier / Filter

w1
f
11

w dim (W) = 121


11

Feed-Forward Neural networks 8


Patch Classifier / Filter

w1
f
11

w dim (W) = 121


11

f = ReLU ×
11 × 11 Input 11 × 11 Weights

Feed-Forward Neural networks 9


Convolution

f = ReLU ×
11 × 11 Input 11 × 11 Weights

Feed-Forward Neural networks 10


Convolution

Feed-Forward Neural networks 11


Example
Parametrs to learn

1 -1 -1
1 0 0 0 0 1 -1 1 -1 Filter 1
0 1 0 0 1 0 -1 -1 1
0 0 1 1 0 0
1 0 0 0 1 0 -1 1 -1
-1 1 -1 Filter 2
0 1 0 0 1 0
0 0 1 0 1 0 -1 1 -1



6 x 6 image
Each filter detects a small
pattern (3 x 3).
Feed-Forward Neural networks 12
Example

1 -1 -1
-1 1 -1 Filter 1
stride=1 -1 -1 1

1 0 0 0 0 1 Dot
product
0 1 0 0 1 0 3
0 0 1 1 0 0
1 0 0 0 1 0
0 1 0 0 1 0
0 0 1 0 1 0

6 x 6 image
13
Example

1 -1 -1
-1 1 -1 Filter 1
stride=1 -1 -1 1

1 0 0 0 0 1 Dot
product
0 1 0 0 1 0 3 -1
0 0 1 1 0 0
1 0 0 0 1 0
0 1 0 0 1 0
0 0 1 0 1 0

6 x 6 image
14
Example

1 -1 -1
-1 1 -1 Filter 1
If stride=2 -1 -1 1

1 0 0 0 0 1
0 1 0 0 1 0 3
0 0 1 1 0 0
1 0 0 0 1 0
0 1 0 0 1 0
0 0 1 0 1 0

6 x 6 image
15
Example

1 -1 -1
-1 1 -1 Filter 1
If stride=2 -1 -1 1

1 0 0 0 0 1
0 1 0 0 1 0 3 -3
0 0 1 1 0 0
1 0 0 0 1 0
0 1 0 0 1 0
0 0 1 0 1 0

6 x 6 image
16
Example

1 -1 -1
-1 1 -1 Filter 1
stride=1 -1 -1 1

1 0 0 0 0 1
3 -1 -3 -1
0 1 0 0 1 0
0 0 1 1 0 0
-3 1 0 -3
1 0 0 0 1 0
0 1 0 0 1 0 -3 -3 0 1
0 0 1 0 1 0
3 -2 -2 -1
6 x 6 image
17
Example

Two 4 x 4 images
Forming 2 x 4 x 4 matrix -1 1 -1
-1 1 -1 Filter 2
stride=1 -1 1 -1
Repeat this for each filter
1 0 0 0 0 1
3 -1 -3 -1
0 1 0 0 1 0 -1 -1 -1 -1
0 0 1 1 0 0 -3 1 0 -3
1 0 0 0 1 0 -1 -1 -2 1
0 1 0 0 1 0 -3 -3 0 1
-1 -1 -2 1
0 0 1 0 1 0
3 -2 -2 -1
6 x 6 image -1 0 -4 3
18
Color image: RGB 3 channels
11 -1-1 -1-1 -1-1 11 -1-1
1 -1 -1 -1 1 -1
-1-1 11 -1-1 -1-1 11 -1-1
-1 1 -1 Filter 1 -1 -1 1 1 -1 -1 Filter 2
-1-1 -1-1 11 -1-1 11 -1-1
-1 -1 1
Color image
1 0 0 0 0 1
1 0 0 0 0 1
0 11 00 00 01 00 1
0 1 0 0 1 0
0 00 11 01 00 10 0
0 0 1 1 0 0
1 00 00 10 11 00 0
1 0 0 0 1 0
0 11 00 00 01 10 0
0 1 0 0 1 0
0 00 11 00 01 10 0
0 0 1 0 1 0
0 0 1 0 1 0

19
Convolution v.s. Fully Connected
• Local connectivity

• Shared weights

Feed-Forward Neural networks 20


Convolution v.s. Fully Connected
1: 1
1 -1 -1 2: 0
3
-1 1 -1 Filter 1 3: 0
-1 -1 1 4: 0


1 0 0 0 0 1 7: 0
8: 1 -1
0 1 0 0 1 0
9: 0
0 0 1 1 0 0 10: 0


1 0 0 0 1 0
0 1 0 0 1 0 13: 0
14: 0
0 0 1 0 1 0 6 x 6 image 15: 1
16: 1
Fewer parameters … Shared weights
21
Convolution, Feature Map

Feed-Forward Neural networks 22


Pooling
• We wish to know whether a feature was there but not exactly where it was.

f = max { … }

Feature Map Pooled Map

23
Pooling

24
Pooling (MAX)
• Pooling region and stride may vary
• Pooling induces translation invariance at the cost of spatial resolution
• Stride reduces the size of the resulting feature map.

25
example

1 -1 -1 -1 1 -1
-1 1 -1 Filter 1 -1 1 -1 Filter 2
-1 -1 1 -1 1 -1

3 -1 -3 -1 -1 -1 -1 -1

-3 1 0 -3 -1 -1 -2 1

-3 -3 0 1 -1 -1 -2 1

3 -2 -2 -1 -1 0 -4 3

26
Convolutional Neural Network

27
Convolutional Neural Network

28
Convolutional Neural Network

Cat, Dog …
Convolution

Max Pooling
A new image
Fully connected layer
Convolution
A new image
Max Pooling

Flattened

29
Convolution Cross correlation

f (t) g(t)

g'(t) f (t)

f (t) * g (t)

30
Convolution Cross correlation

f (t) g(t)

g'(t) f (t)

f (t) * g (t)

31
Convolution Cross correlation

f (t) g(t)

g'(t)
f (t)

f (t) * g (t)

32
Convolution Cross correlation

f (t) g(t)

f (t) g'(t)

f (t) * g (t)

33
Convolution Cross correlation

f (t) g(t)

f (t) g'(t)

f (t) * g (t)

34
Mathematical definition of convolution

+∞
( f * g )(t ) = ∫-∞ f (τ ) g (- τ + t ) dt

35
Cross correlation

f (t) g(t)

g (t) f (t)

36
Cross correlation

f (t) g(t)

g (t) f (t)

37
Cross correlation

f (t) g(t)

g (t)
f (t)

38
Cross correlation

f (t) g(t)

f (t) g (t)

39
Cross correlation

f (t) g(t)

f (t) g (t)

40
Mathematical definition of Cross correlation

+∞
( f * g )(t ) = ∫-∞ f (τ ) g (τ + t ) dt

41
Discrete convolution / cross Correslation

+∞
( f . g )(t ) = ∑ f (τ ) g (τ + t ) dt
-∞
0 0 1 2 3 0 0
0 0 1 2 0 0

42
Discrete convolution / cross Correslation

+∞
( f . g )(t ) = ∑ f (τ ) g (τ + t ) dt
-∞
0 0 1 2 3 0 0
0 0 1 2 0 0

43
Discrete convolution / cross Correslation

+∞
( f . g )(t ) = ∑ f (τ ) g (τ + t ) dt
-∞
0 0 1 2 3 0 0
0 0 1 2 0 0

44
Discrete convolution / cross Correslation

+∞
( f . g )(t ) = ∑ f (τ ) g (τ + t ) dt
-∞
0 0 1 2 3 0 0
0 0 1 2 0 0

45
Discrete convolution / cross Correslation

+∞
( f . g )(t ) = ∑ f (τ ) g (τ + t ) dt
-∞
0 0 1 2 3 0 0
0 0 1 2 0 0

46
Discrete convolution / cross Correslation

+∞
( f . g )(t ) = ∑ f (τ ) g (τ + t ) dt
-∞
0 0 1 2 3 0 0
0 0 1 2 0 0

47
Discrete convolution / cross Correslation

+∞
( f . g )(t ) = ∑ f (τ ) g (τ + t ) dt
-∞
0 0 1 2 3 0 0
0 0 1 2 0 0

48
CNN Example

49
CNN Example

50
CNN Example

51
CNN Example

52
CNN Example

53
CNN Example

54
CNN Example

55
CNN Example

56
CNN Example

57
CNN Example

58
CNN Example

59
CNN Example

60

You might also like