0% found this document useful (0 votes)
20 views

6 Code MLP Export

Uploaded by

safi ullah
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views

6 Code MLP Export

Uploaded by

safi ullah
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Multi Layer Perceptron in pytorch

2
1 mlp_model = nn.Sequential(
1
3
2 nn._______( ___, ___, bias = ___ ),
1

1 -1 1 -5 -1 0
3
nn._______(),
1 1 0 0 3 ReLU 3
0 1 1 1 5 ≈ 5
4
nn._______( ___, ___, bias = ___ ),
1 0 1 -2 3 3
5
nn._______(), 1 -1 1 0 2 ReLU 2
0 1 -1 1 1
≈ 1
nn._______( ___, ___, bias = ___ ),
6 1

1 -1 2 3 .95
7
nn._______() -1 1 1 0 .50
σ
8
)
1 -2 -2 -2
≈ .12
2 1 0 5 .99
-3 0 1 -5 .01

Hints:

Linear Layer: { Identity | Linear | Bilinear }


Activation Function: { ReLU | Tanh | Sigmoid }
in_features: { int }
out_features: { int }
bias: { T | F }

1.8.24
© 2024 Tom Yeh 1
Multi Layer Perceptron in pytorch
2
1 mlp_model = nn.Sequential(
1
3
2 nn._______( ___, ___, bias = ___ ),
1

1 -1 1 -5 -1 0
3
nn._______(),
1 1 0 0 3 ReLU 3
0 1 1 1 5 ≈ 5
4
nn._______( ___, ___, bias = ___ ),
1 0 1 -2 3 3
5
nn._______(), 1 -1 1 0 2 ReLU 2
0 1 -1 1 1
≈ 1
nn._______( ___, ___, bias = ___ ),
6 1

1 -1 2 3 .95
7
nn._______() -1 1 1 0 .50
σ
8
)
1 -2 -2 -2
≈ .12
2 1 0 5 .99
-3 0 1 -5 .01

Hints:

Linear Layer: { Identity | Linear | Bilinear }


Activation Function: { ReLU | Tanh | Sigmoid }
in_features: { int }
out_features: { int }
bias: { T | F }

1.8.24
© 2024 Tom Yeh 2

You might also like