Deep Learning With Actuarial Applications
Deep Learning With Actuarial Applications
Mario V. Wüthrich
RiskLab, ETH Zurich
1
Contents: Generalized Linear Models
• Parameter selection
2
• Starting with Data
3
Car Insurance Claims Frequency Data
4
Exposures and Claims
histogram of the exposures (678013 policies) boxplot of the exposures (678013 policies) histogram of claim numbers
2.0
6e+05
5e+05
150000
1.5
number of policies
4e+05
frequency
100000
1.0
3e+05
2e+05
50000
0.5
1e+05
0.0
0e+00
0
• Exposures bigger than 1 are considered to be data error and are capped at 1.
• Most insurance policies do not suffer any claim (class imbalance problem).
5
Continuous Covariates: Age of Driver
total volumes per driver's age groups observed frequency per driver's age groups
0.35
8000
0.30
●
●
●
0.25
6000
●
●
frequency
exposure
0.20
●●
●●
●●
●
●
●●
0.15
4000
●
●
●
●
●●
●●
● ●●
● ●●
● ● ●
●
●
●●● ●
● ●
0.10
●
●●
●● ●
● ● ●
●●
● ●
●
●● ● ●
●
●● ● ●
● ●
●●
●● ● ●● ● ● ● ●
● ●● ●● ●●● ●
●
● ●●● ● ●●● ●
● ●
●
● ●
●●● ● ● ●
● ●
●
● ● ● ●
●●● ● ● ●
●
●● ● ●
●
●
● ●
●
●
● ● ●
●
● ●
● ● ●
●
●
2000
●● ●●
0.05
0.00
0
18 23 28 33 38 43 48 53 58 63 68 73 78 83 88 18 23 28 33 38 43 48 53 58 63 68 73 78 83 88
6
Categorical Covariates: French Region
total volumes per regional groups observed frequencies per regional groups
1e+05
MTPL portfolio
● French population
8e+04
●
6e+04
exposure
4e+04
●
2e+04
●
●
● ●
●
● ●
●
● ● ● ●
● ●
● ● ●
●
0e+00
R11 R22 R24 R26 R41 R43 R53 R72 R74 R83 R93
regional groups
7
Covariates: Dependence
10
110
100
8
BonusMalus
log−Density
90
6
80
70
4
60
50
2
18
19
20
21
22
23
24
25
26−35
36−45
46−55
56−65
66−75
76+
A B C D E F
8
Goal: Regression Modeling
xi 7→ µ(xi),
E[Ni] = µ(xi)vi,
where Ni denotes the number of claims and vi > 0 is the time exposure of
insurance policy 1 ≤ i ≤ n (pro-rata temporis).
9
• Exponential Dispersion Family (EDF)
10
Exponential Dispersion Family (EDF)
• The parametrization of this family is chosen such that it is particularly suitable for
maximum likelihood estimation (MLE).
• The EDF is the base statistical model for generalized linear modeling (GLM) and
for neural network regressions.
• Remark: This first chapter on GLMs gives us the basic understanding and tools
for neural network regression modeling.
11
Exponential Dispersion Family (EDF)
with
12
Cumulant Function
• Examples:
θ2/2 Gauss,
exp(θ) Poisson,
− log(−θ)
gamma,
κ(θ) = log(1 + eθ ) Bernoulli/binomial,
−(−2θ)1/2
inverse Gaussian,
((1 − p)θ) 2−p
1−p /(2 − p) Tweedie with p > 1, p 6= 2.
13
Mean and Variance Function
ϕ 00 ϕ
Var(Yi) = κ (θi) = V (µi) > 0,
vi vi
• Examples:
1 Gauss,
µ Poisson,
V (µ) = µ2 gamma,
3
µ inverse Gaussian,
p
µ Tweedie with p ≥ 1.
14
Maximum Likelihood Estimation (MLE)
n
! n
Y X Yiθ − κ(θ)
`Y (θ) = log f (Yi; θ, vi/ϕ) = + a(Yi; vi/ϕ).
i=1 i=1
ϕ/vi
n
∂ X vi
`Y (θ) = [Yi − κ0(θ)] = 0,
∂θ i=1
ϕ
and MLE θb Pn
viYi
θb = (κ0)−1 Pi=1
n .
i=1 vi
17
Generalized Linear Models (GLMs)
q
X
xi 7→ g(µi) = g(E[Yi]) = g(κ0(θi)) = β0 + βj xi,j .
j=1
18
Design Matrix
q
X
xi 7→ g(µi) = g(E[Yi]) = hβ, xii = β0 + βj xi,j .
j=1
19
Maximum Likelihood Estimation of GLMs
n
X Yih(µi) − κ(h(µi))
β 7→ `Y (β) = + a(Yi; vi/ϕ),
i=1
ϕ/vi
∇β `Y (β) = 0.
• Score equations are solved numerically with Fisher’s scoring method or the iterated
re-weighted least squares (IRLS) algorithm.
20
MLE and Deviance Loss Functions
n
X Yih(µi) − κ(h(µi))
`Y (β) = + a(Yi; vi/ϕ),
i=1
ϕ/vi
• The deviance loss of the Gaussian model is the square loss function, other examples
of the EDF have deviance losses different from square losses.
21
Examples of Deviance Loss Functions
• Gaussian case: n
X vi 2
D∗(Y , β) = (Yi − µi) ≥ 0.
i=1
ϕ
• Gamma case:
n
X vi Yi µi
D∗(Y , β) = 2 − 1 + log ≥ 0.
i=1
ϕ µi Yi
• Poisson case:
n
X vi µi
D∗(Y , β) = 2 µi − Yi − Yi log ≥ 0.
i=1
ϕ Yi
22
Balance Property under Canonical Link
• Under the canonical link g = h = (κ0)−1 we have balance property for the MLE
n
X n
X n
X
viE[Y
b i] = viκ0hβ,
b xi i = viYi.
i=1 i=1 i=1
• If one does not work with the canonical link, one should correct in βb0 for the bias.
23
• Feature Engineering / Covariate Pre-Processing
24
Feature Engineering
• Assume monotone link function choice g
q
X
xi 7→ µi = E[Yi] = g −1hβ, xii = g −1 β0 + βj xi,j .
j=1
observed frequency per car brand groups observed frequency per driver's age groups
0.35
0.35
●
0.30
0.30
●
●
●
0.25
0.25
●
●
frequency
frequency
0.20
0.20
●●
●●
●●
●
●
●●
0.15
0.15
●
●
●
●
●
●
●●
●●
● ●●
● ●●
● ● ●
●
●
●●● ●
● ●
0.10
0.10
●
●
● ●
●●
●● ●
● ● ●
●●
● ●
●
●● ● ●
●
●● ● ●
● ●
● ●
●
●
●
●●
● ●
● ●
●● ●●● ●
●
● ●●● ● ●●● ●
●
● ● ●● ● ● ● ● ●
●
● ● ●
● ● ●
● ● ●
● ●
● ●● ● ●●
●●● ●
●
●
● ●● ● ●
● ● ●
●
●
●● ● ●
● ●● ● ●
●●
●
● ●
● ● ●
●
●●
●
●
●● ●●
0.05
0.05
0.00
0.00
B1 7→ e1 = 1 0 0 0 0 0 0 0 0 0 0
B10 7→ e2 = 0 1 0 0 0 0 0 0 0 0 0
B11 7→ e3 = 0 0 1 0 0 0 0 0 0 0 0
B12 7→ e4 = 0 0 0 1 0 0 0 0 0 0 0
B13 7→ e5 = 0 0 0 0 1 0 0 0 0 0 0
B14 7→ e6 = 0 0 0 0 0 1 0 0 0 0 0
B2 7→ e7 = 0 0 0 0 0 0 1 0 0 0 0
B3 7→ e8 = 0 0 0 0 0 0 0 1 0 0 0
B4 7→ e9 = 0 0 0 0 0 0 0 0 1 0 0
B5 7→ e10 = 0 0 0 0 0 0 0 0 0 1 0
B6 7→ e11 = 0 0 0 0 0 0 0 0 0 0 1
• One-hot encoding does not lead to full rank design matrices X, because we have
a redundancy.
26
Dummy Coding of Categorical Covariates
B1 0 0 0 0 0 0 0 0 0 0
B10 1 0 0 0 0 0 0 0 0 0
B11 0 1 0 0 0 0 0 0 0 0
B12 0 0 1 0 0 0 0 0 0 0
B13 0 0 0 1 0 0 0 0 0 0
B14 0 0 0 0 1 0 0 0 0 0
B2 0 0 0 0 0 1 0 0 0 0
B3 0 0 0 0 0 0 1 0 0 0
B4 0 0 0 0 0 0 0 1 0 0
B5 0 0 0 0 0 0 0 0 1 0
B6 0 0 0 0 0 0 0 0 0 1
• Declare one label as reference level and drop the corresponding column.
• There are other full rank codings like Helmert’s contrast coding.
27
Pre-Processing of Continuous Covariates (1/2)
observed log−frequency per age of driver
−1.0
age class 2: 21-25 ●
−1.5
age class 3: 26-30 ●
●
●
●
−2.0
●
frequency
●●
● ●
●● ● ●● ●
● ●●●●●● ●●●● ● ●●●●●●● ● ● ●
age class 5: 41-50 ●●● ● ● ●
●
● ●
● ● ●
● ●●
●
−2.5
●
●●● ● ● ● ●●● ● ●
●
● ● ●
−3.0
age class 7: 61-70
age class 8: 71-90
−3.5
18 23 28 33 38 43 48 53 58 63 68 73 78 83 88
age of driver
• Continuous features need feature engineering, too, to bring them into the right
functional form for GLM. Assume we have log-link for g
q
X
x 7→ log (E [Y ]) = hβ, xi = β0 + βj x j .
j=1
• The number of parameters can grow very large if we have many classes.
• One may also consider other functional forms for continuous covariates, e.g.,
29
• Variable Selection
30
Variable Selection: Likelihood Ratio Test (LRT)
χ2Y = D∗(Y , β
b ) − D∗(Y , β
H0
b ) ≥ 0.
full
31
Variable Selection: Wald Test
• Wald test. Choose matrix Ip such that Ipβ full = β p. Consider Wald statistics
−1
b − 0)> Ip I(β
W = (Ipβ b )−1 I > b − 0).
(Ipβ
full full p full
• I(β
b ) is Fisher’s information matrix; the above test is based on asymptotic
full
normality of the MLE β
b .
full
32
Model Selection: AIC
• Different models need to consider the same data on the same scale (log-normal
vs. gamma).
33
Example: Poisson Frequency GLM
1 Call :
2 glm ( formula = claims ~ powerCAT + area + log ( dens ) + gas + ageCAT +
3 acCAT + brand + ct , family = poisson () , data = dat , offset =
4
5 Deviance Residuals :
6 Min 1Q Median 3Q Max
7 -1.1373 -0.3820 -0.2838 -0.1624 4.3856
8
9 Coefficients :
10 Estimate Std . Error z value Pr ( >! z !)
11 ( Intercept ) -1.903 e +00 4.699 e -02 -40.509 < 2e -16 ***
12 powerCAT2 2.681 e -01 2.121 e -02 12.637 < 2e -16 ***
13 .
14 .
15 powerCAT9 -1.044 e -01 4.708 e -02 -2.218 0.026564 *
16 area 4.333 e -02 1.927 e -02 2.248 0.024561 *
17 log ( dens ) 3.224 e -02 1.432 e -02 2.251 0.024385 *
18 gasRegular 6.868 e -02 1.339 e -02 5.129 2.92 e -07 ***
19 .
20 .
21 ctZG -8.123 e -02 4.638 e -02 -1.751 0.079900 .
22 ---
23 Signif . codes : 0 *** 0.001 ** 0.01 * 0.05 . 0.1 1
24
25 ( Dispersion parameter for poisson family taken to be 1)
34
26
27 Null deviance : 145532 on 499999 degrees of freedom
28 Residual deviance : 140641 on 499943 degrees of freedom
29 AIC : 191132
35
Forward Parameter Selection: ANOVA
We should keep the full model according to AIC and according to the LRT on a 5%
significance level.
37
• Car Insurance Frequency Example
38
Example: Poisson Frequency Model (1/2)
θ 7→ κ(θ) = exp(θ).
ϕ 00 1 1
Var (Yi) = κ (θi) = exp(θi) = µi .
vi vi vi
39
Example: Poisson Frequency Model (2/2)
q
X
xi 7→ log (E [Ni]) = log vi + hβ, xii = log vi + β0 + βj xi,j .
j=1
40
Further Points
• Generalized additive models (GAMs) allow for more flexibility than GLMs in
marginal covariate component modeling. But they often suffer from computational
complexity.
41
References
• Barndorff-Nielsen (2014). Information and Exponential Families: In Statistical Theory. Wiley
• Charpentier (2015). Computational Actuarial Science with R. CRC Press.
• Efron, Hastie (2016). Computer Age Statistical Inference: Algorithms, Evidence, and Data Science. Cambridge UP.
• Fahrmeir, Tutz (1994). Multivariate Statistical Modelling Based on Generalized Linear Models. Springer.
• Fisher (1934). Two new properties of mathematical likelihood. Proceeding of the Royal Society A 144, 285-307.
• Hastie, Tibshirani, Friedman (2009). The Elements of Statistical Learning. Springer.
• Jørgensen (1986). Some properties of exponential dispersion models. Scandinavian Journal of Statistics 13/3,
187-197.
• Jørgensen (1987). Exponential dispersion models. Journal of the Royal Statistical Society. Series B (Methodological)
49/2, 127-145.
• Jørgensen (1997). The Theory of Dispersion Models. Chapman & Hall.
• Lehmann (1983). Theory of Point Estimation. Wiley.
• Lorentzen, Mayer (2020). Peeking into the black box: an actuarial case study for interpretable machine learning.
SSRN 3595944.
• McCullagh, Nelder (1983). Generalized Linear Models. Chapman & Hall.
• Nelder, Wedderburn (1972). Generalized linear models. Journal of the Royal Statistical Society. Series A (General)
135/3, 370-384.
• Noll, Salzmann, Wüthrich (2018). Case study: French motor third-party liability claims. SSRN 3164764.
• Ohlsson, Johansson (2010). Non-Life Insurance Pricing with Generalized Linear Models. Springer.
• Wüthrich, Buser (2016). Data Analytics for Non-Life Insurance Pricing. SSRN 2870308, Version September 10, 2020.
• Wüthrich, Merz (2021). Statistical Foundations of Actuarial Learning and its Applications. SSRN 3822407.
42
Feed-Forward Neural Networks
Mario V. Wüthrich
RiskLab, ETH Zurich
1
Contents: Feed-Forward Neural Network
• Universality theorems
• Embedding layers
2
• The Statistical Modeling Cycle
3
The Statistical Modeling Cycle
(1) data collection, data cleaning and data pre-processing (> 80% of total time)
(2) selection of model class (data or algorithmic modeling culture, Breiman 2001)
’solving’ involves:
? choice of algorithm
? choice of stopping criterion, step size, etc.
? choice of seed (starting value)
4
• Generic Feed-Forward Neural Networks (FNNs)
5
Neural Network Architectures
• Neural networks can be understood as an approximation framework.
• FNNs have stacked hidden layers. If there is exactly one hidden layer, we call the
network shallow; if there are multiple hidden layers, we call the network deep.
age age
claims claims
ac ac
Information is processed from the input (in blue) to the output (in red).
7
Representation Learning
• A GLM with link g has the following structure
This requires manual feature engineering to bring x into the right form.
8
Fully-Connected FNN Layer
qm−1 !
(m) (m) (m) def. (m)
X
zj (x) =φ wj,0 + wj,l xl = φhwj , xi,
l=1
(m)
for given network weights (parameters) wj ∈ Rqm−1+1.
(m)
• Every neuron zj (x) describes a GLM w.r.t. feature x ∈ Rqm−1 and activation φ.
The resulting function (called ridge function) reflects a compression of information.
9
Shallow and Deep Fully-Connected FNNs
age age
claims claims
ac ac
Information is processed from the input (in blue) to the output (in red).
10
Activation Function
ex − e−x −2x −1
x 7→ tanh(x) = x = 2 1+e − 1 = 2 sigmoid(2x) − 1.
e + e−x
11
Sigmoid Activation Function φ(x) = (1 + e−x)−1
sigmoid function
1.0
w=4 ●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
w=1 ●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
w=1/4 ●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
0.8
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
0.6
●
●
●
●
●
●
●●
●
●
●
●
●
sigmoid
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
0.4
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
0.2
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
0.0
●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●
−10 −5 0 5 10
12
Fully-Connected FNN Architecture
age
claims
ac
• Choose depth of the network d ∈ N and define the FNN layer composition
(d:1) def. (d) (1)
x 7→ z (x) = z ◦ ··· ◦ z (x) ∈ Rqd ,
with q0 = q for x ∈ Rq .
13
FNN Architecture: Interpretations
age
claims
ac
• Network mapping
D E
−1 (d:1)
xi 7→ µi = E[Yi] = g β, z (xi) .
14
• Universality Theorems
15
Universality Theorems for FNNs
age
claims
ac
• Cybenko (1989) and Hornik et al. (1989): Any compactly supported continuous
function can be approximated arbitrarily well (in sup- or L2-norm) by shallow
FNNs with sigmoid activation if allowing for arbitrarily many hidden neurons (q1).
• Leshno et al. (1993): The universality theorem for shallow FNNs holds if and only
if the activation function φ is non-polynomial.
• Grohs et al. (2019): Shallow FNNs with ReLU activation functions provide
polynomial approximation rates, deep FNNs provide exponential rates.
16
Simple Example Supporting Deep FNNs
0.75 2.5
feature component X2
0.5 2.0
D E D E
β, z (2:1)(x) = β, (z (2) ◦ z (1))(x) , 0.25 1.5
1.0
0.25
0.5
0.75
can perfectly approximate function µ. feature component X1
• Deep FNNs allow for more complex interactions of covariates through compositions
of layers/functions: wide allows for superposition, and deep allows for composition.
17
Shallow Neural Networks
x2
x2
0.0 0.0 0.0
x1 x1 x1
18
• Gradient Descent Methods for Model Fitting
19
Deviance Loss Function
• FNN mapping D E
−1 (d:1)
xi 7→ µi = E[Yi] = g β, z (xi) ,
has network parameter
(1)
ϑ= w1 , . . . , w(d)
qd , β ∈ Rr ,
Pd
of dimension r = m=1 qm (qm−1 + 1) + (qd + 1).
20
Plain Vanilla Gradient Descent Method (1/2)
n
X
∇ϑD∗(Y , ϑ) = 2 [µi − Yi] ∇ϑh(µi).
i=1
21
Plain Vanilla Gradient Descent Method (2/2)
• Negative gradient −∇ϑD∗(Y , ϑ) gives the direction for ϑ of the maximal local
decrease in deviance loss.
• For a given learning rate %t+1 > 0, the gradient descent algorithm updates network
parameter ϑ(t) iteratively by (adapted locally optimal)
2
∗ (t+1) ∗ (t) ∗ (t)
D (Y , ϑ ) = D (Y , ϑ ) − %t+1 ∇ϑD (Y , ϑ ) + o (%t+1) .
• Using a tempered learning rate (%t)t≥1 the network parameter ϑ(t) converges to a
local minimum of D∗(Y , ·) for t → ∞.
22
Over-Fitting in Complex FNNs
● ●
observation
10
homogeneous
neural network 1
neural network 2
8
●
6
mu(x)
●
●
4 ●
2
●
● ● ●
● ●
●
● ● ●
● ● ●
●
0
training loss
0.210
● validation loss
0.208
0.206
deviance loss
0.204
●
●
●
●
●
0.202
●
●
●
●
●
0.200
●
●●
●
●
●●
●
●
●
●● ●●
●
●●
●
●
●
●
●
●
●
●
●● ●●●●● ● ●● ●
●●●●●●
●
● ●●
●
●
●●●
●●
●
●●
●●
●●●
●
●●●●
●
●●●
●
●●
●●
●●●
●
●
●
●
●
●
●
●
●●
●●
●
●●
●●
●●
●
●
●● ●● ● ● ● ● ● ● ● ● ●●●●
0.198
●
●●
●
●●
●
● ●● ● ● ●● ● ●
● ●●
●
●●●
●●
●●
●
●●
●
●●
●
●●
●
●●
●
●●
●●
●
●
●●
●●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●●
●
●●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●●
●
●●
●●
●
●
●●
●
●●
●
●
●
●●
●
●●●
●
●●
●
●●
●
●
●●
●●
●
●●
●
●
●●
●● ● ●●●●●●
●
●
● ●●
●●
●●
●●
●
●●●●●
●●
●
●
●●
●●
●
●
●●
●
●●
●
●
●
●●
●●
●
●
●●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●●●
●
●
●●
●
●
●
●●
●
●
●●
●
●●
●
●
●●
●●
●●
●
●●
●●
●●
●
●
●●
●
●●
●
●●●
●● ●●●
●
●
●●
●
●
●
●●●
●
●
●●
●
●●
●
●●
●●
●●
●
●
●●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●
●●
● ●
●
●
●
●●
●●
●
●
●●
●
●
●●
●●
●
●●
●
●●
●●
●
●
●●
● ● ●●●
● ●
●
●
●●
● ●
●
●● ●
● ●
●●
●
●●
●●
●
●
●
●●
●●
●
●
●●
●
●
●●●
●●
●
●●
●●
●
●●
●
●●
● ●
● ● ●
●
training epochs
D entire data,
L learning data (in-sample),
T test data (out-of-sample),
U training data,
V validation data
25
Computational Issues and Stochastic Gradient
• Gradient descent steps
which are computationally expensive if the size of the training data U is large.
• Partition training data U at random in mini batches Uk of a given size. Use for
gradient descent steps one mini batch Uk at a time. This is called stochastic
gradient descent (SGD) algorithm.
• Running through all mini batches (Uk )k once is called a training epoch.
• Using the entire training data in each GDM step is called steepest gradient descent.
26
Size of Mini-Batches
• Partition training data U at random in mini batches U1, . . . , UK , and use for each
gradient descent step one mini batch Uk at a time
X
∗
∇ϑD (Uk , ϑ) = 2 [µi − Yi] ∇ϑh(µi).
i∈Uk
27
Momentum-Based Gradient Descent Methods
28
Predefined Gradient Descent Methods
• ’rmsprop’ chooses learning rates that differ in all directions by consider directional
sizes (’rmsprop’ stands for root mean square propagation);
• For more details we refer to Chapter 8 of Goodfellow et al. (2016) and Section
7.2.3 in Wüthrich–Merz (2021)
29
• Generalization Loss and Cross-Validation
30
Empirical Generalization Loss
Typically, for neural network modeling one considers 3 disjoint sets of data.
Assume that ϑbU ,V is the estimated network parameter based on U and V. The test
data T is given by (Yt, xt, vt)Tt=1. We have (out-of-sample) generalization loss (GL)
T
vt
µU µU
X ,V ,V
D∗(Y , ϑbU ,V ) = 2 Yth(Yt) − κ(h(Yt)) − Yth(bt ) + κ(h(bt )) .
t=1
ϕ
31
K-Fold Cross-Validation Loss
• If one cannot afford to partition the data D into 3 disjoint sets training data U,
validation data V and test data T , one has to use the data more efficiently.
• Denote by ϑb(−Dk ) the estimated network parameter based on all data except Dk .
33
Car Insurance Claims Frequency Data
34
Feature Engineering
• Categorical features: use either dummy coding or one-hot encoding.
PS: We come back to this choice below.
• Also continuous features need pro-processing. All feature components should live
on a similar scale such that the GDM can be applied efficiently.
xi,l − x−
xi,l 7→ xMM
i,l =2 + l
− − 1 ∈ [−1, 1],
xl − xl
where x−
l and x +
l are the minimum and maximum of the domain of xi,l .
1 library ( keras )
2
3 q0 <- 12 # dimension of input x
4 q1 <- 20
5 q2 <- 15
6 q3 <- 10
7
8 Design <- layer_input ( shape = c ( q0 ) , dtype = ’ float32 ’ , name = ’ Design ’)
9
10 Network = Design % >%
11 layer_dense ( units = q1 , activation = ’ tanh ’ , name = ’ hidden1 ’) % >%
12 layer_dense ( units = q2 , activation = ’ tanh ’ , name = ’ hidden2 ’) % >%
13 layer_dense ( units = q3 , activation = ’ tanh ’ , name = ’ hidden3 ’) % >%
14 layer_dense ( units =1 , activation = ’ exponential ’ , name = ’ Network ’)
15
16 model <- keras_model ( inputs = c ( Design ) , outputs = c ( Network ))
17 model % >% compile ( optimizer = optimizer_nadam () , loss = ’ poisson ’)
18
19 summary ( model )
36
Deep FNN with (q1, q2, q3) = (20, 15, 10)
37
Poisson FNN Regression with Offset
• Poisson regression with offset and canonical link g = h = log, set Ni = viYi
39
Deep FNN with (q1, q2, q3) = (20, 15, 10) with Offset
40
Application to French MTPL Data
Area
●
● ● ●
VehPower
VehAge
DrivAge
●
● ● ●
BonusMalus
B1
●
● ● ●
B10
B11
●
● ● ● ●
B12
●
● ●
B13
● ● ●
●
B14
B2
●
● ●
●
B3
B4
●
● ●
B5
B6
● ● ●
VehGas
●
● ● ● ●
Density
R11
●
● ●
R21
R22
● ● ● ●ClaimNb
R23
●
● ● ●
R24
R25
● ●
R26
●
● ● ●
R31
● ●
R41
R42
● ● ●
R43
●
● ● ●
●
R52
R53
R54
●
● ●
R72
R73
●
● ● ● ●
R74
R82
●
● ● ●
R83
●
● ● ●
R91
R93
● ● ●
R94
●
• The best validation loss model can be retrieved with a callback, see next slide.
• Remark: AIC is not a sensible model selection criterion for FNNs (early stopping).
42
Callbacks in Gradient Descent Methods
1 path0 <- "./ name0 "
2 CB <- c a l l b a c k _ m o d el _ c h e c k p o i n t ( path0 , monitor = " val_loss " ,
3 verbose = 0 , save_best_only = TRUE ,
4 save_weights_only = TRUE )
5
6 model % >% fit ( list ( X . learn , LogVol . learn ) , Y . learn ,
7 validation_split = 0.1 , batch_size = 10000 , epochs = 500 ,
8 verbose = 0 , callbacks = CBs )
9
10 l o a d _m od e l_ w ei gh t s_ h df 5 ( model , path0 )
● ●
training loss (in−sample)
32.5
●
validation loss (out−of−sample)
32.0
●
●
●
●
●
●
●
●
●
31.5
deviance losses
● ●
●
●
31.0
●● ●
● ●
●
● ●
● ●
●
● ●●
● ●●
●
●●
●●●
● ●●●●
●●●
●●●●
● ●●●
●●●●●●
●●
●●●
●●●●●
● ●●●●●
●●●●●●●●
●●●●
●●●●●
●●●●●
●●●●●●
● ●●●●●
●●●●●●●
●●●●●●
●●●●●
●●●●●
●●●●●●
● ●●●●●●●
● ●● ●
30.5
●●●●●
●●●●●
● ●●●●●●
●●●●●
●●●●●
●●●●●
● ●●●●●
● ●●●●
●●●●● ●
●●●●●●
●●●●
●●●●●●●
●●●●●●●●
● ●●●●●●●●
●●●●●●●●●●
● ●●●●●
● ● ●●●●●
●●●●●●●●●●
● ●●●●●●●●
●●●●●●● ●●
●●●● ●●●
●● ●●●●●●●● ● ●
● ●●●●●●●●●●
●●●●●●●●●●●●●● ●
● ●●●●●●●● ●● ●
●● ●●●●●●●●●● ●●
●●●●●●●●●●●●●●● ●
● ● ●●●●●●●●●●● ● ●
● ●●●●●●●●●●● ●●●
● ●●●●●●●●●●●●●●●● ●
● ●●●●●●●●●● ● ●
●●●●●●●●●●●●●●●●● ●●● ●
● ● ●●●● ●●●●●●●●●●●●●●
● ● ● ●●●●●●●●●● ● ●
●●●●●●●●●●●●●●●●●●●●● ●●
●● ●●
●● ● ●
● ●
●●
●● ●
●● ●● ● ●
●●● ●● ●● ●● ● ● ●● ● ●●
30.0
●
●●● ●● ● ● ● ● ● ●
● ● ● ●●
●●●● ● ●
● ● ●● ●●●● ● ●● ●● ●●● ● ●●●● ●
● ● ● ● ●
●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ●
● ●● ●● ●●●●● ● ● ● ● ● ● ● ● ●●● ● ● ● ●
● ●
● ●●●●●●●●● ●● ●●● ● ● ●●● ● ●●●●●●●●●● ●●● ● ● ● ● ● ● ● ● ●
● ●●●●● ● ● ●●● ●●●●●●●●● ●●●● ●●● ● ●● ●● ●●●●●● ●●●● ●●●●●●
●
● ● ● ●● ● ● ●●
● ● ●● ● ●●● ●● ● ●● ●●● ● ●● ●●●●●●●● ●● ●●●
● ●●●●●●● ●● ● ● ●●●● ● ● ●● ●● ●●● ●● ● ● ● ● ●●
● ● ● ●● ●● ●●● ●● ● ●●● ● ●● ●● ● ● ● ● ● ●
● ● ● ● ● ●●● ●●● ●●●●● ● ●●●● ●● ● ● ● ●● ● ● ● ●● ● ●●●●● ●●●● ● ●● ●●●● ● ● ● ●
● ● ● ●●●●● ●●● ●●●●●●●●●● ●●●●●●● ● ● ●● ●● ●●●●● ● ●●●● ●●● ●● ● ● ● ● ●
● ●● ●● ● ● ● ●
● ●● ●● ● ●● ●
● ●● ● ●● ●● ● ●●● ● ● ●●●●●●
epoch
43
• Embedding Layers for Categorical Variables
44
Categorical Variables and Dummy/One-Hot Encoding
B1 0 0 0 0 0 0 0 0 0 0
B10 1 0 0 0 0 0 0 0 0 0
B11 0 1 0 0 0 0 0 0 0 0
B12 0 0 1 0 0 0 0 0 0 0
B13 0 0 0 1 0 0 0 0 0 0
B14 0 0 0 0 1 0 0 0 0 0 each row is in R10
B2 0 0 0 0 0 1 0 0 0 0
B3 0 0 0 0 0 0 1 0 0 0
B4 0 0 0 0 0 0 0 1 0 0
B5 0 0 0 0 0 0 0 0 1 0
B6 0 0 0 0 0 0 0 0 0 1
B1 7→ e1 1 0 0 0 0 0 0 0 0 0 0
B10 7→ e2 0 1 0 0 0 0 0 0 0 0 0
B11 7→ e3 0 0 1 0 0 0 0 0 0 0 0
B12 7→ e4 0 0 0 1 0 0 0 0 0 0 0
B13 7→ e5 0 0 0 0 1 0 0 0 0 0 0
B14 7→ e6 0 0 0 0 0 1 0 0 0 0 0 each row is in R11
B2 7→ e7 0 0 0 0 0 0 1 0 0 0 0
B3 7→ e8 0 0 0 0 0 0 0 1 0 0 0
B4 7→ e9 0 0 0 0 0 0 0 0 1 0 0
B5 7→ e10 0 0 0 0 0 0 0 0 0 1 0
B6 7→ e11 0 0 0 0 0 0 0 0 0 0 1
45
Embeddings for Categorical Variables
• One-hot encoding uses as many dimensions as there are labels (mapping to unit
vectors in Euclidean space).
• From Natural Language Processing (NLP) we have learned that there are “better”
codings in the sense that we should try to map to low-dimensional Euclidean spaces
Rb, and similar labels (w.r.t. the regression task) should have some proximity.
def.
e : {B1, . . . , B6} → Rb, brand 7→ e(brand) = ebrand.
• ebrand ∈ Rb are called embeddings, and optimal embeddings for the regression
task can be learned during GDM training. This amounts in adding an additional
(embedding) layer to the FNN.
46
Deep FNN using Embedding Layers (1/2)
Area
●
● ● ●
VehPower
VehAge
DrivAge
●
● ● ● Area
Area
BonusMalus
B1
●
● ● ●
B10
B11
●
● ● ● ● VehPower
VehPower
B12
●
● ●
B13
● ● ●
VehAge
●
B14
B2
●
● ● VehAge
●
B3
B4
●
● ●
DrivAge
B5
B6
● ● ●
VehGas
●
● ● ●
DrivAge
BonusMalus
●
Density
R11
●
● ●
R21
R22
● ● ● ●ClaimNb BonusMalus
ClaimNb ClaimNb
R23
●
● ● ● VehBrEmb
R24
R25
● ● VehBrEmb
R26
●
● ● ●
R31
● ●
R41
R42
● ● ●
VehGas
R43
●
● ● ●
VehGas
●
R52
R53
R54
●
● ● Density
R72
R73
●
● ● ● ●
Density
R74
R82
●
● ● ● RegionEmb RegionEmb
R83
●
● ● ●
R91
R93
● ● ●
R94
●
47
Deep FNN using Embedding Layers (2/2)
48
Results of Deep FNN Model with Embeddings
• Remark: AIC is not a sensible model selection criterion for FNNs (early stopping).
49
Learned Two-Dimensional Embeddings
1.0
B12
● R43
●
R21
●
0.5
R73
●
R23
●
0.5
B14
● R41
●
B13
●
dimension 2
dimension 2
B5
● R91
●
0.0
B3
●
B6
B4
● ●
R42● R52
●
B1
●
B2
● ● R93
●
R22
●
R72
0.0
R54
●
R11
R26
●● R31
● R74
●
R83
● R82
●
−0.5
R53
●
R24
●
−0.5
B11
●
R25
●
B10
● R94
●
−1.0 −0.8 −0.6 −0.4 −0.2 0.0 0.2 −0.5 0.0 0.5 1.0
dimension 1 dimension 1
50
Special Purpose Layers and Other Features
• Skip connections. Certain layers are skipped in the network architecture, this is
going to be used in the LocalGLMnet chapter.
51
References
• Breiman (2001). Statistical modeling: the two cultures. Statistical Science 16/3, 199-215.
• Cybenko (1989). Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals, and
Systems 2, 303-314.
• Efron (2020). Prediction, estimation, and attribution. Journal American Statistical Association 115/539 , 636-655.
• Efron, Hastie (2016). Computer Age Statistical Inference: Algorithms, Evidence, and Data Science. Cambridge UP.
• Ferrario, Noll, Wüthrich (2018). Insights from inside neural networks. SSRN 3226852.
• Goodfellow, Bengio, Courville (2016). Deep Learning. MIT Press.
• Grohs, Perekrestenko, Elbrächter, Bölcskei (2019). Deep neural network approximation theory. IEEE Transactions on
Information Theory.
• Hastie, Tibshirani, Friedman (2009). The Elements of Statistical Learning. Springer.
• Hornik, Stinchcombe, White (1989). Multilayer feedforward networks are universal approximators. Neural Networks
2, 359-366.
• Kingma, Ba (2014). Adam: A method for stochastic optimization. arXiv:1412.6980.
• Leshno, Lin, Pinkus, Schocken (1993). Multilayer feedforward networks with a nonpolynomial activation function can
approximate any function. Neural Networks 6/6, 861-867.
• Nesterov (2007). Gradient methods for minimizing composite objective function. Technical Report 76, Center for
Operations Research and Econometrics (CORE), Catholic University of Louvain.
• Noll, Salzmann, Wüthrich (2018). Case study: French motor third-party liability claims. SSRN 3164764.
• Richman (2020a/b). AI in actuarial science – a review of recent advances – part 1/2. Annals of Actuarial Science.
• Rumelhart, Hinton, Williams (1986). Learning representations by back-propagating errors. Nature 323/6088, 533-536.
• Schelldorfer, Wüthrich (2019). Nesting classical actuarial models into neural networks. SSRN 3320525.
• Shmueli (2010). To explain or to predict? Statistical Science 25/3, 289-310.
• Wüthrich, Buser (2016). Data Analytics for Non-Life Insurance Pricing. SSRN 2870308, Version September 10, 2020.
• Wüthrich, Merz (2021). Statistical Foundations of Actuarial Learning and its Applications. SSRN 3822407.
52
Discrimination-Free Insurance Pricing
Mario V. Wüthrich
RiskLab, ETH Zurich
1
Contents: Discrimination-Free Insurance Pricing
• Direct discrimination
• Indirect discrimination
• Unawareness price
• Discrimination-free price
2
• Direct Discrimination
3
Best-Estimate Pricing
µ(X, D) = E [Y | X, D] .
120
●
●
●
●
110
●
●
●
●
price
●
100
●
●
●
●
90
●
●
●
● best−estimate female ●
● best−estimate male ●
80
20 40 60 80 100
age X
5
Best-Estimate Price: Direct Discrimination
120 ●
●
●
●
110
●
●
●
●
price
●
100
●
●
●
●
90
●
●
●
● best−estimate female ●
● best−estimate male ●
80
20 40 60 80 100
age X
• Article 2(a):1 “direct discrimination: where one person is treated less favourably,
on grounds of sex...”
120
●
●
●
●
110
●
●
●
●
price
●
100
●
●
●
●
90
●
●
● best−estimate female ●
● best−estimate male ●
●
intuitive discrimination−free
80
20 40 60 80 100
age X
• Article 2(a): “direct discrimination: where one person is treated less favourably,
on grounds of sex...”
7
• Unawareness Price and Indirect Discrimination
8
Unawareness Price
µ(X) = E [Y | X] .
9
Unawareness Price: Example
120
●
●
●
●
110
●
●
● ●
● ●
●
price
●
●
100
●
●
●
● ●
90
● ●
● best−estimate female ● ●
●
● best−estimate male ● ●
intuitive discrimination−free ●
●
unawareness price
80
20 40 60 80 100
age X
10
Unawareness Price: Example
120
●
●
●
110 ●
●
●
● ●
● ●
●
price
●
●
100
●
●
●
● ●
90
● ●
● best−estimate female ● ●
●
● best−estimate male ● ●
intuitive discrimination−free ●
●
unawareness price
80
20 40 60 80 100
age X
P[D=female|X=age]
0.0
population average
20 40 60 80 100
age X
11
What Goes “Wrong” with the Unawareness Price?
2
COUNCIL DIRECTIVE 2004/113/EC of 13 December 2004, Official Journal of the European Union L 373/37
12
• Discrimination-Free Price
13
Discrimination-Free Pricing
14
Discrimination-Free Price: Example
120
●
●
●
110 ●
●
●
● ●
● ●
●
price
●
●
100
● ● ● ● ● ● ● ● ●
●
●
●
● ●
90
● ●
● best−estimate female ● ●
●
● best−estimate male ● ●
● intuitive discrimination−free ●
●
unawareness price
80
20 40 60 80 100
age X
P∗(D) = P(D).
0.2 0.4 0.6 0.8
population distribution
P[D=female|X=age]
0.0
population average
20 40 60 80 100
age X
15
Concluding Remarks
• For any given problem there are infinitely many choices P∗, and henceforth there
are infinitely many discrimination-free prices.
• Discrimination-free prices may induce unwanted economic side effects like adverse
selection.
• We did not discuss fairness nor which variables are discriminatory (ethnicity, etc.).
17
References
• Chen, Guillén, Vigna (2018). Solvency requirement in a unisex mortality model. ASTIN Bulletin 48/3, 1219-1243.
• Chen, Vigna (2017). A unisex stochastic mortality model to comply with EU Gender Directive. Insurance: Mathematics
and Economics 73, 124-136.
• Frees, Huang (2020). The discriminating (pricing) actuary. SSRN 3592475.
• Lindholm, Richman, Tsanakas, Wüthrich (2020). Discrimination-free insurance pricing. SSRN 3520676. To appear in
ASTIN Bulletin 2022.
• Lorentzen, Mayer (2020). Peeking into the black box: an actuarial case study for interpretable machine learning.
SSRN 3595944.
• Pearl, Glymour, Jewell (2016). Causal Inference in Statistics: A Primer. Wiley.
• Zhao, Hastie (2019). Causal interpretations of black-box models. Journal of Business & Economic Statistics.
18
LocalGLMnet and more
Mario V. Wüthrich
RiskLab, ETH Zurich
1
Contents: LocalGLMnet and more
2
• Balance Property for Neural Networks
3
Balance Property for FNN Models
• The reason is early stopping which prevents from being in a critical point of the
deviance loss D∗(Y , ·) (under canonical link choice).
4
Critical Points of Deviance Loss Function
Under the canonical link choice, the balance property is fulfilled in the critical points.
5
Failure of Balance Property of FNNs
0.102
0.100
0.098
0.096
average frequencies
6
Seeds and Randomness Involved in SGD
age
claims
ac
7
Representation Learning: Additional GLM Step
age
claims
ac
• Idea (with canonical link choice g = h): consider a GLM with new covariates z
bi.
age
claims
ac
MLE
• Choose β
b b ∈ Rn×(qd+1)
for design matrix Z
MLE
D E
xi 7→ h(b
µi) = h(E[Y
b i]) = β
b b(d:1)(xi) ,
,z
9
R Code for Implementing the Balance Property
• The R code considers the Poisson model with canonical link g = h = log.
• If we do not have canonical link we still need to adjust the intercept βb0MLE.
• The additional GLM step may lead to over-fitting, if the size qd of the last hidden
FNN layer is not too large, there won’t be over-fitting.
10
Balance Property for FNN Models
11
Balance Property for FNN Models
in−sample losses over 30 SGD calibrations out−of−sample losses over 30 SGD calibrations
31.65
●
30.45
31.55
30.40
30.35
31.45
30.30
30.25
31.35
SGD balance property SGD balance property
12
• The “Best” FNN Regression Model
13
Multiplicity of Equally Good FNN Models
loss function (view 2)
• Many network parameters ϑ produce the same loss figure for a given objective
function ϑ 7→ D∗(Y , ϑ), i.e. they are “equally good” (on portfolio level).
• The chosen network solution will depend in the initial seed of the algorithm.
31.7
●
●
● ●
●
● ●
● ● ● ●
31.6
out−of−sample losses ● ● ●
●
● ● ●
●
● ● ● ● ● ●
● ●
● ● ● ● ●
● ● ●● ● ●
● ●● ●●
● ● ● ● ● ● ●
● ●
● ● ● ●●● ● ●● ● ● ●
● ● ● ●●● ●
● ● ●
●●
● ● ● ●● ● ● ● ● ●
● ● ● ● ● ●
● ● ●● ● ●●
● ● ● ●● ●
● ● ● ●
31.5
●
● ●●●
●● ● ● ●
● ● ●●● ●●
●
●● ● ● ●
●
●● ● ● ● ●
● ● ● ●●●● ● ● ●● ● ● ● ●●
● ● ● ● ● ●
● ● ● ● ● ●
● ●● ● ● ● ●● ● ● ●● ● ●
●● ●● ●● ●●●
● ● ●● ●●●● ● ●
● ● ●
● ●● ●●●● ●● ● ●●●
●● ● ●
●● ●
●●●● ● ● ● ● ● ●
● ● ● ●● ● ●●
● ●●●
● ● ●● ● ● ● ● ●●● ●
● ●● ●●
● ● ●●●● ●● ● ● ●● ● ● ●
● ●
● ● ● ● ● ●
●● ●
●
●● ●● ●
●
●● ●
● ● ● ●
●● ●● ● ● ●
● ● ● ● ●●● ●● ●●●●●●●● ●
● ● ●● ● ●●● ●
● ●●● ● ●
31.4
● ●● ● ● ● ● ●
● ● ●● ● ●
●●
● ● ● ●● ●
●● ●●
●
●● ●
● ● ● ●
● ●● ●
●
● ●
● 400 calibrations
● cubic spline
in−sample losses
This example is taken from Richman–Wüthrich (2020) and the in-sample losses are
smaller than in the table above because we used different data cleaning.
15
• The Nagging Predictor
16
The Nagging Predictor
M M
(M ) 1 X (j) 1 X
µ̄i = µ = µ b(j) (xi).
M j=1 i M j=1 ϑ
b
17
Nagging Predictor: Car Insurance Example
nagging predictors for M>=1
31.55
nagging predictor
1 standard deviation
31.50
out−of−sample losses
31.45
31.40
31.35
31.30
0 20 40 60 80 100
index M
19
Stability on Individual Policy Level
v ,
u M 2
σ
bi u 1 X (j) (M ) (M )
CoV
di =
(M )
= t bi − µ̄i
µ µ̄i .
µ̄i M − 1 j=1
• Individual policy
√ level: average over 400 networks to get coefficient of variation
(CoV) of 1/ 400 = 5%.
20
Fit Meta Model to Nagging Predictor
21
• LocalGLMnet: interpretable deep learning
22
Explainability of Deep FNN Predictors
• Network predictors are criticized for not being explainable (black box).
23
LocalGLMnet Architecture
• A GLM has the following regression structure for parameter β ∈ Rq
q
X
g(µ(x)) = β0 + hβ, xi = β0 + βj x j .
j=1
β : Rq → Rq
(d:1) (d) (1)
x 7→ β(x) = z (x) = z ◦ ··· ◦ z (x).
24
Interpretation of LocalGLMnet Architecture
q
X
g(µ(x)) = β0 + hβ(x), xi = β0 + βj (x)xj .
j=1
25
Implementation of LocalGLMnet
Line 10 is needed to bring in the intercept β0; but there are other ways to do so,
see also next slide for explanation.
26
Remarks and Preparation of Example
q
X
g(µ(x)) = α0 + α1 βj (x)xj .
j=1
27
Example: Regression Attentions LocalGLMnet
●●●●●
●
● ●●
●
● ●● ● ●
●
● ●●●
●
● ●● ●
●
●
● ●
●●● ●●
● ● ●
●
● ●●●● ●●
●●
●
●●
● ● ●●
●
● ● ●
●
●
● ● ● ● ●
●
regression attention: Area Code regression attention: Bonus−Malus Level regression attention: Density
● ● ● ●
●
● ● ●●
●●
●
●
● ● ●● ●● ●
●
● ●
● ●●● ●
●
● ●●●
● ●
● ● ● ● ●
●
●
● ●
●
● ● ● ● ●
●
●
●
● ●●●●● ●
● ●
●●● ●● ●
●
● ●
●●
● ●● ●
●
● ● ● ● ●
●● ●
● ●●●
●● ●● ●
●
●
●
●
● ● ●● ●
●
●●
●
● ●● ●
● ● ●
● ●
●●
● ●● ● ● ●
●
●
● ●
● ●● ●
● ●
●
●
●
● ●● ● ● ● ●
● ●
●
●
● ●
● ●
●
● ● ● ● ●
●
● ● ● ●
●
● ●
● ● ●
● ●● ●
●●
●
●
● ● ●● ● ● ●
●●
● ● ●● ● ●●● ● ●
●●● ●
● ● ● ●
●
● ● ●
● ● ●
●
● ●● ●
●●
●
● ●
● ●
●●●●●● ● ● ●●
● ●
●
● ●● ●●
●●
● ● ●● ● ●●●
●● ●● ● ● ● ●
●
● ● ● ●
●
●●● ●●●
● ● ●
●
● ●
●● ● ●
●●
● ●
●● ●● ● ● ●
●
● ● ●●●
● ●●
0.5
0.5
0.5
●●
● ●
●●
●
● ●
● ● ●
●
● ●
● ● ● ● ● ●
● ● ●
●
●●
● ● ● ● ● ● ●
●
●
●
●
●
● ● ● ● ● ●
●
●●
● ●
●●● ●● ● ● ●●●
● ● ● ● ●
●●
● ● ●●●● ● ● ● ●
● ● ● ● ●
●●
●
●
●
●●●● ● ● ● ● ● ● ● ●
●
●●
● ● ● ● ● ● ● ● ● ●
●
● ● ●
● ● ●
● ● ● ●
● ●
● ● ● ● ●
● ● ● ● ●
●● ● ● ● ●● ●
●
●
● ● ●● ● ●● ●●●●●●● ●
● ● ●
● ● ● ●
●
●
●
●
●
● ●
●●
● ● ●●● ●●●●●● ●●● ● ●
● ● ●
●
●
●
●
●
● ● ●● ● ● ● ●
●
●●
●
● ● ●
● ● ● ●
● ● ●
● ● ●
●● ●
● ● ●
● ●
●
●
●
●
● ●● ●
●
● ●●● ●
●
● ● ●
●
● ● ●
● ●
●
● ●
● ●
●
● ●
●
● ● ● ● ● ●●● ●●● ● ● ● ● ●
●● ●●● ● ●●
●
● ● ● ● ● ●●
●
● ● ●
● ● ●
●
● ●
●●● ●
● ● ● ●● ● ●● ●● ● ● ● ● ● ● ● ●
● ●●●● ●●●
● ●
● ●● ● ● ● ● ●
● ●
● ●
●● ●
●
regression attention
regression attention
● ●
regression attention
●●●
●
● ● ●
●● ● ● ●●
●
● ●
● ● ●
●
● ●
●
●
●
●
● ● ●
●
●
●
●
●
●
●
●
● ●
● ●
● ●
● ●● ●● ● ● ●
● ●
● ●
●● ●
●● ●● ●
● ● ●●● ●● ●
● ●
● ● ● ●
● ● ●● ●
●● ● ●●● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ●●●
● ● ●● ● ●● ●● ● ● ● ● ● ●
● ●
● ● ● ● ● ● ● ● ● ●●● ● ●
● ● ● ● ● ●● ● ●● ●
● ●
● ● ●
● ●
● ● ● ●●●● ● ●
●● ●● ●●●● ● ● ● ● ● ● ●● ●●●
● ● ● ● ●
●●●●
●
●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ●● ●● ●● ● ● ●● ●● ● ● ● ● ● ●● ●●● ● ●● ●
● ●●
●
● ● ● ● ●●
●● ●●
● ● ● ● ● ●
● ● ● ● ● ● ● ● ●● ●●● ● ● ● ●●●● ●●●● ● ● ● ●● ●●● ● ● ● ●● ●● ●
● ● ● ● ●● ● ●● ●●● ● ●● ● ●
● ● ●●● ● ● ● ● ●
● ● ●
● ● ● ● ●●●
●● ●●● ●●●● ●●
●
● ●● ●●●●●● ●● ●●● ●●●●
●
●● ●● ● ●● ●●●●
● ● ●●●●● ● ● ●
●●
● ●
● ●● ● ●● ● ●●●● ●
● ●
● ● ● ● ● ● ● ●
● ● ● ●
● ● ● ● ● ●● ● ●● ●● ●●● ●●
● ●●●● ● ●● ●
●● ● ● ●● ● ●● ●●● ● ●● ●
● ●
●● ●
● ●● ●
●
● ● ● ● ● ● ● ● ● ● ● ●
● ●
● ● ●
●
● ● ●
● ●
● ● ● ●● ● ●●●
●● ●
●●● ● ● ●●
●● ●●● ●
●●
● ●●
● ●●● ●
●
●●●●●
●
●● ●● ● ●●
● ●●● ●● ●● ●●● ●● ●● ●●●● ●●
● ●● ●●● ●● ●● ●●
● ●
●
● ●●● ● ● ●●●
●
●
●●●●●●●
●
● ●● ● ● ●● ● ● ●
● ● ●
● ● ●●● ● ● ●● ● ● ● ● ● ● ● ●
●●● ● ●●● ●● ●●●●● ●
●●● ●●● ●● ●●
● ●●●● ●
●●● ●
● ●●
●● ●●●●●●● ●
● ●● ●●●●●
●●●
● ●●● ●● ●● ●● ● ●● ● ●●
●
●● ●
● ●● ●● ● ●
● ● ●
● ●
● ●●●● ● ●
● ● ● ● ●
● ●
● ● ● ● ●● ● ● ● ● ● ● ●● ●●● ●●●● ●●●●● ● ●
●●●●● ● ●●●●●●●●●
●●●● ●● ●● ●●●● ●● ● ● ●●●● ●● ●
●●● ● ● ●● ● ● ●●
●●●
● ●
●● ● ●●●● ● ● ● ●●● ●
●
●
●
●
● ● ● ● ●
●
●●● ● ●
● ●●
●
● ●
● ●
● ●
● ● ●
● ●● ●●● ●●
●● ●● ●●
●●●
●● ●
● ●
●● ●●●●
● ●●
●●●● ●●
●● ● ●● ●● ●●●
●
●●●●● ● ●●
●
●● ●● ● ●
●●
●●●●
● ●● ● ●● ●
●●●● ● ●
●
●●●●
● ●●
●
● ●
● ●● ●●●● ● ● ●
●
● ● ● ●
● ● ● ●● ● ● ● ● ● ● ●
● ●●● ● ●● ● ● ●● ●
●● ● ●● ●●●● ● ●● ●●●●
●
●●●
●●●
●●● ● ●●
●●
● ●●● ●●
●● ●●●● ● ● ●●
●●●●
●
●●●● ●●
●
●
●●●●●
●●●● ●●
● ● ●● ●
●●●
●
● ●● ●●● ● ● ●
●●● ● ●●
●●● ● ●● ● ●●● ●
●●
●●
●
●●● ●
● ● ●
● ● ●
● ● ● ● ● ●
● ●
● ● ● ● ● ● ● ●
●
● ● ● ●●● ● ● ●●●●● ●● ● ●● ●●
●●● ●●●●● ●●
● ●●●
●●●
● ● ●
● ●
●●● ● ● ●●
●●●●●
●●
●
●●
●●●
●
●●
●
●●●● ●●
●●●●
●●
●●● ● ● ● ● ● ● ● ● ●● ●●●●
● ●●●
● ●●●●
●
●
●● ●
●●
●
●
● ● ●
●
● ●
● ●
●
● ● ●● ● ●
● ●
● ●●
● ●● ●
●
● ●
●
● ● ●
● ● ●●● ●● ●●●● ●
●●●●
●
●
●●●●●● ● ● ●●●
●●●
●●●●
●●
●
●
●●●●●●●
● ●●●
●●● ●●●●
● ●
●●●
● ●● ●●
●●●●●●
●●●●●●
● ●● ●●
● ●●●●●
●●
●
●●
● ●●
●● ●● ●
●
● ● ●●●●●●
● ● ● ● ● ● ● ●●
●
●●●●● ●●
● ●● ●● ● ● ● ● ●
●
● ●
● ●
● ●
● ● ● ●
● ●●● ● ● ● ●
●● ●●
● ●● ● ●
● ● ●
● ●
● ● ● ● ●● ●● ●● ● ●●●● ● ●●● ●●● ●●
● ●
●●●●
● ●●●●
●●●● ●● ● ●●●
●
●
●
●●●●●●●●●●●
● ●● ●
●
●
●●
●
● ●●
●● ●● ●
●●● ●
●●
●● ●●●●●●● ● ●●
●●● ● ●● ●
●●●●
●●
●●●● ●
●● ●● ● ●● ●●●
●
● ●●●●● ●●
●● ●●● ●●●●
● ● ●
●
●
●
●
●
● ●
● ●
● ● ●
● ● ●
● ● ● ● ●● ● ●
● ● ● ● ●
● ● ●●● ● ● ●●●●●● ●● ●●●
● ● ●●● ● ● ●● ●
● ● ● ●●● ●●● ●●● ●●
● ●
●
●
●●●
●
●●●
●●● ●●
● ●●● ●●●●●
● ● ●●●●●●● ●●● ●●●
●●
●● ●● ●
●●●● ● ●
● ●●● ●
●
●
●
● ● ● ● ●
●
●
● ●
● ●
● ● ● ● ● ● ● ●
● ● ● ●
● ● ● ●● ● ● ●● ●●●●●●●
● ● ● ●
●● ●
●● ● ● ●
●
●
●●● ●● ● ●● ● ●
●
● ●● ●●●●
●●● ●● ●●●● ● ●
● ●●
●●●●
●●
● ● ●
● ●● ● ●
●●● ● ● ● ● ●● ●
●● ●
● ●● ●
● ● ● ● ● ●●
●
● ●●
●
●●●●●
●●●●
●●● ● ●● ●●● ●●●
●●
● ●
●
● ●
●
●
● ● ●
● ● ●● ● ●
● ●● ● ●
● ●
● ●●● ● ●● ● ●
● ● ●● ● ● ● ●
●●●●●●
●● ●●●
● ●
●●●●●
●●●
●● ●
●●●●●●● ●●
● ●●●●●●
●●
●●●●
●
●●
●●●● ●●
● ●● ●
●●●● ●●
● ●
●●
●
●●
●
●
●●
● ●●●●●
● ● ●● ●●●●
●
●●●
● ●●
● ●●●● ● ●●●● ●
●
●●●● ●●●
● ●●
●● ●●●●● ●●●
●●
●●●●●●● ●
●●●●
●●
●●●
●
●●
●
●●●
●●
●●●
● ●●● ● ●●
●
●●●●●●
● ● ● ●
●
● ●
● ●
●
●
●
● ●
● ●
● ● ●
● ●
●
●●
●
● ●
●● ● ●
●
● ● ●
● ● ●
● ● ●●● ● ●● ●● ●
● ●● ● ●● ●
●● ●●● ●●●●●●
●●●● ● ●●
●● ●●●
● ●
●●
●
●●●●● ●
●●●● ●●●●●
● ●●●●●
● ●
●●●
●● ●●●
●●
●● ●● ●● ●●●
●
●●●
●
●
●● ●●●
●●●
●●
●●
●●●
●
●
●
●●●● ●●●●● ●
●●
●
●● ●
●
● ●●●● ●● ●●
● ●●●
● ●● ● ●●● ●●
●
● ● ●
●●
●
●
●
●●●
●●
●
●
●●●
●
●●●●
●● ● ● ● ●●
●
●●●
●● ● ●● ● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ● ● ● ● ● ● ● ●
●
● ● ● ● ●
●
● ● ● ●
● ● ● ●
● ●
●
●
●
●
● ● ●
● ●●● ●●●● ●●
● ●●
●●●
●●
●
●●●
●●● ●●●
●●●● ●●● ●
●
●●
●●
●●
●
●
●●●
●
● ●●
●
●●
●
●
●
●
●
●
●●●
●
●
●
● ●●
●●●
●●
●
●●●
●
●
●●
●
●●
●
●●●
●
●●●
●
●●
●●
●
●
●
●●
●●●●
●
●●●
●
●●●●●
●●● ●●●
●●
●
●●●●● ●●●
●● ●●
●
●
● ●●●●
●
●
●●● ●●●
●
● ●●
●●
●
●●
● ● ●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●●●● ●●
● ● ●●
●
●
●●●●
●
●
●
●
●
●
●●
●
●
●
●●●
● ●●●
●●
●
●
●
●●●
●
●
●●
●
●
●●
●
●
● ●
● ●●
●●
● ●
●●●●
●
●
●
●●●
●●
●●
●
●
●● ●●
●
●●
● ●
● ●●●●●●
●●●●
●●
●
●● ●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
● ●●
●●
●●
●
●●●
●
●
●
●
●
● ●●●●
●
●
● ●●
● ●
●
●
● ●●●● ●● ●●● ● ● ●
●
●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ●● ●● ● ●●● ●●
● ●● ●
●●
● ● ● ●
●●● ●●●
● ●●
● ●
● ●
● ● ● ● ●● ● ●
●● ● ●●
● ●●
● ●● ●●
● ● ● ●●●●● ●●●● ● ● ● ●
● ● ● ●● ●● ● ●
● ● ●
●
● ● ●
● ●
● ● ● ● ●
● ●●● ● ●
● ●●● ● ● ●● ●
● ● ● ● ●
● ●
●●● ●●● ● ●●● ●
●●●●
● ●
●●●●● ●●
●●● ●●●●
●●● ●
●●●●●●
●●● ●
●●●
●●
●●●
●●
●● ●●●●
●
● ●
●●
●●●●●●
●● ●●●●●
●●●
● ●● ●●
●
●●
● ●●
●
●●
●●
●●
●●●
●
●●
●●●
● ●●●
●●
●●●●●
●●
●●
●
●●
●
● ● ●
●●● ●●
●●●●
●●
●
●●●
● ●●
● ●
●
●● ●● ● ●
●
●
●●
●●●
●●●●
●
●●●
● ●
●●●●● ●
●●●● ●
●●●
●● ●●
●●●● ●●
● ●●●●●●
●
● ● ● ●● ●● ●●●●●●●●● ●● ●
●
●●●
●●● ●
●
●●●
●●
●● ●●
● ●● ● ●
●●● ●
●● ●●●● ● ●
●
● ●
● ●
● ●
● ●
● ●
● ● ●
● ● ● ● ●●● ●● ● ● ●● ● ● ●● ●●●●●● ●● ●●●
●●●
● ● ●● ●●
● ●
● ●
●●
●● ●
●●●●
●●
●
●●
●●●
● ●●
●●
●●
●
●●
●●●
●
● ●
●●●●● ●●
●●●●
●● ●● ●
● ●
●●●●● ●
● ●● ●
●●●
●
●●
●● ●● ●
●●● ●
●●
●●●● ●
●
● ● ●● ●
●● ●
●●●
● ●
●●
● ●●● ●● ● ● ●●
● ●●
●
●●●●●● ●
●
● ●●●
●
● ●●● ●
● ● ●● ● ● ●
●●● ● ●
●
● ●
● ● ●
● ●
● ● ● ● ● ●● ●● ● ●
● ●
●
●
● ●
● ●
● ● ●● ●●
● ●●●● ●● ●●
●● ●●●● ●● ●●●●●●
●●
●●
●●●
●
●
●
●●●●
●●
●●●●
● ● ●●
●●
●● ●
●
●● ●
●●
● ●
●● ● ●●
●
●●●●●
●●●● ●●
●●●●
● ●● ● ●
●
●●
●●
●●●
● ●●
● ●
●●
● ●●●●●●● ● ●●
●
●●●●●●
●●● ●
●●●
●●●
●●●●●●
●●●●
●●●●●●
●● ●
●●●●● ●●
● ●● ●●
● ●
●●●
●
●●● ●●
●●● ●
●
●●●
●●●● ● ●●●
●● ● ●
●
●●
●●●● ●●●
●
● ●●●
●
●●
● ●
● ● ●● ● ● ●● ●
● ● ● ●
● ●
●
● ●
● ●
●
● ● ●
● ●
● ●
● ●
● ●● ●
● ● ● ● ● ● ● ●●● ● ● ● ●●●●
●●●● ●●
● ●
●●●● ●●●●●
●●● ●● ●● ●●●●● ● ● ●
●
●
●● ● ● ●●●●●
●●●●●●
● ●●●● ●● ● ●
●
●
●
●●
●● ●
●
●●
●
●
●●●
●●
● ●
● ●●
●●
● ●
●●●●●●●
●●
● ●●
●●
●
● ●
●●
● ● ●●●●●●● ● ● ●
●●
● ●● ●
●
● ●●
●
●●
●● ●
●● ●●
● ●● ●●● ● ●
●●● ● ●
●
●
● ●
●
●
●
●
● ●
● ●
●
● ● ● ●
● ● ● ● ●
● ● ●●
● ●
●●
●
●
● ●● ●
● ● ●
● ●●●
● ● ●● ●● ● ● ● ●●●●●●●● ●
●●
● ●● ●●●● ● ●●
●●●●
●●●● ●
●●
●●
●●●●●
●●
●●●● ●● ●●● ●●●
● ●
●●
●
●●
●● ●●●
● ●
●●● ● ●
●●●
●●
●
●●
●
●●
●●● ●
●●● ●●●● ●
●●● ●
●● ●●
●●●●●● ● ● ●●
●
●●
●● ●●●●●
●●
●●
●● ●●●● ●● ●
●●
●●
●
● ●
●
●●
●
●●● ● ●
●
●●
●●●
●
●● ● ●●● ●● ● ●
●
●
●
● ●
● ●
● ●
●
● ●
● ● ● ● ●●● ● ●● ●●● ● ● ●
● ● ● ● ● ●● ● ●●● ●●● ●●●● ●
●●
● ●● ●●
●●●
● ●●●●● ●●●● ●●●● ●●
● ●●●
●
●●●● ●● ●
●●
● ●●●●●●●
● ●
●●
● ●
●
●
●
● ●●
● ●
●●
● ●● ●
●●●●●
●●●● ● ●● ●
●● ●●
● ●● ●●●●●●● ●
● ●●●
●
●● ●● ● ●● ●●●●●●● ●●
● ●●●● ●
●●●
●●
● ● ●●
●● ● ●● ● ●
●
● ● ●
● ●
● ● ● ● ●
● ● ● ● ● ● ●
● ●●● ●●●
● ● ●
●●●●
●●● ● ●● ●● ● ● ●● ●● ● ● ● ● ●
●●●● ● ●● ● ●● ● ●● ●●● ●●●●●● ●● ●●●●●● ●● ● ● ●● ●● ● ●●
●●● ●
●● ●
0.0
0.0
0.0
● ● ● ● ● ●● ● ● ● ●● ●●● ●● ● ● ● ●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ● ● ● ● ●●
●
●
●
●
●●●
●
● ● ●
●
●
●
● ●
●
● ● ●
● ● ● ● ●● ●●
●
●
●● ● ● ● ●●
● ●●
●
●
●●● ● ●●●
●●
●●●●●
● ●●●●●●● ●●● ● ● ●●
●●● ● ●●● ●●
●●●● ●● ● ●● ●
●
●
● ●● ●●●●●●● ● ●
●●●●
●
●●● ●●
●
● ●●●●
● ● ●●●● ●● ●●
● ●
●●●●●●
●
●
●
●●● ● ● ●● ● ● ●
●
● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●
● ● ●
●
● ●
● ● ●
● ● ● ● ● ● ●●
● ● ● ●
● ●● ● ●● ●● ●
●
● ● ●●● ● ●
● ● ●●
● ●●
● ● ● ●
●
●●
● ● ●● ●
●●●●●●● ●
●●● ●
● ●●
● ●●● ● ●●●● ● ●●● ● ● ●
●●●●●● ● ●
●● ●● ● ●
●
●● ●
● ●●● ● ● ● ●● ●
●
● ●
● ●
● ●
● ●
● ●
● ● ● ●●● ● ●● ● ●● ● ●
● ● ● ● ●●●● ● ●● ●●●
●● ● ●● ●●●
●●●● ●● ● ●● ● ●
●
●●●●● ● ● ● ● ●
●
●
●
●
●
●
● ●
● ●
● ●
●
●
●
● ● ●●
●
●
● ● ●●
●
● ●
● ●
● ● ●
● ● ● ● ● ● ● ●
● ●●● ● ●●● ● ● ●● ●
● ● ● ●● ●● ●●● ●●● ●
●●●● ● ● ●●●● ● ● ● ●●
●● ● ●● ● ● ●●
●●●
● ●
●●
●●● ● ● ● ● ●● ● ● ●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●●
●●● ●
●
●
● ●
● ● ●
●
●
●
●●
● ● ● ● ●● ● ● ● ● ●● ●● ●●
● ●● ● ● ● ● ● ● ● ● ● ●● ● ●●●
● ●● ●● ●
● ● ●
●
●
●
●
● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ●● ● ● ● ● ● ●● ● ●
●
● ●
● ●
● ● ●
● ●
● ●● ● ● ● ● ● ●● ● ●●
● ●● ●● ●
●● ● ●
● ● ●●
● ●●●● ● ●● ● ●
● ●
●
● ●
● ●
● ●
● ●
●
●
● ● ● ● ●
●
●
● ● ●
● ● ● ●●● ● ● ●● ● ●● ● ● ●● ● ● ●● ●
●
● ●
● ●
●
● ●
●
● ●
●
●
● ● ● ● ● ● ●
● ● ● ● ● ● ● ●● ● ●● ● ● ● ●
●●● ● ●
●
● ●
● ●
●
● ●
●
●
●
●
● ●
●
● ●
●
●
●
●
● ●
●
●
●
● ● ●
●
● ● ●● ● ● ●● ●● ● ● ● ● ● ● ●
●
●
● ●
● ●
● ●
● ●
● ●
● ● ● ●
● ● ● ● ● ● ● ● ●● ● ● ● ●
●
● ●
● ●
● ● ● ●
● ●● ●●● ●
●● ●● ● ● ●
● ● ● ● ●
● ● ● ● ●● ● ● ● ● ● ●
●
● ●
●
●
●
●
●
●
●
●
● ●
●
●
●
● ● ● ● ●
●
●● ● ●
● ●
● ● ● ● ● ● ● ● ● ●
●
● ●
● ●
● ●
● ●
● ●
● ● ● ● ● ● ● ● ● ●
● ●
● ●
● ●
● ●
● ● ● ● ● ● ● ● ● ● ●
●
● ● ● ●
● ● ● ● ● ● ● ●
● ● ●
●
●
● ●
● ● ● ●
● ● ●●
● ●
●
● ● ●
●
●
● ●
● ● ● ● ● ● ●
● ● ● ●
● ● ● ● ● ● ●
● ●
● ●
● ●
● ●
●● ●● ● ● ● ● ●
● ● ● ● ●
●
● ● ●
● ● ● ● ●
● ●
●
●
● ● ●
● ●
● ●
● ●
● ●
●
●
● ●
●
● ●
● ●
● ● ● ●
● ● ●●
● ●
● ●
−0.5
−0.5
−0.5
●
●
●
●
●
● ●
●
regression attention: Driver's Age regression attention: Vehicle Age regression attention: Vehicle Power
●
● ●
●
● ● ●
●
● ●
● ● ● ● ● ●
● ● ● ●
● ● ● ●
● ● ● ●
● ● ● ● ● ● ● ● ●
● ●
● ● ●
● ●
● ●
●
● ● ●
● ● ●
● ●
● ● ●
●
●
● ● ● ●
0.5
0.5
0.5
● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ●
● ● ●
●
● ●
● ● ●
● ● ● ● ● ●
● ● ● ● ● ●
● ●
●
●
●
● ● ● ●
● ● ●
● ● ● ●
● ● ● ● ●
● ● ● ● ● ●
● ●
● ●
● ●
● ●
●
●
●
● ●
● ● ●
● ● ●
●
●
●
● ● ● ● ●
● ● ● ● ●
● ● ● ●
● ● ● ●
● ● ● ●
● ● ●
● ●
● ●
● ● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
● ● ● ●
● ● ●
● ● ● ● ●
● ● ● ●
● ● ●
● ● ● ● ●
● ● ●
● ● ● ● ●
● ● ● ● ●
●
● ● ● ●
● ● ● ●
● ● ●
● ● ●
● ● ●
● ●
●
●
●
● ●
● ●
● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ●
● ● ● ● ● ●
● ● ● ● ●
● ● ●
● ● ● ●
● ● ● ● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ● ●
●
●
● ●
● ●
● ● ● ●
● ● ●
● ● ● ●
● ●
● ● ● ● ●
● ● ● ●
● ● ● ●
● ● ● ●
● ● ● ● ●
● ●
● ● ●
●
● ● ●
● ● ● ● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ●
● ● ●
● ●
●
● ● ●
● ● ● ● ●
● ●
●
●
● ●
● ● ● ● ●
● ● ● ●
● ●
● ●
● ● ● ● ● ● ● ● ●
● ● ●
● ● ● ● ● ●
● ● ●
● ● ● ● ● ● ● ●
● ● ●
● ● ●
●
●
● ● ●
● ●
● ●
● ●
● ● ● ●
● ●
●
● ● ● ● ●
● ●
●
● ●
●
● ● ● ● ●
● ●
●
● ●
● ● ● ● ● ● ● ● ● ●
regression attention
●
regression attention
regression attention
● ● ● ● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ●
●
● ●
● ●
●
●
● ● ● ● ●
●
●
●
● ●
● ●
● ●
●
●
●
● ● ●
● ● ●
●
●
● ● ● ●
● ●
●
●
●
● ●
● ●
●
● ● ● ●
● ● ● ●
● ●
● ● ● ●
● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ●
● ● ●
● ● ● ● ● ●
● ● ● ● ● ●
● ● ●
● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ●
● ● ● ● ● ● ●
●
● ●
● ● ● ● ● ●
● ● ●
● ●
●
● ●
●
●
●
● ●
● ● ●
●
● ●
● ●
●
● ● ● ●
●
● ● ● ● ●
●
● ●
● ● ●
● ● ●
● ● ● ● ● ● ●
● ● ● ●
● ●
● ●
● ● ● ● ● ●
● ● ●
● ●
● ● ● ● ● ● ●
● ● ●
● ● ● ● ●
● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ●
● ● ●
● ● ● ● ● ● ● ● ● ● ●
●
● ● ● ●
● ● ● ● ● ● ● ●
● ●
● ●
● ●
● ●
● ● ● ● ●
● ● ● ●
● ● ● ●
● ●
●
● ● ●
●
●
●
● ●
● ●
● ● ●
● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
● ● ● ●
● ● ● ● ●
● ● ● ●
● ●
● ●
● ● ●
● ● ●
● ● ●
● ●
● ● ● ● ● ● ● ●
● ●
● ● ●
● ● ●
● ● ● ● ●
●
●
●
●
● ● ●
● ● ● ● ●
● ● ● ● ●
●
●
●
●
● ●
●
● ●
● ● ●
●
● ● ●
● ●
●
●
● ● ● ●
● ●
● ●
● ● ● ●
● ●
●
●
●
●
●
● ●
●
● ●
●
● ●
● ● ● ● ● ● ● ● ● ●
● ● ●
● ● ●
● ● ● ● ●
● ● ● ● ● ● ● ● ● ●
● ●
● ● ● ●
● ●
●
● ● ● ●
● ● ● ●
● ●
● ● ● ●
● ● ●
● ● ●
●
●
● ●
● ● ●
● ● ●
●
●
● ● ●
● ● ● ●
● ●
● ● ● ● ●
● ● ● ● ●
● ● ● ●
●
● ●
●
●
●
● ●
●
● ● ● ● ● ●
●
● ●
● ● ● ● ● ●
● ● ● ●
● ●
●
● ●
● ●
● ●
● ●
● ●
● ● ● ●
● ● ● ● ● ● ●
● ●
● ● ●
● ●
● ● ● ● ● ● ● ●
● ●
● ● ● ● ● ● ● ●
● ● ● ●
● ●
● ●
●
● ● ● ●
●
●
● ●
● ● ●
● ● ●
●
● ● ●
● ●
● ● ●
● ●
● ●
● ● ● ●
● ● ●
● ● ●
●
● ● ●
● ●
● ● ● ●
● ●
● ● ● ● ● ● ● ●
● ●
● ● ●
● ●
● ● ● ● ●
● ●
● ● ● ●
● ●
●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ● ●
● ●
● ● ●
● ● ●
● ● ● ●
● ●
●
●
● ●
● ●
● ●
● ●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
● ● ●
●
● ● ● ● ●
● ●
● ● ●
●
● ● ● ●
● ● ●
● ●
●
● ● ● ● ● ● ● ● ●
● ● ● ● ● ●
●
●
●
●
● ●
● ●
● ●
● ●
●
●
●
●
●
● ● ● ●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
● ●
● ●
●
● ●
● ● ●
●
● ● ● ● ●
● ●
● ● ●
● ●
●
● ● ● ●
● ● ●
● ● ●
● ●
● ●
● ● ●
●
● ● ●
● ●
● ● ● ● ●
● ●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ●
● ●
● ● ● ●
● ● ● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ● ● ●
● ● ●
●
● ●
● ● ●
● ● ●
● ● ●
● ●
●
●
● ●
● ●
●
● ● ●
● ● ●
●
●
●
●
●
●
●
● ● ●
● ● ● ● ●
● ● ●
●
● ● ● ●
● ● ● ●
● ● ●
● ● ● ● ● ● ● ● ● ●
● ●
●
● ●
●
●
●
● ●
●
●
● ●
● ●
● ●
● ● ●
●
●
●
● ●
● ● ● ●
●
●
●
●
● ●
● ●
● ● ●
●
●
●
● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ●
● ● ●
● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ●
● ●
● ● ● ● ●
● ● ● ●
● ● ● ● ● ● ● ● ●
● ● ● ●
● ●
● ●
● ●
● ● ● ●
●
●
● ● ●
● ● ● ●
●
● ● ● ●
● ●
●
● ●
●
●
●
●
●
●
●
●
●
● ●
●
●
● ● ●
● ● ●
●
●
●
●
● ● ●
●
● ● ●
●
●
● ● ●
● ● ●
●
●
●
● ●
●
●
● ●
● ● ●
● ●
● ● ● ●
● ●
●
●
● ● ● ●
● ● ● ●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
● ●
●
●
● ●
● ●
● ● ●
●
● ● ● ●
●
●
●
●
● ●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
● ● ● ●
●
● ● ● ●
● ● ● ● ● ● ●
● ●
● ● ●
● ● ● ● ● ●
●
●
● ●
●
●
● ● ●
● ● ● ● ●
● ● ●
● ●
●
●
● ● ● ●
● ● ● ●
● ● ● ● ●
● ● ●
● ●
● ● ● ● ● ●
●
●
●
● ●
● ●
● ●
●
●
●
● ● ● ● ● ● ●
●
● ● ● ●
●
● ●
● ● ●
● ●
●
●
●
●
●
● ●
● ●
● ●
● ● ● ● ● ● ●
●
● ● ●
● ● ● ● ●
● ● ● ● ●
●
● ●
●
●
● ●
●
● ●
● ● ● ● ●
● ● ●
● ●
● ●
● ●
●
● ●
●
●
● ● ● ● ● ● ● ● ● ● ●
●
●
●
● ●
●
●
●
● ●
●
● ●
● ●
● ●
●
● ● ●
●
●
● ● ● ● ● ● ●
●
●
●
●
●
●
●
● ●
●
● ●
●
● ●
●
●
●
● ●
●
● ● ●
● ●
● ●
●
● ●
● ● ●
● ● ● ●
● ● ● ● ● ●
● ● ● ●
● ● ● ● ●
● ● ● ● ●
● ●
● ● ●
● ●
● ●
●
●
● ●
●
●
● ● ●
● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
● ●
● ●
●
●
●
●
●
●
● ●
●
●
●
● ●
● ● ●
● ●
● ● ●
● ●
● ●
● ● ● ● ●
● ●
●
●
●
●
● ●
●
● ●
●
●
●
●
● ●
●
●
●
● ●
●
● ● ●
●
● ●
● ● ● ● ● ● ● ● ● ● ● ●
●
● ● ● ●
● ● ●
● ● ●
● ● ● ● ● ● ● ●
● ● ●
● ●
● ●
● ●
● ● ● ● ●
● ●
● ●
● ● ● ● ● ● ● ● ●
● ● ●
●
● ● ●
●
●
● ● ●
● ●
● ●
●
● ●
●
● ● ● ● ● ● ● ● ●
● ● ● ● ●
● ● ●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ●
● ● ●
● ●
●
● ● ● ●
● ● ● ● ● ●
●
● ●
● ● ● ●
● ● ●
● ● ●
● ● ● ●
● ●
● ●
● ● ● ●
● ● ●
●
● ●
● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
●
●
●
●
●
●
● ●
● ●
● ●
● ●
●
● ●
● ●
● ●
● ●
●
● ● ●
●
● ●
●
● ●
● ●
● ●
●
●
●
●
● ●
●
●
●
● ●
●
●
●
● ●
● ● ●
● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ●
● ● ● ● ●
● ● ● ● ● ● ● ● ●
● ●
● ● ●
● ● ● ●
● ● ●
● ● ● ● ●
●
● ●
● ● ●
● ● ●
● ● ●
● ● ●
● ● ● ● ● ● ● ● ● ●
●
● ●
●
● ●
● ●
● ● ●
●
●
●
●
● ●
● ●
●
● ●
● ●
● ●
● ● ●
● ●
● ● ● ● ●
● ● ●
● ●
●
● ●
● ●
● ●
● ●
●
● ●
●
●
●
● ● ●
●
●
●
0.0
0.0
0.0
● ● ● ● ●
● ● ● ● ● ●
●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ●
● ● ●
● ●
● ●
● ● ● ●
● ● ●
● ● ● ● ●
● ●
●
●
●
● ●
● ● ●
● ●
●
●
● ● ●
● ● ● ●
●
●
● ●
●
●
●
●
● ● ●
● ●
●
●
● ● ● ●
● ●
● ● ●
● ● ● ● ● ● ● ● ●
● ●
●
●
●
●
●
● ●
●
●
●
●
● ●
●
●
●
● ● ●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
● ●
●
● ● ●
●
● ●
●
● ●
●
● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
● ●
● ●
● ● ●
●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ● ●
● ● ●
● ●
● ●
● ● ● ● ● ● ● ● ● ●
● ●
● ●
● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
●
● ● ●
● ●
● ●
● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
●
●
●
● ●
● ●
● ●
● ● ●
●
●
● ●
● ● ●
● ●
● ●
● ●
● ●
● ●
●
● ● ● ●
● ●
●
●
●
● ●
●
●
●
●
● ●
● ● ● ● ● ●
● ● ● ●
●
●
● ● ● ● ● ●
● ● ●
● ● ● ● ●
● ● ●
● ● ●
● ● ●
●
● ● ●
● ● ●
● ● ●
● ● ●
● ● ● ● ●
●
●
●
●
● ●
●
● ●
● ●
●
●
●
● ●
●
●
● ●
●
●
●
●
● ●
●
●
●
● ●
●
●
●
● ●
● ●
● ●
● ●
● ● ● ●
● ●
●
●
●
● ●
● ●
● ●
● ● ●
● ●
● ● ●
● ● ● ● ●
● ●
● ● ● ● ● ● ● ● ●
● ●
●
● ●
●
● ● ● ● ●
● ●
● ● ● ● ● ● ● ● ● ● ●
●
● ● ●
● ● ●
● ●
● ● ●
● ●
● ●
● ●
● ● ●
● ●
● ●
● ●
● ● ●
● ● ●
●
● ●
● ●
●
● ●
● ●
● ●
● ● ● ●
● ● ●
●
● ● ●
●
● ●
● ● ●
● ●
● ●
●
● ● ●
● ● ● ●
●
●
● ●
● ● ●
● ● ● ● ●
● ● ●
● ● ●
●
●
●
● ●
● ●
● ●
●
●
●
●
● ●
● ● ●
●
● ●
● ●
●
●
●
● ● ● ●
● ● ● ● ● ●
●
●
● ●
● ●
● ●
●
● ●
● ● ●
●
● ● ●
●
● ● ● ● ●
● ● ● ● ● ● ●
● ● ●
● ●
●
● ●
●
●
● ●
● ● ● ●
● ● ● ● ●
●
● ● ● ● ● ● ● ●
●
● ● ● ● ● ● ● ●
● ●
●
●
●
●
● ●
●
● ● ●
● ●
● ●
●
●
●
●
● ●
●
●
●
● ● ●
●
● ●
● ● ●
● ●
● ●
● ● ● ●
● ●
● ●
●
●
●
●
● ●
● ●
● ● ●
● ●
● ● ●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ●
● ● ● ● ● ● ● ●
● ● ●
●
●
● ● ●
● ●
● ●
● ●
● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ● ●
●
● ●
●
●
●
● ●
● ●
● ●
● ● ●
●
● ● ●
● ●
● ● ●
● ● ●
● ● ● ●
● ●
● ●
●
● ●
●
● ●
● ●
●
●
●
● ● ●
● ●
● ● ●
● ● ● ● ●
● ● ●
● ● ● ●
● ●
● ● ● ●
● ●
●
● ●
●
●
●
●
● ●
● ● ● ●
● ● ● ● ●
● ● ●
● ● ● ●
● ● ●
● ● ● ● ● ● ●
●
●
●
● ●
● ●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
● ●
●
●
● ●
●
●
●
●
● ● ● ●
● ●
●
●
● ●
●
● ●
●
●
●
● ●
● ●
● ●
●
●
●
●
●
●
● ● ●
●
●
● ● ●
● ● ●
● ● ●
● ● ● ● ● ●
●
●
●
● ● ● ●
● ● ●
● ●
● ●
● ● ●
●
●
● ● ●
● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ●
●
●
●
●
● ●
● ●
● ●
● ●
●
●
●
● ● ●
● ●
● ●
● ●
● ● ●
● ● ●
●
● ●
● ●
●
●
●
● ●
●
●
●
● ●
● ●
● ● ● ●
● ● ●
● ● ● ● ●
● ● ●
●
● ● ●
● ● ● ●
● ● ●
● ●
● ●
● ● ● ● ● ●
● ●
● ● ● ● ●
● ●
● ● ● ● ● ●
● ●
● ● ●
●
● ●
● ●
●
●
●
●
● ●
●
●
●
● ●
● ●
● ●
● ● ● ●
● ● ● ● ●
● ●
●
●
●
● ●
● ●
● ● ●
● ● ● ●
● ● ● ● ● ●
● ●
● ●
● ●
●
●
● ● ●
●
● ●
● ●
● ●
● ● ●
●
●
●
●
● ● ●
●
● ● ●
●
● ●
● ● ●
● ●
● ● ● ●
● ●
● ● ● ●
● ●
● ● ● ● ●
● ● ● ● ● ● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
● ●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
● ● ● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
● ● ●
● ● ● ●
● ●
● ● ● ● ●
● ●
● ●
● ● ●
● ●
● ● ● ● ●
● ● ● ● ●
● ●
● ● ● ● ● ●
● ● ●
● ●
● ● ● ● ● ●
● ● ● ●
●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
●
● ●
●
● ●
●
●
●
●
● ●
● ●
● ●
● ●
● ●
● ● ● ●
● ●
● ●
●
● ●
● ●
●
●
● ●
● ● ● ● ● ● ●
● ● ●
●
●
●
●
●
●
●
●
● ●
●
●
●
● ●
● ● ●
●
●
●
●
●
●
● ●
● ●
● ●
● ●
●
● ●
● ●
● ●
● ● ●
● ●
● ●
● ●
●
●
●
● ● ● ● ●
●
● ●
● ● ● ● ●
● ● ● ● ● ● ●
●
●
●
● ●
●
● ●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
● ●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
● ● ●
●
●
●
●
● ●
● ●
● ●
●
●
●
●
● ● ● ●
●
● ● ●
● ● ● ●
● ● ● ●
● ●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
● ●
● ●
●
●
●
●
● ●
● ●
● ●
●
● ●
● ●
●
● ●
● ●
●
●
●
● ●
●
●
● ● ●
● ●
● ● ● ● ●
● ● ● ● ●
● ●
● ● ● ● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
● ●
● ●
● ●
●
●
● ● ● ● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
● ● ● ● ● ●
● ● ●
● ● ●
● ● ● ● ●
● ● ● ●
● ● ● ● ● ● ●
● ● ● ● ● ● ● ●
● ●
● ● ● ● ●
● ●
● ●
● ●
● ●
● ● ●
●
● ●
● ● ●
● ●
● ● ●
●
● ● ● ● ● ●
● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ●
● ● ●
● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ●
● ● ● ●
● ●
●
● ● ● ●
● ● ●
● ● ● ●
● ● ●
● ●
● ●
● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ●
● ●
● ●
● ●
● ●
●
● ●
● ● ● ●
● ●
● ● ●
● ● ● ●
● ●
● ●
● ●
● ●
● ● ●
● ●
● ● ● ● ● ● ● ●
● ●
● ● ● ● ● ● ● ● ● ●
●
●
● ●
● ●
●
●
●
● ● ●
● ● ●
● ● ●
● ●
● ●
● ●
● ●
●
●
● ● ● ● ●
● ● ● ● ●
● ● ● ● ● ●
●
● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
● ●
●
●
● ●
● ● ●
● ● ● ● ● ●
● ●
●
●
●
●
●
● ●
● ●
●
●
●
● ●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
● ● ●
●
●
●
● ● ● ●
● ● ● ● ● ● ●
● ● ●
● ● ●
● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ●
● ● ● ● ● ●
● ● ●
● ● ● ●
● ● ● ●
● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ●
● ● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ● ●
● ●
● ● ●
● ●
● ●
● ● ● ● ●
● ● ● ● ● ● ● ● ●
● ● ● ● ●
● ● ●
● ● ● ●
● ●
● ● ● ●
● ●
● ●
● ●
●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ●
● ●
● ● ● ● ● ●
●
● ●
●
●
● ●
●
●
●
●
●
● ●
●
●
●
●
●
● ●
●
●
●
●
●
●
● ●
● ●
●
●
●
●
●
● ●
●
● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ●
● ● ● ● ● ● ●
● ●
● ● ● ● ● ●
● ● ●
● ●
● ●
● ● ● ● ●
●
● ● ● ● ● ●
● ● ●
● ●
● ●
● ● ● ●
● ●
● ● ●
● ●
● ● ● ● ● ● ● ● ● ● ●
● ● ●
● ● ●
● ●
● ●
● ●
● ● ● ● ●
● ● ● ●
● ● ● ● ● ● ● ● ● ●
● ●
●
● ● ● ●
● ● ● ● ● ● ● ● ●
● ● ●
● ● ● ● ● ● ● ●
● ●
●
●
●
● ●
●
●
● ●
●
●
●
●
● ●
●
●
● ●
●
● ● ● ● ● ●
● ● ● ● ● ● ●
● ● ●
● ● ●
● ● ● ● ● ●
● ● ●
● ● ● ● ●
●
● ●
● ● ●
● ● ● ● ● ●
● ●
● ● ● ● ● ●
● ● ● ● ●
● ●
● ● ●
● ● ●
● ● ● ● ● ●
● ● ● ● ●
● ●
● ●
●
−0.5
−0.5
−0.5
●
●
20 30 40 50 60 70 80 90 0 5 10 15 20 4 6 8 10 12 14
Driver's Age Vehicle Age Vehicle Power
28
Confidence Bounds for Variable Selection
q+2
X
g(µ(x+)) = β0+ + β +(x+), x+ = β0+ + βj+(x+)xj .
j=1
+ +
• Magnitudes of βbq+1 (x+) and βbq+2 (x+) determine size of insignificant components.
29
Confidence Bounds for Variable Selection
+ +
• Magnitudes of βbq+1 (x+) and βbq+2 (x+) determine size of insignificant components.
30
Variable Selection with Cyan Confidence Bounds
regression attention: Density regression attention: Vehicle Age regression attention: Vehicle Gas
0.5
0.5
0.5
● ●
●
● ● ●
regression attention
regression attention
regression attention
● ●
●
●
● ●
●● ● ● ●
●
● ● ●
● ● ● ● ●
●
● ● ● ● ●
●●● ● ●●● ● ●
●
● ●
● ● ● ● ●
●●●● ● ●
●● ●● ●
●● ● ● ● ● ● ● ●● ●●● ● ● ● ●
● ● ● ●
● ● ●● ● ● ● ●●● ● ●● ●● ●● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ●● ●●● ● ●●
●
● ●
●
● ● ●
● ● ●
●
● ● ●● ●●● ●● ●● ● ● ● ●●●● ●●●●
● ● ● ●●
●●● ●
● ● ●● ● ● ● ●● ●● ● ● ● ●● ● ●● ●●● ● ● ●●● ●● ● ● ● ● ●
● ●
● ●
● ●
●
● ● ● ●●● ●●
● ●● ●●● ●●●● ●● ● ●
● ●●● ●● ●●● ●●●●●● ●● ●● ● ●●
●● ● ●●
●●● ● ●● ● ●
● ● ●
●● ● ●
● ●
● ● ● ●
● ●● ● ●●●
●● ● ● ● ●● ● ● ●●●● ●●●●●● ●● ●● ●● ●● ● ●● ● ● ●● ● ●● ●● ●●● ● ●●● ●●●●● ●
● ● ● ●
● ● ●
● ●
● ● ● ●
● ●
●●●●● ● ●●
●● ●●● ●
●● ●●
● ●●● ● ● ●●●● ●●●●●
●
●● ●
●
●●● ●●●● ● ●●● ● ●●● ●●●●●● ●●
● ●● ● ●● ● ●
●
●
●
●●● ● ●● ●● ●● ●●● ● ● ●● ●●●●●●● ● ●
●● ● ● ●● ● ● ● ●
● ●
●
● ● ● ● ●
●
● ● ● ● ●●● ● ●●●● ●● ● ● ● ●●●
● ●
●●●● ●●●●● ●●●●●● ●
● ●●●● ●●●●● ●●●● ●●● ● ●●●●●●●● ●●●●●●● ●
● ●●● ●●●
●●●●●●●● ●● ●
●
●●●
●●
● ● ● ●● ●
● ●●
●●
● ●●●
●
●
● ●●● ●●●
● ●
● ● ● ●●●
●● ●
● ● ●
● ●
●
● ●
● ● ● ● ●
●
● ●● ●● ● ●●
● ●● ●●● ●●● ●
●● ●●●●
●●●
● ●●●●●
● ● ●● ●●● ●●● ●●●
● ●●●● ●●
● ●●
●● ●● ●● ●
●●
●●
● ●● ●● ●● ●● ● ●● ● ●
● ●● ● ● ●
●●●● ●●
●
●
● ●
● ●
●
● ●
●
● ● ●
● ● ● ● ●
●● ●
●● ● ●
● ● ●● ● ●●●●● ● ●●●
●
●●●
●●● ● ●●●
●●
●
● ●●● ●●●● ●●●●●
● ●●●●●●●● ●●
●
●
●
●●●● ●
●●●●
●
●●
● ● ●● ●●●
● ●●● ●● ●
●●
●● ●●●● ● ●● ● ●●● ●●
●●
● ● ●● ●●●● ● ● ●
● ● ● ● ● ●
● ●
● ● ● ●
● ● ●● ●●● ●● ●●
●●●●● ●
●●● ●●●●● ● ●●● ● ●●● ●●●●●● ● ●●
● ●●●
●●
●
●●●
● ●
●● ●●●●●
●●● ●
●● ●● ● ●
● ●
● ●● ● ● ●● ●●●● ●●● ●●●●
●
●●●●● ● ● ●
● ●
● ●
● ● ● ● ● ●
● ●●● ● ● ●●●●● ● ● ●
●● ●● ●●●● ●
●●●●
●
●
●●●●●
● ●●●
● ● ● ● ●●●
● ●●●
●●●
●
●● ●
●●●●
●
●
●●●●●
●
●
●
● ●●●
●●●● ●●
● ●●
●●●
● ●● ●
●●
●
●●
●●● ●●●
●●
●
●●●●●●
●
●
●●●● ●● ●●
●
● ● ●
●
●●●●
●
●
●●
●
●●
● ●●
●● ●● ●
●● ● ●●●●
●●●
●
● ●●● ● ●
●
● ●
●●
●
●●
●
●●●●
●
●● ●●
● ●
●● ●● ● ● ●●● ● ● ●
● ●
●
●
●
● ●
●
●
●
● ● ●
● ● ●
● ●
● ● ● ●● ●● ●● ●
● ●●●●●● ●●
●●
●●●● ●● ● ●● ●●
●
●●
● ●
●●●●
● ● ●●●●●●
●●● ●●● ●●● ●●
●
● ●●
● ●●●●●●● ●● ●
●
●
●
●
●
●
● ●●
●●● ●● ●●●●●●
●● ●●●●●
●●
●● ●●
●●●●● ● ●● ●
●●●
●●
●●●●
●● ●
●● ●● ●
●●
●
●● ●
●●●
●
●●●●●●●●●
●
● ●●
●● ●●● ●●●●
● ● ●
● ●
● ●
● ●
●
● ●
● ● ●
●
● ● ●
●
● ●
●● ●
● ●●● ●●
● ●
●
●● ● ● ● ● ●
●
●
●
●●● ● ●●
●
●
●● ●
● ●●●●
● ●●● ●●● ● ●
●● ● ●●
●● ● ●
●
●●
●
●
●
●
● ●
●● ●●
●●●●●
● ●●●●● ●●●●●● ●
● ●● ●● ●●● ● ●●
●●●
●● ●●
● ● ● ●
● ●
●
●
●● ●●
●●
●●
●
●●● ● ●●
● ●
●●● ●● ●
●
● ● ●
●
●
● ●
● ●
● ●
● ●
● ● ●
●
● ●
● ● ● ●
●●● ● ●● ● ● ●
● ● ●● ● ●
●●●
● ●●
●● ●●
●● ●●
●●●●●●●●
●
●
●●●
●●
●
●●●●●●●●●●●●
●●● ● ● ● ● ●
●●● ●● ●●●● ●●●●
●●●● ●●
●●●
● ●● ● ●
●●●●
● ●
●●●●●
●
●
● ●
● ●● ●●●●
●●●●
●●● ●● ● ●●
●●● ●
●
● ●●●●
●● ●●●● ●●●●●● ●●
● ●
●●● ●
●● ●
●●
●●●● ● ●●
●
●●●●● ●● ● ● ● ● ●
● ● ●
● ●
● ● ●
● ● ●
● ● ● ●
●●● ● ● ●● ● ● ●● ●●●●
● ●●●●
● ●●●●●●●●
●● ●●
●● ●●●●
●
● ●● ●●
●
●●
● ●● ●● ● ● ●●●●
● ● ●
●● ●●
● ●●
● ●●●●●●●
● ●
●●●
●
●
●●
●
●●●● ●
●
● ●
●● ●●
●
●●●●● ● ●
●●●●●●●
●
● ●●
● ●●● ●
● ●●
●●● ●●●
●● ●●
●●
● ●● ●
●●
●
●●
●● ●●●
●
●●
●
●
●●
●●
● ●●● ●
●●
●● ●
● ●
●●● ● ●● ●
●
● ●
● ●
● ●
● ●
● ●
● ● ● ● ● ●
●
● ●
● ●
●●●● ● ●●● ●●● ●●●● ●●● ●● ●
●● ●●●●● ●
●●
●●
●●
● ●●● ●
●●●●
● ●●●
● ● ●●● ●
●●● ● ●
●
●●●●● ●● ● ●●●
●
● ●●●● ●●
●●●●●●
● ●
● ●●
●● ●●●●
● ●●
●● ●●●●●●
●
●●●
● ●●● ●●●● ●● ●
●●●●
● ●●
●
● ●
●●●●
●
● ●●
●● ●
● ●
●● ●
● ● ● ●● ●●
● ●●● ●
● ●
● ●
● ● ●
● ● ●
● ● ● ●
● ● ● ●
● ●
● ●●● ●
●●●
● ●●
●
●●●
●
●●
●● ●●●
●
●●●
●
●●●●
●
●●● ●
●
●● ● ●●
●●
●●●
●
●
● ●●
●
●●
●
●
●●
●
●
●
●
●●
●●
●●
●●
●
●●
●●
● ●
●
●●
●
●●
●
●
●
●
●●
●●
●
●
●●
●
●
●●●
● ●
●●●
●●
●
●●●
● ●●●
●
●
●●●
●●●●
●●
●●●
● ●●●
●
● ●●
●●●
● ●●●
●
● ●●
●
●
●
●
●●
● ● ●
●●
●
●●
●
●
●
●
●●
●●
●
●
●
●
●
●●● ●
●●
●
●●●● ●●
● ●
●
●
●●
●
●●
●●
●
●●
●
●
●●
●
●
●
●
●
●
●●●
● ●
●
●
●●●●●
●
●
●
●
●
●●●●●●
●●
●●
● ●●
●
●●
●●
●●●
●
●●
●
●●
●●
● ●
● ● ● ●
●●●●
●●●●
●
●
●●
●
●●
●
●●
●
●●
●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
● ●●
● ●● ●●
●
●
● ●●
● ●
●
●
●
●● ● ● ●● ●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
● ●
●
● ● ● ●
●
●
●
●
● ● ●
● ●
● ● ● ● ● ● ● ● ●● ● ●
● ●●●● ●●
● ●●
● ●
● ●
● ● ● ● ● ●
●● ●● ● ●● ●●● ●● ●● ●●
● ● ● ●
● ● ● ● ● ● ● ● ●
● ●●● ●●●●● ●●● ●
●●●● ●
●●●●
●●
●
●●● ●●
●●● ●●●●
●●● ●
●●●●●●
●●● ●
●●●
●●
●●
●●●
●●●
●●●●
●
● ●
●●
●●
● ●●
●●●
●● ●●●●
●● ●
●●●
● ●●●
●
●
●
●
●●●
●●
●
●●
●●
●
●●
●
●●●
●
●●
●
●●●
●
●●●
●●
●
●
●●
●●●
● ●●
●
●●
●●
●●
●●●●●
● ●●
●●●
●
●
●●●●●
●
● ●●
● ●
●●●
●
●●
●● ●●
● ●● ●
●
●
●●
●
●
●
●●
● ●●
●
●●●
● ●
●●●●● ●
●●●●● ●
●●●
●
●● ●●
●
● ●
●
●● ●●
● ●●
● ●●●●
●
●
●●● ● ●●
● ●●●
●●●●●●●●● ●● ●●●●
●●●
●●
●●●
●
●
●●●
●●
●●
● ●●●●●
●
●●● ● ●● ● ●
●●● ●● ●●●●
● ● ●
● ●
● ●
●
● ●
●
●
●
●
●
●
●
●
●
● ●
●
● ● ●
● ● ● ● ● ● ●
●
●● ●● ● ● ●●●
●● ●●
●● ●●●● ●●●● ●●●●●
●●
●●●
●
●●
●●●●●●●●
●●●●●
●●●●
●
●
●
●●
●
●●●
●●●●●
●
●●
●
●●●
● ●●
●● ●●● ●
●
● ●
●●●
●● ●●
●
● ●●
●
●●
●
●●
●●●●
●●
●
●
●
●● ●
● ●
●●●● ●
●●●
●
●
●●
●●
●●● ●● ● ●
●●● ●
●●
●
●●●
●●
●●
●
●●● ●
●●●●●
●
●
●●●
●●●● ●●
● ●
●●●
● ●●
●● ●●
●●● ●●●●● ● ●●●● ● ●
●
●
●
●● ●●●●
●
●●
●
●
● ● ● ●● ● ● ●
●●●
● ● ●● ●● ● ●
● ●
● ●
● ●
● ●
● ● ● ● ●
● ● ●
● ● ● ● ●
● ●
● ● ● ● ●
● ●
● ● ● ●●●● ●●●●
●●●●
● ●
●●● ● ●
●●●● ●●●
●● ●● ●●
● ●● ●●
●●
●●●●
●●● ●
●●●●
● ●●
●
●● ●●●●
●●●
● ● ●●
●●
●
●
●● ●
●●
●
●●●●●
●
●●
●●
●●
●●●● ●●
●● ●●
●
●●● ●● ●● ●●●●
●
●●●
●
●●●●
●●●
●
●
●●●●●●● ●
●●
● ●●● ●●
●●●
● ●●●
●
●
●● ●
●●
●● ● ●● ● ● ● ●
● ● ● ● ● ● ● ●
● ●
● ●
● ●
● ● ● ● ● ● ●
● ●●● ● ● ● ●●●●
● ●●●●
● ●
●●● ●● ●
●●●
●● ●●●●●●
●●
●●
●
●●●●
●●
●● ●● ● ●●● ●●● ●●
● ● ●●● ●● ●
●●●●
●●
●
●●
●●● ●
●●
●●
●●
●● ● ●● ● ●
●
●●
● ●●●●●●●● ● ●● ●●
● ●
●●
●
●●●●
●
●●● ●
● ●●
●
●● ●
●●
●●●
●●
● ●● ●●● ●●●
●
●●● ● ●
● ●
●
● ●
● ●
● ● ●
● ●
● ● ● ● ● ● ● ● ● ●
●●●
● ● ●● ●● ● ● ● ●●●●●●●● ● ● ●● ● ●● ● ●●●●
●● ● ●●●●●● ●●●● ●
●●
● ●
●● ●●● ●
●●● ● ●● ●●
●
●●● ●● ●●●● ●
●●● ● ●●
● ●
●
●●
●● ●●●
●●
●●
●● ●● ●●
●●
● ● ●
●
●●
●
●●● ● ●
●●
● ●
●● ● ● ●● ● ●
● ● ●
● ●
● ● ● ●
● ● ● ●
● ●
●
● ●
● ● ●
● ● ●● ● ●●● ●●● ●●●● ●
●●● ●● ●●
●●●
● ●●●●● ●●●●●●●● ●●
● ●●●
●
●●● ●●
● ●●●
● ●●●●● ●●
● ●
●●
● ●
●
●
●
● ●●
● ●
●●
● ●
●●●●●
●●
●●●● ● ●● ●
●● ●●
● ●● ●●●●●
●● ●
● ●●●
●
●● ●● ● ●● ●●●●●●● ●●
● ●●●● ●
●●●
●●
● ● ●●
●● ● ●● ● ● ●
● ●
● ●
●
● ● ●
● ●
● ●
● ● ●
● ●
● ● ●
● ●
● ●
● ●
● ● ● ●
● ● ● ●
●
● ●
● ●●● ●●●
● ● ●
●●●●
●●● ● ●● ●● ● ● ●● ●● ● ● ● ● ●
●●●● ● ●● ● ●● ● ●● ●●● ●●●●●● ●● ●●●●●● ●● ● ● ●● ●● ● ●●
●●● ●
●● ● ●
● ●
● ● ● ● ●
● ●
● ● ●
● ● ● ● ● ● ● ● ● ●
●
0.0
0.0
●
0.0
● ● ● ●● ● ● ● ●●
●● ● ●
● ● ●●●
● ●●●
●●●● ●●● ● ● ●● ●● ●●●●●●●● ● ● ●
●●●●●●●●●●● ● ●●●● ●● ●●
● ●●●●●●
● ● ● ● ● ●●
● ● ●● ● ● ● ● ●
● ● ●
● ● ● ● ●
● ●
● ●
● ● ● ● ● ● ● ●
●
● ● ●● ● ●● ●●
● ●●● ● ● ● ● ● ● ●●
●●●● ●●●●●
● ●●●
● ● ●●● ● ●
●●● ● ●●●
●
●●●● ●●●
●
● ●● ●
●● ● ● ●● ● ● ●
●●
●●●
●
●
●●●
● ● ● ● ● ●
● ● ●●●
●● ●
● ● ● ●
●
●
●
● ●
●
●
●
●
●
●
●
● ●
●
●
● ●
●
●
● ●
●
●
●
● ●
● ●
●
●
●
●
●
●
●
● ●
●
●
● ●
●
● ● ●
● ●
●
●
● ● ●
● ● ●●● ● ● ● ●● ● ● ●
● ● ●● ●
●●●●●●● ● ●
● ●●● ● ●●●● ●●● ●● ●
●●●●●● ● ●
●● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ●
● ● ●● ● ●● ●● ● ●●●● ● ●● ● ●● ●●● ● ●● ●●
● ● ●●● ●
●● ●● ● ● ●● ● ●●● ● ● ● ● ● ●
● ● ●
● ● ●
● ●
● ● ● ● ● ●
● ● ● ●
● ●
● ●
● ● ● ● ● ● ● ●● ●● ●●● ●● ● ●● ● ● ●●●● ● ●● ●● ● ●●●
● ● ●●●● ●●● ● ● ● ● ●● ●● ● ● ●●●●● ●
● ●● ● ●
● ●
● ●
●
● ●
● ●
● ●
● ●
●
● ●
● ●
●
●
●
● ●
● ●
● ● ●
● ●
● ●
●
●
●
● ● ● ●
● ● ●
● ●●● ● ●●● ● ● ●● ● ● ● ● ●●●● ● ● ●● ● ● ●● ●● ● ● ● ●
● ●
● ●
● ●
● ●
● ● ● ● ●
● ● ● ●
●
●●
● ● ● ● ●● ● ● ● ● ●● ●● ●●
● ●● ● ● ● ● ● ● ● ● ● ●● ● ●●●
● ●● ●● ●
● ● ●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
● ●
●
●
●
●
●
●
● ●
● ●
●
●● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
●● ● ●●
● ●● ●● ●
●● ● ●
● ● ●●
● ●●●● ● ●● ● ●
● ● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ● ●
● ●
● ●
● ● ● ●
● ● ● ●
● ● ●●● ● ● ●● ● ●● ● ● ●● ● ● ●● ● ●
● ●
● ●
● ●
● ●
● ●
●
● ●
● ●
● ●
● ●
● ●
● ● ● ●
●
●
● ●
● ● ● ● ●
● ● ● ● ●●● ●● ● ● ● ●
●●● ● ●
●
● ●
● ●
●
● ●
●
● ●
● ●
● ● ●
● ●
● ● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ● ● ● ● ●
●● ● ● ●● ●● ● ● ● ● ● ● ● ●
●
● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ● ●
●
● ●
●
●
●
● ● ●
●
● ●
● ●
● ●
●
●
●
●
●
● ●
● ● ● ● ●● ● ● ● ● ●
● ●
● ●
● ●
● ●
● ●
● ● ●
● ●
● ●
● ●
● ● ● ● ● ●
● ●
● ●
● ● ● ● ●
● ●
● ● ●
● ● ●
● ●
● ● ●
● ● ● ●
● ●
● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
●
●
●
● ●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
● ●
● ●
●
● ●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
● ● ●
● ●
● ●
● ● ● ●
●
●
●
● ●
●
● ●
● ●
●
●
●
●
● ●
●
●
●
●
● ●
●
● ●
●
●
●
●
● ●
●
●
●
●
● ●
●
● ●
●
●
●
● ●
●
●
●
● ●
●
● ● ● ● ●
● ●
● ●
● ●
● ●
● ●
● ●
●
● ●
● ● ●
● ●
● ●
● ● ● ●
● ● ● ● ●
● ● ● ● ●
● ●
● ●
●
● ●
● ● ●
●
● ●
● ● ●
● ●
● ●
● ●
● ● ● ● ●
● ● ● ●
●
● ●
● ●
● ●
● ● ● ●
● ●
● ● ●
● ● ● ● ● ●
● ● ● ●
● ● ● ●
● ● ● ●
● ● ●
● ●
●
● ●
● ● ●
● ● ●
● ●
● ●
● ● ● ● ● ● ●
●
● ●
● ● ●
● ●
● ● ● ●
● ●
● ●
● ●
● ● ●
● ● ●
● ● ●
● ● ● ●
● ● ●
● ● ●
● ●
● ● ● ● ●
● ●
● ● ● ●
● ●
● ●
● ●
● ● ●
●
● ● ●
● ●
● ● ● ● ●
● ● ● ● ●
● ● ● ●
●
● ● ● ● ●
●
● ● ●
● ● ●
● ● ●
●
● ●
● ● ●
●
● ●
● ●
● ● ●
●
● ●
● ●
● ● ●
● ● ●
●
●
● ● ● ●
● ● ● ● ● ● ●
● ● ● ● ●
● ●
● ●
● ●
● ●
● ●
● ● ●
● ● ●
● ● ● ●
● ●
● ● ●
● ●
● ●
● ● ●
● ●
● ● ●
● ● ● ●
● ● ●
● ●
● ●
● ● ● ●
● ●
● ● ● ● ● ●
● ●
● ● ● ● ●
● ● ● ● ●
● ● ● ● ● ●
● ●
● ●
● ●
● ● ●
● ●
● ● ● ●
● ● ●
● ●
● ● ● ●
● ● ● ● ● ● ● ● ●
● ● ●
● ● ●
● ●
● ● ●
● ● ●
● ● ● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ● ● ● ● ● ●
● ● ● ● ●
● ● ● ● ●
● ● ●
● ●
● ● ● ● ● ●
● ● ●
● ●
● ●
● ● ● ●
● ●
●
● ● ● ●
● ● ●
● ●
● ●
● ● ● ●
● ●
● ● ● ●
● ● ● ● ● ● ● ●
● ● ●
● ● ●
● ●
● ●
● ● ●
● ●
● ● ● ●
● ● ● ● ● ● ● ● ● ●
● ●
●
● ● ●
● ● ● ● ● ● ● ● ●
● ● ●
● ● ● ●
● ● ●
● ● ●
● ●
● ● ● ●
● ●
●
● ● ●
● ●
● ● ●
●
● ● ● ● ● ●
● ● ● ● ● ●
● ●
● ● ● ●
● ● ● ●
● ● ● ● ●
● ● ● ●
●
● ● ● ● ●
● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ●
● ●
● ● ● ●
● ● ● ●
● ● ● ●
● ●
● ●
−0.5
−0.5
−0.5
●
●
2 4 6 8 10 0 5 10 15 20 Diesel Regular
Density Vehicle Age Vehicle Gas
regression attention: Vehicle Power regression attention: RandU regression attention: RandN
0.5
0.5
0.5
regression attention
regression attention
regression attention
● ●
●
●
● ● ●
● ● ● ● ● ●
● ●
● ●
● ● ●
● ● ● ●
● ●
●
● ●
● ● ● ● ● ● ● ● ● ● ●
●
● ●
● ●
● ●
● ●
● ● ● ● ● ● ● ● ● ● ● ● ● ●
●
● ●
● ●
● ●
● ●
● ●
● ● ● ●● ● ●● ● ● ● ● ●
●
● ●
● ●
● ●
● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ●● ● ●●● ●● ● ● ● ●● ●● ●● ● ●
●
●
●
●
●
● ●
● ●
● ● ● ●
● ● ● ●
● ● ● ● ●●● ● ● ●● ● ●● ●
● ●● ●●● ●●● ●● ● ●
●●●●●● ● ●● ● ●●●●● ● ● ● ● ● ●● ●
● ● ●● ● ● ● ●●● ●● ● ●●●●●
●
● ●
●
● ●
● ●
●
● ●
● ●
●
●
● ● ● ●● ● ● ●● ●●
●● ● ● ● ● ●● ●● ●●
●●● ●●● ● ●● ●● ● ● ● ● ●● ●● ● ● ●
● ● ●●
● ●● ● ● ●● ●● ●●
● ● ● ●● ● ●● ● ●● ●● ● ●
● ● ●● ●
●
● ●
● ●
● ●
● ● ●
● ●
● ●
●
●
● ● ● ●●●
● ●● ● ● ●●●
●●● ● ● ●
● ● ●●● ● ●● ● ●
● ●
●● ● ●● ●●●●●● ● ●● ● ● ● ● ●● ● ●●●●● ●●●●●●
●● ●● ●●●● ●●● ● ● ●●●
● ●● ●●●●●● ●●●
●
● ● ● ● ● ●
● ●
● ● ● ● ● ● ● ● ●● ● ● ●● ●● ●● ● ● ●● ● ● ● ●●● ●● ●● ●●● ● ● ●● ● ●● ● ● ●● ● ●● ● ●● ●● ● ●●● ●● ● ●●● ● ●●
● ●● ●●● ● ● ●
●●●● ●●● ●●●
●●● ●● ●●●●● ●
●
● ● ●
● ●
●
● ●
● ●
● ●
● ●
● ● ● ●
● ● ● ● ●●● ●●●
● ●● ●● ●● ● ● ● ●● ●
●●
●● ●● ● ● ●●● ●
●●● ● ●●● ● ●●
● ● ●●●●● ●● ● ● ●● ● ●●● ● ●● ●●●●
● ●● ●● ●● ●● ●●
● ●●●●●
●●●
●●
●●●●● ●● ●●● ●
● ●●● ● ●● ● ●● ●●● ●
●
● ●
●
● ●
●
● ●
●
●
●
● ●
● ●
●
●
●
●
●
●
●
● ●
● ● ● ● ●
● ● ● ●
● ●●
● ● ●
●●● ● ● ● ● ● ● ● ●● ● ●●
●●●● ●● ●● ●
● ●● ● ●● ●● ●● ● ● ●● ●
● ● ●
● ●●● ●●
●
● ●
●● ●● ● ●● ● ● ● ●
●● ●
●
● ●●
●●
●● ● ● ● ●● ●
●● ●
● ●● ●● ● ●●● ●● ●
● ● ●
●●●●
●●
●● ● ●●●
●●
●●
● ●●
●● ●●
● ●●
● ●●●●
● ●●
● ●●
●●●●
● ● ●●●●●● ●
●
●
●●● ●●●●●
●
●
●●
●●●●●●
●
●●●●
●
●●
●●●● ●
●●● ●●
● ●
● ●
●●●
●●●
●
●
●●●●●● ●●●●●● ● ● ●●● ●● ●
● ●● ● ●
●
● ● ● ● ●
● ●
● ● ● ●
● ● ● ● ● ● ●● ●● ●●● ● ●●● ● ● ●●
●● ●
●● ● ● ●● ●● ●●● ● ● ●● ●● ● ●
●
●●●●●●●●
● ● ● ● ● ● ● ●●●● ● ●●● ●●● ●● ●●
●●●● ●● ●● ●●●●● ●
●●●● ●●
● ●●
●●●
● ●
●●●
●●●
● ●
●●
● ●● ●● ● ●
● ●
●●●●● ●●●●●
●●● ●●
● ●●●●●● ●●● ●●●●● ● ●● ● ● ●●
● ●
● ●
●
● ●
● ● ●
● ●
● ●
● ● ●
● ● ● ●●● ● ●●● ● ●●● ●●●● ● ● ●●●● ●● ● ● ●● ● ●
●●● ●●
● ● ●●●●● ● ● ●● ●● ● ● ●
●● ●● ● ●● ●● ●● ●●
● ●● ● ●●●●●●●● ● ●●●●●●● ●●● ●●●●● ●
● ● ●● ●
● ●●
●
●● ●●●
● ●●●● ●●● ●● ●● ●● ●● ●
●●● ● ●
●
●
● ● ●
●
●● ●● ●● ●● ●● ●● ● ● ● ● ● ●●
●
● ●●
●●
●●●
● ●●● ●●
●● ●●
●● ●●● ●●●●
● ●●
●
●● ●●●
●●●●
● ●
● ●
●●
●
●● ●
●●
●●● ●●
●●
● ●● ●● ●●
●
●
● ●
●● ●●●●●
● ●●● ●
●●●●● ●
●
●
●● ● ●●● ●●●●●
● ●●● ● ●●● ●●●
●● ●
●●● ●●●●● ● ●●●●●● ● ●
●
● ●
● ●
● ●
● ●
● ●
● ● ●
●
●
●
● ● ●
● ●● ●●● ●●● ●●
●●● ●●● ● ●● ● ● ●●●● ●● ● ●●●●
●
●● ●●
●● ●● ● ●● ●● ●●● ● ● ● ●● ●● ●●●● ●●● ●●●● ●●● ●● ●●
●● ● ●●●●● ● ●
●●● ●●●●● ●● ●●● ●● ●●●●●● ●●●
● ●● ● ●● ●
●
●●
● ●
● ● ●●●● ● ●● ●●● ●● ●● ●● ● ● ●
●
●● ●●● ● ●● ● ● ●
●●●
●●● ● ●
●●●● ●●
●● ●
●● ●
●
●●●
●●● ● ●
● ●
●●
●
● ●● ● ●● ●●
●
●
● ●
●●●●
●●
●● ●
●
●●
●●●
● ●
●●●●
●●●
● ●
● ●
●
●●
● ●
●
●●
●● ●
●●
●●
●● ●
●●●●● ●
● ●
●●
●● ●●
●●
●●
●
●● ●●
●●● ●
●●●●●●●●●● ●● ●●
● ●
●●● ● ● ● ● ● ●
●
● ● ●
● ●
● ●
● ● ●
● ● ● ● ● ● ●●● ●●● ●
●● ● ●● ● ●●● ● ●●●
● ●● ●●●● ● ●●●●●
●● ● ● ● ●● ●●●● ●● ●● ● ● ●
● ● ● ●● ●●●● ● ●
●● ●●● ●●
●● ● ●● ● ●●● ●●●●
● ● ●●● ● ●● ●● ● ● ●●●● ● ●●●● ●●●●●● ●● ● ●●
●
●●●
●
●●
●●
● ●
●
●●
●● ●
●●
●●
●●● ●
●●
● ●
●●
●●●
●●●●●
● ●●●●
● ●
●●●
●●●●
●●●● ● ●●● ●
●● ●● ●
●●
●●
●● ●
●●
●● ●
●● ●●●● ●●
● ●●●
●●
● ●
● ●●
●● ● ● ●● ●●
●●●●● ●● ● ●● ● ●
●
●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ● ● ●
●●
●● ● ● ● ●●● ●● ●
●● ●●●● ●● ●● ● ●●
● ●● ● ● ●● ●●● ● ● ● ● ●●●● ● ●●
● ●● ●●●
●●
● ● ● ●●
● ● ●
● ●● ●
● ●
●● ●● ● ●
●●●●● ● ● ● ● ●● ●●
●●● ●●● ●
●● ● ●●
● ● ●● ●
● ●●● ● ●
●●●
●
●
●●
● ●
● ●● ●● ● ●
●●
●●●●●● ●●● ●
● ●
● ● ● ●
●●●●●●●●●● ●
● ●●● ● ● ●● ● ●●● ● ● ●
● ●
●●●●●● ● ● ●● ● ●●●
●●●● ●
● ● ● ● ● ● ● ●
●●●●● ● ●● ●●●● ● ●
●●
● ●
●●● ●
●●
● ●● ●
●●
●●
●
●●
●●
●●
●●
●●
●●●
●●
●
●●●●●
●
●●●
● ●
● ●
●●
●●
●●
●●●
●●●
●●● ●
● ●●
●●
● ●●
●●
●●
●
●●●●●●●●●●
●●
●
● ●●
●●
●●●
●
● ●● ●●●
●●
● ●
●● ●
● ●●
●● ●
● ● ●● ●● ●●●
●●
●
● ●●● ●●● ●● ●●● ● ●● ●●
● ● ● ● ● ● ● ●
● ●● ●●●● ●● ●●●●● ●
●●●
● ●●●●● ● ●● ●● ● ● ●●●●● ●● ●● ●●●● ● ●●
●● ● ● ●●●
●●● ● ●● ● ●
●● ● ●●● ● ● ● ● ●● ● ●
●●●●● ● ●●
●● ●● ●● ●●
●● ●●●
● ● ● ● ●
●
● ●●●●● ●
●●● ● ●
● ●●●●●●
● ● ●● ● ●●● ●● ●
●● ●●●●
●●●●
● ●●
●● ●
● ● ● ●● ●● ●●● ●●
0.0
0.0
● ●●
0.0
●
● ● ●
● ●
● ●
● ● ● ● ● ● ●●●● ● ●● ● ●●● ●●● ●●● ● ●● ● ●● ●●● ●● ●● ● ● ●●●
●●●● ●●● ●●● ●●● ●●
●● ●●●● ●
●● ●●● ● ●●● ●
●●●●
● ●●●● ●●
● ●● ●● ●
●●● ●
●●●●●● ●●● ●●●●● ●●● ●●● ● ●● ●● ●●●● ● ●●●●● ● ●● ● ●● ● ●● ● ● ● ●
●● ●●
●●● ●
●● ●●●● ●●
●
●●●●●●●●
● ●
●●
●●●●
●
● ●
●
● ●
● ●
● ●
●●
●●
●● ●● ●●
● ● ●●●● ●●
●●
●●●
● ● ●●●
●●● ●●
● ●●● ●●
●● ●●●●● ●●
●●●
● ●
●●
●●●●● ●
●●●
●
● ●
● ●●●
● ●● ● ● ● ● ● ●● ● ● ●
● ● ● ●
●● ●●●
●
● ●
●
● ●
● ●
● ● ● ●
● ● ●
● ● ● ●
●●
●●
● ●●
● ●●● ●●●● ●●
●● ●● ●
●
● ●●●● ●●●
● ●● ●●
●●●● ●
●
●●
● ●●●● ●
● ●●
●
●●●●● ●●●●
● ●●
● ● ●●
● ●
●●●
●●●●● ●● ●●
●●
●
● ●●
●
●●●● ● ●
●●●● ●
●● ● ●●
●●●●● ●●● ●●●● ● ● ●● ●●●●●
● ●● ● ●●
●●
●●
●● ●●●●●
●● ●
●●●
●●●
● ●
●●●●
●●
●●●● ● ●●●●●●● ●●●●
●● ●● ● ●● ●●
●●●●
● ●●●● ●●●● ●●●●● ●
●●
● ●● ●● ●●
●● ●
● ● ●● ● ● ● ● ● ● ● ●●
●●●●
● ●●
●● ●● ●● ●● ●●●●●
●
●
● ●
● ●
●●●
● ●
●●
●●
●
●●
●●●
●
●
●●
●●
●●●
●
● ●●
●●●
●
●●●
●●
●
●●
●●
●●●
●
●●
●
●●
●●
●
●
● ●
●
●●
●
●●●
●
●
●●
●●●
●
●
●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●●●●
●●●
●
●●●
●
●●●
●
● ●
●●
●●
● ●●●
●
●●
●
●●●●
●
●●
●
●● ●
●●●●
●●●●●
●●
●
●●●●● ●●
●●
●
● ●
● ● ●
●
●●
●
● ●
●
●●● ●
●●
●● ●●●●●●
●● ●●●
●●●
●● ●●
●●●● ● ●●● ●
●● ●
● ●● ●
●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ● ● ● ● ●●
● ●
●●●● ●
●●●●● ● ●● ●●
●● ●●● ● ● ●●
●●●●●
● ●● ●
● ●●●●●●●●● ● ● ●● ●● ● ●●
●●
● ● ●●
● ●●●●●●●● ●●●●●●● ●●●
● ●● ●● ●●●● ●●●● ●●
● ●● ●
●●● ●●●●●● ●●●
●●
●● ●●● ●●● ●●●
●●
● ●●●●●● ●●●●
●
● ●●
●● ●
●●●●●
● ●●
● ●●
●● ● ● ●● ● ●● ●●●● ● ●
●●●●●●●●● ●
● ●●●
● ●●● ●●●
●
●● ●
●●●●●●●●●
● ●
● ●
●
●●● ● ●
●●●●●
● ● ● ● ●●
●●● ●● ● ● ●●● ●●
●●● ●
●●
●●
●●
●●● ●●●
●
●●
●
●●● ●
●●
●
● ●●
●
●●●●●●●●
●●
●●●
●●●●
●●
●●
●
●●●
●
● ●●●
●
●● ●●
●
●
●●●
●●●●●
●
●
●●●
●●
●
●●
●
●●
●●●
●●
●
●●●
●●
●●
●●
●●
●●
●●
●●●
●●
●●●
● ●
●●●
●●●
●
●●●
●
●●●
●●
●
●● ●
●●
●●●
●
●
●
● ●
●●
●
●
●
●●
●●
●
● ●
●●●
●
●
●
●
●
●●●●●
●●
●
●●
●●
●●
●
●
● ●●
●●
●
● ●●
●● ●●
● ●●
●●
●
●
●●
● ●●●
●
●● ●● ●
●
●● ●● ●● ●
●●● ● ●
●●●●
● ●
● ●
● ●
● ●
● ● ● ● ● ● ●● ●
●●
●
● ●● ● ● ● ●● ●● ●●● ● ●●
●●●●●
●● ●
●● ● ●●●●●●●●●
● ●● ●●● ●
● ● ●●
●●●●
●
● ●
●●● ●● ●●●
●●●● ●●● ●●
●●
●●
●●●●●●●●
●● ●●● ●●●● ●●●● ●
●● ● ●●
● ●●●● ●● ●● ● ●●
●● ●●● ● ●●● ●●
●●●●●●
●● ●
● ● ●● ●● ●●
●●
●●● ●●
● ● ●
●●●● ● ●● ●●●● ●●
●● ●● ● ●●
●●●
●
● ●●●●
● ● ●●●
● ● ●
● ● ● ●●●●●●● ●●●● ● ●●●
●●●●●●●
●
●
●●●●
●
● ●
●●
●●●●
● ●●
●●
●●●●
●●
●●
● ●
● ●
● ●
●
●●
●●●●●
●
●●●
●●
●
●●
●●●
●
●
●●
●
●●
●●●
●●●
●●●
●●
●●
●●
●
●
● ●
● ●●
●
●●
●●
●
●●
●●
●●
●
●
●●
●
●
●●
●
●
● ●●
●
●●
●●
●
●●●
●●
●●●●
●
●●
●●
●
●●
●
●●
●●
●
●●●
●
●●
●●
●●
●●●
●●●●
●●●
●●
●
●
●●●
●●
●●●
●
● ●●●
●●
●
●●
●●● ●●
●
●● ●
●●●●●●
●
●●●●
●●●● ●● ●
●●●●●●
● ●
●●●
●
●● ● ●● ●● ●● ●● ● ● ●● ● ● ●
●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
●
● ● ● ●●●●
● ●
●
● ●●●●
● ●● ● ●● ● ●● ●● ● ● ●
●●
●● ●●● ●●●
● ●● ●● ●● ● ● ●●
●● ● ● ●●●●●●●
●●● ●
●
● ● ● ●● ● ●●●● ●● ●●● ● ●●● ●
●
●●● ● ●●●
●●●
●●
●
● ●● ●●● ●
●●
●
●●●● ● ● ●● ●●●
● ●●●● ● ● ●●●●
●
●●● ●
●● ●
●●●●●●●● ●●●●●●● ●● ●● ●● ●●●●
●
● ●●●●● ●
●
●●●● ●●●
●●●●●●
● ●●●●
● ●●
●● ●
● ●●●
● ●●
● ●●● ●
●
●●● ● ● ● ● ● ●●● ●● ● ●
● ●● ●●●●●
●●●●●
●● ●●
● ●●●
● ●
●●●
●
● ●●
●●●●
●●●
●●
●●
●●● ●●
●
●●
● ●
●
●●
●
●● ●
●●
●●
●
●●●
●
●
●●
●●●
●
●●
●●
●●
●●
●● ●●
●
●●
●●●●
●●
●
●●
●●●●
●●
●
●
●●
●●
●
●●●●
●
●●
●●
●●
●●●
●●
●●●
●●●
●●
●
●
●●
●●
●●
●●●
●●●
●●
● ●
●●●●
●●●●
●●
●●
●
● ●
●●
●●
● ●
●● ●
●● ●●
●●● ●
●
●●
● ●
●
● ●
●●
● ●
●
●●
●● ●●
●●
●●●
●●● ●
●
●●
●●
●●●●●
●●
●● ●
●● ●● ●
●
●●●●●●●●●●●
●● ●● ●
●
●
● ●
● ●
● ● ●
● ●
● ● ● ● ●
●●
● ● ● ●●●
●●● ●●
●●●●●●●●●●●●
●●●● ● ●● ●● ●●● ● ●●●
●
● ●
● ●● ● ●●● ●
●●●●●● ●● ● ●
●
●●● ●
●●●●●●● ●
●
● ●●● ●● ●
●● ●
●
●●●●
●
● ●●
● ●●●●●
●
●●●●
●
●●
● ●●
●
●●● ●●● ●
●●●
●● ●●●●●●
● ●●●● ●●●●
● ●
●●●
●●● ● ●●● ●
●
●
●●●●
●●
● ●● ●●● ●●●● ●
● ●● ●●● ●●
●●
●●●●
●
● ● ●
● ●●● ●●●
●●
● ●
●●●● ● ●●● ●● ●●●●●● ●●●● ●● ●●
●●●
●●●●●●● ●
●
●● ● ●
● ●●●●●●
●●
●●● ●●●
● ●
● ● ●● ● ● ● ●● ●● ● ●● ● ● ●
●●
●● ●
●●
● ●●●●
●
●●
●
●●●●
●●
●●
●
● ●
●●
●●
●
● ●● ●●
●
●
●
●●●
●●●●
●●
●●
●●● ●
●●●●
● ●
●●
●
●●●●
●
●●
●●
●●●●●
●●●
●
●
●●●
●●
●●●
●
●●●
●●
●● ●●
●
●●●●
●●
●●●
●●●
●
●●
●
● ●
●●
● ●●
●●
●●
●
●●
● ●●●
●●●
● ●●● ●
●
● ●
●●
●
●● ●
●●
●
●
● ●●
●●
●●
● ●●
● ●●
●●
●●●●●
●●
●●●●
●●●
●●●
● ● ●● ●●●
● ●●●● ●
● ●●● ●● ● ● ●
●
● ● ●
● ●
● ● ● ● ● ●
●
●● ● ● ●●●●●
● ●●●●●● ●●●●
● ● ●●
●
●●● ●● ●●●●● ●●●●● ●
● ●●
●●
●●● ● ●
●●●●● ●● ●●
●●
●●● ●● ●●
●●
●●● ●● ●●●
●●●
●●
●●●● ●●●
●
●●●● ●●●● ●● ●
●●●●
● ●●● ● ●●● ● ● ● ●●● ●●
●●● ●●
●
●●●
●
●●●●●●● ●●● ●●●
●●●●●●● ●● ●
●
●● ●
● ● ●● ●● ●
● ●●
●●
●●●●
●
●●
●●
● ●●
●● ●
●
●
●●
●
●● ●●●
●●●
●●●
●●
●●
● ●●● ● ●
●●●●●● ● ● ● ●●● ●
● ●●●● ●●●●
●●● ●●●●
●
●● ●
●●●●●
● ●
●●●● ●
●● ●●
●● ●
●● ●●
●●
● ●
● ●●
●
●●
●
●●●
●●●●●
●
●●
●
●●
●
●●
●
●●●
● ●●
●●
●
●●
●●●●
●
●●
●●
● ●
●●
●●●
●●
●●●●
●● ●
●●
●
●
●●●
●●●●
●
●●
●●
●
●
● ●
● ●●●
● ●
●●●
●
●●●
●
●
●●●
●
●●
●●●●● ●
●●●●
●● ●●●● ●
●
●●
●●● ●●
●
●●● ●●
● ● ●●●
●
● ●●● ●●● ● ●●
● ●
● ●
● ●
● ●
● ●
● ● ●
● ● ●●●
● ●●
●●●
●●● ●●●●
● ●● ● ●
●●● ● ●● ● ●● ●●
● ●
●●
●● ●
● ●●●
●●● ● ●●●●● ●●
●
●●
●●
● ●● ●●●●●
● ● ●
●●●
● ●
●
●●●● ●●●●
●●●●● ● ●
●●●●
● ●●● ● ●●
●
● ●
●●●
●●● ●● ●●
●●
●●●
●
● ●●
● ●
● ●●●●●● ●●●
●●● ●●●
● ●●
●●●● ●●
●
●● ●●● ●
●●●
●
●●
●●
●
● ●● ● ●●●●
●●●
●●●
●● ●●●●●●
●
● ●
●● ●●
●●
●●●●
●
●
●
●
●●
●●
●
●●
● ●●
●
●●●●●
●
● ●●● ●● ●●●
●
● ●●
● ●
●●● ●●●
●
●
●
●●
● ●●● ●
●●●
●
●●●●
●
●
●
●
●●●●
● ●
●●
●●
● ● ●
●● ●●●
●● ●● ● ●●●● ●●
● ● ● ●●
● ●●
●●
●
●●●●●
● ●●●
●
● ●
●
●●● ●
●
●●●
● ●●
●●●●
●● ●●● ●
●●
●●
●●
●
● ●
●●●
●●
●●
●●●
●●
●
●
●● ●
●●● ●
●
●
●●●●●
●
●●●●●●
●●●●●
●●●●●
●●●●
●
●
●●
●
●
●●
●●●●
●●
●
● ●●●
●●
●●
●●●
●
● ●●
●●●
● ●
●●●●●●
● ●●
● ●● ●●● ●● ●
●●●●
●● ●
●● ●●● ● ●● ●●●
●● ● ●●
●
● ●
● ●
● ●
● ●
● ● ● ●
● ● ● ● ●
●●
●●●●● ●●
●● ●● ● ●●
● ●
●●
●
●● ●●●●● ●●
●
● ●● ●●●●● ●● ●
●
●●●●●
●●
●
●
●
●
●●
● ●●●
●●
●●
●●
●
●
●●●
●●● ●● ● ●
●
● ●●
●
●● ●
●●●
●
● ●●● ●
●● ●
●●
●
●
●●
●●●●● ●●●●
●●● ●
● ●● ●●● ●●
●● ●●●
●● ●
●● ●●
●●●
● ●●● ●●●
● ●●●● ●●
●●●
●● ●● ●
●●●●●
●●●●
●● ●
● ●● ●
●
●
●
●●●
●●●● ● ● ●● ●●●
●● ● ●●
●● ●●
●
●
●
●●●
●
●●●
● ●● ●
●
●●
● ● ●● ●● ●
● ●●●
●
●●●●
● ●
●
●●●●●●
●●●
● ●●●●
● ●●● ●●
●●● ●●●
●● ●●●●●● ●●●
●●●
●●●●
●●
●●●●●●
●●●●● ●●● ●● ● ● ●● ● ●
● ●● ● ●●●●● ●
●● ●
●
● ●●●● ●●● ●● ●●●●●● ●
● ●
●● ●
●
●
●●
● ●●
●
●
●●
●
●●●
● ● ● ●●
●●●
●● ●
●
●●● ●● ●●●●
● ●●●
●●●●
●●●
● ●
● ●
●
●
●
●
●
●●●●●
●●●●
●●
●
●
● ●
●
●●●● ● ● ●●●
●● ● ●
●●
● ● ● ●
● ● ●
● ● ● ●
●
● ●
● ●
● ● ●
●
● ●
● ● ●
● ●
●●
●●●●●●
●●●●●
● ●●●●
●
●●●● ●●● ●●
●●●●● ●
●●●●●●●●
●●●●
●● ●
● ●●
●● ●
●●●●● ●●
●
●
●●●●● ●●
●●● ●● ●
●●
●●
●●●
●
●
● ●● ●● ●●●
●●●●
● ●●● ●●● ●●●
●● ●●●
●●●●●●●●
●●●
● ●●●●
●●●
● ●●● ●●
●●●●● ●● ●
●●
●
●●●
●●
● ●●●●●●●●●●● ●
●● ●●●
●●●●● ●●●●●
● ●●● ●●●
●● ●
●●
●●●●●● ● ●● ●● ●
●
●●
●●●●
●●
●●
● ●
● ●●
●● ●●
●●●●
●● ●●●●●●● ● ●● ●●
●
● ●●● ●●
●
● ●●● ●● ●●
●
● ●● ● ●● ●● ●
● ●●
● ●● ●●● ● ●●●●● ●● ●● ● ● ●●● ● ●●● ●
●●
●●●●●●●●●●●●
● ●
● ●●●
●●●● ●●●●
●● ●
●●●
● ●
●●●
●●●●●●
●●
●●● ●● ●●
●●
●●●●
● ●
●●
●●●
●●●● ●●●
●● ●●● ● ●● ●●●
● ●● ●
●●●
●● ● ● ●●● ● ● ● ●●
●● ●
●
● ●
● ●
●
● ●
● ●
● ● ● ●
● ● ●●
●●
●
●
● ●
●
●●
●●●
●
●
●●●
●●●
●●
● ●●
●
●
● ●●●●
●● ●●●● ●
●●
● ●●●●●● ●
●●●●
●●●
●●
●● ●●●
●●
● ●
●●
●●● ●
●●●●●●● ●● ●
●●
●●● ●
●●
●●●●●
●
●●
●
●●●● ●●●
● ● ●● ●●● ● ●
●●●●
●●● ●●
●●●
●●●
●●● ●●
●●
● ● ●● ●●
●●●
● ●
●●●●● ●●●●●●
● ●●
●●●●
●●●
●●●● ●●●●
●● ●●●●●●
●●●●●
●
● ●●●
●
● ●●
●●●
●●
●
●● ●
●
●
●
●
●●● ●●
● ●●●●
●●●●●●
● ● ●● ● ●●
● ●● ●●
●
● ●●●
●
●●●
● ●●●● ● ●● ●● ●
● ●
●● ●● ● ●
●●
● ●● ●● ●
● ●●● ● ● ● ●●
●●●●●●
● ●
●● ●● ●● ●●●
●●●● ●
●●
● ●●●
● ●●●
● ● ●
● ●●●● ●● ●● ● ● ●● ● ● ●
●
● ●
● ● ● ● ● ●
● ● ● ●
●●
●●
●●
●●●
●
● ●●
●●
●●
●●●
●●●●●● ●
●●●●●●
●●
●●●●●●●● ●
●
●●●
● ●
●●●● ●
●● ● ●
●● ●●
● ●● ● ●● ●●●●
● ●●●
●●
●● ●
●●● ●● ●
● ●
● ●●●●
● ● ●● ●●●●● ● ●●● ●●
● ● ●
●●●●● ●● ●● ●●●● ●●
●● ● ● ● ●●
● ●
● ●●● ●●●●●● ●●●●
● ● ●● ● ●●
●●● ●●
●●● ● ●● ●●●● ●●● ●
●● ●●●● ●●● ●●
●
●●●● ●
●● ● ● ●●●● ● ●● ●●●● ●
●
●●
●● ● ●●●●● ●●●● ●● ●●●● ●●●● ●
● ●●●● ●
●● ●●● ●●●●
●
●
●●●
●●●
●●
●● ●●●●
● ●●● ●
●●
● ●●● ●
●●●● ●●●●●●●●●●●●●● ● ●● ●● ●●● ●● ● ● ● ● ●
●
● ●
● ●
● ●
● ●
● ●
● ● ● ● ● ●
●●●●● ●● ●●
●
●● ●●●● ●●●●● ● ●●
●●●● ● ●● ● ●●
●●● ●●●● ● ●
●● ●● ● ● ●●●●●● ●●● ● ● ● ● ●●●●● ● ●
● ●● ●
●●● ●●●●●● ● ●● ●●●● ●● ●● ●● ●●
●●● ● ●
●
●●●● ●●● ● ●●●●● ●●● ●● ● ●
●● ●● ● ●● ●● ● ● ● ● ● ●●● ● ● ● ●●● ●●● ●
● ●●
●● ●
●● ● ● ● ● ●●
● ●● ●● ●●
● ● ●● ●● ●●●
● ●● ●● ● ● ●●● ●● ●● ●
● ●● ●
● ● ● ●● ● ● ●● ●●● ● ●●● ● ●● ●●● ● ●● ●
● ● ●● ● ● ●
● ●
●
● ●
● ● ●
● ●
●
●
● ● ●
● ●
●● ● ● ●
●●● ●● ●●●● ● ●●●●●●
● ● ● ●● ●
● ●● ●●●● ●● ●●
●●
● ● ●●●● ●●
●
●●● ●● ● ● ● ●●● ● ●●●● ● ● ●●●● ●●● ● ● ● ●● ●● ●●●●●
● ●●● ●● ● ● ● ●●●●●● ● ● ●
●●● ●● ●●●●●● ● ● ● ● ●
● ●●●● ●● ● ●
● ●● ●
● ●
●● ●
●●●●● ●
● ●● ●●
●
● ●● ● ●●●● ●
●● ●●
●●●●●
● ●●
●●●●●
● ● ● ● ●●●●● ● ●●
● ●
● ● ● ● ●●●
● ● ● ●
● ●
●
● ●
● ● ● ●
● ● ● ● ● ● ● ●● ● ●●●● ● ●●
●
●● ● ●
●●●● ●● ● ●● ● ● ● ● ● ●● ● ●● ●●●
●●● ● ●●● ● ● ● ●● ●● ● ● ● ●●●● ●
● ●
● ●●●● ●● ●
●●●● ● ●● ● ●● ●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
● ● ● ● ●
●
●● ● ● ● ● ● ● ●● ● ●●● ● ●
● ●● ● ● ●●● ●● ●
●●●●●● ● ●●●●
● ● ●● ●
● ●
●●
●● ● ●● ● ● ●● ● ●
●
● ●
● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ●
● ● ● ● ● ●● ● ● ● ●● ●● ● ●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
● ● ● ● ●●● ● ● ● ● ● ● ●●● ● ●●●
●
● ● ●
● ●
● ● ● ● ● ●● ● ● ● ●● ● ● ● ●● ● ●●●
● ●● ●●
●
● ●●
● ● ● ● ● ● ● ●● ● ● ● ● ● ●
●
●
●
● ●
●
●
●
●
●
● ● ●● ● ● ● ● ●
● ●
●
● ● ●
● ● ● ●
●
● ●
● ● ●
● ● ● ●
●
●
−0.5
−0.5
−0.5
● beta(x) ● beta(x) ● beta(x)
zero line zero line zero line
0.1% significance level 0.1% significance level 0.1% significance level
31
Covariate Contributions βbj (x)xj
●
1.5
1.5
● ●
●
● ● ●
●
●
●
●
●
●
● ● ● beta(x)
● ●
●
●
●
●
●
●
●
●
● ●
● zero line
●
● ● ●
● ●
●
●
●
● ●
●
● spline fit
● ● ● ●
●
●
●
● ●
● ● ●
● ●
1.0
1.0
1.0
● ● ●
● ● ●
●
● ●
● ● ● ●
● ● ●
●
● ●
● ●
● ● ● ●
● ●
● ●
●
● ● ● ●
●
● ● ● ●
●
● ● ●
●
● ● ●
● ●
● ● ●
● ● ● ●
● ● ●
● ●
● ● ● ●
● ● ●
● ●
● ● ● ●
● ● ●
● ● ● ● ● ●
● ● ●
●
● ● ●
● ● ●
● ●
feature contribution
feature contribution
feature contribution
● ● ●
●● ● ● ●
● ●
● ● ●
● ●
● ● ●
●
0.5
0.5
0.5
● ●
● ●
● ●
● ● ●
● ●
● ●
● ● ●
● ●
● ●
● ● ● ● ●
● ●
●● ● ●
● ● ● ● ●
● ●
● ●
● ●
● ● ● ● ●
●
● ● ● ● ●●
● ●
●●
● ● ● ● ● ● ● ●
●
● ● ● ● ●
● ●● ● ● ● ● ●
●
● ●
●
●
● ● ●
● ●
● ●
● ●
● ●● ●●●●● ● ● ●● ● ● ●● ● ●
●
● ●
● ● ● ●
●
●
●● ● ●● ●
● ● ●●
● ●● ● ●
● ●
●
● ●
● ● ●
● ● ●●●●●●● ●
● ● ●● ● ● ●● ●
● ● ● ● ● ●
● ● ●
● ● ●
● ● ● ● ●● ● ●●●● ●● ● ● ● ●
●● ●
● ●● ●
● ● ● ●
● ●●● ● ● ● ● ● ● ● ●● ● ● ●● ● ●● ● ● ●● ●
● ●
●●● ●● ●
● ●
●● ●
● ● ●● ● ● ● ● ●
●●●● ●●●
●●
● ● ● ●●●● ● ●
●
●
● ● ● ●●● ● ● ● ● ●● ● ●● ●● ●●● ●●● ●
●●●● ●● ●
● ●
●●
●●● ● ● ● ● ● ● ● ●
● ●
● ● ● ●● ● ● ● ●● ●
● ● ● ● ●●●●
●
●
●●●
● ●●●
●● ●●●●
● ●● ● ●●●● ● ●
●
●
●●●● ● ●
● ●
●
●
●
●● ● ● ● ● ● ●● ● ● ● ●● ●
●●
●● ● ●●● ●● ● ● ● ● ● ●
●
●
●
●
●
●
●
● ●
●
●
●
● ●
● ●
●
● ● ●●●●● ●●●●●● ● ●
● ●
●●●●●
●
●
●●●
●● ●
●
●●●
●●●
●
●
●
●●
●
●
●
●
●●
●
●
●●
●
● ●●●●● ●
●●●● ●
●
● ●●
●●●● ●●
●● ● ● ● ● ●
●
● ●
● ● ● ● ● ●
●
● ● ●
● ● ● ● ●
● ●
● ● ● ● ● ● ● ●
● ● ●
●
● ●
● ● ●● ● ●
● ● ●
●● ● ●●●●
● ● ●
● ● ●
● ● ●● ●
● ●
●
● ● ● ● ● ● ● ●
●● ● ● ● ● ● ●● ●● ●●●●● ●● ●●
●● ● ●
●●● ●● ●● ●
●● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ●
●
● ●●●
●
● ●●
● ●●● ●
● ●
●
●
●
●
●
● ●
● ● ●
●
●●
●
●
●●
●●
●
●
●
●● ●● ●●●●
●
●
●●●●●
● ●
●
●
●
●
●●
●
●●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●● ●●
●●●●●●
● ●●
● ●●●
● ● ●
●
●●
●
●●●
● ●● ● ● ●
●
● ● ●
● ● ●
● ● ● ● ●
● ● ●
● ●●● ● ●●● ● ●
● ● ●
● ● ●
●●●
● ●●
● ●●
●
●●
●●● ●●
●● ●●● ●● ● ●●●
●
●● ●● ●
●
● ●● ●
●●
●●●
●●
●
●●
●
●● ●● ● ●●● ●● ● ●● ●
● ● ● ● ● ● ● ● ● ● ● ●
●●●
● ● ● ●●● ● ● ● ● ● ● ● ●●● ●●●● ●
●
●
● ●
●
●●
●
●●●●●
● ●●
● ●
●●●●●
●
● ●●● ● ●●
●
●● ●●● ●
●
●
●
●●
●●
●● ●
●
●●●●
●●●● ●
● ●●
● ●●● ● ●● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ●
●
● ●
● ● ●
●
● ●
● ● ● ● ● ●
● ● ● ● ● ●
● ●●
● ●
●●●● ●●
●●●●●●
● ●
●
●
●
● ● ●
●● ●
● ●
●
● ● ●●●
●
●●●●
●●●●
●●
●
● ●
●●
●
● ●
●●●
●
●●●
●●
●
●
●●●
●●●● ● ● ●
● ●●●● ● ● ●
● ●
● ●
● ● ● ●
● ●
● ● ● ● ● ● ● ● ● ● ●
●●
● ●●●●●●
●●
● ●●●● ●
● ● ●
● ● ●
● ● ●
● ● ●●●●●
●●
● ●●
●
●●
●
●● ●●
●
●
● ●●●●
●
●
●●
●●
●
●
●●●
●
●
●●
●
●
●
●
●●
●
●
●●●
●●● ●
● ●
●
●●●●
●
● ●●
●
●●
●
●●
●●●
●●
●
●
●
● ●●
●
●
●
●●●●●
●●●
●●●
● ●
●
●
●
●
●
●●●●
● ● ● ● ●
● ● ● ● ● ●
●
●
●
● ● ●
● ●
● ●
● ● ●
● ●
● ● ● ●
● ● ● ●
● ●● ●● ●●
● ● ●● ●●
● ● ● ● ● ● ● ● ●
●●
●●
●●
●
● ●
●
●●●
●●●
● ●
●●
●● ● ●
●●●●
●●●
●
●
●
●
●●
●●
●●
●
●
●
● ●
● ●●●●
●●●
●● ●●● ●
●●●●●
● ●
●
●
● ● ●●
●● ●
●
●●●●
● ● ●●
● ●● ● ● ●
● ● ● ●
● ●
● ●
● ● ●
● ● ● ●
● ● ●
● ● ● ● ● ●
● ●
● ●
● ●●
● ●● ●●
● ●● ● ● ●
● ● ●● ● ● ●
●
● ●● ●
● ●
●●●● ● ●●● ●●●●
●●●
●●●●● ● ● ●
● ●● ● ●● ● ● ● ● ● ● ● ● ●
0.0
0.0
● ● ●● ●
0.0
●
● ●
● ● ●● ● ● ● ● ●●●●
●
● ●
●●
●
●●
● ●
●●
●●
●
● ●●●
●
●●●
●
● ● ●
● ●
●
●
●●
●●
●●
●
●
●●●
●●
●● ●●
●●●●●●● ●●
●●
● ●
●
●
● ●
● ●
●
● ●
●●●●●●●● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
● ● ●
● ● ● ● ● ● ● ●
● ● ● ●
● ●
● ●● ●
● ● ● ● ● ● ●
● ● ● ● ● ● ● ●●● ●●
●●●●
●
●●
●
●
●●
●
●
●●
●●●
●●
●●
●●●●● ●
●●●
● ●●
●
● ●
●
●●
●
●
●●●
●●● ●●
● ● ●●●● ●●● ● ●
●●●●●●
● ●
●● ● ●● ● ● ● ● ● ● ● ●
● ● ●
● ●
● ●
● ●
● ●
● ● ● ● ● ● ● ● ● ● ● ●
●●●
●●● ●●● ●
● ●●●
● ●●● ● ●
●
● ● ●
● ● ● ● ●
●
●●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●●
●●
●●
●
● ●●
●
●●
●●●
●
●
● ●
●
●
●
●●●
●
●● ●
●
● ●●
●
●
●●
● ●●
●
●
●
●●
●●
●●
● ●●
●●● ●● ● ●●●● ● ● ●● ● ●
● ● ● ● ●
●
● ● ●
● ●
● ●
● ●
● ● ●
● ● ● ● ● ● ●● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
●● ●●● ●●● ●● ● ●●● ● ● ●
● ● ●●
●
●●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●●
●
●●
●
●
●●
●●●●●
● ●
●
●
●●●
●
●●
● ● ●
●●●●
●
●●
● ●●
●
● ●
●● ●
●●●● ●●
● ●● ●
●
● ● ● ●
● ● ● ●
●
● ●
●
● ●
● ●
●
●
● ●
● ●
●
●
●
● ●
● ●
● ●
● ●
●
● ● ● ● ●
● ● ●
● ● ● ●
● ● ●
● ● ● ● ● ●
● ●●
● ●●● ●
● ●●●●
●
● ●●●
● ●
● ●
● ●
● ●
● ● ● ●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●●
●
●
●●
●
●●
●
●●●
●
●●●●
●●●● ●●
● ● ● ●● ● ● ●● ● ● ● ● ● ●
● ● ●
● ● ●
●
● ● ●
● ●
● ●
●
● ●
● ●
●
● ●
●
● ● ● ●
● ●● ● ●
● ● ● ●
● ●
● ● ● ● ● ● ● ● ● ● ● ● ●
● ●● ● ●●● ● ● ● ● ● ● ●
● ● ● ● ● ● ●● ●● ● ●
●●
●●
●
●●●
● ●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●●●
● ● ● ● ● ● ● ● ● ● ● ●
● ●
● ●
● ●
●
● ● ●
● ●
● ● ● ● ●
● ●
● ● ●
● ● ● ● ● ● ● ●
● ● ● ● ● ● ●
● ● ●
●●● ● ● ●
● ●
● ●
● ● ● ● ●●
● ● ●
●
● ● ● ● ● ●
● ● ●
● ● ● ●
●
●● ● ●●● ●●● ● ● ●●●●
●●
●●
●
●●
●
● ●
●●
●●
●●
●
●●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●●
●
●
●●
● ● ● ● ● ● ● ● ●
● ●
● ●
● ● ● ●
● ● ● ●
● ●
● ● ● ●
● ●
●
●
● ● ●
● ●
●●
● ● ●
●
● ● ● ●
● ● ●
●
● ● ● ● ● ● ●
●●●
● ● ●● ● ●●
● ●●
● ●● ●
●●
● ● ● ● ● ● ● ●● ●●●
●●
●●
●●●
●●
●●●
●
●●
●●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●●
●
●●
●●
●
●
●
●
●
●
●●
●
●●
●
●●
●
●
●
●
●
●
●●
●
●
●
● ● ● ● ● ● ● ● ● ● ● ●
● ● ●
●
● ●
● ● ●
● ● ●
● ●
● ●
● ●
●
● ●
● ● ● ● ● ●● ●
● ●
● ● ● ● ● ● ● ● ●
●●
● ●●● ●
●●● ● ●●●●
● ●●
● ●●● ●●●
● ● ●
●
●
● ● ● ● ● ● ● ●
● ●● ● ●●● ●●
● ● ●● ●●
●
●●●●●
●
●
●●
●
●●●
●●
●●
●●
● ●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●●
●
●
●
●●
●
●
●●
● ●
●
●
●●
●
●
●● ●●
●
● ●
● ●
●
●
●
● ● ●
● ● ● ●
● ●
●
● ●
● ● ●
● ● ●
● ●
● ●
● ●
●
●
●
● ●
● ●
● ●
● ●
●●
● ●
● ● ●
● ● ●
● ● ● ● ●
● ● ● ● ● ● ● ● ●
●
●●●
●
● ● ●
●
●
● ●
●
●
● ●
●
●
● ●
●
●
● ●
● ●
● ● ● ● ● ●● ●●● ●● ● ●● ●●
●● ●●●
●
●● ●
●●
●
●● ●
●●
●
●● ●
●
●
●
●
●
●●
●●●
●●
●
●● ●
●
●
●
●
●● ●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●●
●●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●●
● ●
●●● ● ● ● ● ●
● ● ●
● ●
●
●
●
●
●
●
●
● ●
●
●
● ● ●
●
●
● ● ●
● ●
●
●
●
●
● ●
●
●
●
●
●
●
●
● ●
● ●
● ●
● ●● ●
● ● ● ●
● ●
● ● ● ● ● ●
●
● ●
● ●
● ● ● ● ● ●
● ● ●
● ● ● ● ● ● ● ● ● ●●●● ●● ●
● ●● ●●●
●● ●
●
●●●● ●● ●
● ● ●
●● ● ●
●●●
● ●●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ●
●●● ● ● ● ●
● ● ● ● ● ●● ● ●● ●●● ●● ●● ●●●●
●●●●●
●
● ●
●● ●
●●
●●●●●
●
● ●
●●●
●●
●
●●●
●● ●
●●●
● ●
●●
●
●●
●●
●●●
●
●
●
● ●
●●
●
●●
● ● ● ● ● ● ●
● ● ●
● ●
● ● ● ● ●
● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●
● ● ●
●
● ●●
●
● ● ● ● ● ● ●●● ●● ●
●
● ●
● ●
● ● ● ●● ● ●● ●● ●● ●●●● ●
●●●
●●●●●●●
●●●●●●●
●●●●●
●●●
●●
●
●●●
●●
●●●
●●
●●
●●●
●
●
●
●●
●
●
●●
●●
●●
●
●●
●
●
●
●
●
●●●
●
●
●●
●●
●
●●
●●
●
●●●
●●●
●
●
●
●
●
●●●●●
●●
●
● ● ●
● ●
●
●
● ●
●
● ●
●
● ●
● ●
● ● ●
●
● ● ● ● ●
●
●
● ● ●
● ● ●
● ●●
●
● ● ● ●
● ● ●
●
●
●
● ●
● ● ● ● ● ●
●
● ● ●
● ● ●
● ● ● ● ● ●
●● ● ●●● ● ●●● ● ● ●
●
● ● ●
● ● ●● ● ●●● ●●●●● ●● ●
●●●● ●
●●
● ●
● ●●●● ●●
●●
● ●
●●●
●●●
● ●● ●
●●
●
●●●
●
●●
●●
●
●●
●
●
●●
●
●
●●
●
●
● ●●
●
● ●
●●●●
●
●
●● ●● ● ● ●
● ● ●
● ●
● ● ● ● ●
● ● ●
● ● ● ●
● ●
● ● ● ● ●
● ●
● ● ●● ● ● ● ● ● ● ● ● ●
● ● ●
● ● ● ● ● ●
●
● ●
●
●●● ●
● ● ●
● ● ● ● ● ●● ● ● ● ●●
●●
●●● ●● ●● ●●●
●● ●●●●●●●●●● ●
●●●●●●●● ●●
●●●
●
●
●●●●
●
●●
●●
●●
●
●●
●●●
●
●
●●●
●
●●
●●
●
●
●●
●
● ●
●
●
●●
●
●
● ●
●●●●
●●● ● ● ● ● ●
● ●
● ●
●
●
●
● ●
●
●
● ●
● ●
● ● ● ● ● ●
● ● ●
●
●●
● ● ● ●
● ●
● ● ● ● ● ● ● ●
● ●
● ●
●
●
● ● ●●●
●●●●●
●●
●
●
● ● ● ● ● ●
●
● ●
●
●
●
●
● ●
● ●
● ● ● ● ●
● ● ●●●●● ●● ●
●●●●● ●●
● ●
●●●
●●● ●●●●●
● ●●●
●●●
●
●
●
●
● ●
●●●●●●●●
●
●
●●●●●
●
●●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●●
●●●●●●
●●●●
● ● ●
● ● ● ● ● ●
●
● ●
● ● ● ●
● ●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ● ●
●
● ●
●
●
● ●
● ●
● ●
● ●
●
● ●
● ● ●
● ●
●
●
●
●
●
● ●
●
● ●
● ● ●
●
● ●
●
● ● ●
●
●
●
● ●
●
●
● ●
●● ● ● ● ●
●● ● ●
● ●● ●●
● ●●
●●
●● ●●●●●
●● ●● ● ●● ●
●●● ●
●● ●●● ●●● ●● ●●●
●
●●●●●
●●●●
●● ●
●●
●●●
●●
●●●
●
● ●
● ● ● ● ●
● ●
● ● ● ●
● ●
●
●
● ●
● ●
● ●
● ●
● ● ●
● ● ●● ● ●
● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
●
●
●●● ●●
●●
●
●● ●
● ●●
●
●● ● ● ● ● ● ● ●
● ●●
●●
● ●●●●●
●● ●●●●● ●● ●●●●
●
●●●●●●●●
●● ● ●
●
● ●●
●● ●
●● ●●
●●●
●
●
●
●
●●●●●●
●
●● ●
●
●
● ● ● ● ●
● ● ●
● ●
● ● ●
●
● ●
● ● ●
● ●
● ●
● ● ● ●
● ●
● ●
●
● ●
● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ●
● ● ● ● ● ● ●
●
● ● ●●●● ●
● ● ●
● ● ●● ●● ● ●●● ●● ● ●
●
●●●
●● ●●
●●●
●●●
●●
●●
●
●
●●
●
●
●●
●●●●
●
●
●● ●
● ●
● ● ● ● ● ● ● ● ●
● ● ● ●
● ●
● ● ●
● ● ●
● ●
● ● ●
●
● ● ● ●
● ●
● ● ● ● ●
● ● ● ● ● ●
●
● ●
●
● ●
● ●●● ●
●●
● ●● ● ● ● ●●● ● ● ●●● ● ● ●● ●● ●●● ●●●● ●●●●●●●●●
●● ●
●
●●●●
●
●●
●●
●
●●
●●●●●
●●
●
●
●● ● ●●
●
●●● ●● ● ● ● ● ● ●
● ● ●
● ●
●
●
● ●
●
●
● ●
● ● ●
● ●
●
● ●
● ●
● ● ● ● ● ●
● ● ● ● ● ● ●
●●
● ● ●●● ●●● ● ●
●
● ● ● ● ●● ● ●● ● ●●●● ●● ● ●●●
●
●
●
●● ●●●●
●
●
●●
●● ●
●
●
●● ●●●●
●
●
●● ●
●● ● ● ● ● ●
● ● ● ● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ● ● ● ● ● ● ● ● ● ●
●
●●
● ●● ● ● ● ●● ● ●● ●●
●●●● ●● ●●●●●
●●
● ●
●
●●●●● ●●
●
●
●●●● ●●● ● ● ● ●
● ● ●
● ● ● ● ●
● ● ●
● ●
● ● ●
● ●
● ●
● ●
● ● ● ● ● ●
●
●
● ● ●
● ●●●
●●
● ●
●
● ● ● ● ● ●● ●●●● ●
● ●●
● ●●●
●●●●●●●
●
●●●
●
●● ●
●●
●●●●●●●
●●●
●●
● ●
●
●
●●
●●●
● ●
● ● ● ● ●
● ● ● ●
● ● ●
● ● ●
●
●
● ●
●
●
●
●
●
●
● ●
●
● ●
●
●
●
●
● ●
● ● ● ● ● ● ● ●
● ●
●●●● ● ● ●●●●● ● ●●
●●●●
● ●
● ●● ● ● ● ● ● ● ● ●
● ●
● ● ●
● ● ● ●
−0.5
−0.5
−0.5
●●● ●● ●● ●●●●●● ●●●
● ● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ●
●
● ●● ●●●● ● ● ● ●● ● ●● ●●●●
● ●● ● ●
●●● ● ● ● ● ● ●
● ●
● ●
● ●
● ●
● ● ●
● ●
● ●
●
●
● ●
● ●
● ● ● ●
● ● ●●● ●●●●
●●
●●●●●●
●●●●●
●●
● ●●
●● ●●
●●●● ●●●● ●●
● ● ● ● ●
● ●
● ● ●
● ●
● ●
● ● ● ●
● ●
● ● ●
● ● ●
●
●
●
●
●
●
● ●
●
●
●
● ●
●●
●
● ● ● ● ● ● ● ● ●●
●● ●●● ● ●●
●●● ● ●
● ● ●
● ● ● ● ● ● ● ● ●
● ●
● ● ●
● ●
● ● ●
● ●
● ● ● ● ● ● ● ●
● ●
●
●●
● ●●
●
●●
● ●●
● ● ●
●●●
●●● ● ●● ● ●
●● ● ●● ●●
●● ●
●● ● ●
● ● ● ● ●
● ● ●
● ●
●
●
●
● ●
● ● ● ●
●
●
● ●
● ● ● ●
●
●
●
●
● ●
●
●
●
●
●
● ●● ●
● ●●● ●● ● ●●●● ● ●●●●●
●●●
●●●●
●●●● ● ● ● ●
● ●
● ● ● ●
● ● ●
● ●
● ●
● ● ●
● ●
●
●
●
●●● ● ● ● ● ● ●● ● ● ● ●
● ● ● ● ● ● ●
●
●
● ● ●
● ● ● ●
● ● ●●● ● ●●● ● ●
●
● ● ● ●
● ● ●
● ●
● ●
● ● ●
●
● ●
●●
●
● ●●
●
●
● ●●●● ●
● ●●
● ●●●
●● ● ● ● ● ● ●
● ● ● ●
● ●
●
● ●
●
● ● ●
●
●
●
● ● ●
●
●
●
● ●
● ●
●
●
●
●
●
●
● ● ● ●●●● ● ●
●●●● ●●
● ● ●
● ● ● ● ●
● ● ●
●
●
●
● ●
● ● ●
● ● ● ●
● ● ●
●
●
●
● ●
●●
● ● ● ●● ● ● ● ● ● ● ● ●
● ● ● ●
●
●
●●●
● ● ●● ● ● ●
● ● ● ●
● ●
● ● ● ● ● ●
● ●
● ● ● ●
● ●
●
● ● ● ●
●
● ●
● ●
●
●
●
●
●
●
●
●
●
● ● ● ●
● ●
●● ● ●
● ● ● ● ● ●
● ● ●
● ● ●
●
● ● ● ● ● ●
● ●
● ● ● ● ●
● ● ● ●
●
●
●
● ● ●●● ● ● ● ● ●
● ● ● ●
● ●
●
● ●
● ● ●
● ●
● ●
●
●
●
●
●
● ● ● ● ● ● ● ●
● ● ● ● ●
● ● ●
● ● ● ● ● ●
● ●
●
● ● ● ● ● ● ● ● ●
● ● ● ● ●
●
● ● ●
● ● ● ● ●
● ● ● ● ●
● ● ●
●
●
●
●
● ● ● ●
● ●
●
● ● ●
● ● ● ● ● ●
●
●
● ●
● ● ●
● ● ● ● ● ●
●
● ● ● ● ● ● ● ●
● ● ●
● ●
● ● ●
● ● ●
●
●
● ●
● ●
● ●
● ● ● ● ● ● ●
−1.0
−1.0
−1.0
● ● ●
● ● ● ● ● ● ● ●
●
● ● ● ● ● ● ● ●
●
●
● ● ●
● ● ● ●
● ● ● ●
● ●
● ● ●
● ● ● ● ● ● ● ● ●
● ● ●
● ● ● ●
● ●
●
● ● ● ●
● ● ●
● beta(x) ● beta(x) ●
−1.5
−1.5
spline fit spline fit
1.5
● beta(x) ● beta(x)
zero line zero line
spline fit
1.0
1.0
feature contribution
feature contribution
0.5
0.5
●
●
●
●
●
●
● ●
●
●
● ●
●
●
● ●
● ●
●
●
● ●
●
● ●
● ●
● ●
● ●
●
● ●
● ●
● ●
●
● ●
● ●
● ●
● ●
●
● ●
● ●
● ●
● ●
●
● ●
● ● ●
●
● ●
●
●
● ●
●
● ●
● ●
● ●
●
● ●
●
0.0
0.0
●
● ●
● ●
● ●
● ●
● ●
●
●
● ●
●
● ●
● ●
● ●
● ●
●
● ●
● ●
● ●
●
● ●
● ●
●
● ●
●
●
●
● ●
● ●
● ● ●
● ●
● ●
● ●
●
● ● ●
● ●
● ●
● ●
● ●
● ● ●
● ●
●
●
● ●
●
● ● ●
● ●
● ● ●
● ● ●
● ● ●
● ●
● ●
● ● ●
● ●
● ●
● ● ●
● ●
● ● ● ●
●
●
● ●
● ●
● ●
●
● ●
● ●
●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ● ●
●
● ●
● ●
● ●
● ●
●
●
● ●
● ● ●
● ● ●
● ●
●
● ●
● ● ●
●
● ●
● ●
● ● ●
● ● ● ● ●
●
● ●
● ● ● ●
● ●
● ●
● ● ● ●
●
● ●
● ●
● ●
● ●
● ● ●
● ●
●
● ●
● ● ●
● ●
● ●
● ●
● ●
●
●
● ●
● ● ●
● ●
● ●
● ●
● ●
● ● ●
● ● ● ●
●
● ●
● ●
● ●
●
● ●
● ● ●
● ●
● ● ●
● ●
● ●
● ●
● ●
● ●
●
● ●
●
● ●
● ●
● ● ● ●
●
● ●
● ●
●
● ●
● ● ●
●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ●
● ● ● ● ●
●
● ●
● ●
● ●
● ● ●
● ●
● ●
● ●
● ●
●
● ●
−0.5
−0.5
●
● ●
● ●
● ● ●
●
● ● ● ●
● ●
● ● ●
● ● ● ●
●
● ● ●
● ●
● ●
●
● ●
● ●
● ●
● ●
● ● ●
● ●
● ●
● ●
●
● ●
● ● ●
●
● ● ●
● ●
● ●
●
● ●
● ●
● ● ●
● ● ● ●
● ●
● ●
● ●
●
● ●
● ●
● ●
●
● ● ●
●
● ●
● ●
●
● ● ●
●
● ●
● ●
● ●
●
●
● ●
●
●
●
●
●
−1.0
−1.0
−1.5
−1.5
0 5 10 15 20 Diesel Regular
Vehicle Age Vehicle Gas
32
Interactions between Covariate Components
interactions of feature component Driver's Age interactions of feature component Vehicle Age
2
2
1
1
DrivAge
interaction strengths
interaction strengths
BonusMalus
VehAge
BonusMalus
Density DrivAge
Density
VehGas
0
0
VehGas
VehAge
−1
−1
Vehicle Age Vehicle Age
Driver's Age Driver's Age
Bonus−Malus Level Bonus−Malus Level
Vehicle Gas Vehicle Gas
−2
−2
Density Density
20 30 40 50 60 70 80 90 0 5 10 15 20
Driver's Age Vehicle Age
>
∂ ∂
∇βj (x) = βj (x), . . . , βj (x) ∈ Rq .
∂x1 ∂xq
33
Calculation of Gradients in keras
34
Variable/Term Importance
variable importance
Bonus−Malus
Driver's Age
Density
Vehicle Age
Vehicle Gas
Vehicle Power
Area Code
RandN
RandU
n
1X b
VIj = βj (xi) .
n i=1
35
Categorical Covariate Components
1.5
1.0
1.0
feature contribution
feature contribution
0.5
0.5
●
●
●
●
●
●
● ●
● ● ●
● ● ●
● ●
●
● ●
● ●
● ●
● ● ●
●
●
●
●
●
0.0
0.0
●
●
●
● ●
● ● ●
● ●
● ●
● ●
●
●
● ● ●
●
●
●
●
●
●
●
●
●
●
●
−0.5
−0.5
−1.0
−1.0
−1.5
−1.5
B1 B3 B5 B10 B12 B14 R11 R23 R26 R42 R53 R73 R83 R94
Vehicle Brand French Regions
36
References
• Breiman (1996). Bagging predictors. Machine Learning 24, 123-40.
• Efron, Hastie (2016). Computer Age Statistical Inference: Algorithms, Evidence, and Data Science. Cambridge UP.
• Ferrario, Noll, Wüthrich (2018). Insights from inside neural networks. SSRN 3226852.
• Goodfellow, Bengio, Courville (2016). Deep Learning. MIT Press.
• Hastie, Tibshirani, Friedman (2009). The Elements of Statistical Learning. Springer.
• Lorentzen, Mayer (2020). Peeking into the black box: an actuarial case study for interpretable machine learning.
SSRN 3595944.
• Noll, Salzmann, Wüthrich (2018). Case study: French motor third-party liability claims. SSRN 3164764.
• Richman (2020a/b). AI in actuarial science – a review of recent advances – part 1/2. Annals of Actuarial Science.
• Richman, Wüthrich (2020). Nagging predictors. Risks 8/3, 83.
• Richman, Wüthrich (2021a). LocalGLMnet: interpretable deep learning for tabular data. SSRN 3892015.
• Richman, Wüthrich (2021b). LASSO regularization within the LocalGLMnet architecture. SSRN 3927187.
• Schelldorfer, Wüthrich (2019). Nesting classical actuarial models into neural networks. SSRN 3320525.
• Schelldorfer, Wüthrich (2021). LocalGLMnet: a deep learning architecture for actuaries. SSRN 3900350.
• Wüthrich (2020). Bias regularization in neural network models for general insurance pricing. European Actuarial
Journal 10/1, 179-202.
• Wüthrich, Buser (2016). Data Analytics for Non-Life Insurance Pricing. SSRN 2870308, Version September 10, 2020.
• Wüthrich, Merz (2019). Editorial: Yes, we CANN! ASTIN Bulletin 49/1, 1-3.
• Wüthrich, Merz (2021). Statistical Foundations of Actuarial Learning and its Applications. SSRN 3822407.
37
Convolutional Neural Networks
Mario V. Wüthrich
RiskLab, ETH Zurich
1
Contents: Convolutional Neural Networks
• CNN examples
2
• Spatial and Temporal Data
3
Spatial Objects
Swiss Female raw log−mortality rates
92
84
76
68
60
age x
52
44
36
28
20
12
6
0
1950 1956 1962 1968 1974 1980 1986 1992 1998 2004 2010 2016
calendar year t
• The first two components of Z give the location (i, j) in the picture.
• The 3rd component of Z gives the signals in location (i, j). This 3rd component
is called channels. Black-white pictures have q = 1 channel (gray scale), color
pictures have q = 3 channels (RGB channels for red-green-blue).
4
Temporal and Time Series Objects
driver 238, trip number 1
3
0
speed / angle / acceleration
−3
0.25
0
50
25
0
0 50 100 150
time in seconds
5
Using FNNs for Time Series Processing
• Assume we have time series information Z = x0:T = (x0, . . . , xT )> ∈ R(T +1)×q
to predict response YT +1
6
RNNs, CNNs and Attention Layers
• There are 3 different ways in network modeling to deal with topological data.
• Attention layers move across the time series and try to pay attention to special
features in the time series, like giving more or less credibility to them. This is
similar to the regression attentions β(x) in LocalGLMnets.
7
• Convolutional Neural Networks (CNNs)
8
Functioning of CNNs
driver 57, trip number 1 driver 206, trip number 1 driver 238, trip number 1
3
3
0
0
speed / angle / acceleration
−3
−3
0.25
0.25
0.25
0
0
50
50
50
25
25
25
0
0
0 50 100 150 0 50 100 150 0 50 100 150
time in seconds time in seconds time in seconds
• b is called filter size, kernel size or band width; q is the number of channels.
• Move with this filter across the time series (in time direction t) and try to spot
specific structure with this filter (in the rolling window).
• Choose filter sizes (b1, b2, q0)> ∈ N3 with b1 < I and b2 < J.
z k : RI×J×q0 → R(I−b1+1)×(J−b2+1)
x 7→ z k (x) = (zk;i,j (x))1≤i≤I−b1+1;1≤j≤J−b2+1,
10
CNNs: Convolution Operation
• Choose the corner (i, j, 1) of the tensor as base point. CNN operation considers
(i, j, 1) + [0 : b1 − 1] × [0 : b2 − 1] × [0 : q0 − 1],
z k : RI×J×q0 → R(I−b1+1)×(J−b2+1)
x 7→ z k (x) = φ (W k ∗ x).
• This convolution operation ∗ reflects one filter with filter weights W k . We can
now choose multiple filters (similar to neurons in FNNs):
This explains the meaning of the lower index k (which plays the role of different
neurons 1 ≤ k ≤ q1 in FNNs).
11
CNNs: Multiple Filters
z CNN(x) ∈ R(I−b1+1)×(J−b2+1)×q1 ,
with q1 filters.
12
Properties of CNNs
• CNNs generally use less parameters than FNNs and RNNs, because filter weights
are re-used/re-located.
13
• Special Purpose Tools for CNNs
14
CNNs: Padding with Zeros
• If this is an undesired feature, padding with zeros can be applied at all edges to
obtain
• Remark that padding does not add any additional parameters, but it is only used
to reshape the output tensor.
15
CNNs: Stride
• Strides are used to skip part of the input tensor x to reduce the size of the output.
This may be useful if the input tensor is a very high resolution image.
16
CNNs: Dilation
• Dilation is similar to stride, though, different in that it enlarges the filter sizes
instead of skipping certain positions in the input tensor.
• This considers
(i, j, 1) + e1 [0 : b1 − 1] × e2 [0 : b2 − 1] × [0 : q0 − 1].
17
CNNs: Max-Pooling Layers
• Pooling layers help to reduce the sizes of the tensors.
• We choose fixed window sizes b1 and b2 and strides s1 = b1 and s2 = b2; this
gives a partition (disjoint windows).
with I 0 = bI/b1c and J 0 = bJ/b2c (cropping last columns by default), and where
the convolution operation ∗ is replaced by a max operation (modulo channels).
• Consider mapping
z flatten : RI×J×q0 → R q1
>
x 7→ z flatten(x) = (x1,1,1, . . . , xI,J,q0 ) ,
with q1 = I · J · q0.
• We have already met this operator with embedding layers for categorical features.
19
CNNs: Example (1/2)
1 library ( keras )
2 #
3 shape <- c (180 ,50 ,3)
4 #
5 model <- ke ra s_ mod el _s eq uen ti al ()
6 model % >%
7 layer_conv_2d ( filters = 10 , kernel_size = c (11 ,6) , activation = ’ tanh ’ ,
8 input_shape = shape ) % >%
9 l ayer_max_pooling_2d ( pool_size = c (10 ,5)) % >%
10 layer_conv_2d ( filters = 5 , kernel_size = c (6 ,4) , activation = ’ tanh ’) % >%
11 l ayer_max_pooling_2d ( pool_size = c (3 ,2)) % >%
12 layer_flatten ()
20
CNNs: Example (2/2)
21
• Time Series Example: Telematics Data
22
What is Telematics Car Driving Data?
• GPS location data second by second, speed, acceleration, braking, intensity of left
and right turns, engine revolutions,
• time stamp (day time, rush hour, night, etc.), total distances at different times,
0.4
20
0.3
10
y coordinate (in km)
0.2
0
●
−10
0.1
−20
0.0
−20 −10 0 10 20 [0] (0,5] (5,20] (20,50] (50,80] (80,130]
24
Speed, Acceleration/Braking and Direction
3
0
speed / angle / acceleration
−3
0.25
0
50
25
0
0 50 100 150
time in seconds
• We have 3 channels:
? Speed v is concatenated so that v ∈ [2, 50]km/h.
? Acceleration a is censored at ±3m/s2 because of scarcity of data and data
error. Extreme acceleration +6m/s2, extreme deceleration −8m/s2.
? Change of direction ∆ is censored at 1/2.
25
Choose 3 Selected Drivers
driver 57, trip number 1 driver 206, trip number 1 driver 238, trip number 1
3
3
0
0
speed / angle / acceleration
−3
−3
0.25
0.25
0.25
0
0
50
50
50
25
25
25
0
0
0 50 100 150 0 50 100 150 0 50 100 150
time in seconds time in seconds time in seconds
• Assume that of each trip we have 180 seconds of driving experience (at random
chosen from the entire trip and pre-processed as described above).
26
Classification with CNNs
2
(vs,t, as,t, ∆s,t)> ∈ [2, 50]km/h × [−3, 3]m/s × [0, 1/2],
>
> >
xs = (vs,1, as,1, ∆s,1) , . . . , (vs,180, as,180, ∆s,180) ∈ R180×3,
27
Logistic Regression for Classification
D E exphbj , zi
softmax B, z = P3 ∈ (0, 1).
k=1 exphbk , zi
j
28
CNNs for Logistic Regression
29
R Code for CNN Architecture on Time Series Data
30
Explicit CNN Architecture and Network Parameters
1
2 Layer ( type ) Output Shape Param #
3 ==============================================================
4 conv1d_1 ( Conv1D ) ( None , 176 , 12) 192
5 ______________________________________________________________
6 max_pooling1d_1 ( MaxPoolin ( None , 58 , 12) 0
7 ______________________________________________________________
8 conv1d_2 ( Conv1D ) ( None , 54 , 10) 610
9 ______________________________________________________________
10 max_pooling1d_2 ( MaxPoolin ( None , 18 , 10) 0
11 ______________________________________________________________
12 conv1d_3 ( Conv1D ) ( None , 14 , 8) 408
13 ______________________________________________________________
14 g l o b al _ma x_ po ol ing 1d _1 ( Gl ( None , 8) 0
15 ______________________________________________________________
16 dropout_1 ( Dropout ) ( None , 8) 0
17 ______________________________________________________________
18 dense_1 ( Dense ) ( None , 3) 27
19 ==============================================================
20 Total params : 1 ,237
21 Trainable params : 1 ,237
22 Non - trainable params : 0
31
Gradient Descent Fitting
1.25 ●
●
●
●●
loss
●●
●
●●
● ●●
●●
●●
●●●●● ●●
● ● ●
● ● ● ● ● ●
●
●●
●● ●
●●●
● ●●● ● ● ● ●● ●●
● ●
●● ●●
● ●
●●
●
●●
●
●●
●● ● ●
●●
●
●●● ●●● ● ●
●
●●●
●●
0.75 ●
●
●●●●
●
●●● ●●
●
●●
●
●●
●
●●●●●
●
●
●
●●
●
● ●●●
●
● ●
●●
●
●●● ●●
●
●● ● ●● ● ●● ● ●●●
● ●●●● ● ●●● ●
●
●
●●● ●●●
●●●●
●
●●
●●
●
●
● ●●●
●●
●●
● ●●●●
●● ●● ● ● ●● ● ● ●●
● ●● ●● ●
● ●● ●
● ●
●●●●●●
●●
●
●
●●
●●
●●
●
●●●● ● ● ●●
●●●
● ●
● ●
●●
●●●●
●
●●
●
●●
●
● ●
●
●● ● ●●●
●
●●●
●●
●●●
●●
●●
●●●
●●
●
●●
●●●
● ● ●
● ●●
●
●● ● ●●● ●● ●●●● ●
●●● ●●●● ●
●●●
●● ●
●●●●
●●
● ●●●●● ● ●● ●
● ●
0.25 ● training
●
● ●● ●
●●
●●
●●●●
●
●●
● ●●● ● validation
●●● ● ● ●●● ●
● ● ●●●● ●●●●●
●
●●
● ● ● ● ●●
●●
●●
●
●● ●
●
●●●●●
●
● ●● ●● ●● ●●
●
●● ●
●●
●●
●
●● ● ● ●●●
●●●● ● ● ● ●●●● ●●
● ●●●● ● ●
●● ● ●● ● ●●
● ●
● ●●
●
●●●●●●●●●●● ●●
●●
● ● ●
● ● ●
● ● ●●
● ●● ●● ●● ●
● ● ● ●
●●
●
●●
●●
●
●●● ●●
●● ● ● ●●
●
●
● ● ●
●●●● ● ●●
● ●●● ●●
● ● ● ●
●●●●●
● ●●●
●● ●
●● ● ● ●
0.8 ● ●● ● ●● ●●● ●
●●●
●● ● ● ●●
●
●●●
● ●
● ● ●●
● ● ●●● ●
●●●
● ● ●●
●●●●
●●●●
●
● ●●● ● ●● ●
● ●
●
●● ● ●● ● ● ●
● ●●● ● ●
● ●●● ● ●●●
●●●
● ●●● ●
●● ●●● ●
● ● ● ●
● ●●●●●
●●●
● ● ●●●●● ● ●● ●
●●●●●●● ● ● ●● ●
●
●
●
●
●●
●● ●
●●●●
●●●●● ●
●● ●●
●●● ●● ●
●●
●● ●● ●●●●● ●●
● ●● ● ●●● ●● ●
●●
●●●● ● ●●● ●●●● ●
●●
●●●●●●
●●
●●●●●●
●●
● ● ●●
●
●●●●●●
● ●●●●●
acc
●●
●
●
●●●●●
●●●
● ● ● ●●● ●
0.6 ●●●●●●●●●
●
●
●●●
●● ●●●
●
●●●
●●●●
● ●●
●
●●
●● ●
●●● ●
●●●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●
32
Out-of-sample Results
true labels
driver 57 driver 206 driver 238
predicted label 57 33 4 0
predicted label 206 8 38 6
predicted label 238 1 5 36
% correct 78.6% 80.9% 85.7%
true labels
driver 300 driver 301 driver 302
predicted label 300 61 1 3
predicted label 301 5 42 11
predicted label 302 8 11 25
% correct 82.4% 77.8% 65.8%
true labels
driver 100 driver 150 driver 200
predicted label 100 43 12 2
predicted label 150 5 64 5
predicted label 200 4 2 51
% correct 82.7% 82.1% 87.9%
34
What’s Next?
• Can this data be made useful to improve driving behavior and style?
35
• Spatial Example: Digits Recognition
36
Modified National Institute of Standards and
Technology (MINST) Data Set
37
R Code for CNN Architecture on Spatial Data
38
Explicit CNN Architecture and Network Parameters
1 Layer ( type ) Output Shape Param #
2 =====================================================================================
3 conv2d_8 ( Conv2D ) ( None , 26 , 26 , 10) 100
4 _____________________________________________________________________________________
5 b a t ch_n ormal izat ion_ 7 ( BatchNormalizat ( None , 26 , 26 , 10) 40
6 _____________________________________________________________________________________
7 activation_7 ( Activation ) ( None , 26 , 26 , 10) 0
8 _____________________________________________________________________________________
9 max_pooling2d_7 ( MaxPooling2D ) ( None , 13 , 13 , 10) 0
10 _____________________________________________________________________________________
11 conv2d_9 ( Conv2D ) ( None , 11 , 11 , 20) 1820
12 _____________________________________________________________________________________
13 b a t ch_n ormal izat ion_ 8 ( BatchNormalizat ( None , 11 , 11 , 20) 80
14 _____________________________________________________________________________________
15 activation_8 ( Activation ) ( None , 11 , 11 , 20) 0
16 _____________________________________________________________________________________
17 max_pooling2d_8 ( MaxPooling2D ) ( None , 10 , 10 , 20) 0
18 _____________________________________________________________________________________
19 conv2d_10 ( Conv2D ) ( None , 8 , 8 , 40) 7240
20 _____________________________________________________________________________________
21 b a t ch_n orma lizat ion_ 9 ( BatchNormalizat ( None , 8 , 8 , 40) 160
22 _____________________________________________________________________________________
23 activation_9 ( Activation ) ( None , 8 , 8 , 40) 0
24 _____________________________________________________________________________________
25 max_pooling2d_9 ( MaxPooling2D ) ( None , 4 , 4 , 40) 0
39
26 _____________________________________________________________________________________
27 flatten_3 ( Flatten ) ( None , 640) 0
28 _____________________________________________________________________________________
29 dense_3 ( Dense ) ( None , 10) 6410
30 =====================================================================================
31 Total params : 15 ,850
32 Trainable params : 15 ,710
33 Non - trainable params : 140 (1/2 of batch normalizations )
40
Result: Confusion Matrix
41
Shift Invariance
42
Rotation Invariance
43
Scale Invariance
44
References
• Efron, Hastie (2016). Computer Age Statistical Inference: Algorithms, Evidence, and Data Science. Cambridge UP.
• Gao, Wüthrich (2019). Convolutional neural network classification of telematics car driving data. Risks 7/1, article 6.
• Goodfellow, Bengio, Courville (2016). Deep Learning. MIT Press.
• Hastie, Tibshirani, Friedman (2009). The Elements of Statistical Learning. Springer.
• Meier, Wüthrich (2020). Convolutional neural network case studies: (1) anomalies in mortality rates (2) image
recognition. SSRN 3656210.
• Perla, Richman, Scognamiglio, Wüthrich (2021). Time-series forecasting of mortality rates using deep learning.
Scandinavian Actuarial Journal 2021/7, 572-598.
• Wiatowski, Bölcskei (2018). A mathematical theory of deep convolutional neural networks for feature extraction.
IEEE Transactions on Information Theory 64/3, 1845-1866.
• Wüthrich, Merz (2021). Statistical Foundations of Actuarial Learning and its Applications. SSRN 3822407.
45
Recurrent Neural Networks
Mario V. Wüthrich
RiskLab, ETH Zurich
1
Contents: Recurrent Neural Networks
2
• Lee–Carter (LC) Model and Time-Series
3
Human Mortality Database (HMD)
Swiss Female raw log−mortality rates Swiss Male raw log−mortality rates
92
92
84
84
76
76
68
68
60
60
age x
age x
52
52
44
44
36
36
28
28
20
20
12
12
6
6
0
1950 1956 1962 1968 1974 1980 1986 1992 1998 2004 2010 2016 1950 1956 1962 1968 1974 1980 1986 1992 1998 2004 2010 2016
4
Human Mortality Database (HMD)
(i)
• Aim: Forecast mortality rates mx,t for ages x, calendar years t and populations i.
Swiss Female raw log−mortality rates Swiss Male raw log−mortality rates
92
92
84
84
76
76
68
68
60
60
age x
age x
52
52
44
44
36
36
28
28
20
20
12
12
6
6
0
1950 1956 1962 1968 1974 1980 1986 1992 1998 2004 2010 2016 1950 1956 1962 1968 1974 1980 1986 1992 1998 2004 2010 2016
5
Lee–Carter (LC) Model (1992)
(i) (i)
(x, t, i) 7→ log(mx,t) = a(i) (i)
x + bx kt ,
(i)
? ax average force of mortality at age x in population i;
(i)
? kt mortality trend in calendar year t of population i;
(i)
? bx mortality trend broken down to ages x of population i.
The inputs (x, i) and (t, i) are treated as categorical variables.
We have log-link, but not a GLM.
6
Lee–Carter 2-Stage Forecasting
(i)
• Center the observed log-mortality rates log(Mx,t )
• Find optimal parameter values with SVD (see also PCA chapter)
X 2
(i) (i)
arg min Lx,t − b(i)
x t k ,
(i) (i)
(bx )x ,(kt )t t,x
P b(i) P (i)
under side constraint for identifiability x bx = 1; and t∈D b
kt = 0.
(i)
• Extrapolate time series (b
kt )t∈D using a random walk with drift.
●● ●
● ●
●● ●
●●●●
40
●
● ● ●
20
●● ●●
●●
● ●●●● ●
● ●●●● ● ●
●
20
●
●●● ●
●●● ●●
● ● ●
●
0
●●
●
0
●● ●●● ●
values k_t
values k_t
● ●● ●
● ●● ●●
●●
−20
●● ●
−20
●
●● ● ●●
● ●●
●● ● ●
●● ●
●
●●
−40
●● ●
−40
●●
● ●
● ●● ●
● ●
● ●
● ●
−60
● ●
● ●
● ●
● ●
●
−60
● ●
● ●
● ●
● ●
● ●
−80
● ●
● ●
● ●
1950 1957 1964 1971 1978 1985 1992 1999 2006 2013 1950 1957 1964 1971 1978 1985 1992 1999 2006 2013
8
• Recurrent Neural Networks (RNNs)
9
Recap: Feed-Forward Neural Networks (FNNs)
age
claims
ac
• The input of this FNN grows whenever we have a new observation xt ∈ Rτ0 .
10
Plain-Vanilla Recurrent Neural Network (RNN)
• Define a recursive structure using a single RNN layer (upper index(1))
• This RNN has one hidden layer with upper index(1) that is visited T times.
11
Remarks on RNNs
• Lower index t in z t = z (1) (xt, z t−1) is time and upper index(1) is the hidden layer.
• There are different ways in designing RNNs with multiple hidden layers. We give
examples of two hidden layers, i.e. depth d = 2.
12
Variants with 2 Hidden RNN Layers
• 1st variant of a two-hidden layer RNN:
(1) (1)
zt = z (1) xt, z t−1 ,
(2) (1) (2)
zt = z (2) z t , z t−1 .
14
Long Short-Term Memory (LSTM) Networks
• The above plain-vanilla RNN architecture is of auto-regressive type of order 1.
15
LSTM Layer: The 3 Gates
• Forget Gate (loss of memory rate):
• Network weights are given by Wf>, Wi>, Wo> ∈ Rτ1×(τ0+1) (including an intercept),
Uf>, Ui>, Uo> ∈ Rτ1×τ1 (excluding an intercept).
16
LSTM Layer: The Memory Cell
• The above gates determine the release and update of the memory cell ct.
for weights Wc> ∈ Rτ1×(τ0+1) (incl. intercept), Uc> ∈ Rτ1×τ1 (excl. intercept), and
Hadamard product ⊗ (element-wise product).
• Finally, define the updated neuron activation, given ct−1 and z t−1, by
• The LSTM produces a latent variable z T , based on time series input (x1, . . . , xT ).
• Network weights Wf>, Wi>, Wo>, Wc> ∈ Rτ1×(τ0+1), Uf>, Ui>, Uo>, Uc> ∈ Rτ1×τ1
and β ∈ R(τ1+1)×dim(YT +1). All time t-independent.
18
• Code LSTM Layers and Networks
19
R Code for Single LSTM Layer Architecture
1 T <- 10 # length of time series x_1 ,... , x_T
2 tau0 <- 3 # dimension of inputs x_t
3 tau1 <- 5 # dimension of the neurons z_t and cell states c_t
4
5 Input <- layer_input ( shape = c (T , tau0 ) , dtype = ’ float32 ’ , name = ’ Input ’)
6
7 Output = Input % >%
8 layer_lstm ( units = tau1 , activation = ’ tanh ’ , recurrent_activation = ’ tanh ’ , name = ’ LSTM1 ’)% >%
9 layer_dense ( units =1 , activation = ’ exponential ’ , name =" Output ")
10
11 model <- keras_model ( inputs = list ( Input ) , outputs = c ( Output ))
21
R Code for Deep LSTMs
1 tau2 <- 4
2
3 Output = Input % >%
4 layer_lstm ( units = tau1 , activation = ’ tanh ’ , recurrent_activation = ’ tanh ’ ,
5 return_sequences = TRUE , name = ’ LSTM1 ’) % >%
6 layer_lstm ( units = tau2 , activation = ’ tanh ’ , recurrent_activation = ’ tanh ’ , name = ’ LSTM2 ’)% >
7 layer_dense ( units =1 , activation = ’ exponential ’ , name =" Output ")
22
• Gated Recurrent Unit (GRU) Networks
23
Gated Recurrent Unit (GRU) Networks
• Gated recurrent unit (GRU) networks were introduced by Cho et al. (2014).
• They should share similar properties as LSTMs but based on less parameters.
24
GRU Layer
• Reset gate:
• Update gate:
thus, we consider a credibility weighted average for the update of z t, this can also
be understood as a skip connection.
• Network weights are given by Wr>, Wu>, Wz> ∈ Rτ1×(τ0+1) (including an intercept),
Ur>, Uu>, Uz> ∈ Rτ1×τ1 (excluding an intercept).
25
R Code for Single GRU Layer Architecture
1 T <- 10 # length of time series x_1 ,... , x_T
2 tau0 <- 3 # dimension of inputs x_t
3 tau1 <- 5 # dimension of the neurons z_t and cell states c_t
4
5 Input <- layer_input ( shape = c (T , tau0 ) , dtype = ’ float32 ’ , name = ’ Input ’)
6
7 Output = Input % >%
8 layer_gru ( units = tau1 , activation = ’ tanh ’ , recurrent_activation = ’ tanh ’ , name = ’ GRU1 ’)% >%
9 layer_dense ( units =1 , activation = ’ exponential ’ , name =" Output ")
10
11 model <- keras_model ( inputs = list ( Input ) , outputs = c ( Output ))
27
Toy Example of LSTMs and GRUs
• Consider raw Swiss female log-mortality rates log(Mx,t) for calendar years
1990, . . . , 2001 and ages 0 ≤ x ≤ 99.
and observations
● ●
● ●
● ●
−2
● ●
● ●
●
●
● ●
−4
●
●
● ●
● ●
● ●
● ●
● ●
● ●
−6
●
●
● ●
●
●
●
●
● ●
●
●
● ●
−8
●
● ●
●
●
● ●
● ●
●
●
●
●
● ●
−10
●
●
calendar years
• Use shallow LSTM1, deep LSTM2 as above, and corresponding GRU1, GRU2,
and deep FNN; GDM: blue is in-sample, red is out-of-sample
● ● ●
●●● ●●● ● ●● ●
●
●
● ●●●●●
●●●
●●● ● ● ●
●●
● ● ●
●● ●
● ●●● ● ● ● ●
● ●● ● ●
● ● ●
● early stopping rule LSTM1 ● early stopping rule LSTM2 ●●
early stopping rule GRU1
●
●
●●
●●● ●
● ●
● ●● early stopping rule GRU2 ●
● early stopping rule deep FNN
● ●
● ●● ●
●
● ● ● ●● ● ● ● ●
●●
● ●
● ● ● ●●
● ●
● ● ● ●● ●
0.20
0.20
0.20
0.20
0.20
● ● ● ● ● ●
● ●●●
● ●
●
● ●
●● ● ● ● ●●
● ●
● ●●
● ●● ● ● ●
● ●
●
●●
●
●
●
●●
● ●
●
●●
● ●
● ● ● ● ●● ● ●●●
● ● ●● ● ● ● ●
●● ●● ● ●●● ● ● ●
● ● ●● ● ● ● ●
● ●●●
●
● ● ●● ●● ●●● ●● ●● ●●●
●
● ●
●●
●●
●●
●●● ● ●●
●
●
●
●●● ● ●● ● ●● ●● ●●●
●● ●● ● ●● ● ●● ●● ● ● ●
●●●●
● ● ●●
●●
●
●●●●
●●●
●●●●
●
●●●●●
● ●
●
●
●●●●● ●
● ●●● ●
●● ●●
●● ●● ●
●
●●
●●● ●●●●
●●
●●●
●● ●●●
●● ● ●●●● ●
●●
●●●
●●●●●●
●●● ●
●●
● ●● ● ●● ● ● ●●●● ● ● ●● ● ●●● ●●
● ● ●●● ●
●● ●
● ●
●●● ● ●●●●● ● ●
● ●
● ● ●
● ●● ●
●
● ●● ● ● ●●
●●● ●
●●
●●
●
●
●●
●●●●
●●●●
●●●
●●●● ●●●●
●●
●●●
●●●●
●
●●
●●
●
●
●
●
●●
●●
● ● ●
●●
●
●●
●
●●●
●●
●●
●●
●
●●
●●● ●●●●
●
●
●●
●
●
●
●●
●●
●●
●
● ●●●
●●●● ●●●
●●●● ●
●
●
●
●
●●
● ● ●● ● ●● ● ● ● ● ●●●●● ●
●
●● ●●●
●●●
● ●
●●●
●●
●●
●
●● ●
● ●● ●
● ● ● ●●
● ● ●
● ● ●● ●● ●● ●●●●
●●●●●●●
● ●● ● ●● ● ●
● ● ● ●
0.15
0.15
0.15
0.15
0.15
● ● ●●●● ●
●● ●● ●●●●●
●●● ●●
● ●● ●
●●●
●●●● ● ●● ●● ●
●●●● ● ●●
●
●●●● ●●●●●●● ● ●● ●
● ● ● ●
●
●
●● ●
●● ●●●
● ●●● ● ●
●●
● ● ● ● ● ●
● ● ● ● ● ● ● ●● ●
● ● ●
● ● ●
● ●
● ●
● ●●●●
● ●
●● ●
● ● ●● ● ● ● ● ●●●●● ●
● ●● ● ●
●
●
●●● ● ●
●●●
●●●●●
●●
● ●
●●
●●
●●
●
●●
●
●●
●
●
●
●
●
●●
●
●
●●
●●● ● ●● ● ● ● ●●● ●● ●●● ● ● ● ●
● ● ●● ●
●
●● ●●
● ●●
● ●
●●
●●●●● ●● ●
●●● ● ●● ●● ● ●
● ●●●●●●●● ●●● ● ●●●●
● ●● ● ●● ● ●● ●●●●
●
●●
●●
●
●●●●
●● ●● ● ●
●
● ●
●
●
●●●
●
● ● ●
●●
●●
●●
●●●
● ● ●● ●●
●●
●● ● ●
●● ●●● ●●●
●●● ● ● ● ● ●● ● ● ●●●●●
●●
● ●●● ● ● ●●
●●
●●●● ● ● ●●●● ●●● ●● ●
●●
●● ●● ●● ●● ●
● ● ●● ●● ● ●●●
●●●
●●
●●
●●● ●● ● ●
●
●
●
●
●
●● ●
●●● ●● ● ● ●●
●●●
●●● ● ●● ● ● ● ● ●●● ●
●●●● ● ● ● ● ● ●● ●● ● ● ● ●
MSE loss
MSE loss
MSE loss
MSE loss
MSE loss
● ● ● ● ● ● ● ●●● ●● ●
● ●
● ●● ● ● ●● ●●
● ● ●● ●●●●● ●●
● ●●●
● ●● ●● ●● ●
●
● ● ● ●●● ● ●●●●●● ●●●●●
●●
●
●
●
●
● ● ●
●
●●
●
●
●●
● ●●
●●
● ●
●●●● ● ● ● ●
●●
●
●●
●
●
●
●●●
●●
●● ●●●● ● ● ●●● ● ● ●● ●●
● ●
●● ●●
●●
●●
●●●●●● ●● ● ● ●
● ● ●●● ●● ● ●● ●● ●
●
● ●
●
●
●
●●
●
●
●●
●●
● ●
●●
●
●● ●
●●
● ● ●● ●● ● ●● ●● ● ●● ●
● ●●●● ●●●
●●
●●
●
●● ●
● ●●● ●●
●●
●●●● ●●●●●● ● ●
● ●
● ●
●●
● ● ●●● ●
● ●● ● ●
●●
●
●●
●●
●●
●
●●●
● ●
● ●●
●● ●
●●●
●●
●
●●
● ●
● ●
●●●● ● ●
● ●● ●●●●●●●● ● ●●
●●
● ● ●●
●●
● ●
● ●●
●● ● ●● ●● ●● ●●●
●●● ●●●
● ●● ● ●●● ●
●
●● ● ●
● ●● ● ●● ●● ● ●
●●
0.10
0.10
0.10
0.10
● ●
0.10
●●
●
●●
●
● ●●● ● ●●
●●●
● ●●
●
●●
●● ●
●● ●●●●
●●
●●
● ● ●●●
●●●● ● ●●●● ●
●● ●●●●
●●● ●
● ●● ●
●●
●●
●
● ●
● ●
●●●
●● ●
●
● ●● ●
●●
●
●
●●
●
● ●
●●●
●
●●●●●
●●●
●● ● ●
● ●●● ● ●●
●●● ● ● ●
●●
●●●
●●● ●●●● ●● ●●● ●●●
● ●●
● ●
● ●●●
● ● ●
●●
●● ●● ● ●● ●● ●
● ● ● ●●●● ●
●
●●●
●●
●●
●●
●●
●●● ●
●●
● ●●
●●●●●● ●
●●●● ●
● ●● ●
● ●
●●
●●●●●●●
●
●●●●
●●
●●●
●●●
●● ●
●
●●●
●●
●
●●●
●
●●
● ●●
●
●●
●
●●●
●
● ●
●
●●●
●
●●
●●
●
●
●●
●
●
●
●●
●
●●●
●●
●●●
●
●
●●
●
●●
●
●
●
●●●●
●
●●
●
●
●
●
●●●
● ●● ●●
●
●
●●
● ● ●●●
●●
●●
●
●
●
●
●
●●
●●●
●●●
● ●● ●
●●
●●
●●
●●●
●●
●●
●
●●
●
●●
●●●●●
●●●
● ●
●
●●●●
●
● ●
●
●●
● ●●
●●●●●●
●
●
● ● ●●
●
●●
●
●
●●
●
● ●
●●●●● ●●●●●
●●●●
●
●●
●
●●
●
●
●●
●
●●
●●
●
●
●
●
●●
●●
●●
●●
●
●
●●
●
●
●
●●
●
●●
●
●●
●
●●
●●
●●
●●
●●
●●●
0.05
0.05
0.05
0.05
0.05
in−sample loss in−sample loss in−sample loss in−sample loss in−sample loss
0.00
0.00
0.00
0.00
0.00
● out−of−sample loss ● out−of−sample loss ● out−of−sample loss ● out−of−sample loss ● out−of−sample loss
0 100 200 300 400 500 0 100 200 300 400 500 0 100 200 300 400 500 0 100 200 300 400 500 0 100 200 300 400 500
31
Application to Swiss Mortality Data
in-sample out-of-sample run times
female male female male female male
LSTM3 (T = 10, (τ0 , τ1 , τ2 , τ3 ) = (5, 20, 15, 10)) 2.5222 6.9458 0.3566 1.3507 225s 203s
GRU3 (T = 10, (τ0 , τ1 , τ2 , τ3 ) = (5, 20, 15, 10)) 2.8370 7.0907 0.4788 1.2435 185s 198s
LC model with SVD 3.7573 8.8110 0.6045 1.8152 – –
estimated process k_t for Female estimated process k_t for Male
●● ●
●
●● ●●
●●●●
40
●
● ●
●●
20
●●
●
● ●
● ●
● ● ●●●●●
●● ● ●●
20
●
●●● ●
● ● ● ●●
● ●
● ●
0
●●
●
0
values k_t
values k_t
●● ●●● ●
● ●● ●
● ●● ●●
−20
● ●
●●
−20
●
● ● ●●
● ●
● ●
● ●
●● ●
● ● ●
●
●
−40
● ●
●●
−40
●●
●● ●
●●
● ● ●
●●
●● ● ●
● ● ●● ● ●●
●
● ●● ● ●●
● ●
● ●● ● ●●
−60
● ●● ● ●●
●
●
●● ● ●●
●● ● ●
●
in−sample ●
● ●
in−sample ● ●●●
−60
●
● ● ●●●
●
LC drift ● ●
LC drift ● ● ●
● ● ● ●
● ● ● ●
●
LSTM drift ● ●
LSTM drift ● ●
−80
● ●
● ●●
●
GRU drift ● ●
GRU drift ●
1950 1957 1964 1971 1978 1985 1992 1999 2006 2013 1950 1957 1964 1971 1978 1985 1992 1999 2006 2013
33
RNNs vs. Convolutional Neural Networks (CNNs)
• Intuitively, for CNNs we move small windows (filters) over the images to discover
similar structure at different locations in the images.
Swiss Female raw log−mortality rates Swiss Male raw log−mortality rates
92
92
84
84
76
76
68
68
60
60
age x
age x
52
52
44
44
36
36
28
28
20
20
12
12
6
6
0
1950 1956 1962 1968 1974 1980 1986 1992 1998 2004 2010 2016 1950 1956 1962 1968 1974 1980 1986 1992 1998 2004 2010 2016
34
Convolutional Neural Networks (CNNs)
• CNNs used for image and speech recognition, natural language processing (NLP),
and in many other fields, for references see our tutorial Meier–Wüthrich (2020).
35
Convolutional Layer: Sketch of Structure
A convolution layer (we consider a two-dimensional image here and a single filter)
(m) (m)
···
z1,1 (x) z (m) (x)
(m−1) (m−1) (m) (m)
1,n2
z (m) : R n1 ×n2
→R n1 ×n2
, x 7→
.. .. ,
(m) (m)
z (m) (x) · · ·
z (m) (m) (x)
n1 ,1 n1 ,n2
(m) (m)
with (local) filter/window having filter size f1 and f2
(m) (m)
f1 f2
(m) (m) X X (m)
x 7→ zi1,i2 (x) = φ w0,0 + wj1,j2 xi1+j1−1,i2+j2−1 .
j1 =1 j2 =1
Mario V. Wüthrich
RiskLab, ETH Zurich
1
Exponential Dispersion Family
• This family contains the Gauss, Poisson, binomial, negative binomial, gamma,
inverse Gaussian, and Tweedie’s models.
• The cumulant function κ determines the distribution type, and the canonical
parameter θi is estimated with MLE.
2
Generalized Linear Models
q
X
xi 7→ g(µi) = g(E[Yi]) = hβ, xii = β0 + βj xi,j ,
j=1
3
Generalized Linear Models: Fitting
• GLMs are fit using MLE.
• For canonical link g = h, the fitted model fulfills the balance property
n
X n
X n
X
viE[Y
b i] = viκ0hβ,
b xi i = viYi.
i=1 i=1 i=1
• Neural network
(d:1) (d) (1)
x 7→ z (x) = z ◦ ··· ◦ z (x),
5
(Feed-Forward) Neural Networks
• Time series, image and text data can be processed (in a similar fashion),
using different types of network architectures, but the general philosophy of
representation learning is the same.
6
(Feed-Forward) Neural Networks
• We have output parameter β ∈ Rqd+1, and each hidden layer z (m) has parameters
(m) (m)
(weights) (w1 , . . . , wqm ) ∈ Rqm(qm−1+1).
(1) (d)
• This network parameter ϑ = (w1 , . . . , wqd , β) is fit with gradient descent
methods, and early stopping is used to prevent from (in-sample) over-fitting.
• Every different seed (starting point) of the gradient descent will provide a different
(early stopped) network calibration ϑ.b
7
(Feed-Forward) Neural Networks: Peculiarities
• There is no “unique best” network, but there are infinitely many sufficiently
(equally) good networks.
• These sufficiently good networks have equally good predictive power on portfolio
level, but they can be quite different on policy level.
• Typically, the balance property fails to hold. This requires an extra fitting (bias
regularization) step.
• Based on the learned structure one can still try to improve a GLM.
8
Convolutional Neural Networks
• Time series data and text data can also be process through RNNs (not presented
here, but there is a SAV tutorial). A RNN is a FNN with loops.
• Often one uses regression trees, random forests and tree boosting as competing
data science models to FNNs.
• This works for tabular data, however, time series, image and text data does not
have obvious non-network counterparts.
9
References: www.ActuarialDataScience.org
• Ferrario, Hämmerli (2019). On boosting: theory and applications. SSRN 3402687.
• Ferrario, Nägelin (2020). The art of natural language processing: classical, modern and contemporary approaches to
text document classification. SSRN 3547887.
• Ferrario, Noll, Wüthrich (2018). Insights from inside neural networks. SSRN 3226852.
• Lorentzen, Mayer (2020). Peeking into the black box: an actuarial case study for interpretable machine learning.
SSRN 3595944.
• Meier, Wüthrich (2020). Convolutional neural network case studies: (1) anomalies in mortality rates (2) image
recognition. SSRN 3656210.
• Noll, Salzmann, Wüthrich (2018). Case study: French motor third-party liability claims. SSRN 3164764.
• Rentzmann, Wüthrich (2019). Unsupervised learning: What is a sports car? SSRN 3439358.
• Richman, Wüthrich (2019). Lee and Carter go machine learning: recurrent neural networks. SSRN 3441030.
• Schelldorfer, Wüthrich (2019). Nesting classical actuarial models into neural networks. SSRN 3320525.
• Schelldorfer, Wüthrich (2021). LocalGLMnet: a deep learning architecture for actuaries. SSRN 3900350.
10