0% found this document useful (0 votes)
27 views

Beamer Template Uoft

The document discusses developing a digital twin using generative neural networks. [1] It reviews previous work using models like ARX, LSTM and GRU to predict temperatures in combustion systems. [2] Generative models like GANs and VAEs are introduced to generate time-series data. [3] A new dynamic generative model is proposed, combining bidirectional RNNs with state space models, to better generate time-series for digital twins.

Uploaded by

Ahmed Muhumed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views

Beamer Template Uoft

The document discusses developing a digital twin using generative neural networks. [1] It reviews previous work using models like ARX, LSTM and GRU to predict temperatures in combustion systems. [2] Generative models like GANs and VAEs are introduced to generate time-series data. [3] A new dynamic generative model is proposed, combining bidirectional RNNs with state space models, to better generate time-series for digital twins.

Uploaded by

Ahmed Muhumed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

| Engineering

Development of digital twin via generative neural networks

Kuilin Chen

June 12, 2019

June 12, 2019 1 / 11


Outline

Review of previous work


Review of generative neural network
Dynamic generative neural network
Seq2Seq generation
Future work

June 12, 2019 2 / 11


Previous work

Completed the literature review of digital twin and pointed out research opportunities in
current digital twin research
Current research focus is to model the relationship between set points and actual
temperature inside combustion system
ARX, LSTM and GRU models have been developed to predict one-step-ahead
temperature based on past set points and temperature
Try to develop new generative models for time-series

June 12, 2019 3 / 11


Generative neural network

We want to learn a probability distribution over high-dimensional x (e.g. picture and long
time-series)
pD (x) is the true distribution, and pθ (x) is the modelled distribution
Direct optimization over pθ to approximate pD is very challenging (e.g.
high-dimensionality, existence of pD ...)
We define a low-dimensional z with a fixed prior distribution p(z), and pass z through gθ
(deep neural network): Z → X
High-dimensional x can be generated without explicitly knowing high-dimensional density

June 12, 2019 4 / 11


Generative adversarial networks (GAN)

Adversarial training
min max V (D, G ) = Ex∼pD (x) [log D(x)] + Ez∼pz (z)[log(1 − D(G (z)))]
G D

G is a generator, D is a discriminator
Train D to discriminate the real and generated samples
Simultaneously train G to generate samples close to real samples
p(x) is not explicitly modeled in GAN
Evaluation of generated samples from GAN can be done by human subjectively

June 12, 2019 5 / 11


Variational autoencoder (VAE)

Evidence lower bound (ELBO)


L(x) = −DKL (qφ (z|x)kp(z)) + Eqφ (z|x) [log pθ (x|z)]
| {z } | {z }
regularization log-likelihood

qφ (z|x) is a probabilistic encoder, pθ (x|z) is a probabilistic decoder


Maximize L by varying φ and θ to train the generative model
ELBO or log-likelihood could be maximized by overfitting x (memorize the training
sample)
Good ELBO or log-likelihood values does not imply good inference
ELBO or log-likelihood should not be used to evaluate generated samples

June 12, 2019 6 / 11


RNN and SSM
xt−1 xt xt+1 xt−1 xt xt+1

hrt−1 hrt hrt+1 zt−1 zt zt+1

ut−1 ut ut+1 ut−1 ut ut+1

(a) RNN (b) SSM

Figure: Graphical models to generate x1:T with a recurrent neural network (RNN) and a state space
model (SSM). Rectangle-shaped units are used for deterministic states, while circles are used for
stochastic ones.

ht = f (ht−1 , ut ) zt ∼ pθz (zt |ut , zt−1 )


xt = g (ht ) xt ∼ pθx (xt |zt )
June 12, 2019 7 / 11
Combination of RNN and SSM

RNN and SSM have been combined to develop generative models in some papers
However, their models are limited to categorical input and output (e.g. rotated image
generate, new drug development)
A new generative model is proposed based on combination of bi-directional RNN and SSM
The objective function and output decoding distribution are re-designed to make it
suitable for time-series generation

June 12, 2019 8 / 11


Variational inference for dynamic generative model

ELBO
log pθ (x|u) − DKL (qφ (z|x, u)kpθ (z|x, u))
= Ez∼qφ [log pθ (x|z, u)] − DKL [qφ (z|x, u)kpθ (z|u)]
| {z } | {z }
log-likelihood regularization

=L(θ, φ)

June 12, 2019 9 / 11


Algorithm

Algorithm 1 Dynamic generative model


Initialize parameters θ, φ
repeat
Get random minibatch datapoints x, u
Get Monte Carlo samples z∗ from distribution qφ (z|x, u)
Evaluate Ez∼qφ [log pθ (x|z, u)] using z∗
Update parameters using gradients ∇θ,φ L (e.g. SGD)
until convergence of parameters θ, φ
return θ, φ

June 12, 2019 10 / 11


Thank You!
Questions?

June 12, 2019 11 / 11

You might also like