Lecture-01_Introductory
Lecture-01_Introductory
Lecture #1
Generative Modeling
HY-673 – Computer Science Dept, University of Crete
Professors: Yannis Pantazis & Yannis Stylianou
TAs: Michail Raptakis & Michail Spanakis
Lecture
What is this course about? #1
Lecture
#1
üStatistical Generative Models
üA Generative Model (GM) is defined as a probability distribution, 𝒑 𝒙 .
üA statistical GM is a trainable probabilistic model, 𝒑𝜽 𝒙 .
üA deep GM is a statistical generative model parametrized by a neural network.
ü𝒑 𝒙 and in many cases 𝒑𝜽 𝒙 are not analytically known. Only samples are available!
üData (𝒙): complex, (un)structured samples (e.g., images, speech, molecules, text, etc.)
üPrior knowledge: parametric form (e.g., Gaussian, mixture, softmax), loss function (e.g.,
maximum likelihood, divergence), optimization algorithm, invariance/equivariance, laws
of physics, prior distribution, etc.
Lecture
What is this course about? #1
xi ∼ pdata , pθ) pθ
d (p data
pdata
We will stydy:
ü Families of Generative Models
ü Algorithms to train these GMs
ü Network architectures
ü Loss functions & distances between probability density functions
Lecture
What is this course about? #1
“Cat”
ü Discriminative Model
ü Learn the probability distribution
𝒑(𝒚|𝒙)
ü Generative Model
ü Learn the probability distribution 𝒑 𝒙
ü Conditional GM
ü Learn 𝒑(𝒙|𝒚)
Lecture
Families of Generative Models #1
Lecture
#1
GMs
üRecent advancements:
üDALL-E 2
üStable Diffusion
üImagen
üGLIDE
üMidjourney
ü!(#$%&'|)'*))
üHierarchical Text-Conditional
Image Generation with CLIP Latents -
Ramesh et al. - 2022
ü https://cdn.openai.com/papers/dall-e-2.pdf
Lecture
Image2Image Translation #1
Lecture
#1
Concatenative
WaveNet
Unconditional
Music
ü GANs (2W)
Lecture
Logistics #1
Lecture
#1
üTeaching Assistant: Michail Raptakis (PhD candidate)
üWeekly Tutorial (Friday 10:00-12:00): Python/PyTorch basics, neural
network architectures and training, solve problems to assist with
homework, solve selected homework’s problems.
üTextbook: Probabilistic Machine Learning: Advanced Topics
by Kevin P. Murphy
ühttps://probml.github.io/pml-book/book2.html
üSeminal papers will be distributed.
Lecture
Grading policy #1
Lecture
#1
üFinal Exam (30% of total grade)
üOpen notes
üNO internet
ü5-6 series of Homework (40% of total grade)
üMix of theoretical and programming problems
üEqually weighted
üProject: paper implementation & presentation (30% of total grade)
üImplementation: 10%
üFinal report: 10%
üPresentation: 10%
Lecture
Project #1
Lecture
Lecture
#1
#1
üSelect from a given list of papers or propose a paper (which has to
be approved)
üCategories of papers:
üApplication of deep generative models on a novel task/dataset
üAlgorithmic improvements into the learning, inference and/or evaluation
of deep generative models
üTheoretical analysis of any aspect of existing deep generative models
üGroups of up to 2 students per project
üComputational resources might be provided (colab, local GPUs,
etc.)
Introduction to Deep
Generative Modeling Lecture #1
HY-673 – Computer Science Dept, University of Crete
Professors: Yannis Pantazis & Yannis Stylianou
TAs: Michail Raptakis & Michail Spanakis