CGAN
CGAN
By
Dr Muhammad Atif Tahir
Recap
• GANs are generative models: they create new data instances that
resemble your training data
• The generator tries to fool the discriminator, and the discriminator tries to
keep from being fooled
Issues (No control on type of image)
• GANs are built that are able to generate realistic images from
a given training set
• However, we haven’t been able to control the type of image If
we would like to generate—for example, a male or female
face, or a large or small brick
• We can sample a random point from the latent space, but we
do not have the ability to easily understand what kind of
image will be produced given the choice of latent variable
Conditional GAN
• Conditional Generative Adversarial Nets” by Mirza and Osindero in
2014
• Thus. the generator must ensure that its output agrees with the
provided label, in order to keep fooling the critic
• If the generator produced perfect images that disagreed with the image
label the critic would be able to tell that they were fake, simply
because the images and labels did not match
CGAN (Continue)
• The image channels and label channels are passed in separately to the
critic and concatenated
• The latent vector and the label classes are passed in separately to the
generator and concatenated before being reshaped
Training CGAN
1. The images and labels are unpacked from the input data
2. The one-hot encoded vectors are expanded to one-hot encoded images
that have the same spatial size as the input images (64 × 64)
3. The generator is now fed with a list of two inputs—the random latent
vectors and the one-hot encoded label vectors
4. The critic is now fed with a list of two inputs—the fake/real images and
the one hot encoded label channels
5. The gradient penalty function also requires the one-hot encoded label
channels to be passed through as it uses the critic
6. The changes made to the critic training step also apply to the generator
training step
Analysis of CGAN
• One can control the CGAN output by passing a particular one-hot
encoded label into the input of the generator
• For example, to generate a face with non blond hair, [1,0] is passed in the
vector
• To generate a face with blond hair, [0,1] is passed in the vector
Analysis of CGAN
Output from the CGAN when the Blond and Not Blond vectors are appended to the latent
sample
Summary
• CGAN that allowed us to control the type of image that is generated.
• This works by passing in the label as input to the critic and generator
• Thereby giving the network the additional information, it needs to
condition the generated output on a given label