Generative Adversarial Networks
Generate new data with GANs.
AI that creates.
What are GANs?
Two neural networks competing:
**Generator**: Creates fake data **Discriminator**: Detects fake vs real
Like counterfeiter vs detective!
How GANs Work
1. Generator creates fake image 2. Discriminator tries to identify it 3. Both improve through competition 4. Eventually, fakes look real!
Simple GAN
```python from tensorflow.keras.layers import Dense, Reshape, Flatten from tensorflow.keras.models import Sequential import numpy as np
Generator: Noise → Image generator = Sequential([ Dense(128, activation='relu', input_shape=(100,)), Dense(256, activation='relu'), Dense(784, activation='tanh'), Reshape((28, 28, 1)) ])
Discriminator: Image → Real/Fake discriminator = Sequential([ Flatten(input_shape=(28, 28, 1)), Dense(256, activation='relu'), Dense(128, activation='relu'), Dense(1, activation='sigmoid') # 0=Fake, 1=Real ])
discriminator.compile(optimizer='adam', loss='binary_crossentropy') ```
Training Loop
```python def train_gan(generator, discriminator, epochs=10000): for epoch in range(epochs): # 1. Train Discriminator # Real images real_images = X_train[np.random.randint(0, X_train.shape[0], 32)] real_labels = np.ones((32, 1)) # Fake images noise = np.random.normal(0, 1, (32, 100)) fake_images = generator.predict(noise) fake_labels = np.zeros((32, 1)) # Train discriminator d_loss_real = discriminator.train_on_batch(real_images, real_labels) d_loss_fake = discriminator.train_on_batch(fake_images, fake_labels) # 2. Train Generator noise = np.random.normal(0, 1, (32, 100)) misleading_labels = np.ones((32, 1)) # Want discriminator to think they're real # Freeze discriminator discriminator.trainable = False g_loss = gan.train_on_batch(noise, misleading_labels) discriminator.trainable = True if epoch % 1000 == 0: print(f"Epoch {epoch}, D Loss: {d_loss_real}, G Loss: {g_loss}") ```
DCGAN - Deep Convolutional GAN
Better for images:
```python from tensorflow.keras.layers import Conv2D, Conv2DTranspose, BatchNormalization, LeakyReLU
Generator with Conv layers generator = Sequential([ Dense(7*7*256, input_shape=(100,)), Reshape((7, 7, 256)), Conv2DTranspose(128, 5, strides=1, padding='same'), BatchNormalization(), LeakyReLU(), Conv2DTranspose(64, 5, strides=2, padding='same'), BatchNormalization(), LeakyReLU(), Conv2DTranspose(1, 5, strides=2, padding='same', activation='tanh') ])
Discriminator discriminator = Sequential([ Conv2D(64, 5, strides=2, padding='same', input_shape=(28, 28, 1)), LeakyReLU(), Conv2D(128, 5, strides=2, padding='same'), LeakyReLU(), Flatten(), Dense(1, activation='sigmoid') ]) ```
Conditional GAN
Generate specific types:
```python # Input: noise + label (e.g., which digit to generate) noise_input = Input(shape=(100,)) label_input = Input(shape=(10,)) # One-hot encoded
Combine inputs combined = Concatenate()([noise_input, label_input])
Generator x = Dense(128, activation='relu')(combined) output = Dense(784, activation='tanh')(x)
cgan_generator = Model([noise_input, label_input], output)
Now can generate specific digits! noise = np.random.normal(0, 1, (1, 100)) label = np.array([[0, 0, 1, 0, 0, 0, 0, 0, 0, 0]]) # Generate "2" generated_digit = cgan_generator.predict([noise, label]) ```
Applications
- **Image Generation**: Create realistic faces - **Image-to-Image**: Photos → Sketches, Day → Night - **Super Resolution**: Low res → High res - **Data Augmentation**: Create training data - **Art Generation**: Novel artwork
Famous GANs
**StyleGAN**: Photorealistic faces **Pix2Pix**: Image-to-image translation **CycleGAN**: Unpaired image translation **BigGAN**: High-resolution images
Challenges
- **Mode Collapse**: Generates same thing repeatedly - **Training Instability**: Hard to balance G and D - **Evaluation**: Hard to measure quality
Tips for Stable Training
1. Use LeakyReLU instead of ReLU 2. Batch normalization helps 3. Use Adam optimizer 4. Train discriminator more frequently 5. Add noise to real images
Remember
- GANs are powerful but tricky to train - Start with DCGAN architecture - Monitor both losses - Expect instability