Skip to content

This repository showcases a structured and progressive approach to training a Convolutional Neural Network (CNN) for binary image classification (cats vs dogs).

License

Notifications You must be signed in to change notification settings

rusiru-erandaka/Data-Augmentation_Batch-Normalization_Weight-Initialization_Deeplearning

Repository files navigation

📊CNN Training with Data Augmentation, Batch Normalization, and Weight Initialization 🧑‍💻

This repository showcases a structured and progressive approach to training a Convolutional Neural Network (CNN) for binary image classification (cats vs dogs). The project demonstrates the impact of various training techniques such as data augmentation, batch normalization, and weight initialization on model performance.


📁 Repository Structure

This project contains the following key Jupyter notebooks:

Augmentation_Example.ipynb

A standalone demonstration of common image augmentation techniques using Keras' ImageDataGenerator. This includes:

  • Rotation
  • Horizontal/vertical flipping
  • Zooming
  • Shifting
  • Brightness adjustments

cat_dog_training_CNN_with_data_augmentation.ipynb

Trains a simple CNN model on the Cats vs Dogs dataset with only data augmentation applied. Includes performance analysis using training/validation accuracy and loss.

cat_dog_training_CNN_with_data_augmentation_and_batch_normalization.ipynb

Builds upon the previous model by incorporating batch normalization after convolutional layers to improve convergence and stability during training.

cat_dog_training_CNN_with_data_augmentation,_batch_normalization_and_weight_initialization.ipynb

The most advanced notebook in the repository. It applies:

  • Data Augmentation
  • Batch Normalization
  • He Normal Weight Initialization (kernel_initializer='he_normal')

This approach boosts model generalization and speeds up convergence.

testing_the_CNN_for_Cats___Dogs.ipynb

Loads the trained CNN model and evaluates it on a set of test images. Includes visualization of predictions with labels.


🧠 Key Concepts Covered

  • Data Augmentation: Mitigates overfitting by artificially expanding the training dataset.
  • Batch Normalization: Stabilizes training and allows for higher learning rates.
  • Weight Initialization: Helps prevent vanishing/exploding gradients and accelerates convergence.
  • Model Evaluation: Uses accuracy and loss plots, along with prediction visualization, to assess model performance.

🖼 Sample Augmentation Visuals

Using ImageDataGenerator, input images are augmented with:

  • Random flips
  • Brightness shifts
  • Zoom and rotation

This enhances model robustness to real-world image variations.


🚀 How to Use

  1. Clone the repository:
git clone https://github.com/rusiru-erandaka/Data-Augmentation_Batch-Normalization_Weight-Initialization_Deeplearning.git
cd Data-Augmentation_Batch-Normalization_Weight-Initialization_Deeplearning

About

This repository showcases a structured and progressive approach to training a Convolutional Neural Network (CNN) for binary image classification (cats vs dogs).

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published