Skip to content

A deep learning-based system that detects facial expressions from webcam input and classifies them into seven emotions — Angry, Disgust, Fear, Happy, Neutral, Sad, and Surprise — using a CNN model and OpenCV for real-time face detection.

Notifications You must be signed in to change notification settings

carlthecoder123123/Realtime-emotion-detector

Repository files navigation

Realtime Emotion Detector 😃

Realtime Emotion Detector

Welcome to the Realtime Emotion Detector repository! This project leverages deep learning techniques to analyze facial expressions captured through a webcam. It classifies these expressions into seven distinct emotions: Angry, Disgust, Fear, Happy, Neutral, Sad, and Surprise. Using a Convolutional Neural Network (CNN) model alongside OpenCV for real-time face detection, this system aims to provide a seamless experience in emotion recognition.

Table of Contents

Introduction

Understanding human emotions is crucial in various fields such as psychology, marketing, and human-computer interaction. The Realtime Emotion Detector offers a way to analyze facial expressions quickly and accurately. This project combines machine learning with computer vision to provide real-time feedback on emotional states.

Features

  • Real-time Detection: Detects emotions in real-time using webcam input.
  • Multi-Emotion Classification: Classifies expressions into seven categories.
  • User-Friendly Interface: Easy to set up and use.
  • Open Source: Available for anyone to use, modify, and distribute.

Technologies Used

This project utilizes the following technologies:

  • Python: The primary programming language.
  • TensorFlow: For building and training the CNN model.
  • Keras: A high-level neural networks API.
  • OpenCV: For real-time face detection and image processing.
  • NumPy: For numerical operations.
  • Matplotlib: For data visualization.
  • Jupyter Notebook: For interactive coding and visualization.

Installation

To get started with the Realtime Emotion Detector, follow these steps:

  1. Clone the Repository:

    git clone https://github.com/carlthecoder123123/Realtime-emotion-detector.git
    cd Realtime-emotion-detector
  2. Install Dependencies: Make sure you have Python installed. Then, run:

    pip install -r requirements.txt
  3. Download the Model: You can download the pre-trained model from the Releases section. Ensure to download and execute the necessary files.

Usage

To run the emotion detector, execute the following command in your terminal:

python emotion_detector.py

This will launch the webcam feed and start detecting emotions in real-time.

How It Works

The Realtime Emotion Detector works through the following steps:

  1. Face Detection: OpenCV captures the video feed and detects faces in real-time.
  2. Preprocessing: Detected faces are resized and normalized for the CNN model.
  3. Emotion Classification: The CNN model predicts the emotion based on the processed face.
  4. Display Results: The detected emotion is displayed on the screen.

Model Architecture

The CNN model consists of several layers:

  • Convolutional Layers: Extract features from the input images.
  • Pooling Layers: Reduce the dimensionality of the feature maps.
  • Fully Connected Layers: Classify the features into the respective emotions.

Training the Model

If you want to train your own model, you can use the training scripts provided in the training directory. Ensure you have a labeled dataset of facial expressions for effective training.

Contributing

We welcome contributions! If you'd like to improve this project, please follow these steps:

  1. Fork the repository.
  2. Create a new branch (git checkout -b feature-branch).
  3. Make your changes.
  4. Commit your changes (git commit -m 'Add some feature').
  5. Push to the branch (git push origin feature-branch).
  6. Open a Pull Request.

License

This project is licensed under the MIT License. See the LICENSE file for more details.

Acknowledgments

  • OpenCV: For providing powerful tools for image processing.
  • TensorFlow and Keras: For making deep learning accessible.
  • NumPy and Matplotlib: For numerical operations and data visualization.

For more information and updates, visit the Releases section.

About

A deep learning-based system that detects facial expressions from webcam input and classifies them into seven emotions — Angry, Disgust, Fear, Happy, Neutral, Sad, and Surprise — using a CNN model and OpenCV for real-time face detection.

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •