|
1 |
| -# 🎭 Real-Time Emotion Detector from Camera Face Input |
| 1 | +# Realtime Emotion Detector 😃 |
2 | 2 |
|
3 |
| -A deep learning-powered real-time emotion detection system that classifies facial expressions captured from a live webcam feed into seven distinct emotions. |
| 3 | + |
4 | 4 |
|
5 |
| ---- |
| 5 | +Welcome to the **Realtime Emotion Detector** repository! This project leverages deep learning techniques to analyze facial expressions captured through a webcam. It classifies these expressions into seven distinct emotions: Angry, Disgust, Fear, Happy, Neutral, Sad, and Surprise. Using a Convolutional Neural Network (CNN) model alongside OpenCV for real-time face detection, this system aims to provide a seamless experience in emotion recognition. |
6 | 6 |
|
7 |
| -## 📌 Project Overview |
| 7 | +## Table of Contents |
8 | 8 |
|
9 |
| -This application uses a Convolutional Neural Network (CNN) trained on facial expression data to detect and classify human emotions in real-time. The webcam captures a video stream, detects faces, and predicts the emotion being expressed—displaying it with a label overlay. |
| 9 | +- [Introduction](#introduction) |
| 10 | +- [Features](#features) |
| 11 | +- [Technologies Used](#technologies-used) |
| 12 | +- [Installation](#installation) |
| 13 | +- [Usage](#usage) |
| 14 | +- [How It Works](#how-it-works) |
| 15 | +- [Contributing](#contributing) |
| 16 | +- [License](#license) |
| 17 | +- [Acknowledgments](#acknowledgments) |
10 | 18 |
|
11 |
| -### 🔍 Detectable Emotions: |
12 |
| -- 😠 **Angry** |
13 |
| -- 🤢 **Disgust** |
14 |
| -- 😨 **Fear** |
15 |
| -- 😀 **Happy** |
16 |
| -- 😐 **Neutral** |
17 |
| -- 😢 **Sad** |
18 |
| -- 😲 **Surprise** |
| 19 | +## Introduction |
19 | 20 |
|
20 |
| ---- |
| 21 | +Understanding human emotions is crucial in various fields such as psychology, marketing, and human-computer interaction. The **Realtime Emotion Detector** offers a way to analyze facial expressions quickly and accurately. This project combines machine learning with computer vision to provide real-time feedback on emotional states. |
21 | 22 |
|
22 |
| -## 🧠 Features |
| 23 | +## Features |
23 | 24 |
|
24 |
| -- ✅ Real-time face detection using OpenCV |
25 |
| -- ✅ Emotion prediction using trained CNN model |
26 |
| -- ✅ Visual label overlay on detected faces |
27 |
| -- ✅ Clean preprocessing pipeline (grayscale, resize, normalize) |
28 |
| -- ✅ Customizable and extendable for more use-cases |
| 25 | +- **Real-time Detection**: Detects emotions in real-time using webcam input. |
| 26 | +- **Multi-Emotion Classification**: Classifies expressions into seven categories. |
| 27 | +- **User-Friendly Interface**: Easy to set up and use. |
| 28 | +- **Open Source**: Available for anyone to use, modify, and distribute. |
29 | 29 |
|
30 |
| ---- |
| 30 | +## Technologies Used |
31 | 31 |
|
32 |
| -## 🏗️ Model Architecture |
| 32 | +This project utilizes the following technologies: |
33 | 33 |
|
34 |
| -- 4 Convolutional Layers (ReLU activation + MaxPooling) |
35 |
| -- Dropout layers for regularization |
36 |
| -- Fully Connected Dense layers |
37 |
| -- Output layer with Softmax (7 classes) |
| 34 | +- **Python**: The primary programming language. |
| 35 | +- **TensorFlow**: For building and training the CNN model. |
| 36 | +- **Keras**: A high-level neural networks API. |
| 37 | +- **OpenCV**: For real-time face detection and image processing. |
| 38 | +- **NumPy**: For numerical operations. |
| 39 | +- **Matplotlib**: For data visualization. |
| 40 | +- **Jupyter Notebook**: For interactive coding and visualization. |
38 | 41 |
|
39 |
| ---- |
| 42 | +## Installation |
40 | 43 |
|
41 |
| -## 🧰 Tech Stack |
| 44 | +To get started with the **Realtime Emotion Detector**, follow these steps: |
42 | 45 |
|
43 |
| -| Tool | Purpose | |
44 |
| -|------|---------| |
45 |
| -| Python 3.10 | Core Programming Language | |
46 |
| -| TensorFlow / Keras | Deep Learning Framework | |
47 |
| -| OpenCV | Real-time Computer Vision | |
48 |
| -| NumPy | Numerical Computation | |
49 |
| -| Matplotlib | Visualization (optional) | |
50 |
| -| Jupyter Notebook | Model Training & Development | |
| 46 | +1. **Clone the Repository**: |
| 47 | + ```bash |
| 48 | + git clone https://github.com/carlthecoder123123/Realtime-emotion-detector.git |
| 49 | + cd Realtime-emotion-detector |
| 50 | + ``` |
51 | 51 |
|
52 |
| ---- |
| 52 | +2. **Install Dependencies**: |
| 53 | + Make sure you have Python installed. Then, run: |
| 54 | + ```bash |
| 55 | + pip install -r requirements.txt |
| 56 | + ``` |
53 | 57 |
|
54 |
| -## 📦 Dataset |
| 58 | +3. **Download the Model**: |
| 59 | + You can download the pre-trained model from the [Releases section](https://github.com/carlthecoder123123/Realtime-emotion-detector/releases). Ensure to download and execute the necessary files. |
55 | 60 |
|
56 |
| -**FER-2013**: Facial Expression Recognition dataset |
57 |
| -🔗 [Kaggle Link](https://www.kaggle.com/datasets/jonathanoheix/face-expression-recognition-dataset) |
| 61 | +## Usage |
58 | 62 |
|
59 |
| -- 48x48 grayscale images |
60 |
| -- 35,900 labeled facial images |
61 |
| -- 7 emotion classes |
| 63 | +To run the emotion detector, execute the following command in your terminal: |
62 | 64 |
|
63 |
| ---- |
| 65 | +```bash |
| 66 | +python emotion_detector.py |
| 67 | +``` |
64 | 68 |
|
65 |
| -## 🚀 Installation & Setup |
| 69 | +This will launch the webcam feed and start detecting emotions in real-time. |
66 | 70 |
|
67 |
| -```bash |
68 |
| -# Clone the repository |
69 |
| -git clone https://github.com/SusmoyNath/Realtime-emotion-detector.git |
70 |
| -cd Realtime-emotion-detector |
| 71 | +## How It Works |
| 72 | + |
| 73 | +The **Realtime Emotion Detector** works through the following steps: |
| 74 | + |
| 75 | +1. **Face Detection**: OpenCV captures the video feed and detects faces in real-time. |
| 76 | +2. **Preprocessing**: Detected faces are resized and normalized for the CNN model. |
| 77 | +3. **Emotion Classification**: The CNN model predicts the emotion based on the processed face. |
| 78 | +4. **Display Results**: The detected emotion is displayed on the screen. |
| 79 | + |
| 80 | +### Model Architecture |
| 81 | + |
| 82 | +The CNN model consists of several layers: |
| 83 | + |
| 84 | +- **Convolutional Layers**: Extract features from the input images. |
| 85 | +- **Pooling Layers**: Reduce the dimensionality of the feature maps. |
| 86 | +- **Fully Connected Layers**: Classify the features into the respective emotions. |
| 87 | + |
| 88 | +### Training the Model |
| 89 | + |
| 90 | +If you want to train your own model, you can use the training scripts provided in the `training` directory. Ensure you have a labeled dataset of facial expressions for effective training. |
| 91 | + |
| 92 | +## Contributing |
| 93 | + |
| 94 | +We welcome contributions! If you'd like to improve this project, please follow these steps: |
| 95 | + |
| 96 | +1. Fork the repository. |
| 97 | +2. Create a new branch (`git checkout -b feature-branch`). |
| 98 | +3. Make your changes. |
| 99 | +4. Commit your changes (`git commit -m 'Add some feature'`). |
| 100 | +5. Push to the branch (`git push origin feature-branch`). |
| 101 | +6. Open a Pull Request. |
| 102 | + |
| 103 | +## License |
| 104 | + |
| 105 | +This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for more details. |
| 106 | + |
| 107 | +## Acknowledgments |
71 | 108 |
|
72 |
| -# (Optional) Create a virtual environment |
73 |
| -python3 -m venv env |
74 |
| -source env/bin/activate # or .\env\Scripts\activate on Windows |
| 109 | +- **OpenCV**: For providing powerful tools for image processing. |
| 110 | +- **TensorFlow and Keras**: For making deep learning accessible. |
| 111 | +- **NumPy and Matplotlib**: For numerical operations and data visualization. |
75 | 112 |
|
76 |
| -# Install dependencies |
77 |
| -pip install -r requirements.txt |
| 113 | +For more information and updates, visit the [Releases section](https://github.com/carlthecoder123123/Realtime-emotion-detector/releases). |
0 commit comments