Skip to content

A machine learning project for sign language gesture detection. Built with Python and Jupyter Notebook using OpenCV and deep learning. Demonstrates preprocessing, model training, and gesture recognition.

MPradeep-08/Sign_Language_Detection

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 

Repository files navigation

Sign Language Detection

Overview

This project focuses on detecting sign language gestures using machine learning techniques. The implementation is provided in the Action Detection Refined.ipynb Jupyter Notebook, which covers data preprocessing, model training, and gesture recognition.

Project Structure

  • Action Detection Refined.ipynb – Jupyter Notebook containing the complete workflow for sign language detection.

Technologies Used

  • Python
  • Jupyter Notebook
  • OpenCV
  • TensorFlow / MediaPipe (or other libraries you used)

How to Run

  1. Clone the repository:
    git clone https://github.com/MPradeep-08/Sign_Language_Detection.git
    cd Sign_Language_Detection
    

2.Open the notebook:

jupyter notebook "Action Detection Refined.ipynb"

3.Run the cells step by step to preprocess data, train the model, and test sign detection.

About

A machine learning project for sign language gesture detection. Built with Python and Jupyter Notebook using OpenCV and deep learning. Demonstrates preprocessing, model training, and gesture recognition.

Topics

Resources

Stars

Watchers

Forks