Skip to content

aashishops/ASL-Detector

Repository files navigation

Sign Language Detector 🤙☝️👆👌🤞

DOCS UI Streamlit App

Introduction

Welcome to the American Sign Language Detector!

The ASL Sign Language Detector is a powerful tool designed to recognize and interpret American Sign Language gestures.In this project, we generated hand landmarks using Mediapipe. Subsequently, we trained the model using the Support Vector Machine (SVM) algorithm provided by scikit-learn. We Used this Kaggle Dataset to train the model

Key Features

  • Real-time Recognition: The detector is capable of recognizing ASL signs in real-time, making it suitable for live interactions.
  • User-Friendly Interface: The application comes with a user-friendly interface, making it accessible to users of all backgrounds.

Demo

Click here Streamlit Logo to Detect the Sign Language

  • Open the Link

  • Click the Start Button to turn on the camera and ensure there is enough light To Start

  • Show some ASL Signs For it to Detect

To Detect

  • Click Stop to Turn off the Camera.

Run Locally

Clone the project

  git clone https://github.com/aashishops/ASL-Detector

Go to the project directory

  cd ASL-Detector

Install Prerequisits

  pip install -r requirements.txt 

Run The app on Streamlit

  streamlit run app.py

OR Run The app on OpenCV

  python asl_gesture.py

How it Detects

  • Hand Tracking: It employs hand tracking to locate and track the user's hand gestures.

  • Gesture Recognition: A machine learning model trained on a dataset of ASL signs recognizes and interprets the gestures

How it was done

- Landmark Generation :

The landmarks were generated using the Mediapipe library. landmarks

These landmark coordinates are collected as dataset in Dataset.csv for training and classification.

The Dataset was trained on Support Vector Machine Model using Scikit learn.

License

MIT License MIT License