I wanted to experiment with Mediapipe's gesture detection and thought it would be fun to detect American Sign Language.
Create a program that uses the users webcam to recgonize hand gestures and allow the user to type using it.
Using Mediapipe's gesture recongition a custom ASL detector was created which allows the user to write text to a .txt file using only hand gestures.
In order to train my own gesture recongition model, I needed to first create a database of example gestures. I took approximately 100 pictures of each hand gesture of the ASL alphabet and then fed these images into a python script. This script trained a model which I used in another python script to look through the users webcam and allow the user to type using their hand gestures.
| Skills Demonstrated | Project Artifacts |
|---|---|
| OpenCV, Mediapipe, Gesture Recongition | Github Repo |
This program can be used by those attempting to practice ASL or train custom gestures models of thier own.