Capstone
Capstone
ABSTRACT
Disabled individuals often rely on assistance for their daily needs and tasks. For deaf and mute individuals, sign
language serves as their primary mode of communication. However, many are illiterate. To bridge this gap, a novel
system has been developed for the translation of sign language actions into English text, subsequently converted
into speech. This system utilizes a training set comprising 26 American Sign Language Alphabet image samples.
During testing, hand gestures are captured from live feeds, and class labels are predicted using various trained
models such as CNN, FRCNN, and YOLO. Among these, the CNN model exhibits the highest accuracy, thus adopted
for this project. Real-time data is incorporated into the training process to enhance accuracy, yielding impressive
results. The translator is implemented and trained using a dataset comprising 15,600 hand sign image samples,
with 600 images allocated for each alphabet, ensuring robust performance.
Figure 1: Figure Caption and Image above the caption [In draft mode, Image will not appear on the screen
[Provide your system diagram. it is mandatory]
Novelty of Project
This project introduces a novel approach to translating sign language into English speech, catering particularly to
illiterate deaf and mute individuals. Unlike existing technologies, it focuses on real-time detection of static ASL
signs, leveraging CNN for high accuracy. Its emphasis on inclusivity and real-time functionality distinguishes it
from conventional solutions.
Impact on society/environment
This project has a profound impact on society by enhancing communication accessibility for disabled individuals,
fostering inclusivity and empowerment. By providing real-time translation of sign language into speech, it
facilitates greater participation in social and professional spheres, promoting equality and understanding among
diverse communities.
Conclusion
In conclusion, our Sign Language Translation System project stands as a beacon of inclusivity and
accessibility, significantly advancing communication for the hearing-impaired community. By achieving a
remarkable 99.94 percent accuracy rate in sign language recognition, this project underscores our
commitment to empowering deaf individuals to engage more fully in various life aspects.