Close Menu
Xarkas BlogXarkas Blog
    What's Hot

    Vivo X300 Ultra Launch in India Confirmed: Check Full Specifications and Expected Price

    April 17, 2026

    OPPO F33 And F33 Pro Launched In India With 7,000mAh Battery, Dimensity 6360 Max: Check Price And Specs

    April 17, 2026

    Motorola Edge 70 Pro Launch Date in India and Full Specifications Confirmed Through Flipkart

    April 17, 2026
    Facebook X (Twitter) Instagram
    Xarkas BlogXarkas Blog
    • Tech News

      Hummer EV Price in India 2026: Complete Guide, Features, Specifications & Availability

      April 2, 2026

      Apple Vision Pro vs Meta Quest 3: The Ultimate VR Headset Showdown

      December 3, 2025

      ChatGPT told them they were special — their families say it led to tragedy

      November 24, 2025

      Beehiiv’s CEO isn’t worried about newsletter saturation

      November 24, 2025

      TechCrunch Mobility: Searching for the robotaxi tipping point

      November 24, 2025
    • Mobiles

      Vivo X300 Ultra Launch in India Confirmed: Check Full Specifications and Expected Price

      April 17, 2026

      OPPO F33 And F33 Pro Launched In India With 7,000mAh Battery, Dimensity 6360 Max: Check Price And Specs

      April 17, 2026

      Motorola Edge 70 Pro Launch Date in India and Full Specifications Confirmed Through Flipkart

      April 17, 2026

      Moto Pad 60 Pro and Moto Pad 60 Neo Get a Price Hike in India: Check New Prices

      April 16, 2026

      OPPO Reno 16 Pro Full Specifications and Colourways Revealed: Check Launch Timeline for the Reno 16 Series

      April 16, 2026
    • Gaming

      Roblox’s AI assistant gets new agentic tools to plan, build, and test games

      April 17, 2026

      How the rewards app Freecash scammed its way to the top of the app stores

      April 15, 2026

      Where Baldur’s Gate 3 Gets Player Agency vs. Narrative Control Right (and Wrong)

      April 14, 2026

      Best Fallout 4 Romance Mods

      April 14, 2026

      Scratch & Peek

      April 14, 2026
    • SEO Tips
    • PC/ Laptops

      Dell Pro 14 (AMD Ryzen AI 7 Pro 350) Review: The Sensible Choice for Everyday Office Work

      January 9, 2026

      CES 2026: MSI Unveils New Prestige, Raider, Stealth and Crosshair Laptops with Intel Core Ultra SoCs

      January 7, 2026

      CES 2026: Samsung Unveils New Galaxy Book6 Laptops

      January 6, 2026

      CES 2026: HP Shows a Keyboard-Based PC and New EliteBooks

      January 6, 2026

      CES 2026: Intel Unveils Core Ultra Series 3, Its First Platform Built on 18A

      January 6, 2026
    • EV

      Hummer EV Price in India 2026: Complete Guide, Features, Specifications & Availability

      April 2, 2026

      Here’s How Much It Costs

      November 15, 2025

      Sodium-Ion Batteries Have Landed In America. The Hard Part Starts Now

      November 15, 2025

      Mazda Begins Testing Its Long-Overdue U.S. EV

      November 14, 2025

      Volkswagen Adds Smartwatch Support For U.S. Vehicles

      November 14, 2025
    • Gadget
    • AI
    Facebook
    Xarkas BlogXarkas Blog
    Home - Editor's Choice - Study uses AI to interpret American Sign Language in real-time
    Editor's Choice

    Study uses AI to interpret American Sign Language in real-time

    KavishBy KavishDecember 16, 2024No Comments5 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Study uses AI to interpret American Sign Language in real-time
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email


    Breaking barriers: Study uses AI to interpret American Sign Language in real-time
    Bader Alsharif, first author and a Ph.D. candidate in the FAU Department of Electrical Engineering and Computer Science. Credit: Florida Atlantic University

    Sign language serves as a sophisticated means of communication vital to individuals who are deaf or hard-of-hearing, relying on hand movements, facial expressions, and body language to convey nuanced meaning. American Sign Language exemplifies this linguistic complexity with its distinct grammar and syntax.

    Sign language is not universal; rather, there are many different sign languages used around the world, each with its own grammar, syntax and vocabulary, highlighting the diversity and complexity of sign languages globally.

    Various methods are being explored to convert sign language hand gestures into text or spoken language in real time. To improve communication accessibility for people who are deaf or hard-of-hearing, there is a need for a dependable, real-time system that can accurately detect and track American Sign Language gestures. This system could play a key role in breaking down communication barriers and ensuring more inclusive interactions.

    To address these communication barriers, researchers from the College of Engineering and Computer Science at Florida Atlantic University conducted a first-of-its-kind study focused on recognizing American Sign Language alphabet gestures using computer vision. They developed a custom dataset of 29,820 static images of American Sign Language hand gestures.

    Using MediaPipe, each image was annotated with 21 key landmarks on the hand, providing detailed spatial information about its structure and position.

    These annotations played a critical role in enhancing the precision of YOLOv8, the deep learning model the researchers trained, by allowing it to better detect subtle differences in hand gestures.

    Results of the study, published in Franklin Open, reveal that by leveraging this detailed hand pose information, the model achieved a more refined detection process, accurately capturing the complex structure of American Sign Language gestures.

    Combining MediaPipe for hand movement tracking with YOLOv8 for training, resulted in a powerful system for recognizing American Sign Language alphabet gestures with high accuracy.

    “Combining MediaPipe and YOLOv8, along with fine-tuning hyperparameters for the best accuracy, represents a groundbreaking and innovative approach,” said Bader Alsharif, first author and a Ph.D. candidate in the FAU Department of Electrical Engineering and Computer Science. “This method hasn’t been explored in previous research, making it a new and promising direction for future advancements.”

    Findings show that the model performed with an accuracy of 98%, the ability to correctly identify gestures (recall) at 98%, and an overall performance score (F1 score) of 99%. It also achieved a mean Average Precision (mAP) of 98% and a more detailed mAP50-95 score of 93%, highlighting its strong reliability and precision in recognizing American Sign Language gestures.

    “Results from our research demonstrate our model’s ability to accurately detect and classify American Sign Language gestures with very few errors,” said Alsharif. “Importantly, findings from this study emphasize not only the robustness of the system but also its potential to be used in practical, real-time applications to enable more intuitive human-computer interaction.”

    The successful integration of landmark annotations from MediaPipe into the YOLOv8 training process significantly improved both bounding box accuracy and gesture classification, allowing the model to capture subtle variations in hand poses. This two-step approach of landmark tracking and object detection proved essential in ensuring the system’s high accuracy and efficiency in real-world scenarios.

    The model’s ability to maintain high recognition rates even under varying hand positions and gestures highlights its strength and adaptability in diverse operational settings.

    “Our research demonstrates the potential of combining advanced object detection algorithms with landmark tracking for real-time gesture recognition, offering a reliable solution for American Sign Language interpretation,” said Mohammad Ilyas, Ph.D., co-author and a professor in the FAU Department of Electrical Engineering and Computer Science.

    “The success of this model is largely due to the careful integration of transfer learning, meticulous dataset creation, and precise tuning of hyperparameters. This combination has led to the development of a highly accurate and reliable system for recognizing American Sign Language gestures, representing a major milestone in the field of assistive technology.”

    Future efforts will focus on expanding the dataset to include a wider range of hand shapes and gestures to improve the model’s ability to differentiate between gestures that may appear visually similar, thus further enhancing recognition accuracy. Additionally, optimizing the model for deployment on edge devices will be a priority, ensuring that it retains its real-time performance in resource-constrained environments.

    “By improving American Sign Language recognition, this work contributes to creating tools that can enhance communication for the deaf and hard-of-hearing community,” said Stella Batalama, Ph.D., dean, FAU College of Engineering and Computer Science.

    “The model’s ability to reliably interpret gestures opens the door to more inclusive solutions that support accessibility, making daily interactions—whether in education, health care, or social settings—more seamless and effective for individuals who rely on sign language. This progress holds great promise for fostering a more inclusive society where communication barriers are reduced.”

    More information:
    Bader Alsharif et al, Transfer learning with YOLOV8 for real-time recognition system of American Sign Language Alphabet, Franklin Open (2024). DOI: 10.1016/j.fraope.2024.100165

    Provided by
    Florida Atlantic University


    Citation:
    Breaking barriers: Study uses AI to interpret American Sign Language in real-time (2024, December 16)
    retrieved 16 December 2024
    from https://techxplore.com/news/2024-12-barriers-ai-american-language-real.html

    This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
    part may be reproduced without the written permission. The content is provided for information purposes only.





    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Kavish
    • Website

    Related Posts

    Vivo X300 Ultra Launch in India Confirmed: Check Full Specifications and Expected Price

    April 17, 2026

    OPPO F33 And F33 Pro Launched In India With 7,000mAh Battery, Dimensity 6360 Max: Check Price And Specs

    April 17, 2026

    Motorola Edge 70 Pro Launch Date in India and Full Specifications Confirmed Through Flipkart

    April 17, 2026

    Roblox’s AI assistant gets new agentic tools to plan, build, and test games

    April 17, 2026

    Moto Pad 60 Pro and Moto Pad 60 Neo Get a Price Hike in India: Check New Prices

    April 16, 2026

    OPPO Reno 16 Pro Full Specifications and Colourways Revealed: Check Launch Timeline for the Reno 16 Series

    April 16, 2026
    Leave A Reply Cancel Reply

    Top Reviews
    Editors Picks

    Vivo X300 Ultra Launch in India Confirmed: Check Full Specifications and Expected Price

    April 17, 2026

    OPPO F33 And F33 Pro Launched In India With 7,000mAh Battery, Dimensity 6360 Max: Check Price And Specs

    April 17, 2026

    Motorola Edge 70 Pro Launch Date in India and Full Specifications Confirmed Through Flipkart

    April 17, 2026

    Roblox’s AI assistant gets new agentic tools to plan, build, and test games

    April 17, 2026
    About Us
    About Us

    Email Us: info@xarkas.com

    Facebook Pinterest
    © 2026 . Designed by Xarkas Technologies.
    • Home
    • Mobiles
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.

    Ad Blocker Enabled!
    Ad Blocker Enabled!
    Our website is made possible by displaying online advertisements to our visitors. Please support us by disabling your Ad Blocker.