Curated by THEOUTPOST
On Fri, 21 Feb, 12:04 AM UTC
7 Sources
[1]
Nvidia has built a free AI-led platform to help teach American Sign Language with '400,000 video clips representing 1,000 signed words' so far
If you've been meaning to learn American Sign Language (ASL), or just want to brush up on your vocabulary, Nvidia has announced a free new tool, alongside the American Society for Deaf Children and agency Hello Monday, that harnesses AI to teach prospective users how to speak. Announced in the Nvidia Blog, this new tool's announcement post laments the fact that, despite being the "third most prevalent language in the United States", AI tools aren't as prominently teaching Americans ASL as they are English and Spanish. Signs, Nvidia's new ASL learning platform, uses footage from your webcam to correct or instruct you on your sign language. Notably, you can't currently use Signs without allowing it access to your webcam. This is because the app directly gives you feedback on how you sign and how to be clearer in the way you move your hands. As of right now, Nvidia has access to a database of "400,000 video clips representing 1,000 signed words", which is then validated by ASL speakers. Effectively, Nvidia's AI is used to interpret signs, categorise them, and then pass them on to real-life users who can validate findings. Though we don't yet have any confirmation on other forms of sign language, the press release states: "NVIDIA teams plan to use this dataset to further develop AI applications that break down communication barriers between the deaf and hearing communities" ASL is more complex than simple gestures and can be influenced by facial expressions so the team behind Signs' next step is to figure out how to interpret those into its corrections and teachings. As well as this, a future version of the tool is intended to incorporate "regional variations and slang terms". The dataset will supposedly be released later this year, but for now, you can learn ASL or contribute to the honing of the app by visiting the Signs website. The mention of AI may throw up warning signs but this use of it seems different to more than just compiling (or scraping copyrighted) data, like common parlance of the term may suggest. Signs actively attempts to interpret gestures, something that AI is uniquely good at doing. The fact that it all goes through a human at the end of the chain, who can verify authenticity should help stop the data from skewing or hallucinating. Also, the fact that it's being used to teach an oft-overlooked language, and not generate glossy pictures of emojis makes its use of AI even better. If it can build meaningfully on this start to include more gestures and dialects, Nvidia could have a real winner on its hands.
[2]
Nvidia uses AI to release Signs, a sign language teaching platform - SiliconANGLE
Nvidia uses AI to release Signs, a sign language teaching platform To increase accessibility to American Sign Language learning, Nvidia Corp. today announced the launch of a new artificial intelligence-powered platform that offers instruction day and night. The platform is called Signs, a teaching platform developed in partnership between Nvidia, the American Society for Deaf Children and the creative agency Hello Monday. It is an interactive web platform that uses a camera to give AI-enabled feedback and 3D avatar with voice to teach sign language. ASL is the third largest language in the United States, yet it is underrepresented in AI datasets in comparison to spoken and written languages such as English and Spanish. To correct this disparity, Nvidia said it became necessary to build a set of data and validate it. The company said it trained the AI, which instructs users by reading their gestures and providing them a visual guide, with a dataset it aims to grow to over 400,000 video clips representing around 1,000 signed words. Each sign will be validated by fluent ASL users and interpreters to ensure its accuracy. "Most deaf children are born to hearing parents. Giving family members accessible tools like Signs to start learning ASL early enables them to open an effective communication channel with children as young as six to eight months old," said Cheri Dowling, executive director of the American Society for Deaf Children. Along with teaching sign language, the platform also allows signers of any skill level to contribute by singing specific words to add to Nvidia's growing ASL open-source video dataset. The company said the dataset is planned for release later this year. Although Signs currently focuses on the user's hand movements and finger positions for signing, ASL also incorporates facial expressions to convey meaning. For example, in the sign for hot, the signer places clawed fingers near their mouth and then thrusts them away as if removing something hot from their mouth, this is accompanied by a facial expression as if they're exhaling. Raised eyebrows incorporated into signs can signal questions, and gaze can direct attention to a subject or action. Changing facial expressions during a sentence can convey intensity or stress on a certain part of a statement. The team behind the app said it is exploring how to track these signals and integrate them into future versions. Nvidia further said that the team is exploring how regional dialects, slang terms and other variations can be represented in Signs. It should be noted that ASL, just like American English and Spanish, is a living language with a diverse vocabulary that can vary between different communities. It is a distinct language that is continuously changing with new signs being created to convey concepts when the need arises. "Improving ASL accessibility is an ongoing effort," said Anders Jessen, founding partner of Hello Monday. "Signs can serve the need for advanced AI tools that help transcend communication barriers between the deaf and hearing communities."
[3]
Nvidia helps launch AI platform for teaching American Sign Language
Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Nvidia has unveiled a new AI platform for teaching people how to use American Sign Language to help bridge communication gaps. The Signs platform is creating a validated dataset for sign language learners and developers of ASL-based AI applications. It so happens that American Sign Language is the third most prevalent language in the United States -- but there are vastly fewer AI tools developed with ASL data than data representing the country's most common languages, English and Spanish. Nvidia, the American Society for Deaf Children and creative agency Hello Monday are helping close this gap with Signs, an interactive web platform built to support ASL learning and the development of accessible AI applications. Sign language learners can access the platform's validated library of ASL signs to expand their vocabulary with the help of a 3D avatar that demonstrates signs -- and use an AI tool that analyzes webcam footage to receive real-time feedback on their signing. Signers of any skill level can contribute by signing specific words to help build an open-source video dataset for ASL. The dataset -- which Nvidia aims to grow to 400,000 video clips representing 1,000 signed words -- is being validated by fluent ASL users and interpreters to ensure the accuracy of each sign, resulting in a high-quality visual dictionary and teaching tool. "Most deaf children are born to hearing parents. Giving family members accessible tools like Signs to start learning ASL early enables them to open an effective communication channel with children as young as six to eight months old," said Cheri Dowling, executive director of the American Society for Deaf Children, in a statement. "And knowing that professional ASL teachers have validated all the vocabulary on the platform, users can be confident in what they're learning." Nvidia teams plan to use this dataset to further develop AI applications that break down communication barriers between the deaf and hearing communities. The data is slated to be available to the public as a resource for building accessible technologies including AI agents, digital human applications and video conferencing tools. It could also be used to enhance Signs and enable ASL platforms across the ecosystem with real-time, AI-powered support and feedback. Whether novice or expert, volunteers can record themselves signing to contribute to the ASL dataset. Supporting ASL education and exploring language nuance During the data collection phase, Signs already provides a powerful platform for ASL language acquisition, offering opportunities for individuals to learn and practice an initial set of 100 signs so they can more effectively communicate with friends or family members who use ASL. "The Signs learning platform could help families with deaf children quickly search for a specific word and see how to make the corresponding sign. It's a tool that can help support their everyday use of ASL outside of a more formal class," Dowling said. "I see both kids and parents exploring it -- and I think they could play with it together." While Signs currently focuses on hand movements and finger positions for each sign, ASL also incorporates facial expressions and head movements to convey meaning. The team behind Signs is exploring how these non-manual signals can be tracked and integrated in future versions of the platform. They're also investigating how other nuances, like regional variations and slang terms, can be represented in Signs to enrich its ASL database -- and working with researchers at the Rochester Institute of Technology's Center for Accessibility and Inclusion Research to evaluate and further improve the user experience of the Signs platform for deaf and hard-of-hearing users. "Improving ASL accessibility is an ongoing effort," said Anders Jessen, founding partner of Hello Monday/DEPT, which built the Signs web platform and previously worked with the American Society for Deaf Children on Fingerspelling.xyz, an application that taught users the ASL alphabet. "Signs can serve the need for advanced AI tools that help transcend communication barriers between the deaf and hearing communities." The dataset behind Signs is planned for release later this year. Start learning or contributing with Signs at signs-ai.com, and learn more about Nvidia's trustworthy AI initiatives. Attendees of Nvidia GTC, a global AI conference taking place March 17-21 in San Jose, will be able to participate in Signs live at the event. See notice regarding software product information.
[4]
Nvidia wants to teach you sign language with its new AI tool
Together with the American Society for Deaf Children and creative agency Hello Monday, Nvidia developed an interactive web platform called Signs to support ASL learning. The platform will also provide a dataset validated by fluent ASL users and interpreters that developers can use to build more accessible AI applications. Signs features a library of ASL signs for learners to improve their vocabulary, as well as a 3D avatar teacher. Learners can get real-time feedback on their signing via an AI tool that analyzes webcam footage. The platform initially has 100 signs, and is focused on hand movements and finger positions. Users can also learn the meanings of different facial expressions and head movements. Nvidia said it wants to grow the signs library to 400,000 video clips for 1,000 signed words. Signs users can contribute to the video dataset, which is open-source. The company added that it plans to make the dataset available to the public for building accessible AI agents, video conferencing features, and other AI tools. "Most deaf children are born to hearing parents. Giving family members accessible tools like Signs to start learning ASL early enables them to open an effective communication channel with children as young as six to eight months old," Cheri Dowling, executive director of the American Society for Deaf Children, said in a statement. "And knowing that professional ASL teachers have validated all the vocabulary on the platform, users can be confident in what they're learning." The company is currently working with the Rochester Institute of Technology's Center for Accessibility and Inclusion Research to improve the platform, and is looking to include regional and slang terms in the Signs library. The dataset will be released sometime this year, Nvidia said. Meanwhile, Nvidia is preparing for its annual GPU Technology Conference in March, where attendees will get to use Signs. The company is set to report its fiscal fourth-quarter earnings next week.
[5]
It's a Sign: AI Platform for Teaching American Sign Language Aims to Bridge Communication Gaps
Your browser doesn't support HTML5 video. Here is a link to the video instead. American Sign Language is the third most prevalent language in the United States -- but there are vastly fewer AI tools developed with ASL data than data representing the country's most common languages, English and Spanish. NVIDIA, the American Society for Deaf Children and creative agency Hello Monday are helping close this gap with Signs, an interactive web platform built to support ASL learning and the development of accessible AI applications. Sign language learners can access the platform's validated library of ASL signs to expand their vocabulary with the help of a 3D avatar that demonstrates signs -- and use an AI tool that analyzes webcam footage to receive real-time feedback on their signing. Signers of any skill level can contribute by signing specific words to help build an open-source video dataset for ASL. The dataset -- which NVIDIA aims to grow to 400,000 video clips representing 1,000 signed words -- is being validated by fluent ASL users and interpreters to ensure the accuracy of each sign, resulting in a high-quality visual dictionary and teaching tool. "Most deaf children are born to hearing parents. Giving family members accessible tools like Signs to start learning ASL early enables them to open an effective communication channel with children as young as six to eight months old," said Cheri Dowling, executive director of the American Society for Deaf Children. "And knowing that professional ASL teachers have validated all the vocabulary on the platform, users can be confident in what they're learning." NVIDIA teams plan to use this dataset to further develop AI applications that break down communication barriers between the deaf and hearing communities. The data is slated to be available to the public as a resource for building accessible technologies including AI agents, digital human applications and video conferencing tools. It could also be used to enhance Signs and enable ASL platforms across the ecosystem with real-time, AI-powered support and feedback. During the data collection phase, Signs already provides a powerful platform for ASL language acquisition, offering opportunities for individuals to learn and practice an initial set of 100 signs so they can more effectively communicate with friends or family members who use ASL. "The Signs learning platform could help families with deaf children quickly search for a specific word and see how to make the corresponding sign. It's a tool that can help support their everyday use of ASL outside of a more formal class," Dowling said. "I see both kids and parents exploring it -- and I think they could play with it together." While Signs currently focuses on hand movements and finger positions for each sign, ASL also incorporates facial expressions and head movements to convey meaning. The team behind Signs is exploring how these non-manual signals can be tracked and integrated in future versions of the platform. They're also investigating how other nuances, like regional variations and slang terms, can be represented in Signs to enrich its ASL database -- and working with researchers at the Rochester Institute of Technology's Center for Accessibility and Inclusion Research to evaluate and further improve the user experience of the Signs platform for deaf and hard-of-hearing users. "Improving ASL accessibility is an ongoing effort," said Anders Jessen, founding partner of Hello Monday/DEPT, which built the Signs web platform and previously worked with the American Society for Deaf Children on Fingerspelling.xyz, an application that taught users the ASL alphabet. "Signs can serve the need for advanced AI tools that help transcend communication barriers between the deaf and hearing communities." The dataset behind Signs is planned for release later this year.
[6]
Want to learn American Sign Language? AI will teach you now - here's how
Developed by NVIDIA and the American Society for Deaf Children, a new AI-driven website known as Signs invites people to learn how to sign. Have you ever wanted to learn sign language to communicate with family members, friends, or other people who are deaf? If so, you might want to try a new interactive website that uses AI to train you on American Sign Language (ASL). Known as Signs, the site shows you how to sign and then uses the camera on your PC or mobile device to make sure you're shaping your hand and fingers correctly. How does Signs work? Assuming you're brand new to ASL, head to the Signs website. After getting past the initial screen, choose the option to Learn ASL and start the tutorial. Make sure your camera is activated and adjusted properly and that you have enough space to sign. Also: 10 key reasons AI went mainstream overnight - and what happens next An online 3D avatar then shows you how to sign your first word, namely Hello. With your own face and hand visible on the screen, follow the avatar's gestures to sign the word. The training then moves on to a couple of other words and phrases, including Thank you and Who. After completing the tutorial, you can choose the level you want to tackle. The higher the level, the more complex the words you'll be asked to sign. If you're a beginner, you might want to start with the first level. If you already know some ASL, you may consider jumping ahead to one of the higher levels. As the avatar shows you the signs, it also calmly and patiently describes how your hand and fingers should move. Each time you replicate a sign correctly, a ding sounds off to indicate that you're catching on. You're also awarded a certain number of points to keep track of your progress. You can keep going all the way until you complete level four, thereby providing you with a good foundation. You may also want to repeat the training for each level until you become skilled enough that you no longer need the lessons. The website is a decidedly team effort. NVIDIA developed the open ASL database in collaboration with the American Society for Deaf Children. New York-based creative studio Hello Monday/DEPT, which designed a similar site called Fingerspelling, built the Signs platform to use AI to help people learn interactively. The site is open to different kinds of participants. Those who want to learn ASL can take advantage of the interactive training. Those already familiar with signing can contribute their own videos to expand the number of words and signs accessible in the site's database. But even people just getting started with ASL can upload videos of their signs to enhance the collection. Also: Where AI educators are replacing teachers - and how that'll work Each submitted video is verified by Deaf individuals and certified interpreters to make sure the hand and finger movements and positions are accurate. Though people think of sign language as using just your hand and fingers, facial expressions and head movements also play a role in communicating with the Deaf community. Toward that end, the developers are thinking of ways to incorporate face tracking and other motion-capture methods. "NVIDIA and Hello Monday/DEPT believe everyone has the right to expression," the website says on its About page. "At the same time, NVIDIA and Hello Monday/DEPT recognize that some communities lack equitable access to means of expression. We created this platform primarily to help teach hearing parents that may not know how to sign, connect and communicate with Deaf children and family members. We see this as an opportunity to use AI for Good, to teach and engage the Deaf community meaningfully."
[7]
Nvidia Unveils 'Signs' AI Platform For Teaching American Sign Language, Expanding Into Assistive Tech - NVIDIA (NASDAQ:NVDA), Tesla (NASDAQ:TSLA)
NVIDIA Corp. NVDA has unveiled Signs, an innovative artificial intelligence-powered platform designed to enhance American Sign Language (ASL) learning and accessibility, entering the assistive technology space as companies like Neuralink make strides in neural communication. What Happened: The platform, developed in partnership with the American Society for Deaf Children and Hello Monday, aims to create a comprehensive validated dataset of 400,000 video clips representing 1,000 signed words. "Most deaf children are born to hearing parents. Giving family members accessible tools like Signs to start learning ASL early enables them to open an effective communication channel with children as young as six to eight months old," said Cheri Dowling, executive director of the American Society for Deaf Children. See Also: Alibaba CFO Toby Xu Touts 5% Drop In Share Count Thanks To Aggressive Buybacks As Jack Ma-Founded Company Basks In Stellar Q3 Results Why It Matters: While NVIDIA focuses on external AI-driven communication solutions, Tesla Inc. CEO Elon Musk's Neuralink is developing complementary neural technology. The company's upcoming "Blindsight" product aims to restore vision in individuals born blind, demonstrating the expanding possibilities in neural communication assistance. The Signs platform features a 3D avatar demonstrating signs and provides real-time AI feedback on users' signing through webcam analysis. Currently offering an initial set of 100 signs, Nvidia's research team is exploring the integration of facial expressions and head movements in future versions, while also investigating regional variations and slang terms. Nvidia plans to make the dataset publicly available later this year, potentially catalyzing the development of new accessible technologies including AI agents and video conferencing tools. The platform is currently accessible at signs-ai.com, with live demonstrations planned for the upcoming NVIDIA GTC conference in San Jose, March 17-21. Read Next: Sam Bankman-Fried Slams Biden's Crypto Policies As 'Incredibly Destructive' While Aligning With Trump -- Looking For A Pardon? Image Via Shutterstock Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. NVDANVIDIA Corp$140.000.55%OverviewTSLATesla Inc$352.76-2.16%Market News and Data brought to you by Benzinga APIs
Share
Share
Copy Link
Nvidia, in collaboration with the American Society for Deaf Children and Hello Monday, has introduced 'Signs', an AI-driven platform designed to teach American Sign Language (ASL) and create a comprehensive ASL dataset for future AI applications.
Nvidia, in collaboration with the American Society for Deaf Children and creative agency Hello Monday, has launched 'Signs', an innovative AI-powered platform designed to teach American Sign Language (ASL) 1. This free, interactive web-based tool aims to bridge communication gaps and increase accessibility to ASL learning.
Signs utilizes AI to interpret and provide feedback on users' sign language gestures captured via webcam. The platform features a 3D avatar that demonstrates signs and offers real-time feedback to learners 2. Currently, the system focuses on hand movements and finger positions but plans to incorporate facial expressions and head movements in future versions to capture the full complexity of ASL.
A key component of the Signs platform is its growing dataset, which Nvidia aims to expand to include 400,000 video clips representing 1,000 signed words 3. This dataset is being meticulously validated by fluent ASL users and interpreters to ensure accuracy, creating a high-quality visual dictionary and teaching tool.
The development of Signs involves collaboration with researchers from the Rochester Institute of Technology's Center for Accessibility and Inclusion Research to improve user experience for deaf and hard-of-hearing individuals 4. Future enhancements may include regional variations, slang terms, and the integration of facial expressions to convey meaning more accurately.
Nvidia plans to make the Signs dataset publicly available later this year, providing a valuable resource for developers to build more accessible AI applications, including AI agents, digital human applications, and video conferencing tools 5. This initiative aims to address the underrepresentation of ASL in AI datasets compared to spoken languages like English and Spanish.
The Signs platform offers significant potential for ASL education, particularly for families with deaf children. Cheri Dowling, executive director of the American Society for Deaf Children, emphasized the importance of early ASL learning: "Giving family members accessible tools like Signs to start learning ASL early enables them to open an effective communication channel with children as young as six to eight months old" 5.
Signs encourages community participation by allowing users of all skill levels to contribute to the open-source video dataset. This collaborative approach aims to enrich the platform's ASL database and support the development of more inclusive AI technologies 3.
As Nvidia prepares to showcase Signs at its annual GPU Technology Conference in March, the company continues to refine and expand this innovative tool, potentially revolutionizing ASL education and accessibility in the digital age 4.
Reference
[5]
The Official NVIDIA Blog
|It's a Sign: AI Platform for Teaching American Sign Language Aims to Bridge Communication GapsNVIDIA's CEO Jensen Huang predicts widespread AI adoption and introduces 'Physical AI' at SIGGRAPH 2023, signaling a new era of AI-powered technology across various sectors.
2 Sources
2 Sources
Researchers from Osaka Metropolitan University and Indian Institute of Technology Roorkee have developed a new AI method that improves the accuracy of sign language translation by 10-15%, potentially revolutionizing communication for the deaf and hard of hearing community worldwide.
2 Sources
2 Sources
Nvidia announces a groundbreaking partnership with California to provide AI training to 100,000 residents. This initiative aims to boost the state's AI workforce and prepare citizens for the future of technology.
2 Sources
2 Sources
NVIDIA quietly released a new open-source AI model, Llama-3.1-Nemotron-70B-Instruct, which has reportedly outperformed leading models from OpenAI and Anthropic in benchmark tests, signaling a shift in NVIDIA's AI strategy.
6 Sources
6 Sources
Nvidia introduces Nemotron-4-Mini-Hindi-4B, a lightweight Hindi-language AI model, during CEO Jensen Huang's visit to India. This move aims to tap into India's growing AI market and support localized AI development.
9 Sources
9 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved