{"id":104669,"date":"2025-05-15T22:38:15","date_gmt":"2025-05-15T22:38:15","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/104669\/"},"modified":"2025-05-15T22:38:15","modified_gmt":"2025-05-15T22:38:15","slug":"how-were-advancing-accessibility-at-meta","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/104669\/","title":{"rendered":"How We\u2019re Advancing Accessibility at Meta"},"content":{"rendered":"<p>Today, I\u2019m recognizing Global Accessibility Awareness Day by reflecting on Meta\u2019s continued efforts to create products that promote a more accessible future. Building and improving accessibility features helps ensure we deliver meaningful impact for everyone, and I\u2019m proud to share some of our latest developments.\u00a0\u00a0<\/p>\n<p>Helping People Navigate Life Hands-Free<\/p>\n<p>Ray-Ban Meta glasses offer a hands-free form factor and Meta AI integrations \u2014 features that help everyone navigate daily life, but can be especially useful to the blind and low vision community. With Ray-Ban Meta glasses, you can capture and share photos, send text or voice messages, make phone calls, take video calls, listen to music, translate speech in real-time, and interact with Meta AI for in-the-moment help.\u00a0<\/p>\n<p>Since launching Ray-Ban Meta, people have captured and shared millions of moments with loved ones, and we\u2019ve loved seeing all the different ways people across communities have used them to live more connected lives as we expand their availability across the world.\u00a0\u00a0\u00a0<\/p>\n<p>Starting today, we\u2019re introducing the ability to customize Meta AI to provide detailed responses on Ray-Ban Meta glasses based on what\u2019s in front of you. With this new feature, Meta AI will be able to provide more descriptive responses when people ask about their environment. This feature will begin to roll out to all users in the U.S. and Canada in the coming weeks and expand to additional markets in the future. To get started, go to the Device settings section in the Meta AI app and toggle on detailed responses under Accessibility.<\/p>\n<p>I\u2019m also excited to share that our <a href=\"https:\/\/www.bemyeyes.com\/news\/be-my-eyes-is-rolling-out-on-ray-ban-meta-glasses-starting-today\/\" target=\"_blank\" rel=\"noopener\">Call a Volunteer<\/a> feature, created in partnership with Be My Eyes, will launch in all 18 countries where Meta AI is supported later this month. Call a Volunteer connects blind or low vision individuals to a network of sighted volunteers in real-time to help them complete their everyday tasks.<\/p>\n<p>Designing for Better Human-Computer Interaction<\/p>\n<p>Wristband devices can facilitate human-computer interactions (HCI) for people with diverse physical abilities, including those with hand paralysis or tremor. We\u2019re exploring these capabilities through our work to develop sEMG (surface electromyography) wristbands at scale for on-the-go interactions with computing systems. Wristbands that use sEMG, or muscle signals, as a form of input are particularly promising for accessible HCI. This is because muscle signals at the wrist can provide control signals even if someone can\u2019t produce large movements (due to a spinal cord injury, stroke or another disabling event), experiences too much movement (due to tremor), or has fewer than five fingers on their hand.\u00a0\u00a0<\/p>\n<p>The sEMG wristband used for <a href=\"https:\/\/www.meta.com\/blog\/orion-ar-glasses-augmented-reality\/?srsltid=AfmBOoqzxKlduFeUH2IeQ5ooTH2VDUX-c2a24QWiF-zMvUIoLpT3m_U1\" target=\"_blank\" rel=\"noopener\">Orion<\/a>, our AR glasses product prototype, is our latest iteration of this technology. As part of our journey to develop sEMG wristbands for a diverse range of people, we\u2019ve been investing in <a href=\"https:\/\/www.meta.com\/blog\/surface-emg-wristband-electromyography-human-computer-interaction-hci\/\" target=\"_blank\" rel=\"noopener\">collaborative research<\/a> that focuses on accessibility use cases.<\/p>\n<p>In April, we completed data collection with a Clinical Research Organization (CRO) to evaluate the ability of people with hand tremors (due to Parkinson\u2019s and Essential Tremor) to use sEMG-based models for computer controls (like swiping and clicking) and for sEMG-based handwriting. We also have an <a href=\"https:\/\/www.youtube.com\/watch?v=edAUrOcN4E4\" target=\"_blank\" rel=\"noopener\">active research collaboration with Carnegie Mellon University <\/a>to enable people with hand paralysis due to spinal cord injury to use sEMG-based controls for human-computer interactions. These individuals retain very few motor signals, and these can be detected by our high-resolution technology. We are able to teach individuals to quickly use these signals, facilitating HCI as early as Day 1 of system use.<\/p>\n<p>Removing Barriers to Communication<\/p>\n<p>We\u2019re working to make the metaverse more accessible by providing live captions and <a href=\"https:\/\/www.meta.com\/help\/quest\/530990386475072\/?srsltid=AfmBOop4QJQKCNWpNYzkXqWsLKyTO-OadQZbpHtVH_f_Zn9hLrkRZEqO\" target=\"_blank\" rel=\"noopener\">live speech<\/a> in our extended reality products. Live captions work by converting spoken words into text in real-time, allowing users to read the content as it\u2019s being delivered. This feature is available at the <a href=\"https:\/\/www.meta.com\/help\/quest\/674999931400954\/?srsltid=AfmBOorO5SY97H73As-t4C6zVIbqO80pEzCxwN9-djt1O1Kjp2tqW-JL\" target=\"_blank\" rel=\"noopener\">Quest system level<\/a>, Meta Horizon <a href=\"https:\/\/www.meta.com\/help\/quest\/3507156169537057\/?srsltid=AfmBOopsbNwawrihU82AulttdsNb7M_YULlr5w_4HzauHIeRyEpKwHPM\" target=\"_blank\" rel=\"noopener\">call level<\/a> and in <a href=\"https:\/\/www.meta.com\/help\/quest\/273423262445175\/?srsltid=AfmBOop3HtTQyAMLLPB2LD28vBK1j2SuSZCkmiVgoUOLA-1hfWVBCx25\" target=\"_blank\" rel=\"noopener\">Meta Horizon Worlds<\/a>.<\/p>\n<p>Live speech converts text into synthetic audio, providing an alternative means of communication for people who may struggle with verbal interactions, or who prefer not to use their voice. Since launch, we\u2019ve observed extremely high retention of the live speech feature, and have since rolled out enhancements, including the ability to personalize and save frequently used messages.\u00a0<\/p>\n<p><a href=\"https:\/\/about.fb.com\/wp-content\/uploads\/2025\/05\/03_SignSpeak.gif\" target=\"_blank\" rel=\"noopener\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-43580\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/05\/03_SignSpeak.gif\" alt=\"GIF of a person signing into their device and SignSpeak avatar responding\" width=\"960\" height=\"836\"\/><\/a><\/p>\n<p>We\u2019re also excited by the ways <a href=\"https:\/\/www.llama.com\/\" target=\"_blank\" rel=\"noopener\">Llama<\/a>, our collection of open source AI models, is being used to promote accessibility. Developers at <a href=\"https:\/\/www.sign-speak.com\/\" target=\"_blank\" rel=\"noopener\">Sign-Speak<\/a> have paired their API with Llama to create a WhatsApp chatbot that translates American Sign Language (ASL), facilitating communication between Deaf people and hearing people. With this software, a Deaf person can sign ASL into a device, and the software will translate ASL into English text for the hearing person. The hearing person can message via voice or text to the device, and the software will sign to the Deaf person through an avatar.\u00a0\u00a0<\/p>\n<p>We\u2019re committed to investing in features and products that make connection easier for all, and we\u2019ll continue to evolve to address the needs of the billions of people around the world who use our products. <\/p>\n","protected":false},"excerpt":{"rendered":"Today, I\u2019m recognizing Global Accessibility Awareness Day by reflecting on Meta\u2019s continued efforts to create products that promote&hellip;\n","protected":false},"author":2,"featured_media":104670,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3162],"tags":[45634,48152,27113,48153,53,16,15,3243,3244],"class_list":{"0":"post-104669","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-virtual-reality","8":"tag-accessibility","9":"tag-artificial-intelligence-and-machine-learning","10":"tag-augmented-reality","11":"tag-metaverse","12":"tag-technology","13":"tag-uk","14":"tag-united-kingdom","15":"tag-virtual-reality","16":"tag-vr"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/114514251196676928","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/104669","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=104669"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/104669\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/104670"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=104669"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=104669"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=104669"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}