scorecardresearch
Clear all
Search

COMPANIES

No Data Found

NEWS

No Data Found
Sign in Subscribe
AI to enable paralysed woman to 'speak' through a digital avatar; here’s how

AI to enable paralysed woman to 'speak' through a digital avatar; here’s how

Using the AI interface, the patient can now formulate her responses in writing, much like the system developed by Synchron for individuals with locked-in syndrome

A brain-computer interface translates the study participant’s brain signals into the speech and facial movements of an animated avatar. (Credit: Noah Berger) A brain-computer interface translates the study participant’s brain signals into the speech and facial movements of an animated avatar. (Credit: Noah Berger)
SUMMARY
  • This system empowered a stroke-paralysed woman to communicate freely by utilising a digital avatar under her control
  • BCIs are devices that monitor the analogue signals generated by the brain and translate them into digital signals comprehensible to computers
  • Speech Graphics, the company responsible for the impressive facial animation technology seen in games like Halo Infinite and The Last of Us Part II, to fabricate a digital avatar for the patient

A collaborative team of researchers from UC San Francisco and UC Berkeley, working in conjunction with Speech Graphics based in Edinburgh, has introduced an innovative communication system by leveraging artificial intelligence and machine learning. This system empowered a stroke-paralysed woman to communicate freely by utilising a digital avatar under her control, operated through a brain-computer interface.

Brain-computer interfaces (BCIs) are devices that monitor the analogue signals generated by the brain and translate them into digital signals comprehensible to computers, akin to a digital-to-analogue converter unit in a mixing soundboard, but designed to fit within the confines of the skull.

In this study, overseen by Dr Edward Chang, the head of neurological surgery at UCSF, researchers implanted a 253-pin electrode array into the patient's brain's speech centre. These electrodes observed and captured the electrical signals responsible for triggering the muscles in her jaw, lips, and tongue. Instead of stimulating these muscles, the signals were routed via a cable port in her skull to a bank of processors. This computational stack included a machine learning artificial intelligence (AI) which, after several weeks of training, became proficient in recognising over 1,000 words based on the patient's unique electrical signal patterns.

However, this is only the initial aspect of this breakthrough. Using the AI interface, the patient can now formulate her responses in writing, much like the system developed by Synchron for individuals with locked-in syndrome. Yet, she can also "speak" in a manner of speaking, using a synthesised voice trained on recordings of her natural voice predating her paralysis – similar to the approach taken with digitally recreated celebrities.

Also Read Hollywood vs AI: Why famous actors including Oppenheimer, Barbie cast are on strike

The research team collaborated with Speech Graphics, the company responsible for the impressive facial animation technology seen in games like Halo Infinite and The Last of Us Part II, to fabricate a digital avatar for the patient. Speech Graphics' technology reverse-engineers the intricate musculoskeletal movements a real face would perform, based on an analysis of audio input. This data is then fed in real-time to a gaming engine, resulting in a seamless animation of the avatar's facial expressions. As the patient's mental signals are directly linked to the avatar, she can convey emotions and communicate non-verbally as well.

Michael Berger, the CTO and co-founder of Speech Graphics, remarked, "Creating a digital avatar that can speak, emote, and articulate in real-time, connected directly to the subject’s brain, shows the potential for AI-driven faces well beyond video games. Restoring voice alone is impressive, but facial communication is so intrinsic to being human, and it restores a sense of embodiment and control to the patient who has lost that."

BCI technology was pioneered in the early 1970s and has experienced gradual evolution over the ensuing decades. Recent exponential progress in processing and computing systems has reinvigorated the field, with several well-funded startups currently in competition to secure the first FDA regulatory device approval. Notably, Synchron, headquartered in Brooklyn, garnered attention last year as the first company to successfully implant a BCI in a human patient. Elon Musk's Neuralink initiated controlled FDA trials earlier this year, following previous testing rounds that involved numerous animal subjects.

Also Read

Battle of the billionaires: Elon Musk vs Mark Zuckerberg cage match could make over $1 billion

Chandrayaan-3 poised to unlock future energy source on the moon; know all about it

For Unparalleled coverage of India's Businesses and Economy – Subscribe to Business Today Magazine

Published on: Aug 25, 2023, 11:35 AM IST
IN THIS STORY
×
Advertisement