Around 5,000 people come to Switzerland every year become aphasic. This disorder is characterized by the loss of the ability to find the words to express one’s thoughts. Using brain-machine interfaces, research teams are attempting to decode the brain’s activity to restore the words of aphasic people.
A language disorder
New ethical questions
With the exploration of brain-machine interfaces for speech reconstruction, new ethical questions have arisen, particularly about the reliability of decoding. “You have to make sure that the device corresponds to what the person wants to say and that they can mean it the other way around,” recommends Pierre Mégevand, resident in the Department of Neurology at HUG and assistant professor in the Department of Clinical Medicine Neuroscience from UNIGE. The person must also have a way to keep their thoughts private. Questions also arise with regard to brain data: How can their security be guaranteed? Who do they belong to? “From a philosophical point of view, one can ask whether the complete recording of brain activity gives information that the individual does not know, for example about their identity. Finally, with these interfaces we can certainly restore lost functions, but also go towards a form of transhumanism. A way of questioning,” says the specialist.
At the origin of aphasia we find lesions affecting the brain regions involved in language, located in the left hemisphere in most people. The most common cause is a cerebrovascular accident (CVA): 30% of those affected develop a speech communication disorder. A cerebral head trauma, a neurodegenerative disease such as Alzheimer’s or a brain tumor can also be responsible. “In aphasia, it is not language itself that is affected, but language,” explains Pierre Mégevand, resident at the Department of Neurology at the University Hospitals of Geneva (HUG) and assistant professor at the Department of Clinical Neurosciences at the University of Geneva (EINIG). Anything that goes through words no longer works properly, whether spoken or written, because the part of the brain that manages the connection between words and their meaning is damaged.
Depending on the brain areas affected and their extent, aphasia progresses differently. It will more or less affect the ability to understand (read, hear) or produce (speak, write) language, with serious consequences for the quality of life of those affected and their families. “Aphasia often has a direct impact on their work, but also on social and personal relationships. It can trigger depression and lead to difficulties with everyday activities such as reading the mail, paying bills or shopping in people living alone,” the specialist notes.
Decoding the brain to restore speech
Speech passes through nerve motor commands sent by the brain to move the tongue, lips, lungs, vocal cords and anything that allows us to articulate sounds. A research group has also shown that it is possible, using electrodes implanted in the brain and connected to a computer, to decode the electrical activity of the regions that control the articulation apparatus to reconstruct what a person is interfaced with trying to say between brain and machine . “This research led to the idea that we could restore the speech of people who had lost this ability by using a computer that synthesized mind-speech in real-time. This interface system was implanted in a first patient who was quadriplegic and unable to speak. By refining and accelerating its functions, it could become very promising,” said the specialist.
At the same time, the University of Geneva’s team of linguistic and auditory neurosciences, led by Professor Anne-Lise Giraud, studied the brain’s activity when a person is imagining words or syllables. “We have been working on implanting a new type of electrode for this interface,” says the researcher. To do this, we recorded the brain activity of people already implanted with electrodes (used to treat epilepsy) as they mentally pronounce certain syllables or words. We were then able to decode that data using algorithms trained to track patients’ thoughts.” This allowed the team to determine which brain regions are most interesting for implanting future electrodes and get the best electrical signals. It also determined the frequencies of brain activity and their most promising combinations, which can be used for the decoding algorithm. In the future, the system could be able to decode syllables one by one to assemble words, an approach that is proving to be more reliable and Easier Than Learning Decipher all the words in the dictionary or the articulated motor sequences. “We also want people to be able to train the algorithm themselves to optimize the signal sent to the decoder. We will test this step very soon,” says Anne-Lise Giraud happily.
Partial recovery from aphasia
Immediately after a stroke, aphasia is often severe. In the following weeks and depending on the size of the lesion and its location, the human recovers naturally thanks to the surrounding neurons that take over some of the language. “The first treatment is to prevent the person’s condition from deteriorating further due to complications of the stroke, such as fever, pneumonia or urinary tract infection,” explains Pierre Mégevand, resident doctor in the Department of Neurology at HUG and assistant professor in the Department of Clinical Neuroscience at UNIGE. The second axis of treatment concerns the recovery of speech itself through rehabilitation through speech therapy. Despite this rehabilitation, some people do not fully recover and must learn to live with a speech disorder. “In order to promote this recovery, research is being carried out on electrical stimulation of the brain in order to promote the activity of certain crucial language regions or, on the contrary, to reduce the activity of those that could be counterproductive.”
Published in Le Matin Dimanche on 06/19/2022
 Source: https://aphasie.org/wp-content/uploads/2021/02/annual-report-2017-f.pdf
#Decoding #brain #signals #treat #aphasia