Researchers study links between facial recognition and Alzheimer’s – News Artificial intelligence and robotics

Alzheimer’s disease has been on the rise worldwide in recent years and is rarely diagnosed at an early stage when it can still be effectively controlled. Using artificial intelligence, KTU researchers conducted a study to determine whether human-computer interfaces could be adapted to allow people with memory impairments to recognize an object visible in front of them.

Rytis Maskeliūnas, researcher at the Department of Multimedia Technology at Kaunas University of Technology (KTU), considers the classification of information visible on the face to be an everyday human function: “In communication, the face ‘tells’ us the context of the conversation, especially from an emotional point of view, but can we identify visual stimuli from brain signals? »

The visual processing of the human face is complex. Information such as a person’s identity or emotional state can be perceived by us by analyzing faces. The aim of the study was to analyze a person’s ability to process contextual facial information and recognize how a person reacts to it.

The face can indicate the first symptoms of the disease

According to Maskeliūnas, since degenerative brain diseases affect not only memory and cognitive functions, but also the cranial nervous system (particularly oculars) associated with facial movements, numerous studies show that brain disorders may be able to be analyzed by examining facial muscles and eye movements.

Dovilė Komolovaitė, a graduate of KTU’s Faculty of Mathematics and Science, who co-authored the study, shared that the research clarified whether a patient with Alzheimer’s disease visually processes visible faces in the brain in the same way as people without the disease.

“The study uses data from an electroencephalograph, which measures electrical impulses in the brain,” says Komolovaitė, who is currently studying artificial intelligence at the Faculty of Computer Science for a master’s degree.

In this study, the experiment was performed on two groups of people: healthy people and people with Alzheimer’s disease.

“The brain signals of a person with Alzheimer’s are usually much louder than a healthy person,” Komolovaitė says, pointing out that this correlates to a reason that makes it harder for a person to concentrate and pay attention when feeling the symptoms of the Alzheimer’s disease.

Photos of faces were shown during the study

The study selected an older group of women over 60 years of age: “Advanced age is one of the main risk factors for dementia, and because the effects of gender were detected in brainwaves, the study is more accurate when only one gender group is chosen.” »

During the study, each participant conducted experiments showing photographs of human faces for up to an hour. According to the researchers, these photos were selected according to several criteria: when analyzing the influence of emotions, neutral and anxious faces are shown, while when analyzing the factor of familiarity, the study participants are provided with known and randomly selected people.

To understand whether a person correctly sees and understands a face, study participants were asked to press a button after each stimulus to indicate whether the face displayed was reversed or correct.

“Even at this stage, an Alzheimer’s patient makes mistakes, so it is important to determine whether the deterioration of the object is due to memory or visual processes,” explains the researcher.

Inspired by real-life interactions with patients living with Alzheimer’s disease

Maskeliūnas reveals that his work with Alzheimer’s disease began with his collaboration with the Huntington’s Disease Association, which opened his eyes to what these many neurodegenerative diseases really look like.

The researcher was also in direct contact with patients suffering from Alzheimer’s disease: “I have seen that the diagnosis is usually made too late when the brain is already irreversibly damaged. years of life. »

Today we can see how human-computer interaction is being adapted to make life easier for people with physical disabilities. Controlling a robotic hand by “thinking” or controlling a paralyzed person writing text by imagining letters is not a new concept. Still, trying to understand the human brain is probably one of the most difficult tasks remaining to us today.

In this study, the researchers worked with data from standard electroencephalographs, but Maskeliūnas points out that to create a practical tool, it would be better to use data collected from invasive microelectrodes, which can more accurately measure the activity of neurons. . This would greatly increase the quality of the AI ​​model.

“Of course, in addition to the technical requirements, there should also be a community environment that focuses on making life easier for people with Alzheimer’s. But in my personal opinion, five years from now, we will still see technologies focused on improving bodily function, and the focus on people with brain disorders in this area will come later,” says Maskeliūnas.

According to master’s student Komolovaitė, a clinical examination with the help of colleagues from the medical field is necessary, so this step of the process would take a lot of time: “If we want to use this test as a medical tool, a certification procedure is also required. »

#Researchers #study #links #facial #recognition #Alzheimers #News #Artificial #intelligence #robotics

Leave a Comment

Your email address will not be published.