A brain-computer interface project for people with motor and speech disabilities
- The researcher, project leader at the INAOE, Conahcyt Public Center, explained that the interface is personalized and only the user can have control of their environment
- The tool under development is aimed at people who do not have mobility or speech, but their brain is completely healthy and alert
- It is about controlling useful devices such as: wheelchairs, electric beds, television, heater, making phone calls, emergency calls and messages
Researchers and students from the National Institute of Astrophysics, Optics and Electronics (INAOE), a Public Center coordinated by the National Council of Humanities, Sciences and Technologies (Conahcyt), develop a project of brain-computer interfaces for applications that allow people with motor and speech disabilities control equipment to move and communicate independently.
This project is part of research and development of centers of the National System of Public Centers (SNCP) of Conahcyt, which are aligned with the invitation made by President Andrés Manuel López Obrador to the general director of Conahcyt, María Elena Álvarez-Buylla Roces, to add capabilities for the development of prototypes, generating authentic science applied to well-being.
In this project, a pioneer in the country, scientists and students from the Coordination of Computational Sciences and the postgraduate program in Biomedical Sciences and Technologies of the INAOE participate. The objectives are to analyze and study electroencephalographic signals and develop a friendly interface that allows imagined words or specified actions to be interpreted and converted into commands to be executed by a computer.
INAOE researcher and project leader, Carlos Alberto Reyes, explained that a brain-computer interface is a mental tool to provide the brain with a non-muscular communication and control channel, to transmit messages and commands to the outside world “without using, in In this case, neither sounds nor movements, just the signals that you can emit or generate in your brain when you are doing or thinking about some type of activity or action that you want to establish outwardly.
He added that the most practical and economical means to extract the information generated in the brain is electroencephalography. “We have several types of devices that allow us to see the signals that are generated in the brain when thinking, doing or trying to issue a command or establish an action.”
In addition, he added that “the neuroparadigm of unspoken words is being used, since the type of tool we want to develop is aimed at people who do not have mobility or speech, but their brain is totally healthy and alert.”
Alberto Reyes commented that the interface is aimed at commanding support and home equipment, which will allow the user to gain independence of actions, lead a type of life as normal as possible and manage devices such as a wheelchair or an electric hospital bed.
“The interface is designed in a personalized way, in such a way that only the user can have control of their environment.”
The devices are manipulated through a navigation environment that consists of windows with the devices to be controlled and subwindows with options that will execute actions through hardware actuators that they must directly activate and manipulate.
It involves controlling useful devices such as a wheelchair, an electric bed, television, heater, lighting, etc. You can also make phone calls with particular ease, emergency calls and write messages.
In the electric chair option, the options are forward, reverse, right turn, left turn, stop and the possibility of increasing or decreasing the speed. Electric bed control allows the user to place the bed in the desired or recommended position. All of the above can be controlled and monitored through the brain-computer interface commanded directly by brain signals.
“We want this tool to be very practical, to allow the user to do things quickly and only need a small vocabulary because with the words they are thinking in their brain they establish an action. For example, we want them to move the cursor so that it moves to one side or the other.”
With a vocabulary that uses the words up, down, left, right and select, the user will be able to choose intuitively; for example, what you want to watch, and select the on or off options, increase or decrease the volume, change channels or choose an entertainment channel or application.
The INAOE researcher explained that the interface has another option called home automation, which is used to control the living environment: turning on or off the light, air conditioning, fan, heating or some other device that you want to connect.
"Until now we have simulated it, but we began to move the electric chair, in the contest we held at the INAOE last year we already saw that it is possible."
Finally, the scientist added that the project is in the simulation stage, but that they are now seeking to complete it by finding within the electroencephalogram (EEG) signal the unspoken words that lead to a command.
“This is not simple, these are unknown fields, we have to open the way, especially in Mexico. No other group in the country is developing the combination of an entire mind-controlled environment,” he concluded.
With these actions, the INAOE - which is part of the National System of Public Centers Conahcyt - complies with the provisions of the first General Law on Humanities, Sciences, Technologies and Innovation (LGHCTI), directing capabilities, talent and infrastructure to provide solutions to priority problems of the country and guarantee the human right to science.
Luis Enrique Erro # 1, Tonantzintla, Puebla, México, Código Postal 72840, Tel: (222) 266.31.00, difusion@inaoep.mx
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 2.5 Mexico License.