The hottest MIT has developed a new interface syst

  • Detail

MIT has developed a new interface system that allows operators to control robots with their minds. According to VentureBeat, controlling robots with their minds is not as far away as it sounds. Researchers at the computer science and Artificial Intelligence Laboratory (CSAIL) under the Massachusetts Institute of Technology (MIT) have developed a new interface, which can read the brain waves of human operators and let them command the machine to perform tasks through thinking

Daniela Rus, director of CSAIL, said, "we want to stay away from the world where people have to adapt to machine constraints. Methods like this show that it is very possible to develop a more natural and intuitive robot system." The system uses a combination of electroencephalogram (EEG) and electromyography (EMG) to monitor brain activity, in which EEG detects electrical activity in the brain through electrodes connected to the scalp, and EMG is used to measure signals generated by motor neurons

neither EEG nor EMG is a perfect science, nor is it particularly accurate. However, by including the experimental process and combining the two, the research team can obtain higher accuracy than using one of the techniques alone. Joseph delpreto, a doctoral student and the first author of the project paper, said: "by observing muscle and brain signals, we can begin to understand a person's natural posture and their hasty decisions about whether something is wrong. This helps to make the communication between people and robots more like that between people."

csail team's algorithm analyzes the signal of "error related potential" (a) ex factory products should have product certificates, which include: errps). This is a neural activity mode that has always been a breakthrough in high-end technology. When people notice errors, it will happen naturally. At the moment when an error is detected, such as when the controlled robot is about to make an error, it will stop working, so that the operator can use the gesture based menu interface to correct the error

Ruth said: "this work, which combines EEG and EMG feedback, enables the interaction between humans and robots to be applied to a wider range of applications, which is more than what we can do before using only EEG feedback. By including muscle feedback, we can use gestures to remotely command robots, resulting in more nuances and specificity."

researchers found that compared with 70% of the control group, human supervised robots took more than 97% of the time to correct errors. What's more impressive is that this system is also effective for people who have never used it before. The research team highlights the excess capacity of low-end products and believes that the system may be more useful for people with language barriers or mobility difficulties

Copyright © 2011 JIN SHI