Engineers at UCLA have created a wearable device that reads brain signals without surgery to help people move objects or control computers. This is called a brain-computer interface, or BCI, a system that links the brain to machines. It is noninvasive, meaning no cuts or implants are needed. The device uses artificial intelligence (AI) to guess what the user wants and finish tasks.
This setup works better than older noninvasive BCIs. It could help people with limited movement, like those paralyzed or with conditions such as ALS, a disease that weakens muscles over time. Researchers made special programs to read electroencephalography, or EEG, which measures brain electrical activity through a cap on the head. These programs turn brain signals into actions. An AI with a camera watches and adjusts in real time to match the user's goals.
AI assistance boosts performance
Tests involved four people: three able-bodied and one paralyzed from the waist down. They wore the EEG cap to control a cursor on a screen or a robotic arm to stack blocks. In the cursor task, they hit targets and held position briefly. For the arm, they moved blocks to new spots. Everyone did tasks much quicker with AI help. The paralyzed person finished the arm task in under seven minutes with AI but could not without it.
The BCI reads signals for intended moves, and the AI uses vision to understand goals, not eye tracking. Future work may add better AI for faster, gentler handling of objects and use more data for complex jobs. The lab behind this also seeks patents and funding from health groups and tech firms. This approach avoids surgery risks of implanted BCIs, which are still in early trials after many years.
The engineers have described the methods and results of this study in a paper published in Nature Machine Intelligence.