Your spinal cord is smarter than you think

Researchers at the RIKEN Center for Brain Science (CBS) in Japan and colleagues have proved that motor (muscle movement) learning and memory are not solely confined to brain circuits.

Published in Science on April 11, the study found two critical groups of spinal cord neurons: one necessary for new adaptive learning, and another for recalling adaptations once they have been learned.


In an experiment, a mouse learned that dangling its legs too much avoided being electrically shocked, without learning and recall.

“Not only do these results challenge the prevailing notion that motor learning and memory are solely confined to brain circuits,” says Aya Takeoka at the RIKEN Center for Brain Science (CBS) in Japan, “but we showed that we could manipulate spinal cord motor recall, which has implications for therapies designed to improve recovery after spinal cord damage.”

The findings could help scientists develop ways to assist motor recovery after spinal cord injury.

The findings could help scientists develop ways to assist motor recovery after spinal cord injury.

Citation: Lavaud, S., Bichara, C., Yeh, H., & Takeoka, A. (2024). Two inhibitory neuronal classes govern acquisition and recall of spinal sensorimotor adaptation. Science.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Miniature stimulator could revolutionize deep-brain treatments

Rice University engineers say they have developed the smallest implantable brain stimulator ever demonstrated in a human patient. The pea-sized device can be powered wirelessly via an external transmitter and can stimulate the brain through the dura  (the protective membrane attached to the bottom of the skull). No implant required.

The “Digitally Programmable Over-brain Therapeutic” (DOT) device could revolutionize treatment for drug-resistant depression and other psychiatric or neurological disorders. This therapeutic alternative offers greater patient autonomy and accessibility than current neurostimulation-based therapies, and is less invasive than other brain-computer interfaces (BCIs), say the researchers.

No battery required

“In this paper [in open-access journal Science Advances], we show that our device, the size of a pea, can activate the motor cortex, which results in the patient moving their hand,” said Jacob Robinson, a professor of electrical and computer engineering and of bioengineering at Rice. “In the future, we can place the implant above other parts of the brain, like the prefrontal cortex, where we expect to improve executive functioning in people with depression or other disorders.”

In-home use

The researchers tested the device temporarily in a volunteeer human patient, using it to stimulate the motor cortex⎯the part of the brain responsible for movement—and generating a hand movement response. They next showed the device interfaces with the brain stably for a 30-day duration in pigs.

Implantation would require a minimally invasive 30-minute procedure that would place the device in the bone over the brain (not the brain itself). Both the implant and the incision would be virtually invisible, and the patient would go home the same day.

Robinson envisions the technology being used from the comfort of one’s home, where patients would retain complete control over how the treatment is administered.

No brain surgery

The equivalent treatment is deep brain stimulation (DBS), a safe procedure. But it’s still brain surgery, and its perceived risk will place a very low ceiling on the number of people who are willing to accept it and may benefit from it, according to Robinson.

For some conditions, epilepsy for example, the device may need to be on permanently or most of the time, but for disorders such as depression and OCD, a regimen of just a few minutes of stimulation per day could suffice to bring about the desired changes in the functioning of the targeted neuronal network.

Future research

Robinson said he is “really interested in the idea of creating networks of implants and creating implants that can stimulate and record, so that they can provide adaptive personalized therapies based on your own brain signatures.” An associated company, Motif Neurotech, is in the process of seeking FDA approval for a long-term clinical trial in humans. Patients and caregivers can sign up on the Motif Neurotech website to learn when and where these trials will begin.

The work was supported in part by The Robert and Janice McNair Foundation, the McNair Medical Institute, DARPA and the National Science Foundation.

Citation: Woods, J. E., Singer, A. L., Alrashdan, F., Tan, W., Tan, C., Sheth, S. A., Sheth, S. A., & Robinson, J. T. (2024). Miniature battery-free epidural cortical stimulators. Science Advances. (open-access)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

‘Holodeck’ helps robots navigate the real world

Virtual interactive environments are currently used to train robots prior to real-world deployment, in a process called “Sim2Real.” But these complex environments, created by artists, are in short supply.

“Generative AI systems like ChatGPT are trained on trillions of words, and image generators like Midjourney and DALL-E are trained on billions of images,” notes Callison-Burch, Associate Professor in Computer and Information Science (CIS) at the University of Pennsylvania.

“We only have a fraction of that amount of 3D environments for training ‘embodied AI.’ If we want to use generative AI techniques to develop robots that can safely navigate in real-world environments, then we will need to create millions or billions of simulated environments.”

Star Trek-inspired

Enter Holodeck, a system for generating interactive 3D environments, co-created by Callison-Burch and others at Penn, and collaborators at Stanford, the University of Washington, and the Allen Institute for Artificial Intelligence (AI2).

Holodeck generates a virtually limitless range of indoor environments, using AI to interpret users’ requests.

Holodeck engages an LLM in conversation, using a carefully structured series of hidden queries to break down user requests into specific parameters. The system executes this query by dividing it into multiple steps. The floor and walls are created, then the doorway and windows.

Next, Holodeck searches Objaverse, a vast library of premade digital objects. Holodeck queries a layout module, which the researchers designed to constrain the placement of objects.

Reality check

To evaluate Holodeck’s abilities, the researchers generated 120 scenes using both Holodeck and ProcTHOR, an earlier tool created by the School of Engineering and Applied Science and the
University of Pennsylvania Institute for Artificial Intelligence. Students then evaluated the results.

The researchers also tested Holodeck’s ability to generate scenes that are less typical in robotics research and more difficult to manually create than apartment interiors, like stores, public spaces and offices. The researchers then used scenes generated by Holodeck to “fine-tune” an embodied AI agent.

In June, the researchers will present Holodeck at the 2024 Institute of Electrical and Electronics
Engineers (IEEE) and Computer Vision Foundation (CVF) Computer Vision and
Pattern Recognition (CVPR) Conference
 in Seattle, Washington.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

AI-powered smartglasses track gaze, facial expressions for VR/AR headsets

Cornell University researchers have developed two technologies that track a person’s gaze and facial expressions, using sonar-like sensing to improve communication.

Mounted on an eyeglass frame, the MR (mixed-reality) technology is small enough to fit on commercial smartglasses or virtual reality and augmented reality headsets like Vision Pro or Meta Quest. The design consumes significantly less power than similar tools using cameras, say the researchers.

The speakers and microphones are mounted on an eyeglass frame, bouncing sonar-like inaudible soundwaves off someone’s face and picking up reflected signals caused by face and eye movements.


GazeTrak is the first eye-tracking system that relies on acoustic signals. It continuously and accurately detects facial expressions and recreates them in an avatar in real time. The detailed facial expressions and gaze movements could improve interactions with other users.

“It’s small, it’s cheap and super low-powered, so you can wear it on smartglasses every day—it won’t kill your battery,” said Cheng Zhang, an assistant professor of information science who directs the Smart Computer Interfaces for Future Interactions (SciFi) Lab, which created the new devices.

GazeTrak has a speaker and four microphones positoned around the inside of each eye frame of the glasses. It bounces and picks up soundwaves from the eyeball and the area around the eyes. It then sends sound signals into a customized deep-learning pipeline that uses AI to continuously infer the direction of the person’s gaze.


For futher help, EyeEcho has an ultrasound speaker and an microphone located next to the glasses’ hinges, pointing down to catch skin movement as facial expressions change. These reflected signals are also interpreted by AI.

EyeEcho continuous facial expression tracking on glasses
(credit: Ke Li et al.)

Imaginative Uses

With this new technology, users can make hands-free video calls through an avatar, even in a noisy café or on the street. While some smartglasses have the ability to recognize faces or distinguish between a few specific expressions, currently, none track expressions continuously, like EyeEcho, say the researchers.

GazeTrak could also be used with screen readers to read out portions of text for people with low vision as they read a website or book.

GazeTrak and EyeEcho could also potentially help diagnose or monitor neurodegenerative diseases, like Alzheimer’s and Parkinsons, where patients often have abnormal eye movements and less expressive faces. It could tracked the progression of the disease at home or via a physician.

Li will present GazeTrak at the Annual International Conference on Mobile Computing and Networking on May 11-16 and EyeEcho at the Association of Computing Machinery CHI conference on Human Factors in Computing Systems in May.

Citations: Ke Li et al. EyeEcho: Continuous and Low-power Facial Expression Tracking on Glasses. arXiv and Ke Li et al. GazeTrak: Exploring Acoustic-based Eye Tracking on a Glass Frame. arXiv.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Mayo Clinic study finds active workstations may improve cognitive performance

What if your workstation had a walking pad, bike, stepper and/or standing desk?

Mayo Clinic study suggests that such an active workstation could reduce your sedentary time and improve your mental cognition at work—decreasing your risk of preventable chronic diseases without reducing job performance.

Improving work performance and health

“Active workstations may offer a way to potentially improve cognitive performance and overall health, simply by moving at work,” says Francisco Lopez-Jimenez, M.D., a preventive cardiologist at Mayo Clinic and senior author of the study, which was published in the Journal of the American Heart Association.

The research involved 44 participants in a randomized clinical trial with four office settings, evaluated over four consecutive days at Mayo Clinic’s Dan Abraham Healthy Living Center. Study findings are published in the Journal of the American Heart Association.

The settings included a stationary or sitting station on the first day, followed by three active workstations (standing, walking or using a stepper) in a randomized order.

Researchers analyzed participants’ neurocognitive function based on 11 assessments that evaluated reasoning, short-term memory and concentration. Fine motor skills were assessed through an online typing speed test and other tests.  

Improved reasoning

When participants used the active workstations, their brain function either improved or stayed the same. Their typing speed slowed down only a bit, but the accuracy of their typing was not affected. The study revealed improved reasoning scores when standing, stepping and walking as compared with sitting. 

When it comes to your cardiovascular health, office workers may spend a large part of their eight-hour workday sitting at a computer screen and keyboard.

“These findings indicate that there are more ways to do that work while remaining productive and mentally sharp. We would do well to consider an active workstation in the prescription for prevention and treatment of conditions like obesity, cardiovascular disease and diabetes,” says Lopez-Jimenez.

Citation: Miguel A. Gomez Ibarra et al. 4 Apr 2024. Effect of Active Workstations on Neurocognitive Performance and Typing Skills: A Randomized Clinical Trial. Journal of the American Heart Association. (open-access)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

World’s most powerful MRI scanner images the living brain with unrivaled clarity

The Iseult MRI scanner, with a magnetic field intensity of 11.7 tesla (the most powerful in the world), has been announced by CEA (French Alternative Energies and Atomic Energy Commission).

CEA said they have acquired “some of the most remarkable anatomical images of the brain. The same image quality would require hours with MRI scanners currently available in hospitals (1.5 or 3 teslas), but that’s not realistic in practice (any movement would blur the image).”

Detecting weak brain signals

CEA said the scanner will also facilitate detection of some chemical species with weak signals that are hard to capture at lower magnetic fields, such as lithium (a drug used to treat bipolar disorders), molecules actively involved in brain metabolism) and glucose and glutamate (associated with many brain diseases, such as gliomas and neurodegeneration).

The ultra-detailed anatomical information will also support diagnostic and health care for neurodegenerative diseases such as Alzheimer’s and Parkinson’s.

“Cognitive sciences will also be of key importance in our research,” said Nicolas Boulant, the Head of the Iseult project and Director of Research at CEA.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

AI writing, illustration emit less carbon than humans

A study by researchers at University of Kansas and University of California-Irvine suggests that writing and illustrating using AI emits hundreds of times less carbon than humans performing the same tasks. 

To calculate the carbon footprint of a person writing, the researchers measured the “energy budget”—the amount of energy used in certain tasks for a set period of time.

AI less energy-wasteful

They found that AI systems emit between 130 and 1,500 times less CO2e (carbon dioxide equivalent) per page of text, compared to CO2e generated by human writers. And they also found that illustration systems like DALL-E 2 and Midjourney emit 310 to 2,900 times less CO2e per image than humans.

“When we did it, the results were kind of astonishing, even by conservative estimates,” said Torrance. “AI is extremely less wasteful.”

Environmental impact and combatting climate change

The research was conducted to improve understanding of AI and its environmental impact and to address the United Nations Sustainable Development Goals of ensuring sustainable consumption and production patterns; and taking urgent action to combat climate change and its impacts, the researchers wrote.

Citation: Tomlinson, B., Black, R.W., Patterson, D.J. et al. The carbon emissions of writing and illustrating are lower for AI than for humans. Sci Rep 14, 3732 (2024). (open access)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Brain-computer interface lets patients play games with just their thoughts

Engineers at the University of Texas at Austin have incorporated machine learning capabilities with their brain-computer interface, normally used to help with patient motor disabilities.

The subjects wear a cap with electrodes that gather data by measuring electrical signals from the brain, and the decoder interprets that information and translates it into game action.

Improving brain function

The experiments are designed to improve brain function for patients and use the devices controlled by brain-computer interfaces to make their lives easier.

The decoder worked well enough that subjects trained simultaneously for the bar game and the more complicated car racing game, which required thinking several steps ahead to make turns.

The research on the calibration-free interface is published in PNAS Nexus.

Improved clinical use

In a clinical setting, this technology will eliminate the need for a specialized team to do this calibration process, which is long and tedious, the researchers note.

This project used 18 subjects with no motor impairments. They plan to test this on people with motor impairments to apply it to larger groups in clinical settings.

The researchers have showed off another potential use of the technology: controlling two rehabilitation robots for hand and arm.

Citation: Kumar, S., Alawieh, H., Racz, F. S., Fakhreddine, R., & Millán, J. D. (2024). Transfer learning promotes acquisition of individual BCI skills. PNAS Nexus, 3(2). (open access)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Generative AI used to develop potential new drugs for antibiotic-resistant bacteria

Acinetobacter is a group of bacteria commonly found in the environment. The most common cause of infections is Acinetobacter baumannii. (credit: CDC) 

Researchers at Stanford Medicine and McMaster University have devised a new AI model, SyntheMol (“synthesizing molecules”), which creates recipes for chemists to synthesize drugs in the lab. With nearly 5 million deaths linked to antibiotic resistance globally every year, new ways to combat resistant bacterial strains are urgently needed, according to the researchers.

Using SyntheMol, the researchers have so far developed six novel drugs aimed at killing resistant strains of Acinetobacter baumannii, one of the leading pathogens responsible for antibacterial resistance-related deaths, as noted in a study published March 22 in the journal Nature Machine Intelligence.

25,000 possible antibiotics and the recipes to make them in less than nine hours

The model was trained to construct potential drugs using a library of more than 130,000 molecular building blocks and a set of validated chemical reactions. It generated the final compound and the steps it took with those building blocks, giving the researchers a set of recipes to produce the drugs.

The researchers also trained the model on existing data of different chemicals’ antibacterial activity against A. baumannii. With these guidelines and its building block starting set, SyntheMol generated around 25,000 possible antibiotics and the recipes to make them in less than nine hours.

To prevent the bacteria from quickly developing resistance to the new compounds, researchers then filtered the generated compounds to only those that were dissimilar from existing compounds.

“Now we have not just entirely new molecules but also explicit instructions for how to make those molecules,” said James Zou, PhD, an associate professor of biomedical data science and co-senior author on the study.

A new chemical space

The researchers chose the 70 compounds with the highest potential to kill the Acinetobacter baumannii bacterium. The company was able to efficiently generate 58 of these compounds, six of which killed a resistant strain of A. baumannii when researchers tested them in the lab. These new compounds also showed antibacterial activity against other kinds of infectious bacteria prone to antibiotic resistance, including E. coli, Klebsiella pneumoniae and MRSA.

The scientists were able to further safety-test two of the six compounds for toxicity in mice. The next step is to test the drugs in mice infected with A. baumannii to see if they work in a living body,” Zou said. “This AI is really designing and teaching us about this entirely new part of the chemical space that humans just haven’t explored before.”

The study was funded by the Weston Family Foundation, the David Braley Centre for Antibiotic Discovery, the Canadian Institutes of Health Research, M. and M. Heersink, the Chan-Zuckerberg Biohub, and the Knight-Hennessy scholarship.

Citation: Swanson, K., Liu, G., Catacutan, D.B. et al. Generative AI for designing and validating easily synthesizable and structurally novel antibiotics. Nat Mach Intell 6, 338–353 (2024).

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Implantable batteries could one day run on your body’s own oxygen

Implantable and biocompatible Na-O2 battery (credit: Chem/Lv et al.)

Implantable medical devices rely on batteries (such as pacemakers, which keep the heart on beat). But batteries eventually run low and require invasive surgeries to replace.

So researchers at Tianjin University of Technology, China devised an implantable battery that runs on oxygen in the body. Their study with rats, published in the journal Chem, shows that the proof-of-concept design can deliver stable power and is compatible with the body’s biological system.

Biocompatible electrodes

To build a safe and efficient battery, the researchers made its electrodes out of a sodium-based alloy and nanoporous gold, a material with pores thousands of times smaller than a hair’s width. Gold is compatible with living systems and sodium is an essential and ubiquitous element in the human body.

The electrodes undergo chemical reactions with oxygen in the body to produce electricity. To protect the battery, the researchers encased it within a porous polymer film that is soft and flexible.

The researchers implanted the battery under the skin on the backs of rats and measured its electricity output. Two weeks later, they found that the battery can produce stable voltages between 1.3 V and 1.4 V. Although the output is insufficient to power medical devices, the design shows that harnessing oxygen in the body for energy is possible.

No inflammatory reactions

The team also evaluated inflammatory reactions, metabolic changes, and tissue regeneration around the battery. The rats showed no apparent inflammation.

Byproducts from the battery’s chemical reactions, including sodium ions, hydroxide ions, and low levels of hydrogen peroxide, were easily metabolized by the body and did not affect the kidneys and liver. The rats healed well after implantation, with the hair on their back completely regrown after four weeks. Blood vessels also regenerated around the battery.

The team plans to increase the battery’s energy delivery by exploring more efficient materials for electrodes and optimizing battery structure and design.

Beyond powering medical devices

Corresponding author Xizheng Liu, who specializes in energy materials and devices, noted that the battery is easy to scale up in production; and choosing cost-effective materials can further lower the price. The team’s battery may also find other purposes beyond powering medical devices.

“Because tumor cells are sensitive to oxygen levels, implanting this oxygen-consuming battery may help starve cancers. It’s also possible to convert the battery energy to heat and kill cancer cells,” says Liu.

This work was financially supported by the National Key Research and Development Program of China, the National Science Fund for Distinguished Young Scholars, and the National Natural Science Foundation of China.

Citation: Lv et al. Implantable and Bio-compatible Na-O2 battery. Chem (Cell Press. (open access)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter