Making AI smarter: a multisensory, integrated artificial neuron
Sep. 16, 2023.
2 mins. read.
Interactions
How to create a tactile-light sensor, emulating the human brain
Sensors in robots and other devices typically run separately. However, “allowing sensors to communicate directly with each other can be more efficient in terms of energy and speed,” emulating the human brain, says Saptarshi Das, associate professor of engineering science and mechanics at Penn State.
Penn State researchers applied this synergistic approach in developing the first multisensory integrated artificial neuron, which they announced today (Sept. 15) in the open-access journal Nature Communications.
Creating a tactile-light neuron
The team focused on integrating a tactile sensor and a visual sensor. This combination allows the output of one sensor to modify the other, with the help of visual memory. (For example, a flash of light could enhance the chance of successful movement through a dark room.)
The researchers fabricated the multisensory neuron by connecting a tactile sensor to a phototransistor, based on a monolayer of molybdenum disulfide, a compound that exhibits unique electrical and optical characteristics, useful for detecting light and supporting transistors.
The sensor generates electrical spikes in a manner reminiscent of neurons processing information, allowing it to integrate both visual and tactile cues.
Simulations
To simulate touch, the tactile sensor used the triboelectric effect, in which two layers slide against one another to produce electricity (meaning the touch stimuli were encoded into electrical impulses). To simulate visual input, the researchers shined a light into a monolayer molybdenum disulfide photo memtransistor (which can remember visual input), similar to how a person can hold onto the general layout of a room after a quick flash illuminates it.
They found that the sensory response of the neuron—simulated as electrical output—increased when both visual and tactile signals were weak.
Das explained that an artificial multisensory neuron system could enhance sensor technology’s efficiency, paving the way for more eco-friendly AI uses. As a result, robots, drones and self-driving vehicles could navigate their environment more effectively while using less energy.
Combining sensors to mimic how our brains actually work
“Biology enables small organisms to thrive in environments with limited resources, minimizing energy consumption in the process,” said Das, who is also affiliated with the Materials Research Institute.
“The requirements for different sensors are based on the context. In a dark forest, you’d rely more on listening than seeing, but we don’t make decisions based on just one sense,” he noted. “We have a complete sense of our surroundings, and our decision-making is based on the integration of what we’re seeing, hearing, touching, smelling, etc.
“The senses evolved together in biology, but separately in AI. In this work, we’re looking to combine sensors and mimic how our brains actually work.”
The Army Research Office and the National Science Foundation supported this work.
Citation: Sadaf, M. U., Sakib, N. U., Pannone, A., Ravichandran, H., & Das, S. (2023). A bio-inspired visuotactile neuron for multisensory integration. Nature Communications, 14(1), 1-12. https://doi.org/10.1038/s41467-023-40686-z (open-access)
Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter.
0 Comments
0 thoughts on “Making AI smarter: a multisensory, integrated artificial neuron”