back Back

New energy-efficient brain-like transistor mimics human intelligence

Dec. 21, 2023.
5 mins. read. 7 Interactions

Radical new transistor deign performs energy-efficient associative learning at room temperature

About the Writer

Amara Angelica

232.43481 MPXR

Amara Angelica is Senior Editor, Mindplex

Moiré pattern physics allows for neuromorphic functionality at room temperature (credit:: Wikipedia)

Taking inspiration from the human brain, researchers have developed a new synaptic transistor capable of higher-level thinking.

The researchers at Northwestern University, Boston College and MIT have developed a device that processes and also stores information, like the human brain. This transistor goes beyond simple machine-learning tasks. It can categorize data and perform associative learning.

Previous transistor devices cannot function outside cryogenic temperatures (-3l0 degrees Fahrenheit or lower). The new device works at room temperatures. It also operates at fast speeds, consumes very little energy and retains stored information even when power is removed, making it ideal for real-world applications.

The study was published today (Dec. 20) in the journal Nature.

Co-locating brain, memory, and information processing

“The brain has a fundamentally different architecture than a digital computer,” said Northwestern’s Mark C. Hersam, who co-led the research. “In a digital computer, data move back and forth between a microprocessor and memory, consuming a lot of energy and creating a bottleneck when attempting to perform multiple tasks at the same time.

With the new architecture, “the brain, memory and information processing are instead co-located and fully integrated, resulting in orders of magnitude higher energy efficiency. Our synaptic transistor similarly achieves concurrent memory and information processing functionality to more faithfully mimic the brain, he said.

High power consumption

Recent advances in artificial intelligence (AI) have motivated researchers to develop computers that operate more like the human brain. Conventional digital computing systems have separate processing and storage units, causing data-intensive tasks to devour large amounts of energy. 

Currently, the memory resistor, or “memristor,” is the most well-developed technology that can perform combined processing and memory function. But memristors still suffer from energy-costly switching.

“For several decades, the paradigm in electronics has been to build everything out of transistors and use the same silicon architecture,” Hersam said. “Significant progress has been made by simply packing more and more transistors into integrated circuits.”

But this comes “at the cost of high power consumption, especially in the current era of big data, where digital computing is on track to overwhelm the grid. We have to rethink computing hardware, especially for AI and machine-learning tasks.”

Based on moiré patterns

To rethink this paradigm, Hersam and his team explored new advances in the physics of moiré patterns, a type of geometrical design that arises when two patterns are layered on top of one another. When two-dimensional materials are stacked, new properties emerge that do not exist in one layer alone. And when those layers are twisted to form a moiré pattern, unprecedented tunability of electronic properties becomes possible.

For the new device, the researchers combined two different types of atomically thin materials: bilayer graphene and hexagonal boron nitride. When stacked and purposefully twisted, the materials formed a moiré pattern. By rotating one layer relative to the other, the researchers could achieve different electronic properties in each graphene layer, even though they are separated by only atomic-scale dimensions. With the right choice of twist, researchers harnessed moiré physics for neuromorphic functionality at room temperature.

“With twist as a new design parameter, the number of permutations is vast,” Hersam said. “Graphene and hexagonal boron nitride are very similar structurally but just different enough that you get exceptionally strong moiré effects.”

Higher-level “associative learning”

To test the transistor, Hersam and his team trained it to recognize similar—but not identical—patterns. Just earlier this month, Hersam introduced a new nanoelectronic device capable of analyzing and categorizing data in an energy-efficient manner, but his new synaptic transistor takes machine learning and AI one leap further.

“If AI is meant to mimic human thought, one of the lowest-level tasks would be to classify data, which is simply sorting into bins,” Hersam said. “Our goal is to advance AI technology in the direction of higher-level thinking. Real-world conditions are often more complicated than current AI algorithms can handle, so we tested our new devices under more complicated conditions to verify their advanced capabilities.”

First the researchers showed the device one pattern: 000 (three zeros in a row). Then, they asked the AI to identify similar patterns, such as 111 or 101. “If we trained it to detect 000 and then gave it 111 and 101, it knows 111 is more similar to 000 than 101,” Hersam explained. “000 and 111 are not exactly the same, but both are three digits in a row. Recognizing that similarity is a higher-level form of cognition, known as “associative learning.”

In experiments, the new synaptic transistor successfully recognized similar patterns, displaying its associative memory. Even when the researchers threw curveballs — like giving it incomplete patterns—it still successfully demonstrated associative learning.

“Current AI can be easy to confuse, which can cause major problems in certain contexts,” Hersam said. “Imagine if you are using a self-driving vehicle, and the weather conditions deteriorate. The vehicle might not be able to interpret the more complicated sensor data as well as a human driver could. But even when we gave our transistor imperfect input, it could still identify the correct response.”

Hersam co-led the research with Qiong Ma of Boston College and Pablo Jarillo-Herrero of MIT. He is the Walter P. Murphy Professor of Materials Science and Engineering at Northwestern’s McCormick School of Engineering and chair of the department of materials science and engineering, director of the Materials Research Science and Engineering Center and member of the International Institute for Nanotechnology.

The study, “Moiré synaptic transistor with room-temperature neuromorphic functionality,” was primarily supported by the National Science Foundation.

Citation: Yan, X., Zheng, Z., Sangwan, V.K. et al. Dec. 12, 2023. Moiré synaptic transistor with room-temperature neuromorphic functionality. Nature 624, 551–556 (2023). https://doi.org/10.1038/s41586-023-06791-1

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Comment on this content

1 Comments

One thought on “New energy-efficient brain-like transistor mimics human intelligence

  1. Wow, this is huge and could be a cornerstone in a whole new AI architecture. Kudos on presenting a complex concept in a cogent manner.

    1 Like
    Dislike
    Share
    Reply

4

Like

Dislike

Share

1

Comments
Reactions
💯 💘 😍 🎉 👏
🟨 😴 😡 🤮 💩

Here is where you pick your favorite article of the month. An article that collected the highest number of picks is dubbed "People's Choice". Our editors have their pick, and so do you. Read some of our other articles before you decide and click this button; you can only select one article every month.

People's Choice
Bookmarks