Twisted magnets make machine learning more adaptable, reduce energy use
Nov. 14, 2023.
2 mins. read.
5 Interactions
Training one large AI model can generate hundreds of tons of carbon dioxide, say the researchers
An international team of researchers has found that by applying an external magnetic field generated by chiral (twisted) magnets and changing temperature, the physical properties of these materials could be adapted to different machine-learning tasks and dramatically reduce energy use.
“This work brings us a step closer to realizing the full potential of physical reservoirs to create computers that not only require significantly less energy, but also adapt their computational properties to perform optimally across various tasks, just like our brains,” said Dr. Oscar Lee (London Centre for Nanotechnology at UCL and UCL Department of Electronic & Electrical Engineering), lead author of a paper. “The next step is to identify materials and device architectures that are commercially viable and scalable.”
Reducing energy required for machine learning
Traditional computing consumes large amounts of electricity. This is partly because it has separate units for data storage and processing, meaning information has to be shuffled constantly between the two, wasting energy and producing heat. This is particularly a problem for machine learning, which requires vast datasets for processing. Training one large AI model can generate hundreds of tons of carbon dioxide, say the researchers.
Physical reservoir computing is one of several neuromorphic (or brain inspired) approaches that aim to remove the need for distinct memory and processing units, facilitating more energy-efficient ways to process data.
Customizing computation tasks
The researchers found that different magnetic phases of chiral magnets excelled at different types of computing tasks. The skyrmion phase, where magnetized particles are swirling in a vortex-like pattern, had a potent memory capacity apt for forecasting tasks. The conical phase, meanwhile, had little memory, but its non-linearity was ideal for transformation tasks and classification, for instance, identifying if an animal is a cat or dog.
The work also involved researchers at Imperial College London, the University of Tokyo and Technische Universität München and was supported by the Leverhulme Trust, Engineering and Physical Sciences Research Council (EPSRC), Imperial College London President’s Excellence Fund for Frontier Research, Royal Academy of Engineering, the Japan Science and Technology Agency, Katsu Research Encouragement Award, Asahi Glass Foundation, and the DFG (German Research Foundation).The work also involved researchers at the University of Tokyo and Technische Universität München and was supported by the Leverhulme Trust, Engineering and Physical Sciences Research Council (EPSRC), Imperial College London President’s Excellence Fund for Frontier Research, Royal Academy of Engineering, the Japan Science and Technology Agency, Katsu Research Encouragement Award, Asahi Glass Foundation, and the DFG (German Research Foundation).
Citation: Lee, O., Wei, T., Stenning, K. D., Gartside, J. C., Prestwood, D., Seki, S., Aqeel, A., Karube, K., Kanazawa, N., Taguchi, Y., Back, C., Tokura, Y., Branford, W. R., & Kurebayashi, H. (2023). Task-adaptive physical reservoir computing. Nature Materials, 1-9. https://doi.org/10.1038/s41563-023-01698-8 (open-access)
Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter.
0 Comments
0 thoughts on “Twisted magnets make machine learning more adaptable, reduce energy use”