back Back

Easy to use code generator for AI models could save bandwidth, memory, and computation

Feb. 04, 2025.
2 mins. read. 1 Interactions

MIT researchers have devised a system to make AI models work better by using data redundancy and developed a code generator.

About the Writer

Giulio Prisco

157.99152 MPXR

Giulio Prisco is Senior Editor at Mindplex. He is a science and technology writer mainly interested in fundamental science and space, cybernetics and AI, IT, VR, bio/nano, crypto technologies.

Artificial intelligence (AI) models like those for medical images or voice recognition use complex data structures called tensors. Tensors are like matrices but with more than two dimensions, making them hard to work with. These models need a lot of computation, using a lot of energy.

MIT researchers have devised a system to make AI models work better by using two kinds of data redundancy: sparsity and symmetry. Sparsity means many values in the data are zero, so you only need to work with the non-zero parts. Symmetry means parts of the data mirror each other, so you only need to compute half.

Before, using these redundancies was hard and you could only use one type at a time. But the MIT system, called SySTeC, lets you use both at once. This can make computations 30 times faster in some cases.

Automatic generation of complex code

SySTeC is easy to use, so even people new to deep learning can improve their AI algorithms. The system improves the generation of complex code.

SySTeC starts by figuring out how to use symmetry. If a tensor is symmetric, SySTeC only works on half of it. If steps in between are symmetric, it skips extra work. Then, it deals with sparsity by only keeping non-zero data. In the end, SySTeC generates ready-to-use code.

“In this way, we get the benefits of both optimizations,” says researcher Willow Ahrens in an MIT press release. “And the interesting thing about symmetry is, as your tensor has more dimensions, you can get even more savings on computation.”

The MIT researchers have described SySTeC in a preprint published in arXiv and will present it at the International Symposium on Code Generation and Optimization.

The MIT researchers argue that SySTeC could make AI models not just faster with less memory and bandwidth demands.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Comment on this article

0 Comments

0 thoughts on “Easy to use code generator for AI models could save bandwidth, memory, and computation

1

Like

Dislike

Share

Comments
Reactions
💯 💘 😍 🎉 👏
🟨 😴 😡 🤮 💩

Here is where you pick your favorite article of the month. An article that collected the highest number of picks is dubbed "People's Choice". Our editors have their pick, and so do you. Read some of our other articles before you decide and click this button; you can only select one article every month.

People's Choice
Bookmarks