- all
- popular
- trending
- most recent
Elon Musk's xAI builds supercomputer for AI with "superhuman" speed
The company set up a computing cluster with 100,000 Nvidia H200 GPUs for Grok in just 19 days, a task that usually takes about 4 years.
Analog computing solves complex equations using far less energy
Memristors can run AI tasks at 1/800th power: studies reported in IEEE Spectrum
'Skyrmions' move at record speeds: a step toward future computing
Nanoscale memory bits may offer high storage capacity and low energy consumption
Radical new light-wave chip design enables AI computing at speed of light
Uses high-speed light waves instead of electricity, could be adapted for use in GPUs
Twisted magnets make machine learning more adaptable, reduce energy use
Training one large AI model can generate hundreds of tons of carbon dioxide, say the researchers