Scalable AGI | Mindplex Podcast – S2EP13
Mar. 23, 2024. 1 hr. and 32 mins. and 57 sec. listen. 47 Interactions
Simuli CEO Rachel StClair explains how holographic compression will vastly revolutionize data processing and the importance of using standardized protocols to pave the way for Superintelligent AGI and quantum computing.
15 Comments
15 thoughts on “Scalable AGI | Mindplex Podcast – S2EP13”
Good
🟨 😴 😡 ❌ 🤮 💩
Good!
🟨 😴 😡 ❌ 🤮 💩
wow amazing
🟨 😴 😡 ❌ 🤮 💩
Nice episode
🟨 😴 😡 ❌ 🤮 💩
episode 26 is really awesome
🟨 😴 😡 ❌ 🤮 💩
wow amazing discussion
🟨 😴 😡 ❌ 🤮 💩
Good project
🟨 😴 😡 ❌ 🤮 💩
it was impressive !
keep going!
🟨 😴 😡 ❌ 🤮 💩
Interesting
🟨 😴 😡 ❌ 🤮 💩
Good topic👍
🟨 😴 😡 ❌ 🤮 💩
really mindplex podcast just impressive
🟨 😴 😡 ❌ 🤮 💩
Important protocol and very nice
🟨 😴 😡 ❌ 🤮 💩
I am impressive
🟨 😴 😡 ❌ 🤮 💩
Very nice presentation and very nice voice I liked it
🟨 😴 😡 ❌ 🤮 💩
As a programmer, the idea of self-sustaining AI that can auto-update its source code scares the snot out of me. 😉
When we compare the state of quantum computers and AI systems today, the logical conclusion is that we will have a self-sustainable AI way sooner than a fully functional QC. QC is hype, and the hardware is in the cradle stage. It won't be useful, in a large scale to impact the AI world, in our lifetime. Emm unless the damn AIs become self-sustainable and design better hardware.
Again, frankly speaking, AGI, or self-sustainable AIs in their classical meaning (conscious), are a long way from reality. Yes, we do the hype, but the walk is a long way. However, an LLM is closer to being a self-sustaining system, and very soon one of the good LLMs will be able to rewrite their own codes and update most software systems without any human supervision.
I have enjoyed this episode, and kudos, guys.
🟨 😴 😡 ❌ 🤮 💩