back Back

What Is Truth and Its Importance in the Age of Singularity: With Michael Shermer | MCP Episode 14

Aug. 12, 2024. 1 hr. and 11 mins. and 51 sec. listen. 4 Interactions

In this episode, Dr. Mihaela Ulieuru interviews Michael Shermer, founding publisher of Skeptic magazine and renowned science writer, discussing his work on skepticism and his bestselling books.

About the host

Mihaela Ulieru

22.82332 MPXR

Dr. Ulieru, a Blockchain pioneer, champions equality and a fair society. As an esteemed academic researcher, she spearheads multi-stakeholder programs in Distributed Artificial Intelligence and Applications. With over 200 peer-reviewed articles, over 250 keynote speeches, and support for several non-profits and foundations, she still finds time to write poetry.

Comment on this podcast

2 Comments

2 thoughts on “What Is Truth and Its Importance in the Age of Singularity: With Michael Shermer | MCP Episode 14

  1. Hugh

    1 mon ago
    3.91169 MPXR
    2 interactions

    I enjoyed this interview, however I cringed when Michael said he didn't believe in the alignment issue because the government would be able to tell Elon to shut off the self-driving algorithm if it got out out of control. Did he not read Bostrom's Superintelligence? If the superintelligence takeoff is short enough no one will be able to "turn it off", and if at that point it's not aligned, well, then it might be game over.

    1 Like
    Dislike
    Share
    Reply
    1. I have also enjoyed this one!

      First, kudos to Dr. Mihaela! Probably your interview with Michael is the best I have seen, and it is ten times better than the one with Charles Hoskinson. Keep it up.

      Now I am going to address your [Hugh's] comment. Like you, I agree that ignoring the alignment issue and relying on government intervention is a fruitless plan. However, I do not agree with Bostrom's side proposal as well. I needed to mention that here because you mentioned Bostrom, and I since I do not know what your stance is when it comes to his side's proposal, I wanted to know your view on it as well.

      Like you said, Bostrom correctly identified the risk; well, to be honest, any smart person will also identify that. Yet, Bostrom's solution is also a fruitless plan, or even a much worse one, which will create tyrants. He and his group are suggesting that, to avoid the risks of the short takeoff, all developments of AI should be monopolized, and the rights should be reserved for selected elites like him and his group. As you can see, this is worse than ignoring the alignment issue or hoping for benevolent government regulation. 

      Like
      Dislike
      Share
      Reply

Related Articles

2

Like

Dislike

Share

1

Comments
Reactions
💯 💘 😍 🎉 👏
🟨 😴 😡 🤮 💩

Here is where you pick your favorite article of the month. An article that collected the highest number of picks is dubbed "People's Choice". Our editors have their pick, and so do you. Read some of our other articles before you decide and click this button; you can only select one article every month.

People's Choice
Bookmarks