back Back

Microsoft’s ChatGPT is teaching robots to follow commands in plain English, but what could go wrong?

Feb. 23, 2023.
1 min. read Interactions

About the Writer

Lewis Farrell

37.75457 MPXR

Highly curious about things that increase my awareness, expand my perception, and make me open to being a better person.

Microsoft is collaborating with OpenAI, the creators of ChatGPT, to teach robots to obey simple commands in plain English, and the implications of this development are enormous. The collaboration’s goal is to make it easier for people to interact with robots without having to learn complex programming languages or details about robotic systems.

Microsoft’s new set of design principles for ChatGPT enables robots to understand physical laws and receive instructions in plain English. The framework begins by defining a set of high-level tasks that a robot can perform, then writes a prompt that ChatGPT translates into robot speak, and then runs a simulation of the robot following your instructions.

While this is an exciting development, it also raises concerns because it implies that robots could potentially command themselves, which could have unintended consequences. Before any significant deployment, it is critical to ensure that the technology has been thoroughly tested and that any potential risks have been addressed. Nonetheless, this advancement has the potential to transform how people interact with robots, making it easier to develop and deploy new capabilities.

Source.

Interesting story? Please click on the 👍 button below!

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Comment on this content

0 Comments

0 thoughts on “Microsoft’s ChatGPT is teaching robots to follow commands in plain English, but what could go wrong?

Like
Dislike
Share

0

Comments
Reactions
💯 💘 😍 🎉 👏
🟨 😴 😡 🤮 💩

Here is where you pick your favorite article of the month. An article that collected the highest number of picks is dubbed "People's Choice". Our editors have their pick, and so do you. Read some of our other articles before you decide and click this button; you can only select one article every month.

People's Choice
Bookmarks