What to Do About a Problem Like AI

A new book argues that the solution to the AI quandary is to make it more human and sentimental.

By Matt Hrodey
Jun 11, 2023 11:00 PMJun 11, 2023 11:01 PM
AI face
Should we make machines more like us? (Credit: VAlex/Shutterstock)

Newsletter

Sign up for our email newsletter for the latest science news
 

As the field of artificial intelligence (AI) has grown from vague hopes to striking realities, as seen in technologies like ChatGPT, so too have alarms grown clearer. Geoffrey Hinton, the so-called “godfather of AI,” left Google to speak more clearly about the threat posed by the technology, which may one day grow smarter than its creators. He strictly opposes allowing the military to use AI and worries that an artificial hyperintelligence could one day manipulate human beings.

How do we convince our machines to behave ethically, even when we’re not watching? A forthcoming book by Eve Poole called Robot Souls: Programming in Humanity argues that we have to make them more like us, and that means imbuing them with empathy and compassion, even if it means reducing their efficiency. This would be no simple program update, and as with humans, empathy and caring would remain specific to the individual, a kind of artificial subjectivity.

0 free articles left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

0 free articlesSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

Stay Curious

Sign up for our weekly newsletter and unlock one more article for free.

 

View our Privacy Policy


Want more?
Keep reading for as low as $1.99!


Log In or Register

Already a subscriber?
Find my Subscription

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2023 Kalmbach Media Co.