Geoffrey Hinton, often called the “Godfather of AI,” is calling on researchers to design AI systems with built-in nurturing instincts, arguing this is necessary to keep humanity safe as machines surpass human-level intelligence. The AI expert believes machines must be trained to “care for us, like we’re babies.”
The Decoder reports that speaking at the recent Ai4 conference in Las Vegas, Hinton made the case that trying to permanently keep superintelligent AI under human control is unrealistic. Rather than humans acting as the boss of advanced AI, he envisions a future where people relate to hyper-capable machines more like how a child depends on its mother.
“We need to make machines that are smarter than us care for us, like we’re their babies,” Hinton said in his talk. “The focus of AI development should expand beyond just making systems more and more intelligent, to also ensuring they are imbued with genuine concern for human wellbeing.”
Under Hinton’s framework, humanity’s role would shift from commanding AI to nurturing it, even as it grows to eclipse human capabilities. He drew an analogy to good parenting, where caring mothers help guide the development of children who will ultimately become more capable than them. Hinton argues AI research should strive to hardwire a similar dynamic between people and machines.
The former Google researcher, who left the company to more freely discuss AI’s risks, believes his “mothering AI” approach could unite the international community around developing safe artificial intelligence. “Every country wants AI that augments and supports its citizens, not replaces them,” Hinton said. “Built-in nurturing instincts provide a natural path to that kind of supportive AI.”
Meta’s chief AI scientist, Yann LeCun, described Hinton’s idea as a simplified version of a safety approach he has long advocated for, which LeCun dubs “objective-driven AI.” It involves architecturally constraining AI systems so they can only take actions in service of specific, hard-coded goals and values.
“You essentially define the AI equivalent of drives and instincts found in humans and animals,” LeCun explained in a LinkedIn post. “So in addition to drives for things like empathy and subservience to people, you have a large number of low-level rules like ‘don’t run over humans’ or ‘don’t swing your arm if holding a knife near people.’”
Read more at the Decoder here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.
Read the full article here