Meta CEO Mark Zuckerberg claimed this week that artificial intelligence companies are trying to create "God or something," but said his company is different.
"I find it a pretty big turnoff when people in the tech industry kind of talk about building this one true AI. It's almost as if they think they're creating God or something. And it's like, that's not what we're doing," Zuckerberg said during an interview with YouTuber Kane Kallaway.
He also maintained that his company had other goals, such as allowing creators and small businesses to personalize their own AIs to suit their needs.
"So, a big part of the approach is going to be enabling every creator, and then eventually also every small business on the platform, to create an AI for themselves to help them interact with their community and their customers if they're a business," he said.
However, according to The Sun, during his 40-minute interview, the tech CEO made no serious attempt to address the ethical concerns surrounding the technology, such as loss of jobs, privacy, and misinformation.
In a study conducted by German AI ethicist Thilo Hagendorff from the University of Stuttgart, Large Language AI Models (LLM) exhibited overwhelming "maladaptive" behavior and "Machiavellianism."
"GPT-4, for instance, exhibits deceptive behavior in simple test scenarios 99.16% of the time," the study said.
In another study published in Patterns, researchers found that Meta's LLM also exhibited Machiavellian tendencies:
"Billed as a human-level champion in the political strategy board game 'Diplomacy,'" the tech zine Futurism wrote, "Meta's Cicero model was the subject of the Patterns study. As the disparate research group — comprised of a physicist, a philosopher, and two AI safety experts — found, the LLM got ahead of its human competitors by, in a word, fibbing."
Nick Koutsobinas ✉
Nick Koutsobinas, a Newsmax writer, has years of news reporting experience. A graduate from Missouri State University’s philosophy program, he focuses on exposing corruption and censorship.
© 2024 Newsmax. All rights reserved.