Facebook CEO Mark Zuckerberg hit back Tuesday at whistleblower Frances Haugen, who testified to a Senate Commerce subcommittee that the company's platform harms children and relies too much on artificial intelligence rather than employees to keep track of its content.
"We care deeply about issues like safety, well-being, and mental health," Zuckerberg said in a lengthy statement posted on the social media platform and distributed to his company's employees Tuesday. "It's difficult to see coverage that misrepresents our work and our motives."
Haugen, a former Facebook product manager, earlier in the day testified that the company's artificial intelligence program, used to combat misinformation, hate speech, and advertising that is inappropriate for children only catches about 10% to 20% of the site's banned content.
She also accused Zuckerberg of being more focused on higher profits over safety, and called for transparency, saying the company entices its users to extend their use of the site, which gives more opportunities to focus advertising on them.
"The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people," she said, calling for congressional action and sparking outcry from lawmakers on both sides of the aisle.
Zuckerberg, though, said Facebook leads the social media industry in research on its impact and transparency and denied his company polarizes society.
"If social media were as responsible for polarizing society as some people claim, then why are we seeing polarization increase in the U.S. while it stays flat or declines in many countries with just as heavy use of social media around the world?" he argued.
He also highlighted work that the company does to eliminate harmful content, including creating platforms such as "Messenger Kids," and by introducing a feature called "Meaningful Social Interactions" to its news feed, which shows fewer viral videos and more content from friends and family.
That move was made "knowing it would mean people spent less time on Facebook, but that research suggested it was the right thing for people's well-being. Is that something a company focused on profits over people would do?" said the CEO.
Zuckerberg also said he believes it is important for everything the company builds to be "safe and good for kids."
"The reality is that young people use technology," he said. "Rather than ignoring this, technology companies should build experiences that meet their needs while also keeping them safe. We're deeply committed to doing industry-leading work in this area."
Zuckerberg acknowledged that it's "frustrating" for employees to see the "good work we do get mischaracterized, especially for those of you who are making important contributions across safety, integrity, research, and product," but said that he believes that over the long term if the company keeps "trying to do what's right and delivering experiences that improve people's lives, it will be better for our community and our business."
Meanwhile, he also said Facebook has advocated for updated internet regulations for several years, and that he doesn't "believe private companies should make all of the decisions on their own."
"I have testified in Congress multiple times and asked them to update these regulations," said Zuckerberg. "I've written op-eds outlining the areas of regulation we think are most important related to elections, harmful content, privacy, and competition."
Further, he addressed the long outage experienced Monday on Facebook and its other platforms, Instagram and WhatsApp, calling it the worst "we've had in years."
"This was also a reminder of how much our work matters to people," said Zuckerberg. "The deeper concern with an outage like this isn't how many people switch to competitive services or how much money we lose, but what it means for the people who rely on our services to communicate with loved ones, run their businesses, or support their communities."
© 2021 Newsmax. All rights reserved.