Facebook’s chief spokesman says the social media giant is going to institute new guards for younger Instagram users in the face of criticism from a whistleblower's charges that the company puts profits ahead of users’ well being.
In an interview on CNN’s “State of the Union,” Nick Clegg, Facebook’s vice president for global affairs, claimed the company has always been aware of, and sensitive to its younger users.
“But we understand the concerns of some that we need to press pause, listen to experts, explain our intentions and so on,” he said just days after Frances Haugen, a former product manager for Facebook, told a panel of the Senate Commerce Committee that the company’s pursuit of profits stoked division and harmed the mental health of young users.
“We’re going to introduce new controls for adults of teens on an optional basis, obviously, so adults can supervise what their teens are doing online,” he said.
“Secondly, we'll be doing something which I think will make a considerable difference which is where the teen is looking at the same content over and over again, and its content which may not be conducive so their well-being, we'll nudge them to look at other content,” he said.
“And the third thing we're introducing is something called ‘take a break,’ where we'll be prompting teens to simply take a break from using Instagram.”
Clegg also said the company is willing to ensure that its algorithms are working as intended, but warned if they’re removed, “what you would see is more, not less, hate speech, more, not less, algorithms.”
“These work as giant spam filter to deprecate bad content. Of course it has a down side but it also has a very positive effect,” he declared.
He also vowed that “where we see content we think is relevant to the investigations of law enforcement, we cooperate with them.”
“The whole point, of course, of Facebook is that each person's news feed is individual to them. It's like an individual fingerprint, and that's basically determined by the interaction of your choice, your friends, your family, the groups you choose to be part of,” he said.
“We need greater transparency so the systems we have in place,” he added. “[A]lgorithm systems, machine learning systems, they should be held to account if necessary by regulation, so people can match what our systems say they should do from what actually happens.”
© 2022 Newsmax. All rights reserved.