A Facebook executive said Sunday that the company would introduce new measures on its apps to prompt teens away from harmful content, as US lawmakers scrutinise how Facebook and subsidiaries like Instagram affect young people’s mental health.
Nick Clegg, Facebook’s vice president of global affairs, also expressed openness to the idea of letting regulators have access to Facebook algorithms that are used to amplify content. But Clegg said he could not answer the question whether its algorithms amplified the voices of people who had attacked the US Capitol on January 6.
The algorithms “should be held to account, if necessary, by regulation so that people can match what our systems say they’re supposed to do from what actually happens,” Clegg told CNN’s “State of the Union.”
“We’re going to introduce something which I think will make a considerable difference, which is where our systems see that the teenager is looking at the same content over and over again and it’s content which may not be conducive to their well-being, we will nudge them to look at other content,” Clegg told CNN.
In addition, “we’re introducing something called, ‘take a break,’ where we will be prompting teens to just simply just take a break from using Instagram,” Clegg said.
US senators last week grilled Facebook on its plans to better protect young users on its apps, drawing on leaked internal research that showed the social media giant was aware of how its Instagram app damaged the mental health of youth.
Senator Amy Klobuchar, a Democrat who chairs the Senate Judiciary Committee’s antitrust subcommittee, has argued for more regulation against technology companies like Facebook.
“I’m just tired of hearing ‘trust us’, and it’s time to protect those moms and dads that have been struggling with their kids getting addicted to the platform and been exposed to all kinds of bad stuff,” Klobuchar told CNN on Sunday after Clegg’s interview.
© Thomson Reuters 2021