In the News
WIBW | Sarah Motter
TOPEKA, Kan. (WIBW) - Following allegations that social media giant Facebook fed the Capitol riot on Jan. 6, Frances Haugen has detailed to Senator Jerry Moran just what the company knew and how legislation reform could make effective changes.
U.S. Senator Jerry Moran (R-Kan.), member of the U.S. Senate Commerce Subcommittee on Consumer Protection, Product Safety and Data Security, questioned Facebook Whistleblower Frances Haugen on Tuesday, Oct. 5, based on the social media giant’s potential harm to children.
Sen. Moran questioned Haugen about whether the company knew its decisions would harm children that used its platforms, including Instagram and WhatsApp, but decided to proceed with its harmful behaviors.
“Facebook’s internal research is aware that there are a variety of problems facing children on Instagram,” said Haugen. “They know that severe harm is happening. For example, in the case of bullying, Facebook knows that Instagram dramatically changes the experience of high school.”
Haugen noted that with the introduction of social media, bullying now follows children home through their mobile devices instead of staying at school.
“The last thing they see before they go to bed at night is someone being cruel to them, or the first thing they see in the morning is someone being cruel to them,” Haugen said. “Kids are learning that their own friends, that people who they care about, are cruel to them.”
Further, Haugen said parents are ill-equipped to deal with such a new issue and give their children well-meaning, but inaccurate advice.
“I don’t understand how Facebook can know all these things and not escalate to someone like Congress for help and support navigating these problems,” Haugen continued.
Sen. Moran proceeded to question the whistleblower about whether the company was aware of issues that do not just affect children or body image issues but still deal a great amount of harm to adults that use the platforms.
Haugen said the company is, in fact, aware that meaningful social interactions, or interactions that garner the most engagement, on the platform did not care if bullying was a factor or hate speech. She said the content started to change when content creators such as BuzzFeed admitted some of its most successful content is the content they are most ashamed of. She also said politicians have been forced to take positions their constituents do not approve of because those are the positions that have been distributed on Facebook.
“Facebook also knows that, they have admitted in public that, engagement-based ranking is dangerous without integrity and security systems, but then not rolled out integrity and security systems to most of the languages in the world,” said Haugen. “And that’s what’s causing things like ethnic violence.”
Moran then questioned Haugen about the magnitude and impacts of the sales of Facebook users’ data, however, Haugen said she did not work in that department and did not have much information about it.
As for legal action Congress could take that would have the most impact on Facebook, Haugen said strongly recommends reforming Section 230 to exempt decisions about algorithms.
“Modifying 230 around content is very complicated because user-generated content is something that companies have less control over,” said Haugen. “They have 100% control over their algorithms and Facebook should not get a free pass on choices it makes to prioritize growth, and virality, and reactiveness over public safety. They shouldn’t get a free pass on that because they’re paying for their products right now with our safety.”
Haugen went on to describe how she thought an oversight body could also bring about reform.
“Right now, the only people in the world who are trained to analyze these experiments to understand what’s happening inside of Facebook are the people who grew up inside of Facebook, or Pinterest or another social media company,” she said. “And there needs to be a regulatory home where someone like me could do a tour of duty after working at a place like this and have a place to work on things like regulation and to bring information out to the oversight boards.”
Previously, Haugen worked on Facebook’s team as a product manager to support counter-espionage before she quit the company earlier in 2021 and took a stash of private research with her.
After the hearing, Moran called for increased transparency in Big Tech.
“People should be able to connect with one another online without being manipulated by secret algorithms created by Big Tech that can exacerbate mental illness and thoughts of suicide,” said Moran. “There must be increased transparency provided by these tech giants so that Kansans have the information necessary to choose what services to use and have better control over their personal and private information.”
In 2020, Moran said he introduced the Consumer Data Privacy and Security Act, landmark federal data privacy legislation to strengthen laws that govern personal data.
During the hearing, Moran said he and Sen. Richard Blumenthal (D-Conn.) recommitted their efforts to a bipartisan path forward in light of Haugen’s news.
Moran said he is also a cosponsor of the Filter Bubble Transparency Act, which would require large-scale internet platforms to show users content that is not the result of a secret algorithm.