A former Facebook Inc employee has said the social media giant’s products harm the mental health of some young users, sow divisiveness and weaken democracy, and urged US lawmakers to regulate the company.
Whistleblower Frances Haugen told a US Senate subcommittee on Tuesday that Facebook has repeatedly misled the public about the harm it knows teenage girls suffer from its photo-sharing app Instagram, and how its products feed the division.
“I’m here today because I believe Facebook’s products harm children, divide and weaken our democracy,” Haugen said in a statement before her testimony on Capitol Hill.
“Congress action is needed. They will not solve this crisis without your help.”
Her testimony came a day after Facebook and two of its main services, Instagram and messaging app WhatsApp, joined an hour-long global outage, and after weeks of mounting pressure on the social media company to explain its policies for young users.
Haugen went public in an interview with CBS on Oct. 3, revealing that she was the one who provided documents used in a Wall Street Journal investigation and a Senate hearing on The alleged damage of Instagram.
The WSJ stories showed that the company contributed to the increased polarization online when it made changes to its content algorithm; failed to take steps to reduce hesitation about vaccines and was aware that Instagram was hurting the mental health of teenage girls.
“As long as Facebook operates in the shadows and hides its research from public scrutiny, it’s inexplicable,” Haugen told the panel on Tuesday.
“Until the incentives change, Facebook won’t change. Left alone, Facebook will continue to make choices that are against the public good,” she said. “Facebook is hiding behind walls that keep researchers and regulators from understanding the true dynamics of their system.”
Facebook spokesman Kevin McAlister said in an email to Reuters news agency that the company believes protecting its community is more important than maximizing profits.
He also said it was incorrect to say that leaked internal research showed Instagram was “toxic” to teenage girls.
That echoed testimony from Facebook’s head of global security, Antigone Davis, Delivered last week before the same Senate committee. “We care deeply about the safety and security of the people on our platform,” Davis said at the time.
“We take the issue very seriously… We have implemented multiple safeguards to create safe and age-appropriate experiences for people between the ages of 13 and 17.”
But during Tuesday’s hearing, US senators accused Facebook CEO Mark Zuckerberg of pursuing higher profits while being arrogant about user safety. They also demanded that US regulators investigate Haugen’s allegations that the company’s products are harming children and inciting divisions.
In an era of deep political division in Washington, DC, both Republican and Democratic lawmakers agreed on the need for major changes.
In an opening statement, Democratic Senator Richard Blumenthal, who chairs the subcommittee holding the hearing, said Facebook knew its products were addictive, such as cigarettes.
“Tech is now facing that breathtaking moment of Big Tobacco truth,” Blumenthal said.
He called on Zuckerberg to appear before the committee, and the Securities and Exchange Commission and the Federal Trade Commission to investigate the company.
“Our children are the victims. Teens who look in the mirror these days feel doubt and insecurity. Mark Zuckerberg should look at himself in the mirror,” Blumenthal said.
Senator Marsha Blackburn, the top Republican on the subcommittee, said Facebook turned a blind eye to children under 13 on its sites. “It’s clear that Facebook puts profit above the well-being of children and all users,” Blackburn said.
Shihab Rattansi of Al Jazeera, reporting from Capitol Hill, said regulating content on Facebook and other social media platforms will be difficult for Congress because of the US First Amendment to protect free speech.
“The question becomes, ‘What criteria will be used and who will oversee that,'” Rattansi said.
Still, Jason Kint, CEO of the trade organization Digital Content Next, said Tuesday’s hearing was important. “What’s different at this point is evidence coming out of the building,” he told Al Jazeera.
“What this hearing provides is proof that they knew and that there was actual empirical data supporting all this downstream damage to the way the platform works.”