Instagram Chief Says Tech Companies Must Earn Liability Shield

(Bloomberg) — The Meta Platforms Inc. executive who leads Instagram told U.S. senators that tech companies should only receive legal protection from liability for user-generated content if they adhere to kids’ safety rules established by a new oversight body — part of his pitch to diffuse congressional anger at the social media giant.

The proposal from Instagram’s Adam Mosseri expands a previous call from Mark Zuckerberg, chief executive officer of Facebook parent Meta, for lawmakers to update technology companies’ legal shield, known as Section 230 of the Communications Decency Act of 1996. Critics say reforming the law would only benefit dominant platforms that have resources to devote to complying with the new legal requirements.

Under Mosseri’s proposal, Congress would form “an industry body that will determine best practices” for how tech companies should keep children and teens safe online. In consultation with outside experts and advocates, that group would develop uniform standards for how internet platforms verify the ages of their users, design age-appropriate experiences and include parental controls in their services, Mosseri said. 

“I want to assure you we do have the same goal: We all want teens to be safe online,” Mosseri said Wednesday during a Senate subcommittee hearing. “This is an industrywide challenge that requires industrywide solutions and industrywide standards.”

For years, Meta has called on Congress to pass new regulations for online platforms in the face of bipartisan criticism that the company doesn’t do enough to protect users’ safety, privacy or well-being on its social networks. While lawmakers have proposed several Section 230 bills, there is little consensus among them on the best strategy to force tech companies to improve their business practices. 

Mosseri touted a recent announcement by Instagram about a handful of new product features designed to improve teen safety and parents’ visibility into their children’s digital behavior. The changes will let users of the photo-sharing app set a reminder to take breaks from scrolling, limit the interaction between teens and people they don’t follow and provide more tools for parental control. 

Instagram’s new features have already elicited skepticism among some lawmakers, who argue the moves don’t go far enough to protect children. 

“Instagram pushed forward with what I call half-measures on the eve of the hearing trying to show that they are willing to give parents more tools in the tool box to regulate how children are online,” Tennessee Senator Marsha Blackburn told Bloomberg Television in an interview before the hearing. “It is not enough to satisfy parents who are concerned about this.”

Read More: Instagram to Nudge People to ‘Take a Break’ From Social App

Instagram has faced sharp criticism from regulators around the world ever since the Wall Street Journal and a consortium of media organizations earlier this year published a series of damaging reports based on internal documents disclosed by Frances Haugen, a former Facebook product manager turned whistle-blower. The documents surfaced new revelations about how the company’s social networks spread hate speech and misinformation as well as harm the mental health of vulnerable teenagers.

Speaking during Wednesday’s hearing before the Senate Commerce consumer protection subcommittee, lawmakers said recent revelations by Haugen are adding to the bipartisan momentum to set tougher rules for privacy and platform accountability. 

“Parents are asking, what is Congress doing to protect our kids?” said subcommittee Chair Richard Blumenthal, a Connecticut Democrat. “The resounding bipartisan message from this committee is: legislation is coming. We can’t rely on trust anymore. We can’t rely on self-policing.”

One of the most troubling revelations from the Facebook documents shared by Haugen was that Instagram often makes teenagers who already have body-image issues feel worse about themselves. Both Democrats and Republicans have pointed to that internal research as proof that Facebook understood potential harms for its youngest users, but hid that information from Congress and the public.

“There is a such a frustration that you turn a blind eye towards taking responsibility and accepting accountability for your platform,” Blackburn said.

QuickTake: How Facebook Whistle-Blower Stoked Screen Time Debate

Mosseri repeated Facebook’s earlier defense of its internal studies, arguing that the findings were taken out of context and some of it was based on a small sample size. He added that the same research found that many teen users said Instagram helped them navigate “hard moments.”

“Our goal with all of the research that we do is to improve the services that we offer,” Mosseri said in prepared remarks. “That means our insights often shed light on problems so that we can evaluate possible solutions and work to improve.”

More stories like this are available on bloomberg.com

©2021 Bloomberg L.P.

Close Bitnami banner
Bitnami