What you need to know as Big Tech executives defend social media safety on Capitol Hill
Originally Published: 31 JAN 24 06:00 ET Updated: 31 JAN 24 11:06 ET By Brian Fung and Clare Duffy, CNN
Washington (CNN) — A group of social media bigwigs is currently being grilled by Congress on Wednesday about the risks their products pose to young people — yet again.
The chief executives of Meta, TikTok, Snap, Discord and X, formerly known as Twitter, are testifying before the Senate Judiciary Committee starting right now.
The hearing became even more relevant Wednesday, a day after a gruesome video was posted on YouTube showing a man holding what he claimed was his father’s decapitated head. The video circulated for hours on YouTube – garnering more than 5,000 views – before it was taken down.
Here are our main takeaways so far.
A rare unifying force on Capitol Hill: Hating social media companies
Wednesday’s hearing again demonstrates the breadth of criticism for social media companies among lawmakers, a rare bipartisan topic on Capitol Hill.
In an early instance, Republican Sen. Lindsey Graham highlighted how he has “almost nothing in common” with his Democratic colleague Sen. Elizabeth Warren, and how he has a different political philosophy from Democratic Sen. Dick Durbin. However, he noted that they agree on the issues of how tech is affecting society.
“Elizabeth and I see an abuse here that needs to be dealt with. Sen. Durbin and I have different political philosophies, but I appreciate what you’ve done on this committee. You’ve been a great partner. To all my Democratic colleagues, thank you very, very much. To my Republican colleagues, thank you all very, very much,” Sen. Graham said.
Despite both parties’ appetite for going after tech platforms, however, Congress has yet to pass meaningful legislation to regulate social media companies.
Most of the action has taken place in state legislatures and in the courts, which have become battlegrounds for new policies including age minimums for social media.
A South Carolina lawmaker is suing Instagram after his son died by suicide
In his opening remarks, Sen. Graham mentioned a state lawmaker who lost his eldest son and who is now suing Meta. That lawmaker is South Carolina state houseRep. Brandon Guffey.
About two weeks after his oldest son’s funeral, Guffey says he received a private Instagram message with a laughing emoji.
Gavin Guffey, 17, had fatally shot himself in a bathroom in July 2022, and the grieving father was searching for clues on what led to his suicide.
Then Guffey and his younger son began to get messages demanding money in exchange for nude photos of his late son. Anyone on Gavin’s Instagram followers list who had the last name Guffey got similar messages, his father says.
‘You have blood on your hands’: Sen. Graham says to tech CEOs
South Carolina Republican Sen. Lindsey Graham told tech CEOs in his opening remarks, “You have blood on your hands.”
It triggered an applause and cheers from many in the audience.
“You have a product that’s killing people … You can’t be sued, you should be!” Graham added. “It is now time to repeal Section 230.”
Section 230 is the federal law that immunizes websites and social media platforms for their content moderation decisions and from lawsuits arising from user-generated content.
Wednesday hearing likely to get tense
Congress has done little to rein in the industry in recent years, even as consumer groups say social media puts young users at risk of everything from depression to bullying to sexual abuse. But lawmakers now cite a growing procession of whistleblowers, consumer lawsuits and new state legislation around the country to argue this time is different.
“We’re going to work hard to hold their feet to the fire,” Connecticut Democratic Sen. Richard Blumenthal told reporters on Tuesday ahead of the hearing.
The spotlight is likely to be brightest on Meta CEO Mark Zuckerberg, as the company that controls Instagram and Facebook has faced particular pressure over the issue because of its size andpopularity with younger users.
Zuckerberg is expected to tout the company’s more than 30 safety controls, according to prepared testimony released ahead of the hearing. Still, lawmakers are likely to call on Meta to do more.
Each of the CEOs is expected to defend their efforts on youth safety and to point to features and policies their companies have implemented in recent years — including, in some cases, updates introduced in the weeks leading up to the hearing.
TikTok and X, for example, both pointed to their recent efforts to combat child sexual exploitation on their platforms in statements ahead of the hearing.
Snap CEO Evan Spiegel is also set to argue that Snapchat is different from other social platforms like Instagram and TikTok, because most of its users aren’t there to consume content promoted by an algorithmically run feed, according to prepared testimony released Wednesday morning.
Lawmakers say that the longer platforms drag their heels on meaningful changes, the more harm and the more victims it will produce.
‘We had no idea the damage they could cause’
One of social media’s self-described victims is Rosemarie Calvoni, a woman who is suing Meta and other social media companies over her daughter’s struggle with anorexia.
Calvoni blames Meta for sending her daughter down a rabbit hole of eating disorder content, leading to inpatient treatment and multiple relapses that have caused her to miss “most of high school.”
“When these platforms were first launched, we had no idea of the damage they could cause, not only to my daughter but our family as a unit,” Calvoni told reporters on a call Tuesday.
Facebook whistleblower Arturo Béjar went public and told US lawmakers last year that Meta and other execs disregarded his survey research finding that, among other things, more than 25% of 13-to-15-year-olds have reported receiving unwanted sexual advances on Instagram.
Meta has since said it offers tools for users to customize their experiences on company platforms, and that it has proposed federal legislation requiring app stores to verify the ages of their users.
In recent weeks, Meta has also begun hiding more “age-inappropriate” content in teens’ feeds and restricting teens from receiving direct messages from people they don’t follow.
New Meta docs raise questions about Zuckerberg culpability
Senior Meta executives tried to sound the alarm internally about Instagram and Facebook’s handling of kid safety in 2021, according to new documents published Wednesday by Blumenthal and Tennessee Republican Sen. Marsha Blackburn.
The new communications include warnings to Zuckerberg that “we are not on track” and that the company faces “increased regulatory risk” due to underinvestment in user safety.
The issue would culminate in a public relations disaster and harm to Meta’s future metaverse ambitions, executives warned, if the company did not urgently hire dozens of new employees to meet the challenge.
The documents were provided to lawmakers by Meta in response to earlier requests for information.
Two years after the warnings, however, Zuckerberg has instead laid off thousands of employees, including staff dedicated to user well-being. And his company is confronting a rising global tide of litigation and regulation.
The newly released communications provide some of the most specific and direct evidence to date suggesting that Zuckerberg ignored or rebuffed efforts by senior company officials, including then-COO Sheryl Sandberg and President of Global Affairs Nick Clegg, to invest more heavily in the mental health of Meta’s users.
While aspects of Meta’s 2021 email exchanges have been previously described in a lawsuit filed by the state of Massachusetts against the company, Wednesday’s release included copies of some of the correspondence. These detail how some within Meta described the issue as a brewing risk for the company.
In one August 2021 email, Clegg — presenting a proposal developed by Meta’s head of health and other VPs — told Zuckerberg a growing number of policymakers worldwide have “publicly and privately” expressed concerns about Facebook and Instagram’s possible effects on teen mental health.
Clegg and other executives called for hiring at least 45 new employees in 2022 to deal with safety issues, some of whom would form a dedicated team focused on well-being across the company’s apps.
Ideally, Clegg said, Zuckerberg would authorize as many as 124 new hires, but acknowledged that financial pressures could make it difficult.
After months of radio silence from Zuckerberg, Clegg tried to follow up, this time with a slimmed-down proposal that envisioned either 25 new hires or, if even that was infeasible, just seven.
“This would be the bare minimum needed to meet basic policymaker inquiries,” Clegg wrote to Zuckerberg on Nov. 10, 2021.
Soon after, Sandberg told Clegg: “I am supportive of this and will follow up.”
But she also tried to manage his expectations, adding that “as you know, we have overall budgeting issues across the board so no promises on what will happen.”
The-CNN-Wire
™ & © 2024 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.