Ads for Match Group and Walmart ran alongside disturbing content on Meta’s platforms, court documents claim
By Clare Duffy, CNN
New York (CNN) — Meta advertisers, including dating app company Match Group and Walmart, have raised concerns to the tech giant about their ads appearing alongside disturbing sexual content, New Mexico Attorney General Raúl Torrez alleged in an updated complaint in his lawsuit against the tech giant.
The updated complaint, filed Wednesday in New Mexico district court, comes after Torrez sued Meta and CEO Mark Zuckerberg last month, accusing the company of creating a “breeding ground” for child predators. The lawsuit alleges that Meta exposes young users to sexual content and makes it possible for adult users they don’t know to contact them, putting children at risk of abuse or exploitation.
Wednesday’s filing alleges that Match Group cut its advertising spending on Meta’s platforms over its concerns. The claims that advertisers have raised such concerns to Meta — and that Match cut its spending — marks a potentially serious risk to the tech giant’s core advertising business.
“We don’t want this kind of content on our platforms and brands don’t want their ads to appear next to it,” Meta said in a statement from spokesperson Andy Stone. “We continue to invest aggressively to stop it — and report every quarter on the prevalence of such content, which remains very low. Our systems are effective at reducing violating content, and we’ve invested billions in safety, security and brand suitability solutions.”
Walmart said in a statement to CNN that it takes “brand safety issues extremely seriously, and protecting our customers and communities will always be a top priority.”
Match Group did not immediately respond to a request for comment.
Wednesday’s complaint cites an alleged email exchange between Match and Meta late last year in which the dating company asked how Meta prevented its ads from running next to inappropriate content. Meta tells advertisers that its technology prevents ads from appearing next to content that is “excessively controversial or offensive, such as full nudity, excessive violence, terrorist acts, or misinformation,” and by touting its additional “brand suitability controls” available to advertisers, the complaint states.
However, Match informed Meta that its ads had run alongside a series of Instagram Reels videos showing “young girls, including a ‘[y]oung girl provocatively dressed, straddling and caressing a Harley Davidson-style motorcycle,’” as well as next to a Facebook group “showing films of women being murdered,” according to the complaint.
Meta responded saying it had removed some of the flagged content, including the group, the complaint states.
Match Group’s CEO also allegedly reached out directly to Zuckerberg to note that “Meta is placing ads adjacent to offensive, obscene – and potentially illegal – content, including sexualization of minors and gender-based violence,” and said that Match had cut its advertising spending on Meta’s platforms. Zuckerberg has not yet responded to the message, the complaint states.
Walmart also repeatedly raised concerns to Meta last year about where its ads were being shown, including following a November report by The Wall Street Journal indicating that an advertisement for the retailer was shown after an explicit video of a woman, the complaint alleges.
“It is extremely disappointing that this type of content exists on Meta, and it is unacceptable for Walmart’s brand to appear anywhere near it,” the company said in an email to Meta, according to the complaint.
Meta moved to dismiss the New Mexico lawsuit last month, arguing the complaint failed to state a claim. The tech giant also cited its protections under a law known as Section 230, which shields tech companies from being held accountable for content that users post on their sites.
Growing scrutiny of youth safety
The New Mexico lawsuit is part of growing pressure on Meta to address alleged harm to young users of Facebook and Instagram.
The social media giant has been sued by various school districts and state attorneys general in lawsuits related to youth mental health, child safety and privacy. Former Facebook employee-turned-whistleblower Arturo Bejar also told a Senate subcommittee in November that Meta’s top executives, including Zuckerberg, ignored warnings for years about harm to teens on its platforms.
Meta has strongly and repeatedly denied claims that its platforms put children at risk. The company on Wednesday announced new youth safety efforts, including content restrictions and new blocked search terms, that add to its existing slate of more than 30 well-being and parental oversight tools.
Still, Torrez claims the company is not doing enough to protect young users.
“Mr. Zuckerberg and Meta are refusing to be honest and transparent about what is taking place on Meta’s platforms,” Torrez said in a statement to CNN Wednesday. “New Mexico’s complaint alleges that, for years, they have misled the public about children’s inappropriate and dangerous experiences on Facebook and Instagram.”
Allegations of child predators using Facebook
Wednesday’s complaint also alleges that Meta executives have for years known that child predators were using Facebook to groom victims and failed to act.
It claims that as far back as 2018, Facebook employees raised alarms to senior executives that the platform was aiding child predators in connecting with children and suggested solutions to address the issue that were rejected.
In particular, the “People You May Know” feature was responsible for suggesting to adult users the accounts of children that matched search terms they’d previously used, such as “children in their geographic area,” the complaint alleges. In a Wall Street Journal report about the issue in December, Meta’s Stone told the newspaper that the employee concerns had prompted new work at the company to avoid recommending teen accounts to suspicious adults.
The complaint also alleges that “Forums on the dark web devoted to child sex abuse … show open conversations about the role of Instagram’s and Facebook’s algorithms in delivering children to predators.”
In one instance viewed by the attorney general’s investigators, one participant in the chat forum discussed having been followed by a 10-year-old girl on Instagram and said they were considering sending her a direct message.
“Do you think I’m getting in trouble,” the participant asked on the forum, according to the complaint. Another participant responded, “you see how the baby reacts! On Instagram it is very easy to just ask for a couple of intimate photos and then send your own and if you like each other go to a meeting in a cozy place,” the complaint states.
The new claims come as Zuckerberg is set to testify later this month at a Senate subcommittee hearing on youth safety online, alongside representatives from TikTok, Snapchat, Discord and X.
The-CNN-Wire
™ & © 2024 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.