Facebook and Instagram accused of creating a ‘marketplace’ for child predators in new lawsuit

/

The New Mexico state attorney general claims Meta’s platforms promoted teen accounts to child predators.

p>span:first-child]:text-gray-13 [&_.duet–article-byline-and]:text-gray-13″>

a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Illustration by Nick Barclay / The Verge

Meta and its CEO, Mark Zuckerberg, allowed Facebook and Instagram to become a “marketplace for predators in search of children,” a new lawsuit from the New Mexico attorney general alleges, as first reported by The Wall Street Journal. The lawsuit, filed in state court on Tuesday, also claims Meta’s algorithms recommend sexual content to children.

As outlined in the complaint, the New Mexico attorney general’s office conducted an investigation that involved creating test profiles on Facebook and Instagram that appeared to be teenagers or preteens. Not only did the office find inappropriate recommendations for each of the decoys, such as an account that openly posted adult pornography, but it also found that they attracted predators as well.

One test account claiming to be a 13-year-old girl garnered over 6,700 followers, most of whom were adult males. Some of them asked her to contact them privately on WhatsApp, Telegram, and Kik, or meet offline. The complaint says the fake 13-year-old’s account also received messages “filled with pictures and videos of genitalia, including exposed penises, which she received at least 3-4 times per week.” The account attempted to report many posts and accounts, but Meta “advised that it found no violation of Community Standards,” according to the lawsuit.

“Meta’s platforms Facebook and Instagram are a breeding ground for predators who target children for human trafficking, the distribution of sexual images, grooming, and solicitation,” the lawsuit states. “Teens and preteens can easily register for unrestricted accounts because of a lack of age verification. When they do, Meta directs harmful and inappropriate material at them.”

The Journal has published a series of reports over the past several months that found disturbing patterns on Facebook and Instagram. Most recently, the outlet published an investigation into how Facebook appears to enable and promote groups dedicated to sharing child sexual abuse material. Meta responded by expanding the child safety-related terms, phrases, and emoji it uses to find predatory networks. It also stopped recommending groups with members that “exhibit potentially suspicious behavior.”

“We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators,” a Meta spokesperson told the Journal. Meta didn’t immediately respond to The Verge’s request for comment.

New Mexico Attorney General Raúl Torrez claims Meta is downplaying the dangers children face on the platform, adding that the company continues to “prioritize engagement and ad revenue over the safety of the most vulnerable members of our society.

Meta is currently facing dozens of lawsuits that allege its platforms harm the mental health of children. Zuckerberg and the CEOs of several other major social platforms are also set to testify before the US Senate in January “about their failure to protect children online.”

Update December 6th, 11:12AM ET: Added additional details from the complaint.

This post was originally published on The Verge

Share your love