Meta had 'historical reluctance' to protect kids on Instagram, say court docs in New Mexico lawsuit - Action News
Home WebMail Friday, November 22, 2024, 09:24 AM | Calgary | -11.8°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
World

Meta had 'historical reluctance' to protect kids on Instagram, say court docs in New Mexico lawsuit

Newly unredacted documents from New Mexico's lawsuit against Meta underscore the company's "historical reluctance" to keep children safe on its platforms, the complaint says.

Recently unredacted complaint shows Meta aware of issues but dragged feet on addressing concerns

A large white sign with a blue infinity symbol in front of green trees in the background.
Newly unredacted documents from New Mexicos lawsuit against Meta released Wednesday underscore the Facebook and Instagram parent company's historical reluctance to keep children safe on its platforms, according to the complaint. (Godofredo A. Vsquez,)

Newly unredacted documents from New Mexico's lawsuit against Meta underscore the company's "historical reluctance" to keep children safe on its platforms, the complaint says.

New Mexico's Attorney General Ral Torrez sued Facebook and Instagram owner Meta in December, saying the company failed to protect young users from exposure to child sexual abuse material and allowed adults to solicit explicit imagery from them.

In the passages freshly unredacted from the lawsuit Wednesday, internal employee messages and presentations from 2020 and 2021 show Meta was aware of issues such as adult strangers being able to contact children on Instagram, the sexualization of minors on that platform, and the dangers of its "people you may know" feature that recommends connections between adults and children.

But Meta dragged its feet when it came to addressing the issues, the passages show. Instagram, for instance, began restricting adults' ability to message minors in 2021.

One internal document referenced in the lawsuit shows Meta "scrambling in 2020 to address an Apple executive whose 12-year-old was solicited on the platform, noting 'this is the kind of thing that pisses Apple off to the extent of threatening to remove us from the App Store.'"

According to the complaint, Meta "knew that adults soliciting minors was a problem on the platform, and was willing to treat it as an urgent problem when it had to."

WATCH | How parents can haveconversations withchildren about sextortion:

How can parents talk to their kids about online exploitation?

9 months ago
Duration 2:50
Tiana Sharifi, CEO of the Exploitation Education Institute, shares tips with BC Today host Michelle Eliot on how parents can approach conversations with their children about sextortion.

Internal document detailed potential harm

In a July 2020 document titled "Child Safety - State of Play (7/20)," Meta listed "immediate product vulnerabilities" that could harm children, including the difficulty reporting disappearing videos and confirmed that safeguards available on Facebook were not always present on Instagram.

At the time, Meta's reasoning was that it did not want to block parents and older relatives on Facebook from reaching out to their younger relatives, according to the complaint.

The report's author called the reasoning "less than compelling" and said Meta sacrificed children's safety for a "big growth bet."

In March 2021, though, Instagram announced it was restricting people over 19 from messaging minors.

In a July 2020 internal chat, meanwhile, one employee asked, "What specifically are we doing for child grooming (something I just heard about that is happening a lot on TikTok)?"

The response from another employee was, "Somewhere between zero and negligible. Child safety is an explicit non-goal this half" (likely meaning half-year), according to the lawsuit.

In a statement, Meta said it wants teens to have safe, age-appropriate experiences online and has spent "a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online. The complaint mischaracterizes our work using selective quotes and cherry-picked documents."

LISTEN | Why young brains are more vulnerable to social media than adult brains:
Thirty three U.S. states are suing Meta, the parent company of Facebook and Instagram, claiming that it knowingly designed their products to hook children to social media. Yifeng Wei is an assistant professor in the department of psychiatry at the University of Alberta who specializes in child and teen mental health. She joins Edmonton AM to talk about why young brains are more vulnerable to social media than adult brains.

Inappropriate comments, sexual advances

Instagram also failed to address the issue of inappropriate comments under posts by minors, the complaint says.

That's something former Meta engineering director Arturo Bjar recently testified about. Bjar, known for his expertise on curbing online harassment, recounted his own daughter's troubling experiences with Instagram.

"I appear before you today as a dad with firsthand experience of a child who received unwanted sexual advances on Instagram," he told a panel of U.S. senators in November.

"She and her friends began having awful experiences, including repeated unwanted sexual advances, harassment."

A March 2021 child safety presentation noted that Meta is "underinvested in minor sexualization on (Instagram), notable on sexualized comments on content posted by minors.

Not only is this a terrible experience for creators and bystanders, it's also a vector for bad actors to identify and connect with one another."

The documents underscore the social media giant's "historical reluctance to institute appropriate safeguards on Instagram," the lawsuit says, even when those safeguards were available on Facebook.

WATCH |Sextortion through messaging apps is on the rise in Canada:

Why sextortion on chat apps is not a parenting problem

10 months ago
Duration 6:08
Sextortion through messaging apps is on the rise in Canada with young males especially at risk. CBCs Katie Pedersen investigates whats happening online and why experts say its not a parenting problem, but a technology one.

Meta said it uses sophisticated technology, hires child safety experts, reports content to the National Center for Missing and Exploited Children, and shares information and tools with other companies and law enforcement, including state attorneys general, to help root out predators.

Meta plans to limit exposure to self-harm topics

Meta, which is based in Menlo Park, Calif., has been updating its safeguards and tools for younger users as lawmakers pressure it on child safety, though critics say it has not done enough.

Last week, the company announced it will start hiding inappropriate content from teenagers' accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders.

New Mexico's complaint follows the lawsuit filed in October by 33 states that claim Meta is harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.

"For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and sexual exploitation," Torrez said in a statement.

"While the company continues to downplay the illegal and harmful activity children are exposed to on its platforms, Meta's internal data and presentations show the problem is severe and pervasive."

Meta CEO Mark Zuckerberg, along with the CEOs of Snap, Discord, TikTok and X, formerly Twitter, are scheduled to testify before the U.S. Senate on child safety at the end of January.

LISTEN | Dozens of stateslaunchlawsuits against Meta:
Dozens of states have launched lawsuits against Meta, claiming it has made its sites Facebook and Instagram too addictive for young people. Tech columnist Mohit Rajhans takes a look at the case.