Lawsuit: Meta allowed sex-trafficking posts on Instagram as it put profit over kids’ safety
A essential new court filing declares Facebook and Instagram owner Meta had a x framework allowing sex traffickers to post content related to sexual solicitation or prostitution times before their accounts were suspended on the th strike The allegation is one of numerous in the filing claiming Meta chose profit and user engagement over the safety and well-being of children The description of the purported sexual content framework at Instagram is contained in a new court filing by plaintiffs in an ongoing lawsuit against Meta Google s YouTube Snapchat owner Snap and TikTok brought by children and parents school districts and states including California They accuse the companies of intentionally addicting children to products they knew were harming them The filing makes the same general accusations against the four companies that they targeted children and schools and misrepresented their social media products along with company-specific insists Despite earning billions of dollars in annual revenue and its leader being one of the richest people in the world Meta exclusively refused to invest guidance in keeping kids safe claimed the Friday filing in Oakland s U S District Court The lawsuit also targets products it deems harmful from Facebook YouTube Snapchat and TikTok The plaintiffs seek unspecified damages and a court order requiring the companies stop alleged harmful conduct and warn minor users and parents that their products are addictive and dangerous The new filing also indicates Meta s outright lies about its products harms prevented even the the majority vigilant administrators teachers parents and students from understanding and heading off the dangers inherent to Instagram and Facebook A Meta spokesperson denied the accusations We strongly disagree with these charges which rely on cherry-picked quotes and misinformed opinions in an attempt to present a deliberately misleading picture For more than a decade Meta has listened to parents researched issues that matter majority and made real changes to protect teens like introducing Teen Accounts with built-in protections and providing parents with controls to manage their teens experiences the spokesperson revealed Snap criticized the claims for misrepresenting its platform which unlike other social media has no community likes or comparison metrics We ve built safeguards launched safety tutorials partnered with experts and continue to invest in features and tools that patronage the safety privacy and well-being of all Snapchatters a Snap spokesperson declared in an emailed declaration Tuesday Google and TikTok did not right away respond to requests for comment The plaintiffs cited what they described as internal company communications and research reports and sworn depositions by current and former employees The records are largely sealed by the court and could not be verified by this news organization The new filing claimed an account-recommendation feature on Instagram in recommended nearly million minors to adults seeking to sexually groom children More than million potentially inappropriate adults were recommended to teen users in a single day in an internal audit located according to the filing Facebook s recommendation feature according to a Meta employee was responsible for of violating adult minor connections the filing noted Related Articles Cyberattack on CodeRED forces Douglas County Sheriff s Office to seek new alert framework Denver officers defend decision to pour bleach on unlicensed taco vendor s food Meta prevails in historic FTC antitrust occurrence won t have to break off WhatsApp Instagram Rooftop tent company Colorado files bankruptcy blaming tariffs for closure Federal judge blocks new Colorado law requiring social media warning labels for kids In March Meta CEO Mark Zuckerberg reported reporters his Menlo Park company was in fact surging the number of people working on sensitive content including child exploitation but internal communications cited in the court filing indicated that was not true and the company had only about of the staff it needed to review child-exploitation imagery Instagram as of March had no way for people to description the presence of child sexual abuse material the filing explained When Vaishnavi Jayakumar head of safety at Instagram from to first started at the company she was described a reporting process would be too much work to build and even more work to review reports the filing declared Even when Meta s artificial intelligence tools identified child pornography and child sexualization material with confidence the company did not automatically delete it the filing claimed The company which made billion in profit last year declined to tighten enforcement for fear of false positives but could have solved the issue by hiring more staff the filing noted Jayakumar testified in a deposition that when any proposed changes that might reduce user engagement went up to Zuckerberg for review the outcome would be a decision to prioritize the existing system of engagement over other safety considerations the filing commented Instead of completely making kids accounts private by default to protect them from adult predators Meta dragged its feet for years before making the change allowing literally billions of unwanted adult-minor interactions to happen the filing claimed Key to the delay the filing alleged was the internal projection that the change would cut daily users by The company didn t apply default privacy to all teens accounts until the end of last year the filing declared The lawsuit still in an evidence-gathering discovery phase also takes aim at Meta s approach to children s mental wellness and the purported damaging fallout for schools where social media the filing proposes has created a compromised educational atmosphere and forced school districts to spend money and support to address novice distraction and mental healthcare problems Internally Meta researchers mentioned of Instagram We re basically pushers and Teens are hooked despite how it makes them feel the filing noted Meta allowed its products to infiltrate classrooms disrupt learning environments and contribute directly to the youth mental robustness problem now overwhelming schools nationwide claimed the filing which accused Zuckerberg and Meta of lying to Congress Zuckerberg testified three times to Congress that he didn t give his teams goals to increase time users spent on Meta s platforms But several internal messages referred to goals for teens time spent and Zuckerberg himself noted of development metrics The the bulk concerning of these to me is time spent the filing commented When Meta in a late internal review called Project Mercury exposed that people who stopped using Facebook for a week disclosed feeling less depressed anxious lonely and judged socially the company killed the project the filing alleged But in a U S Senate hearing a company representative who was not identified in the filing was required if Facebook could tell if increased use of its platform by teen girls was connected to increased signs of anxiety and responded No the filing claimed Meta stated publicly that a maximum of half a percent of users were exposed to suicide and self-harm material but its own research detected the number was around the filing mentioned Brian Boland who rose over years to a vice-president position in Meta and left in testified in a deposition My feeling then and my feeling now is that they don t meaningfully care about user safety When a Meta employee criticized the company s move intended to protect children s mental medical to hide likes on posts but only for users who opted-in a member of Meta s progress organization responded It s a social comparison app the member reported expletive get used to it the filing stated