Meta evades child harm inquiry, citing Apple and Google app stores
3 min readAs the US Senate scrutinized the company’s lapses in protecting children, it urged legislation mandating parental consent for app downloads
On Wednesday, Meta urged US lawmakers to regulate Google and Apple’s app stores for enhanced child protection. This plea coincided with the Senate initiating an investigation into Meta’s shortcomings in safeguarding children on its platforms.
In a blog post titled “Parenting in a Digital World Is Hard. Congress Can Make It Easier,” Antigone Davis, Meta’s global head of safety, advocated for federal legislation. The proposed legislation would require app stores to notify parents whenever a child aged between 13 and 16 downloads an app, seeking parental approval. It’s worth noting that children under 13 are already prohibited from creating accounts and downloading apps without parental consent.
While Meta’s blog post avoids explicitly naming Google or Apple, the two companies operate the world’s largest smartphone app stores—the Play Store for Android and the App Store for iPhone’s iOS. Any legislation aimed at regulating children’s app downloads would consequently impact both platforms.
Davis asserted that there exists “a better way” to govern smartphone and internet usage, contrasting with laws necessitating parental approval for a child to establish a social media account. Utah, as an illustration, initiated a requirement in March for parents of individuals under 18 to authorize the use of apps like TikTok, Instagram, and Facebook, aiming to safeguard the mental health of youth, according to the state governor, Spencer Cox.
On the same day as Davis’s proposal, the Senate judiciary committee issued a letter to Mark Zuckerberg, Meta’s CEO, urging him to “provide documents related to senior executives’ knowledge of the mental and physical health harms associated with its platforms, including Facebook and Instagram.” The letter stipulates a deadline of November 30 for the submission of these documents. As of the time of the press, neither Google nor Apple had provided statements.
Following the release of this story, Meta issued a statement, stating, “We’ve always said that we support internet regulation, particularly when it comes to young people. That said, we’re concerned about what is amounting to a patchwork of divergent laws across a variety of US states. Laws that hold different apps to different standards in different states will leave teens with inconsistent online experiences.” The company maintains it has endorsed legislation focusing on “clear industry standards” for parental supervision and age verification since 2021.
In response, the National Society for the Prevention of Cruelty to Children criticized Meta, stating, “It is pleasing to see that regulation of social media is already focusing the minds of top tech executives on protecting children. But Meta has sat on its hands while knowing the harm their platforms can cause children and is now seeking to pass the buck when they should be getting their own house in order.
The Senate committee’s initial inquiry follows a former high-ranking Meta employee’s testimony last week regarding the harm Instagram can inflict on children, including his own daughter. The ex-Instagram engineering director, Arturo Bejar, informed the panel of senators that Meta’s leaders disregarded his concerns when he raised them internally.
Appearing before the senators, Bejar stated, “I appear before you today as a dad with first-hand experience of a child who received unwanted sexual advances on Instagram.” This same issue was a focal point in the testimony of another Meta whistleblower, Frances Haugen, who leaked internal documents to the US government detailing how company executives ignored warnings about the negative effects of social media use on teenage girls. Haugen testified before Congress in October 2021.