YouTube faces federal investigation for allegedly violating children’s privacy

The platform is also considering shifting all children's content to a separate app, in an effort to protect them.

YouTube faces federal investigation for allegedly violating children's privacy
YouTube is facing a federal investigation for allegedly violating children's privacy on its platform. (Photo credit: Olly Curtis/Future Publishing via Getty Images)

The hits just keep on coming for YouTube.

On Wednesday, The Washington Post reported that the Federal Trade Commission was in the latter stages of an investigation into YouTube for allegedly improperly collecting data on children, a practice forbidden by the 1998 Children’s Online Privacy Protection Act (COPPA).

The investigation could result in a hefty fine against YouTube, and could require the platform to make some major changes in order to better protect children from malicious content. While the main thrust of the FTC investigation has to do with data collection, complaints leveled to the FTC against YouTube also contend that the company has failed to protect children from dangerous or unnerving content.

In late 2017, for instance, it was reported that videos which had proliferated on YouTube, ostensibly aimed at kids, had been spliced or edited to make them especially disturbing or traumatic for children. One of these videos showed the cartoon character Peppa Pig being tortured by a dentist. In December 2017, it was reported that those same exploitative YouTube channels were making hundreds of thousands of dollars thanks to YouTube’s monetization policies.


A similar scandal occurred in February 2019 when a former YouTube content creator chronicled how pedophiles were using the YouTube comments section on children’s videos to communicate or share illegal pornography.

Brands like Coca Cola and Nestle responded by pulling their advertising until YouTube announced that it would no longer allow comment sections on channels featuring children. But a version of the problem re-appeared in June, when The New York Times reported how YouTube’s algorithm had inadvertently created a catalog of videos for pedophiles to watch and share.

In response to both the FTC investigation and its continued problems moderating kids’ content, YouTube is weighing making significant platform changes.

According to The Wall Street Journal, the company is debating moving all content aimed at kids into the standalone YouTube Kids app. Others within the company are advocating turning off autoplay for all children’s content — which can inadvertently lead them down an algorithmic rabbit hole toward more disturbing content.

But YouTube Kids isn’t without its problems either. In February, dozens of videos were discovered on the app which portrayed suicides, school shootings, and misogynistic violence. The same month, trolls repeatedly re-uploaded a kids video containing a spliced-in clip of a man joking about self-harm.


The federal investigation into YouTube comes at bad time for the company, which is still dealing with the fallout from its mishandling of the Steven Crowder controversy. Speaking at the Code Conference in Arizona last week, YouTube CEO Susan Wojcicki said that the platform very much wanted to support LGBTQ content creators, but would not take down homophobic videos from far-right media figure Crowder, because it would open the floodgates to other content that needed to be removed.

Wojcicki also flip-flopped as to whether YouTube’s algorithms were creating a pipeline for individuals to be radicalized into extremist views.

“Our view has been that we are offering a diverse of content to our users and that we’re providing that diverse set,” she said. “Users will choose different types of content for them to see but we’re offering a diverse set over time.”