A new fight has popped up in the ongoing battle between TikTok and the U.S. government, but this time, it’s about children’s online privacy.
The Justice Department sued TikTok on Friday, alleging that the social media platform violated the Children’s Online Privacy Protection Act (COPPA) by allowing children to create accounts and interact with adults — and collecting and retaining their data without getting consent from their guardians. COPPA, which was passed over two decades ago, requires social media platforms and other websites to get parental consent before collecting personal information from children under 13. In response, most social media platforms — including Facebook, Instagram, and Snapchat — simply don’t allow anyone under 13 to make an account. TikTok, on the other hand, offers a view-only experience for children under 13.
Here’s how the TikTok ban will likely play out in the courts
“This action is necessary to prevent the defendants, who are repeat offenders and operate on a massive scale, from collecting and using young children’s private information without any parental consent or control,” Brian M. Boynton, the head of the Justice Department’s Civil Division, told the Associated Press in a statement.
This comes after the FTC sued Musical.ly, the app that would later become TikTok, for violating COPPA in 2019, the AP reported; Musical.ly paid $5.7 million to resolve the allegations at the time.
According to NBC News, “TikTok knowingly and repeatedly violated kids’ privacy, threatening the safety of millions of children across the country.” FTC Chair Lina Khan said. The FTC will continue to use the full scope of its authorities to protect children online, especially as firms deploy increasingly sophisticated digital tools to surveil kids and profit from their data.
In response to these allegations, a TikTok spokesperson emailed Mashable the following statement:
“We disagree with these allegations, many of which relate to past events and practices that are factually inaccurate or have been addressed. We are proud of our efforts to protect children, and we will continue to update and improve the platform. To that end, we offer age-appropriate experiences with stringent safeguards, proactively remove suspected underage users, and have voluntarily launched features such as default screentime limits, Family Pairing, and additional privacy protections for minors.”