The TikTok COPPA fine has dramatically upped the stakes in kids digital privacy
The FTC dramatically upped the ante on COPPA enforcement this week with a record-setting fine of $5.7m against the app formerly known as Musical.ly (now TikTok). The settlement says TikTok breached COPPA by knowingly collecting personal information from children without first obtaining parental consent, as required by the law.
It represents a direct challenge to social media platforms that have until now relied on their Terms of Service to deny they have a kids’ audience.
This case also shows us a more assertive FTC, with important consequences for general audience publishers who have unacknowledged kids’ audiences.
The settlement makes for sobering reading. It describes how TikTok:
- collected personal information from kids, including email address, phone number, first and last name, a short biography and a profile picture;
- defaulted to making profiles public, including of children, and allowed them to be direct-messaged and geo-located;
- ignored requests from parents to delete child accounts; and,
- knew children were using the app, but failed to seek parental consent.
The ruling requires TikTok to pay the highest-ever COPPA fine of $5.7m and to delete any personal information collected from under-13s. Going forward, TikTok will be considered a ‘mixed audience’ app, requiring it to age gate its users and to protect those under 13. (The company announced it would be blocking under-13 users from the main app as of yesterday).
This ruling critically moves the COPPA goalposts for digital services that have large kids’ audiences but have ignored them. Digital platforms and services can’t just state in their Terms of Service that they’re for 13+ only, especially if kids are highly visible. The FTC showed how TikTok had ‘actual knowledge’ by virtue of the videos featuring children, letters received from parents, its own user safety guides addressing children, and even direct communication with child account holders.
Additionally two FTC commissioners released an extraordinary statement alongside the ruling, threatening executives in companies who breach COPPA with personal prosecution:
[…] individuals at large companies have often avoided scrutiny. We should move away from this approach. [The] Commission should identify and investigate those individuals who made or ratified that decision and evaluate whether to charge them.[…] we should prioritize uncovering the role of corporate officers and directors and hold accountable everyone who broke the law.
This is a material shift in how seriously the FTC is taking breaches of children’s digital privacy. If you are still using solutions designed for adults (which most likely are capturing personal data), the clock is ticking. If you are a digital service provider, you need to be honest about who your audience is and identify and protect kids by age gating or sign-posting clearly.
Regardless of which part of the ecosystem you’re operating in (service, advertiser, publisher), kidtech is no longer optional when you’re engaging with children online.