TikTok is under scrutiny for its handling of users under the age of 13, as evidence from a Guardian investigation suggests that moderators have been instructed to allow such users to remain on the platform if they claim parental oversight.
The platform’s minimum age requirement is 13.
In one instance, a user, self-declared as 12, was permitted to stay on TikTok because their account bio stated parental management.
This decision was conveyed through internal communications involving a quality analyst responding to a moderator’s query about whether to ban the underage user.
- Apple Watch Faces Setback: Key Points To Understand
- Google Settles $700m Antitrust Case, promising App Store Competition
Allegedly, moderators have been advised that if a parent is visible in the background of an apparent underage video or if the account bio indicates parental management, these accounts can continue using the platform.
Suspected underage accounts are sent to an “underage” queue for further moderation, where moderators decide whether to ban or approve the account.
An anonymous TikTok staff member expressed concerns, stating that it is “incredibly easy to avoid getting banned for being underage.”
TikTok, however, denies the allegations, asserting that under-13s are not allowed on the platform, regardless of claims of parental oversight.
TikTok emphasizes its commitment to providing a safe experience for users under 18, requiring users to be at least 13 years old to have an account.
The platform claims to have removed over 18 million suspected underage accounts globally between April and June of the current year.
The platform has faced previous fines from regulators, including a €345 million fine from the Irish data watchdog for violating EU data law related to children’s accounts.
The UK data regulator also fined TikTok £12.7 million for allegedly misusing data of children under 13.
The Guardian’s investigation suggests that potentially underage accounts have received preferential treatment, with some tagged as “top creators.”
One instance involved a child with a “top creator” tag despite appearing underage and another with over 16,000 followers also receiving this label.
TikTok’s approach to age limits is governed by the children’s code in the UK, which mandates parental consent for processing the personal data of children below 13.
The newly introduced Online Safety Act in the UK requires platforms to outline measures preventing underage access consistently.
Experts argue that TikTok, regulated by Ofcom and the Online Safety Act, must enforce its terms consistently.
The EU’s Digital Services Act adds additional protections for children and emphasizes age verification measures for major platforms.
TikTok maintains that its 6,000 European moderators apply community guidelines uniformly to all content.
TikTok Faces Scrutiny Over Age Verification Practices For Under-13 Users, TikTok Faces Scrutiny Over Age Verification Practices For Under-13 Users