TikTok Unveils Smarter Age Detection Tech Amid Global Child Safety Pressure

As more governments explore stricter age-based restrictions for social media use, TikTok has provided a detailed look at its evolving systems designed to detect underage users and minimize teen exposure to harmful content.
In a newly published overview, the company outlined how it’s now leveraging AI-driven age detection, alongside traditional verification methods, to ensure compliance and strengthen youth protection across its platform.
As TikTok explained:
“In most parts of the world, the minimum age to use TikTok is 13. We use a multi-layered approach to confirm someone’s age or detect when they may not actually be the age they say they are.”
These safeguards begin at the most basic level — requiring users to enter their date of birth when creating an account.
“If someone fails to meet our minimum age, we suspend their ability to immediately re-create an account using a different date of birth.”
But TikTok is now expanding its approach through AI-based age qualification tools, which have already shown promising results.
“We’ve been piloting new AI technologies in the U.K. over the last year and found they’ve strengthened our efforts to remove thousands of additional accounts under 13. We’re planning to roll this technology out more widely, including in the EU, and are currently discussing it with our European privacy regulator.”
In addition to automation, TikTok’s human moderation teams play a critical role in identifying potentially underage users.
“If [moderators are] reviewing content for another reason but suspect an account belongs to an underage user, they can send it to our specialized review team with deeper expertise on age assurance. Since judging age can be complex, our teams are instructed to err on the side of caution when making enforcement decisions. When in doubt, we will remove an account we suspect may be under 13. We also allow anyone to report an account they believe belongs to someone under 13. You don’t even need a TikTok account to do this.”
This layered approach—combined with restrictions such as disabling DMs for users under 16 and enforcing default screen time limits—represents TikTok’s broader push to create a safer digital environment for minors.
The company reports that these measures lead to the removal of around six million underage accounts globally every month.
Such efforts are becoming increasingly crucial, as nations around the world move toward more formalized regulations governing young users’ access to social media.
In Europe, France, Greece, and Denmark have all expressed support for restricting access to users under 15, while Spain has proposed a minimum age of 16. Elsewhere, Australia, New Zealand, and Papua New Guinea are developing similar legislation, and Norway is currently drafting its own rules.
To be fair, most major platforms already require users to be at least 13 or 14 years old. The difference now lies in enforcement. Governments want platforms to take greater responsibility for age verification—and are increasingly attaching steep penalties for those that fail to comply.
Yet, creating a standardized, legally enforceable framework for age verification remains an open challenge. Each platform currently uses its own systems, leading to inconsistent outcomes and uneven accountability. Smaller players, in particular, struggle to match the resources and compliance capabilities of tech giants like TikTok or Meta.
TikTok recognizes this imbalance and has been working to promote industry-wide collaboration.
“Since its first session last year, TikTok has engaged in the Global Multistakeholder Dialogue on Age Assurance convened by the Centre for Information Policy Leadership (CIPL) and WeProtect Global Alliance. This dialogue aims to explore the complex challenges of age assurance and minor safety, whilst driving consensus across the sector. To that end, we have already started to explore whether the European Commission’s planned age verification app could be an effective additional tool for us. However, for any solution to be truly effective, it’s crucial to have a level playing field in which peer platforms are subject to the same regulatory requirements and are held to the same standards.”
That, ultimately, is the core of the issue: uniformity and accountability. If governments are to impose stricter rules on digital age limits, those regulations must apply evenly to all platforms. Otherwise, enforcement risks becoming fragmented, inconsistent, and unfairly burdensome on smaller companies.
TikTok’s latest transparency update serves both as a progress report and a subtle call for collaboration—because for age assurance to truly work, the entire industry will need to move in lockstep.
📢 If you're interested in Facebook Ads Account, don't hesitate to connect with us!
🔹 https://linktr.ee/Adshinepro
💬 We're always ready to assist you!