April 22, 2025
JAKARTA – United States tech giant Meta and Chinese-owned video sharing app TikTok have reaffirmed their commitment to cooperate with the government in implementing child online safety measures, including rolling out accounts specifically for adolescent users, following last month’s enactment of a regulation to protect children in cyberspace.
Signed by President Prabowo Subianto on March 28, the government regulation on child protection for electronic system providers (ESPs) mandates platforms to implement more stringent age verification processes to prevent users under the age of 17 years from gaining access to harmful content.
The new regulation mirrors Australia’s Online Safety Amendment passed in November 2024, which prevents children under 16 years from having social media accounts.
Meta executives told journalists on Thursday that they were ready to work closely with the Indonesian government to ensure the newly signed regulation would support safer online experiences for teenagers.
Antigone Davis, Meta vice president and global head of safety, highlighted the social networking company’s proactive efforts to protect minors online through the creation of Teen Accounts feature across its platforms.
The feature comes with default protections, including accounts set to private, blocking messages from unknown users, strict limits on sensitive content, screen break reminders and restricting notifications during bedtime hours.
Read also: Social media age restriction policy met with caution
Meta is now expanding Teen Accounts to Facebook and Messenger following its successful Instagram debut last year with additional built-in protections, including banning users under 16 from using Instagram Live without parental consent.
“Teen Accounts are designed to create that low-risk environment for a teen where the right safeguards are in place,” Davis said. “Our goal is to ensure teens can access the digital tools they love, [such as] education, community, career development, without sacrificing safety.”
According to Davis, Meta currently uses a combination of self-reported age and its own age estimation technology, which draws on signals like behavior and profile cues to identify underage users.
Still, the company believes the most effective solution to be a unified age verification system on app stores that would allow for consistent protection across all mobile apps.
“Instead of making parents verify age across 40 different apps, you verify it once on the phone and it can be shared with all apps,” she added.
This approach would not only ease the burden on families, Davis said, but also ensure that even smaller developers could comply with emerging online safety laws.
TikTok, a subsidiary of China’s ByteDance, has meanwhile introduced its own safety measures for teenage users.
“Prior to the regulation, TikTok had already implemented child protection and monitoring programs,” Anggini Setiawan, TikTok Indonesia’s head of communications, said on Monday.
Among these measures, which apply globally, is the Family Pairing feature that allows parents and guardians to monitor their children’s activity on the platform, including screen time, content and privacy settings.
Read also: Indonesian parents welcome age restriction for social media: YouGov survey
TikTok’s age restriction policy includes a minimum age of 14 for users and a requirement for new users to enter their date of birth during registration, which determines a user’s access to features such as direct messages (DMs), live streaming and recommended content on the “For You” page.
“If a user attempts to change their birth date, the system can detect inconsistencies based on their behavior. Accounts found to belong to those under the permitted age will be blocked,” Anggini told The Jakarta Post.
Its existing teen safety policy means TikTok is ready to comply with the regulation, pending detailed guidelines on specific adjustments necessary for alignment.
The child online safety regulation takes full effect once the Communications and Digital Ministry issues a ministerial decree on technical aspects of its implementation, which is expected within two years after the regulation’s signing date.
Equality, fairness
Rafael Frankel, Meta’s public policy director for Southeast Asia, urged more open and inclusive cooperation in implementing the new regulation, highlighting its rushed rollout and lack of public consultation during its drafting, saying, “there needs to be an improvement in the policy process”.
Frankel also emphasized the importance of clearly defined guidance in implementing the regulation. “It has to be objective and fair and equally distributed across the digital ecosystem. And it has to be a transparent process, which, to be frank, we haven’t had,” he said.
Echoing this view, Davis stressed the importance of “clear and fair risk classifications”, pointing to a requirement for social media platforms to rate their products and services as either high or low risk to minors through a self-assessment.
“There’ll be apps that are considered low risk and apps that are considered high risk,” Frankel noted.
“It’s really important that the designation is done in an objective and fair way, that it not only takes into consideration the features of your app but [also] the mitigations that you put in place.”