April 17, 2025
JAKARTA – The newly issued government regulation (PP) on child digital protection, which includes a minimum age limit for using social media platforms, has met with criticism for lacking clarity as well as necessary and effective safeguards.
The regulation, signed by President Prabowo Subianto on March 28, mandates both public and private electronic system providers (ESPs) to ensure that their products, services and features are free from content harmful to children younger than 17. These include pornography, violence and materials deemed to cause addiction or psychological harm.
It also requires providers to conduct a self-assessment on “impacts of child personal data protection” to determine whether their products and services pose high or low risk to underage users, and submit the results to the Communications and Digital Ministry.
This self-assessment could be problematic, according to digital rights group Southeast Asia Freedom of Expression Network (SAFEnet), as ESPs could downplay their services’ risk levels to avoid additional obligations mandated by the regulation.
“Ideally, an independent party should be involved, even though the ministry will carry out a verification afterward,” said Nenden Sekar Arum, executive director of SAFEnet.
Read also: Social media age restriction policy met with caution
Nenden also highlighted ambiguities in the regulation’s articles on age verification, which requires collecting children’s sensitive personal data.
The new rule mandated ESPs to “immediately delete [sensitive] data after verification, unless retention is permitted under prevailing laws”, which she warned of potentially making gaps in data protection since many platforms often stored user credentials for purposes of identification.
Overlap, loophole
Wahyudi Djafar, executive director of the Institute for Community Studies and Advocacy (ELSAM), described the rule as “rigid” due to its provisions’ lack of details and technical clarity, potentially leading to challenges in its future implementation.
The child online safety rule might also overlap the Electronic Information and Transactions (ITE) Law and the Personal Data Protection (PDP) Law, which regulate the protection and processing of personal information.
For example, the new regulation still contains an article on processing children’s personal data, which is considered sensitive under the PDP Law. The explanation for the regulation’s Article 9 categorizes “professional or work-related information” as part of children’s personal data, unlike the PDP Law.
Wahyudi noted that work-related data should not be relevant to children under 17, as minors were prohibited from working under the 2002 Child Protection Law.
Read also: Indonesian parents welcome age restriction for social media: YouGov survey
While the new rule prohibits digital profiling for personalized services and market development, it still contains a clause that permits digital profiling with user consent.
“This should prevent advertising that targets children, but the consent loophole creates inconsistency. This must be addressed in a forthcoming technical regulation,” Wahyudi said.
The government regulation will be fully implemented after the Communication and Digital Ministry issues a decree, which is expected within two years after the PP signing. The time gap will allow platforms to align their products and services with the new requirements.
Not a panacea
While welcoming the new regulation as a significant step forward in ensuring online safety for minors, Diyah Puspitarini of the Indonesian Child Protection Commission (KPAI) noted the absence of provisions on law enforcement.
“It would be very beneficial if this regulation clearly outlined the role of law enforcement institutions, such as the police’s cybercrimes division,” she said.
Diyah also emphasized the urgency of increasing digital literacy, which she deemed was lagging behind the rapid pace of technological advancements. The government could work with various parties to improve the necessary critical skills in this area, she said.
She also called for better government communication about the new regulation, since many people appeared to be unaware of it.
Some ESPs, including Google and Chinese technology firm ByteDance, which owns video sharing platform TikTok, have expressed their support and readiness to comply with the child online safety regulation.
Instagram, the photo and video sharing app of United States tech giant Meta, globally launched its Teen Accounts feature early this year.
The feature, which targets users aged 13-17 and includes protective tools and parental monitoring options, is expected to be unveiled in Indonesia this month.