October 15, 2025
KATHMANDU – A new report has revealed that one in every eight children in South Asia has experienced sexual assault before turning 18.
The findings come from ‘Into the Light Index on Global Child Sexual Exploitation and Abuse’ (ITL Index 2025), which examined both in-person and online abuse across Western Europe and South Asia, which were the highest affected regions as per its 2024 report.
The report, released by Childlight, a global child safety data institute, shows a grim picture of both in-person and technology-facilitated sexual exploitation of children in the region. It also highlights systemic failures in protection, data collection, and response.
According to the research, roughly one in eight children in the region experiences rape or sexual assault before the age of 18. The report, which includes representative survey data from India, Nepal, and Sri Lanka, found that 12.5 percent of children, 14.5 percent of girls and 11.5 percent of boys, reported such abuse. In real terms, that translates to an estimated 54 million children affected across these three countries alone.
The report cautions that these figures likely represent a conservative estimate, noting that the true prevalence could be significantly higher because of underreporting and gaps in available data.
It also sheds light on the scale of online child sexual exploitation and abuse facilitated by technology. Technology-facilitated abuse includes online grooming, coerced or non-consensual sharing of sexual images, forced exposure to pornography, and live-streamed exploitation. These forms of harm are increasingly difficult to detect as tech companies expand encrypted services without placing corresponding safeguards.
The findings reveal a 1,325 percent surge in AI-generated child sexual abuse material (CSAM) over the past year, including ‘deepfake’ content that superimposes real children’s faces onto sexualised images.
Child sexual abuse material (CSAM) refers to images, videos, or other visual content that depict the sexual exploitation or abuse of children.
Within South Asia, India, Bangladesh, and Pakistan host the overwhelming majority of child sexual abuse material (CSAM) detected on computers, according to global monitoring bodies, the National Center for Missing and Exploited Children (NCMEC) and INHOPE. Collectively, these three countries account for nearly all reports originating from the region.
In 2024 alone, NCMEC recorded 2,252,986 reports of CSAM linked to India, 1,112,861 to Bangladesh, and 1,036,608 to Pakistan.
Childlight’s analysis uses data from NCMEC and INHOPE to calculate a ‘CSAM availability rate’, the number of reported cases relative to population size.
The Maldives recorded the highest rate in South Asia, with 94 cases per 10,000 people, followed by Bangladesh (64.1) and Pakistan (41.3). Bhutan ranked fourth at 41, trailed by Afghanistan (28.9), Sri Lanka (27.8), and Nepal (19.4). India, despite its high absolute numbers, reported the lowest proportional rate in the region, 15.5 cases per 10,000 people.
Across available data, girls in South Asia are disproportionately affected. The majority of sexual assault survivors in India, Nepal, and Sri Lanka are female, though it also cautions that male victims remain undercounted due to cultural barriers and stereotypes that deter boys.
The report also laments the absence of sex- and gender-disaggregated national surveys in most South Asian countries.
Drawing on representative surveys and global datasets, the report identifies familial child sexual exploitation and abuse as a key but largely under-researched issue.
Metadata analysis conducted reveals a notable rise in “self-generated” child sexual abuse material (CSAM), content created or shared by children and young people themselves.
However, the report cautions that determining the intent behind such youth-produced images is often a complex task. In some cases, the material may reflect harmful or coercive interactions among peers. In contrast, in others, it stems from adult exploitation, including online grooming, non-consensual recording and distribution, or sexual extortion.
Childlight warns that this emerging trend blurs traditional definitions of CSAM and calls for a broader understanding of what constitutes sexually abusive material involving children in the digital age.
The report singles out technology companies for policy choices that inadvertently shield predators. End-to-end encryption (E2EE), for instance, while protecting user privacy, makes it nearly impossible for platforms or law enforcement to detect child abuse imagery or grooming activity.
Childlight urges governments in South Asia to strike a balance between privacy rights and child protection, suggesting regulatory reforms similar to those being debated in Europe.