Open Access Publisher and Free Library
09-victimization.jpg

VICTIMIZATION

VICTIMIZATION-ABUSE-WITNESSES-VICTIM SURVEYS

Posts tagged child sexual exploitation
Who benefits? Shining a light on the business of child sexual exploitation and abuse

By Childlight (Global Child Safety Institute)

The sexual exploitation and abuse of children is a global health crisis. This report – presenting a diverse set of studies into the nature of the crisis – underscores the need for urgent action, provides solutions and shines a light on the financial networks that fuel child sexual exploitation and abuse (CSEA). The theme – who benefits? – asks a critical question: who is making money from this vile trade? The answer is as disturbing as it is clear. Organised crime groups profit, of course, but so do mainstream technology companies. 

The report shows that advertising revenue increases when platforms attract high volumes of traffic, including traffic generated by offenders engaging in CSEA. The exploitation of children is not just an atrocity — it is an industry, generating billions of dollars in profits. This is a market, structured and profitable, designed to generate revenue off the backs of vulnerable children. But markets can be disrupted, and that is where change must begin. Governments, businesses and communities must shift to a prevention-focused approach that stops CSEA before it begins.

Key findings

  • Offenders are evolving, adapting and exploiting gaps in legislation and regulations.

  • Offenders groom single parents via dating apps to access their children.

  • Offenders target displaced children in conflict zones like Ukraine.

  • Images are traded using sophisticated payment methods, including cryptocurrencies, to evade detection.

Key solutions

  • Law enforcement and financial institutions can use tell-tale digital breadcrumbs to track and dismantle CSEA networks.

  • Tech companies must be held accountable, pro-actively detect and remove child sexual abuse material (CSAM), and make more effective use of tried and tested tools, like blocklists, to shut down access to CSAM.

  • Policymakers must act decisively, as the United Kingdom has begun to do, by criminalising AI-generated CSAM and banning so-called ‘how-to’ manuals for paedophiles. 

Edinburgh: University of Edinburgh, Childlight, 2025. 15p.