Child Sexual Abuse Online: Tech, Social, Prevention

Child Sexual Abuse Online: Tech, Social, Prevention
Slide Note
Embed
Share

This research delves into online child sexual exploitation and abuse, addressing the lack of universal terminology, data analysis on severity, and categorization guidelines. It highlights the importance of aligning definitions internationally for effective prevention efforts.

  • Child abuse
  • Online exploitation
  • Prevention
  • Technology
  • Data analysis

Uploaded on Feb 19, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Understanding child sexual abuse online: The tech, the social and what it means for prevention. Prof. Corinne May-Chahal, Larissa Engelman, Christine Weirich and Adam Crawford ESRC Policing, Vulnerability Futures Research Centre (ES/W002248/1)

  2. Before we start, what definitions do you use? ECPAT uses the term Online Child Sexual Exploitation and Abuse (OCSEA), which it defines as situations involving digital, internet and communication technologies at some point during the continuum of abuse or exploitation. OCSEA can occur fully online or through a mix of online and in-person interactions between offenders and children. What is online child sexual exploitation and abuse?

  3. Prevention Issue 1 Lack of universal terminology and definitions of child sexual abuse material (CSAM), leading to different definitions of and thresholds for crimes, and incompatible data sets between jurisdictions. ECPAT recommends that a key prevention task is to ALIGN TERMINOLOGY AND DEFINITIONS: Align terminology and definitions of CSAM, OCSEA, and electronic evidence with international guidelines. Ensure consistency in the definition of a child as anyone under the age of 18 for all crimes of sexual exploitation

  4. What does the data tell us? International Child Sexual Exploitation Database (ICSE) ECPAT/INTERPOL, 2018 2.7 million images and videos helped identify 23,564 victims worldwide The younger the victim, the more severe the abuse. 84% of images contained explicit sexual activity. More than 60% of unidentified victims were prepubescent, including infants and toddlers. 65% of unidentified victims were girls. Severe abuse images were likely to feature boys. 92% of visible offenders were male. The majority of victims were white

  5. How severity is categorised Sexual Offences Definitive Guideline, Sentencing Council, (2014). Images involving penetrative sexual activity; images involving sexual activity with an animal or sadism Category A Images involving non-penetrative sexual activity Category B Other child sexual abuse images not falling within categories A or B i.e. with some sexually suggestive content Category C

  6. Clearing Houses National Centre for Missing and Exploited Children (NCMEC) Washington DC. U.S. federal law requires that U.S.-based Electronic Service Providers report instances of apparent child pornography that they become aware of on their systems to NCMEC s CyberTipline. To date, the CyberTipline has received over 82 million reports. In 2022 it received over 32 million reports Since 2002, the Child Victim Identification Programme (CVIP) has reviewed images/videos and identified 19,100 children In 2022, 89.9% of reports related to locations outside of the U.S. Other clearing centres include the Internet Watch Foundation, Canadian Centre for Child Protection and Child Rescue Coalition.

  7. Types of OCSEA Types of OCSEA Grooming online that leads (or intends to lead) to offline child sexual abuse Peer on peer coercive image and text sharing, originating in schools, gangs and within offline peer relationships. Peer to peer exchange of child sexual abuse media produced in offline settings. Live streaming of child sexual abuse Recorded or captured sexual abuse of children (particularly young children) in domestic and family environments

  8. Prevention Issue 2 What are the implications of the varied forms of OCSEA for prevention? Need to tailor responses to actions and behaviours The Tech Industry can help but they re not the sole source of prevention action

  9. Grooming In the UK the number of instances recorded by police jumped by about 70% from 2017/2018 to 2020/2021 to an all time high of 5,441 Sexual Communication with a Child offences recorded in 2021. (NSPCC, August 2021) New gaming platforms facilitate grooming

  10. Grooming Some safety solutions available. (1) Age Assurance, (2) Parental Controls, (3) Reporting and Blocking, (4) AI-supported Content and User Moderation. Child safety is an afterthought for most platforms. Most do not apply all these safety measures (Bracket Foundation, 2023)

  11. Why dont ESPs use these solutions Competitive disadvantage if they implement safety measures they may increase sign up hurdles Liability Legal frameworks do little to impose significant legal or financial consequences No comprehensive regulatory framework for gaming

  12. Immersive intensity/realness of interaction makes it easier to build relationships with minors Competitiveness of gaming increases the risk of contact with predators Multitude of interaction on social gaming and metaverse platforms difficult to moderate Not easy to see what children are doing from outside of VR and AR In-app currencies needed to improve experience Characteristics of gaming that increase risk

  13. Roblox, Clash of the Titans, Minecraft, Fortnite Roblox users under 16 account for as many as 67% of the total user base Popular gaming sites with children Roblox, I was gobsmacked, I have allowed [Name] to play Roblox and somebody shared something on Facebook and this woman had happened to go on a child s iPad or whatever it was to go on his Roblox game, and a guy had got the character on the bed and was saying rude things and that s a child s game. People will allow children to play Roblox because it s a child s game .

  14. Webpages containing self-generated images increased by 168% from 68 thousand in 2020 to 182 thousand in 2021 (IWF) Peer on peer - material is increasing Cyberbulling is widespread and linked in research to coercive sexual behaviour The Bracket foundation (2023) found 48% children between nine and 17 are affected by cyberbullying

  15. HISTORICALLY, COMPANIES RESPOND TO SAFETY CONCERNS ONLY AFTER PUBLIC REPORTS OF ABUSE MORE FIXES REQUIRED (Bracket Foundation, 2023)

  16. Safety Solutions available to tech industry Tool Description Age Assurance Methods of age estimation (AI estimates based on picture/video or behaviour) Provides basis for enforcing age-based rules AI Classifiers Algorithmic classifiers are used to identify abusive content and behaviour, text, speech, pictures and videos Allows for scaling up content moderation on platforms with many users Moderating users User actions moderated by human moderators Reporting and blocking depends on user willingness (most don t) Parental controls Allow parents to monitor and limit their children s activities Can be installed on device of included as part of the application (depends on parent and capacity of child to circumvent)

  17. Terminology self generated v youth involved sexual imagery Solutions offered by the Safety Tech Challenge Fund on device or trust the ESP to run It s not all about tech though Prevention Offline prevention needs to change they re always telling us what not to do but nobody says what we should be doing, we don t know what a good relationship looks like (Scottish Violence Prevention Unit sexual harm project with young people

  18. Primarily in darkweb forums and peer to peer networks Tech industry involvement is opaque, liability unclear Web-based forums can be used to share Bittorrent links, and encrypted files can be shared from public cyberlockers. Peer-to-peer software that shares encrypted (and innocuously labelled) binaries can be run over the Tor network, and the decryption passwords and pointers to the content shared on Usenet newsgroups accessed via the web. The Ares peer-to-peer network client now includes Bittorrent link capabilities, an integrated image/movie viewer (allowing it to be used to view and not just download content), and an integrated chat function (Steel et al, 2020 p14). Peer to Peer CSAM exchange

  19. CSAM Characteristics CSAM Characteristics Focused on in Focused on in development of tech development of tech tools tools Child age CSA activity 6% 12% LE activity CSAM characteristics focused on in technical tools vary according to the research target (images, video, devices), the CSAM context (e.g., open web, dark web, P2P) and CSAM content (children of different ages, sexual activity, file names, locations). 31% 21% Metadata Nudity 9% 6% Perpetrator activity Place 15% Just under one third of tools reported in research focus on perpetrator activity.

  20. Methods and Models for CSAM Detection Filename Other Webcrawler Survey Multi-dimensional Face recognition Crime Script CNN/deep learning Chatbot 0 1 2 3 4 5 6 7 8 9 10

  21. Evaluations of CSAM Detection Models Over a third of tools are evaluated for accuracy, which ranged from 60% for the detection of young children to 97% for filepath analysis (Project Vic, Microsoft) 30% 34% Recall was also reported in 12% studies ranging from 65% for a stepwise law enforcement processing model to 94% for filepath analysis 6% 18% Other methods included calculating Mean Average Error (for age estimation and CSAM severity detection), Goodness of Fit (network analysis) and estimation of reduction in law enforcement processing time (94% compared with existing tools) 12% Accuracy MAE Recall Other N/A

  22. At present, the prevention of P2P CSAM sharing is at best hypothetical The majority of technical tools are focused on tertiary prevention (CSAM post-production) CSAM continues to increase; Tor and multiple sharing platforms, including P2P, are highly conducive to increasing the production of more CSAM, and more severe CSAM The pandemic expanded opportunities for child sexual exploitation which remain Current prevention approaches increase the load on law enforcement and it is impossible for them to meet demand Prevention

  23. Estimated that 80% originates from ASEAN region Occurs elsewhere but no reliable data on extent Laws vary across ASEAN (ages of consent, proactive evidence gathering, offences not comprehensive) Live streams can be captured by perpetrators and shared - practice known as capping Technical challenge is a lack of evidence on devices Algorithmic bias has failed to detect indigenous victims and offenders More recent actions have focused on financial transaction Live Streaming

  24. Images on data bases depict the majority of children are under 13 Data bases include material concerning very young children Sexual abuse of children in domestic and family environments Tor platforms and P2P networks encourage greater severity of CSAM Media reports of parental grooming but almost impossible to detect US studies find family members are the largest group of perpetrators of CSAM

  25. This is not new, but needs new eyes All child protection professionals must include the possibility of OCSEA in their assessments Prevention in familiar territory Early Years workers must be aware of the possibility of OCSEA, which can manifest in many ways Teachers need to be accessible and aware

  26. What do we know so far? Online and offline facilitated child sexual abuse increasingly blurred (May-Chahal & Kelly, 2020) Significant overlap among types of online Violence against Children (WHO, 2022) Ambiguity in relation to self-generated imagery (Skidmore, Aitkenhead & Muir, 2022) There is a need for child-led instead of offender-led responses (Skidmore, Aitkenhead & Muir, 2022) "Evidence about the success of prevention programmes for online child sexual exploitation and abuse (OCSEA) is not yet available (WHO, 2022)

  27. UK Online Safety Act The OSB will improve the accountability of platform providers but has limits Risk assessments for OCSEA are specific to provider platforms There is a duty to - protect children from encountering harmful content Mitigate and manage risks on their service S122 Providers must use accredited technology to search for CSEA (or terrorist) content and prevent users from encountering it The debate: This technology does not exist particularly for encrypted services unless the privacy of all users is compromised. The UK government has said it does (it does not) Providers are threatening to withdraw encrypted services from UK users if this is enacted (see Meredith Whittaker, President of Signal, open letter to Lord Bethell on X). 70+ UK and 400+ global cyber security experts challenge the mass surveillance indicated by S122. -

  28. The challenges for legislation mean more emphasis is needed offline: So, who protects children? If service providers are to be held accountable, CSEA content must be routinely reported (currently rare) Early intervention and secondary prevention need funding urgently is a levy the way forward? Community based responses need to be developed that encourage reporting, and support children and parents to prevent OCSEA.

Related


More Related Content