Shifting the dial: How Internet Matters is helping to prevent self-generated CSAM

9th May 2024

Self-generated child sexual abuse material (CSAM) continues to rise at an alarming rate, according to not-for-profit organisation, Internet Matters. CSAM includes both material that’s been shared voluntarily between peers (and then re-shared without consent) and coerced ‘self-generated’ imagery through grooming, pressure or manipulation. Regardless of origin, there’s a significant risk that this material can land in the hands of adult offenders, where it’s then distributed within offender networks.

At Nominet, we support a number of research, initiatives and programmes that prevent, respond and eradicate practices that put young people at significant risk online. We’ve found that the creation and distribution of this material is a growing concern, and one echoed throughout our Countering Online Harms Innovation Fund grantees. In 2023, 92% of content removed from the internet by the Internet Watch Foundation (IWF) contained “self-generated” child sexual abuse material.

It was therefore important that we funded work that tackled the growing problem of self-generated CSAM. In 2023, we partnered with Internet Matters to conduct focused research into how to prevent the creation and distribution of this material.

Now, Internet Matters has published the full findings of its research into self-generated CSAM online. The report, ‘Shifting the dial: methods to prevent self-generated child sexual abuse among 11–13-year olds’, is funded by Nominet and aims to not only inform the work of Internet Matters, but the work of the wider online safety sector in combating this issue.

How widespread is the issue of self-generated CSAM?

The volume of self-generated sexual images involving 11-to-13-year-olds continues to rise, with a 14% increase in 2022 (199,363) to 2023 (254,071) alone. According to a recent Internet Matters national survey, 14% of teenagers under 16 say they have experienced a form of image-based sexual abuse. This equates to over 400,000 children in the UK. In the same survey, a quarter of teenagers under 16 said they are aware of a form of image-based abuse being perpetrated against another young person.

‘Shifting the Dial’ demonstrates the lack of programmes tailored by gender to prevent CSAM – despite girls being overwhelmingly the victims of online sexual abuse. There is also a lack of evidence about what works to deter children from sharing sexual images online. Some resources have also been criticised for being simplistic or victim blaming, particularly when it comes to young women and girls.

What did Internet Matters find in its research?

The research involved speaking with different focus groups of 111 children (58 girls, 53 boys) on multiple occasions. It asked them which educational messages would be most effective in dissuading 11-to-13-year-olds from sharing sexual images. The research revealed:

  • Many children say they haven’t received specific education in relation to sexual image sharing, or only very superficial coverage in RSHE/RSE lessons. Even then, they don’t feel able to get the information they want in whole class groups.
  • Girls in particular want smaller, gender-based groups. They said they found it hard to share or discuss issues in front of the boys in their class for fear of being teased or bullied.
  • Girls said they wanted more information earlier in secondary schools – for example in Year 7 – if not in primary school. They felt that currently, the lessons they do receive are delivered too late.
  • Girls say they want educational resources to acknowledge the greater likelihood that boys will behave as perpetrators, while girls are more likely to experience harassment. Girls said boys should receive targeted messaging that would help them understand the harmful impact of demanding nude images.
  • Strikingly, boys saw huge value in messages which tackle ‘perpetrator’ behaviour with unequivocal and un-sensationalised information about the consequences and legality of this behaviour.
  • Girls were generally negative about “consequences of sharing” messaging or any messaging they felt was simplistic and failed to address the underlying causes of sexual image-sharing.
  • Children said that currently, they typically learn more about sexual image-sharing from sources outside of school such as friends and family, or informally in school from gossip around certain incidents, as well as from TV and social media. In many cases this information tends to minimise or normalise sexual image sharing.

Can self-generated CSAM be prevented, and if so, how?

Efforts to tackle self-generated abuse content are typically focused on removing this content once it’s already in circulation. While this is valuable, there needs to be a greater emphasis on preventing sexual content from being shared in the first place.

Internet Matters trialled two digital methods of prevention – an interactive game and an ‘in-the-moment’ nudge technique – which could be available on digital devices. Both showed promise and received an enthusiastic response from the children’s panels. Internet Matters will be developing these further, following feedback from children, parents, and professionals to make them available to a wider number of children.

The children in the Internet Matters panels also appreciated the single sex RSHE lessons that Internet Matters designed and felt that learning in smaller groups that were split by gender worked well. Moving forward, this could be a more effective approach to preventing self-generated CSAM – enabling young people to feel more comfortable and have their voices heard.