Di Levine – Assistant Professor/Lecturer and Impact Lead, School of Criminology & Research Associate (visiting researcher), Centre for Social Development in Africa, University of Johannesburg
Yesterday – 19th November – was World Day for the Prevention of Child Abuse. I would normally have said that ‘this couldn’t have come at a more prescient time’ or something along those lines, but actually, every day is focused on the prevention of child abuse for social workers, teachers, organisations, policy makers, researchers, and survivors. However, while this is ‘just another day’ for many of us, it does offer us an opportunity to shine a light on the risks, protective, and promotive factors inherent in child protection systems at this particular moment in time.
And this particular moment in time appears to be digitally-mediated. There have been a number of reports in recent months representing the increasing importance of this aspect of child protection.
Financial sexual extortion and AI-generated imagery
The We Protect Global Threat Assessment 2023 indicated that the volume of child sexual abuse material reports increased by 87% since 2019 (to 32 million reports). The nature of this abuse has also shifted in two regards. First, financial sexual extortion reports increased by 7,200% between 2021 and 2022 in the US. Second, AI-generated imagery is predicted to become a serious problem within a year, with one study conducted by the Internet Watch Foundation over a 5-week period in early 2023 finding 29 URLs containing suspected AI-generated abuse images, of which seven contained child sexual abuse imagery. There are some early indications that boys and girls may have different forms of vulnerabilities in these emerging risk categories, with boys more at risk of financial sexual coercion than girls, and LGBTQ+ young people in the US more likely to stay in contact with someone who makes them uncomfortable online, claiming the friendship as the primary cause for maintaining the relationship. The report suggests three steps to reducing risks: i) investment in public health initiatives focusing on prevention, ii) centring children’s rights and perspectives in intervention design and delivery, and iii) implementation of globally-aligned legislation. If you had asked me 15 years ago whether this last was necessary, I would have said ‘no’. Now, my personal perspective is that ‘incentivising’ platforms to reduce or eliminate harm is not far enough. We need legislative and regulatory tools that force platforms to implement ‘safety by design’ and painful sanctions if they are found to breach their responsibilities.
Young survivors of CSEA want to understand themselves and others
In Disrupting Harm: Conversations with young survivors about online child sexual exploitation and abuse, End Violence Against Children found 11 key messages emerged from young people focusing on both prevention and response. As is often the case, the participants made some points that have made many times before, but that we persist in ignoring or addressing only superficially in system change. Particularly striking for me were ‘Help me understand myself’, ‘Online sexual exploitation does not take place in isolation’, and ‘Do not judge me’. Understanding ourselves is about giving children the language and the structures to both understand themselves and articulate that understanding in relation to themselves and others. Our work on building a ‘digital ethics of care’ model (pilot project, adolescent-focused article) takes steps in this direction. The importance of multi-systemic approaches to tackling online sexual exploitation is exemplified in the second point. And the ways in which we work with ‘youth advisors’ or ‘peer workers’ is going to be fundamental in mitigating the risks of stigma and judgement.
Platforms need to take more action and responsibility
Following on depressingly neatly, the most recent OECD benchmarking report on the top 50 online services – Transparency Reporting on Child Sexual Exploitation and Abuse Online – has three key findings. First, a minority of services have suitable policies for managing child sexual abuse online. Of the 50 studied, only 10 have explicit policies on Child Sexual Exploitation and Abuse. Another 25 include general prohibitions on this kind of abuse, though without detailed explanations. Additionally, 15 services address child sexual abuse within broader prohibitions, while few define “child” or reference legal definitions. Second, of the 50 online services studies, only 20 issue Transparency Reports (TRs), and of those 20 the TRs vary wildly in terms of the data reported. Only 15 TRs provided proactive detection rates, with little detail on the violations in question. The lack of consistency means getting a clear sectoral and system view is very difficult. Finally, the benchmarking report found that there was limited and inconsistent information on content moderation. Of the 50 online services included, 29 used a combination of staff, automated tools, and user reporting. Only 16 of these offered detailed information about their detection methods. While 28 had policies for user notifications and appeals, these may not apply to violations like child sexual exploitation and abuse.
What can we DO?
Keep the systems around them as flexible as possible, and help them to build a language to describe how they and others around them are feeling.
And put pressure on your local representative/Member of Parliament to make online platforms take the responsibility they need to take – beyond lip service and committee formation – to keep our children safe and making the most of the riches of the internet. The Online Safety Act (2023) is only one small, and (some have argued flawed) step in achieving the long term goals of digital child protection.
Digitally-mediated child abuse has lifelong physical, psychological, and social consequences. On this World Day for the Prevention of Child Abuse we need to take courageous action to break the cycle and ensure that children have the opportunity to grow up in safe and nurturing digital environments.