How the international development sector can reduce the risk of paedocriminal perpetrators misusing content featuring children
A group of children smiling at the camera, playing sports or eating: Photos and videos of minors do not have to show naked skin to be of interest to paedocriminal perpetrators.
This also holds true for images created and published by international development actors, such as NGOs or various UN bodies. In this blog, I will briefly explain the problem, discuss why it ought to be taken seriously by development actors and propose ways they can address it.
What is the problem?
Since the early 2000s, paedocriminal perpetrators have been using both the clearnet, which is publicly accessible to everyone, as well as the darknet to abuse photos and videos of children for sexual purposes. In terms of the disseminated content, the darknet forums often distinguish between different categories. One of these categories is named “non-nude”. It means that the sexual characteristics of minors are not visible in the images. Even though these photos and videos only show children in everyday situations, they are misused for criminal purposes.
Paedocriminal perpetrators steal such content from private or public social media accounts and channels as well as websites and upload it to specific forums or chats.It is viewed, exchanged and commented upon. The comments made by perpetrators contain sexualised texts, sexual sounds or specific hashtags and emojis. Moreover, the rapidly evolving AI tools make it easy to alter content. With so-called deepnude generators or nudifiers, everyday images of children can be sexualised with just a few clicks.
Exact data illustrating the extent of the problem is difficult to obtain. Yet, already in 2020 (p.116), the “kids” category of just one publicly accessible photo platform popular with paedocriminal perpetrators contained 58.000 albums with more than three million everyday photos of children. They were clicked upon 14 billion times, and more than 600,000 partially sexualising comments were posted. In 2023, the number of clicks had increased to 18.7 billion. Everyday images of children have become “an integral part of the paedocriminal scene”(p.112) – both on the darknet and the clearnet.
Why should this problem be taken seriously by international development actors?
NGOs, UN bodies and other development actors regularly create and publish everyday photos and videos featuring children. The sector considers minors as an important part of society and wants their needs and potential to be recognised. This is arguably best achieved when children remain visible and audible as rights-holders and experts of their own world. For organisations that are financed by individual donations, photos are important because they evoke empathy and understanding and encourage people to act. They also illustrate the value of development and humanitarian work.
But no matter how good the intentions, these photos and videos find their way into paedocriminal forums and chats. Given that paedocriminal acts can be based on different motives and be linked to various violent sexual fantasies or fetishes, any photo can be abused – even those used by international development actors.
This misuse violates several children’s rights, including the right to protection of privacy and honour, along with the right to one‘s image. The infringements also involve the right to protection of mental health, as well as protection from violence and exploitation. These violations are not abstract – they constitute a direct threat to the children affected and can inflict tremendous harm.
Finally, the sexualisation of everyday depictions of children through AI is a growing concern. This risk is particularly high if the minors portrayed are easily recognisable, as is often the case in this sector.
How can the problem be addressed?
For various reasons, it is difficult for international development actors to search the internet for their own published photos and videos which may have been stolen and sexualised. Determining the extent of the problem is therefore impossible. What can be done, however, is to change the way content featuring children is created and published, thereby reducing the risk of misuse by paedocriminal perpetrators.
Save the Children Germany, together with jugendschutz.net, has developed a guideline on the sensitive handling of children’s photos and videos in institutions and organisations. It is based on extensive discussions with specialists from child and youth protection organisations, as well as the police.
The 36-page digital publication provides background information on internet paedocriminality and explains how perpetrators operate. It describes the various rights of children that are violated in this context. It also includes illustrations that contrast risky and less risky motifs. Concrete recommendations explain what to pay attention to when creating and publishing content featuring children. To follow these recommendations does not require additional resources or specific skills, but rather an overall awareness of the problem and the risk it presents to children.
Sadly, the dissemination of everyday depictions of children helps paedocriminal perpetrators to obtain even more content. When they share such photos and videos in relevant forums, their reputation increases, and they receive more material from other users. If the overall amount of potentially risky images decreases, the content made available to perpetrators will also diminish.
The aim, therefore, is to reduce the number of risky photos and videos in circulation to enhance the protection of children and their rights. Hopefully, the guideline provides concrete starting points for international development actors to contribute to this goal.
#PublishWithCare
Category
News & viewsThemes
Communications