Coming of age has never been so fraught. Many teens use ‘sexts’ to consensually explore their emerging sexuality. Such engagements are prohibited and teens who sext may experience gender-linked sexual shaming and victimisation, including by adults. Other teens may experience non-consensual online interactions, often categorised as image-based abuse or technology-facilitated sexual violence.
This project gathers teens' perspectives upon and remedies for peer-perpetrated and peer-magnified image-based sexual harassment and abuse. Reports of sextortion, sexualised AI deepfakes and blackmail of teens by adult predators are rising, even as teens worry that reporting such abuse might see them, as victim, accused of creating child exploitation material.
This project aims to develop strategies to overcome, prevent and mitigate image-based abuse. It does this by speaking to young people and developing strategies via art-based methodologies. This project is funded by the Australian Research Council, a government-funded organisation, Project number DP250102379. This research is conducted in conjunction with Professor Jessica Ringrose at University College London (UCL).
We are hoping to recruit teenagers to take part in our research. ‘Teenagers’ include anyone enrolled in secondary school, even if they are twelve. Interviews and focus groups with teenagers will involve discussions on sending explicit messages; peer-on-peer image-based sexual harassment and abuse; sextortion, AI chatbots, companions and AI sexualised deepfakes. These activities will also encourage teens to identify alternative educational, policy and legal settings that might minimise harm to participants and support safe, respectful sexting.
Separately, we are also aiming to recruit parents to participate in focus groups. These focus groups will explore how parents navigate conversations around intimate images, image-based sexual harassment and abuse, sextortion, and AI-informed sexualised deepfakes with their children.
Personnel: