fb tracking

Growing Up Online: Gen Z and the Threat of Image-Based Sexual Abuse

By Asha Buerk, Summer Technology Policy Intern

CW: This post contains discussion of sexual assault and non-consensual intimate imaging. 

When I was 11, I remember being absolutely giddy that my Mom let me use my iPod Touch to make a Snapchat account. I didn’t have a phone, and wanted to keep in contact with a friend who had moved schools, so she grudgingly allowed me to press in a username and make my first ever social media account. That wasn’t by any means the first time I’d been online, but more importantly, it wasn’t even the first time my face had been posted online. Pictures of me as a toddler and elementary schooler exist scattered across abandoned Facebook walls and online Christmas cards. As a member of Gen-Z, my peers and I have grown up online, both as a result of our own fascination with social media, and as a side effect of growing up in a society where social media became mandatory. 

In 2015, there was little public awareness about the mental health harms and online phishing scams propagated by social media, much less technologies that could replicate voices and faces. My family had no reason to believe that allowing me to create a Snapchat account or posting me on their personal Facebook walls would ever reach beyond our inner circle, or even really be of consequence. 

With the explosion of artificial intelligence in the past few years, it has become incredibly easy to create and distribute high-quality AI fabricated content like deepfakes, which are falsified video content that ‘stamps’ an individual’s face over someone else’s. This technology, in combination with AI powered voice replicators and generators that easily create video from nothing more than text prompts make for a dangerous combination. 98% of all deepfakes are non-consensual intimate images– colloquially known as deepfake porn. As of 2023, it takes 25 minutes and $0 to create NCII using one clear photo of an individual’s face. This ease has allowed thousands of highschool girls across the country to have their likenesses exploited and sexualized without their consent or even awareness. What started out as a dark corner of the internet has massively expanded and become a tool to harass and demean women. 

There is no pepper spray or self defense class I can take to alleviate my anxieties or calm my family– by growing up when I did, I’ve become a potential victim with no way to fight against image-based assault. Even if I wipe all of my personal profiles and beg my extended family to remove theirs as well, for everything I’ve ever posted, my face appears in two or three of my friend’s posts. My highschool team photos, headshots from jobs, and school awards, all exist online and make me vulnerable to NCII. 

It is so incredibly violating to have your body used without your consent. Anyone who has experienced sexual assault knows it leaves you feeling dirty, angry, and embarrassed. Image-based sexual abuse leaves women feeling exposed and humiliated in the same way that assault does– but with the addendum that now, rather than having a singular perpetrator, millions of individuals have access to view, create, and distribute that nonconsensual imagery to others. The overwhelming powerlessness that comes with that understanding is nothing short of a near-debilitating anxiety.

Additionally, as a young woman, I’m terrified of what this means for me and my friends in terms of our professional and personal lives. 99% of NCII created features women as the ‘faked’ subject in explicit content. The intense stigma surrounding sex work is used to discredit women by making them the subject of deepfake porn, with the hopes that it will get them fired, excommunicated, or shunned by friends and family. Any angry individual with a computer and internet access has the ability to effectively ruin my personal and professional life, while violating and using my body for deeply selfish and cruel purposes–destroying my mental health as well. 

As a young adult, the majority of my likeness online comes from when I was under 18. From aforementioned Facebook posts, middle school birthday parties, and my brief stint on a school team in third grade, my childhood likeness exists in perpetuity online. Even if I scrub all of my current profiles clean off the internet, the threat of NCII of my 14 year old face will continue to be a threat to the life I’ve built for myself as an adult. 

No one deserves to have their face used without their consent, especially in intimate imagery, regardless of what job, title, or amount of fame they hold. In a culture of victim shaming, we must be careful not to make the jump of “well her face is online.” If I chose to work in a public service or outward facing position, I should never have to ‘expect’ to have NCII made of me. I should not have to explain to my parents what NCII is, or why they keep receiving concerning videos of me and why we can’t do anything about it. 

In my home state, there is nothing to prevent someone from creating deepfake porn of me and distributing it. There are no legal consequences for perpetrators or avenues for me to seek justice. This cannot be left to just state legislators to act by creating AI-generated NCII bills. Rather, federal legislators must work quickly to create nation-wide protections. 

Bills like the DEFIANCE act and Take It Down Act give victims of NCII avenues to pursue civil action against perpetrators and seek damages . Additionally, the latter requires the platform on which the image is housed to take it down within 48 hours and decrees the creation of non consensual intimate imagery as a crime. 

Passing these as quickly as possible is imperative to protect young women in our country. Without any federally-mandated repercussions, individuals of my generation remain vulnerable to these assaults and their repercussions. I urge legislators to read over and support the aforementioned bills, and I send strength to young women who are afraid for their futures. I stand with you, and I can only hope our representatives do as well.