We Had to Remove This Post: Desensitisation in the Digital Age

We Had to Remove This Post serves as a reflection of Gen Z and forces the reader to consider how social media can desensitise us and ask: just how much is too much?

Esosa Otabor

desensitisation

noun   

/diˌsen.sɪ.taɪˈzeɪ.ʃən/ 

the process of causing someone to experience something, usually an emotion or a pain, less strongly than before: 

He discusses our culture's desensitization to violence from so much exposure in movies, video games, and music.


I don’t remember the exact moment I was introduced to the internet. Most of us who are Gen Z were born in the early 2000s and grew up in a world dominated by a new era of media, communication, and technological advancement. When I think of my childhood, I reminisce on the hours I spent delving into worlds that seemed unbeknownst to everyone but me: spending copious amounts of time scrolling through Tumblr and worrying about which niche I should stick to for the month, garnering a cult-like following on Twitter of other early 2010 boyband obsessed teens and writing an ungodly amount of fanfiction on Wattpad. 

As fondly as I remember these parts of growing up, I now am starkly aware of the more disturbing and darker corners of the internet that many of us found ourselves in at the time. Sites like Tumblr were notoriously known for lacking a lot of content moderation, which meant many of us were prematurely exposed to sexual content, violence, and offensive media from an extremely young age. As we grew older, we saw first-hand the birth of a number of digital giants, from the launch of the first iPhone in 2007 to the invention of Instagram in 2010. It was becoming clear that this uncharted online world would irrevocably impact how we perceived everything around us. We now have the (sometimes unfortunate) privilege of unbounded information at our fingertips via a plethora of news websites, blogs and tweets. Social media gives voice to not only everyone but everything, making room for even the most bizarre interests. But who is it exactly that has to sift through all this content? Who separates the moral from the immoral? Who bears witness to the hidden corners of humanity’s depravity?

Exposing the darker underworld of content moderation, We Had to Remove This Post is a short, succinct, and punchy novel from Dutch author Hanna Bervoets which follows the story of a young content moderator who works for Hexa, a third-party company employed by an unnamed social media company. Although this social media company remains anonymous in the book, the author not so subtly hints at it being something akin to Facebook as it is described as a site that teenagers “abandoned . . . ages ago in favor of their own dance and lip-synching apps” (p80), an obvious nod to Tik Tok’s recent domination. 

Kayleigh, our protagonist, is a jaded twenty-something-year-old who finds herself working at Hexa after a troubling break-up has left her in crippling debt. Her job is to sift through all of the user-generated content posted on the social media platform and decide whether or not it's appropriate for it to stay up. The novel's first line asks what every reader is wondering: "So, what kind of things did you see?" We quickly learn that all of the content moderators within the company were subject to watching everything from conspiracy videos to explicit pornography and posts consisting of extreme violence, torture and self-harm. Employees were made to regulate this content with meticulous and apathetic eyes. Although her colleagues seemed to be struggling with the traumatic nature of the work, Kayleigh quickly learns (and even seems to take joy from following) the complex rules set out by Hexa:

“A video of someone flinging their cat out the window is only allowed if cruelty is not a motive; a photo of someone flinging their cat out the window is always allowed; a video of people kissing in bed is allowed as long as we don’t see any genitalia or female nipples; male nipples are permitted at all times.” (p15)

“The platform doesn’t allow people to post things like “All Muslims are terrorists,” because Muslims are a PC, a “protected category,” just like women, gay people, and, believe it or not, Mr. Stitic, heterosexuals. “All terrorists are Muslims,” on the other hand, is allowed, because terrorists are not a PC and besides, Muslim isn’t an offensive term.” (p14)

Hexa is a backdrop for the isolating relationships Kayleigh forms whilst at work, bonding with her colleagues through shared trauma and the unhealthy coping mechanisms they adopt. As the characters' mental health deteriorates, they become increasingly influenced by the content they are moderating or simply desensitised to the violence they are subject to. Several workers start to exhibit alarmingly anti-Semitic and racist beliefs and indulge in various conspiracy theories. This reflects how easily misinformation spreads on social media and how exposure to specific spheres of the internet may eventually lead someone to adopt a particular way of thinking. Kayleigh finds herself unaroused by the 'tame' pornography she used to watch, remarking how she needs to replace her "favorite porn website for an alternative search engine—the progressive kind that doesn't store your search terms." (p98), leaving the reader to wonder whether she has fallen subject to the grotesque sexual imagery she sifts through daily. 

Essentially, the novel acts as a precautionary tale against the dangers of excessive media consumption and how the consequential desensitisation of this can warp our perceptions of ourselves and those around us. Kayleigh's relationship with her girlfriend and co-worker, Sigrid, is told solely from her perspective. At the end of the book, she is proven to be an unreliable narrator when it is revealed that a major aspect of their romantic relationship is not what it seems. The reader is forced to question if the dissociative nature of the workplace has seeped into her personal dynamics. An overarching theme in the book is the subjectivity of perception and individual interpretation: from the workers having to decide whether the content is offensive to some of the characters falling prey to conspiracy theories and Kayleigh's biased view of her relationship. As the boundaries between the characters' online and offline lives blur, where do the company guidelines end and human morality begin?

Bervoets crafts the novel in such a way that the burden of what the moderators must witness not only plays on your conscience as a reader but comments on the current corrosive nature of social media in real life and the potential damage it can cause. We Had to Remove This Post is nothing if not timely as we witness Elon Musk’s acquisition of Twitter and his slackening of content moderation rules that are leading to an increase in hate speech and disinformation online—not to mention the various cases of TikTok employees suing the company for negligence and a lack of protection for workers against emotional trauma. Through this book, Bervoets prompts us as a society who have interwoven social media into our lives to ask ourselves: do we need to re-evaluate how we consume content? How can we become more aware of how content affects us?


Esosa Otabor is a London-based writer, editorial assistant and MA Migration & Diaspora studies student at SOAS, University of London. She has previously written for Black Ballad UK and has a particular interest in writing from black & African diasporas and novels in translation. You can find her over on Instagram @sosa.tr


Work cited: 

We Had to Remove This Post, Hanna Bervoets. Picador Books, 2022. 

Previous
Previous

I Don’t Want to See Another ‘As Seen on TikTok’ Tag Ever Again

Next
Next

A Deep Dive into AI Poetry