/r/ DeepFakes and Reddit’s Ongoing Misogyny Problem
Adrienne Massanari / University of Illinois at Chicago
This week (February 5), The Daily Dot reported on a Reddit community where individuals could request (and pay for) digital porn of their favorite celebrities. Unlike the 2014 celebrity photo hack colloquially known as “The Fappening,” /r/deepfakes did not trade merely in stolen images. These were AI-created, high-quality clips of celebrities (almost all women) superimposed onto pirated porn. The community was finally banned but only after having acquired a large number of new subscribers (a total of around 90,000) and negative media coverage. /r/deepfakes continued existence was an anomaly; other platforms such as Twitter, Discord, and Pornhub had policies banning this kind of face-swapping material for some time. But such is the way of Reddit, which has a long history of allowing misogynistic and racist communities to flourish and only to taking action when negative press forces administrators’ hands.
The idea of AI-created porn presents a host of ethical dilemmas, which I’m not going to cover here. Instead, I want to explore why Reddit continues to demonstrate the same pattern of behavior when it comes to hosting communities that engage in speech and behaviors that are troubling (at best) and give 4chan’s /b and /pol a run for the category of “worst internet cesspool.”
If you’re not familiar with Reddit, here’s a crash course: anyone can create a community around a topic of interest (such as science, cats, or hip hop) called a subreddit, which allows users to submit content that is then voted on by the community. Stuff that is more popular as voted on by the community eventually floats to the top of each user’s front page (a curated list of material that’s popular across their subscribed subreddits). Reddit members (also known as Redditors) can also get a sense of what’s popular across the platform by accessing reddit.com/r/all. Subreddits are moderated by community members, and there’s very little intervention from the administrators in terms of what is acceptable. The few rules that do exist are mostly the result of ongoing issues with spam, bullying, sexualized images of minors, and the release of private information about individuals.
Reddit has a long history of allowing its platform to house a number of racist and misogynistic subreddits. It wasn’t until mid-2015 that Reddit administrators banned racist subreddits affiliated with “The Chimpire” (which included /r/coontown, probably the most notorious member). But /r/coontown was merely a new instantiation of /r/n***rs, which was banned in 2013 for vote manipulation. Both bans were only undertaken after significant negative media coverage, despite public opposition from other Reddit members (often voiced during Ask-Me-Anything (AMA) sessions regularly held with site administrators). The same goes for the infamous /r/jailbait (a subreddit for sexualized images of teenage girls), /r/creepshots (a subreddit for sexualized images of women taken without their consent), and /r/fatpeoplehate (a subreddit that mocked and harassed overweight women). In all cases, Reddit’s administrators seem to suggest that while they, themselves, found the content of these subreddits repugnant, “every man [sic] is responsible for his own soul.”
But in several cases, such as with /r/TheFappening, it wasn’t until there was a threat of celebrities suing Reddit for DMCA violations or the possibility that underage individuals were included in the photo dump that the subreddit was banned.
However, these are only a small handful of objectionable subreddits that exist on the platform – and it’s clear that people who espouse these views find the platform welcoming. Plenty of subreddits are actively misogynistic, and some regularly make it to the front page of /r/all. These include communities affiliated with the Manosphere, such as /r/mensrights, /r/TheRedPill (dedicated to pick-up artistry and nonconsensual sexual activity), and /r/incels (only recently banned, but which hosted discussions of “involuntary celibacy,” a philosophy said to have inspired the 2014 Isla Vista shootings).
Then there’s those subreddits that merely trade in sexist memes, such as /r/pussypassdenied, which promotes the idea that women are held less accountable for their actions than men are (and celebrates moments when their “pussy pass” is denied in some way). And who can forget that Reddit still hosts a #Gamergate subreddit, /r/KotakuInAction, despite the fact that even 4chan banned #Gamergate mentions from its boards in 2014.
Not to mention that /r/The_Donald is one arguably the most popular forum dedicated to misogynist-in-chief and “alt-right” favorite.
Up to this point, Reddit’s approach mostly involves sticking their collective heads in the sand in the hopes that subreddits like /r/KotakuInAction burn themselves out and go away. This doesn’t work. These are actually really popular spaces on the platform, and as I have argued elsewhere, it’s not like the terrible denizens of /r/TheRedPill don’t also interact with more “mainstream” subreddits on the site. One of the common sentiments I heard while researching my book was that Reddit served as a recruiting ground for Stormfront (a white nationalist site), with racist memes and image macros being posted to popular subreddits such as /r/AdviceAnimals. And recent media reports suggest that Reddit may have been instrumental in the spread of Russian propaganda during the last US presidential election. So why do site administrators continue to do nothing, even when research suggests that banning toxic subreddits actually reduces the overall amount of hate speech on the site?
I suspect two factors are at work here, both of which involve (surprise!) money. The first is related to Reddit’s governance structure. As a company Reddit has very few employees, and most of them are dedicated to site development and marketing rather than community moderation. Instead, they rely on a fleet of volunteer moderators to monitor content on the site. Reddit can quickly become over-reliant on moderators, as without them there’s literally no way the site can continue running effectively. This means it’s very easy for a small number of moderators to become incredibly powerful even if they’re in charge of problematic subreddits. Such was the case with notorious Redditor Violentacrez, who while running /r/jailbait regularly collaborated with the site’s administrators to keep the platform free of child pornography and other illegal content. And yet, the relationship between administrators and moderators remains fraught, leaving moderators frustrated with the lack of meaningful tools to do their jobs well.
The second reason is more directly tied to revenue. Often the Redditors who are engaged in the most objectionable behavior on the site are also the most invested and active on the site – and they demonstrate it with their dollars. During the two weeks that /r/TheFappening was live, for example, many of its subscribers purchased Reddit Gold (a subscription service which helps the site survive, as Reddit is still not profitable) as a kind of “thank you” for hacked celebrity nudes. This meant that /r/TheFappening essentially supported the entire platform for an entire month.
At the same time, it’s clear that Reddit’s leaders are hoping its planned redesign will allow it to make more appealing for advertisers, and perhaps tacitly encourage these “problematic” subreddits to find a home elsewhere. One can only hope, but I’m not holding my breath.
Please feel free to comment.