Is TikTok Censoring Pro-Abortion Content?

Photographed by Anna Jay
Sex and censorship has always been a divisive topic, almost always met with pearl-clutching protest. 'Think of the children' attitudes are infiltrating social media feeds and causing a spree of virtuous violence against educators, activists and sex workers. Abortion is no different. Now, pro-choice groups focused on drawing attention to abortion access and empowering their communities are falling victim to conservative policing on one of the largest and fastest-growing social media apps on the market: TikTok.
This censorship is endemic across the app but doesn’t seem to be affecting right-wing groups or, for that matter, pro-lifers. One activist group, Rogue Valley Pepper Shakers (@rvpeppershakers), which defends Planned Parenthood clinics from so-called 'pro-life' harassment and posts information on abortion resources, is one of many groups facing unnecessary censorship from TikTok. 
Advertisement
The group, which amassed 190.4k followers and 2.5 million likes, has now left the app. Its absence from TikTok is a signifier of something even more insidious: the weaponisation of social media platforms’ terms of service against women’s rights. Rogue Valley Pepper Shakers has been hidden, shut down and had content removed because of apparent 'breaches' of TikTok’s terms of service. On one occasion, the group had a post removed for supposed violations to “minor safety”, which suggests that the post was promoting harmful content geared towards minors, or people under the age of 18. When, in fact,  the content had been about LGBTQIA rights, firmly rubbing salt in the wound and highlighting antiquated bias against LGBTQIA parents
This is a pattern. Another pro-choice, pro-abortion account, Reproductive Rights Coalition (@reprorightscoalition), is experiencing similar issues on the app. In one video they discuss getting banned for 'violating community guidelines', when it appears no violations have taken place. 
Jack Cooperman, who was running the TikTok and Instagram accounts for the group, tells Refinery29:
"We have no way of evaluating our own actions to make sure we’re compliant and able to conduct our social media presence in a manner which doesn’t cause censorship. 
"We want to show what's happening in our corner of the world. The truth of the matter. We want to gain support in defending women and the rights of anyone with a uterus...I wish the apps would spend more time looking at what they censor and giving better feedback so we can all get what we want. We have enough to fight for. We shouldn’t have to fight them too."
Advertisement
TikToker Whitney (@prochoicewithheart) has experienced worsening censorship since TikTok’s move to automated communication and moderation. Incidentally, this is also how Facebook – which owns Instagram – moderates its platforms and has resulted in pro-women content (including the use of the phrase 'men are trash') being clumsily censored by algorithms in a process which is often referred to as 'shadow banning'
Whitney regularly posts about America’s landmark abortion case Roe v. Wade and highlights politicians across the US who are voting against abortion rights. She tells me that she’s finding it harder and harder to get help when her work is censored. "I used to talk to someone. Now, the automated system is silencing and hurting small creators, like myself, who are vulnerable to troll attacks," she tells me. "Reporting should be used to keep the community safe, not silencing creators for sharing information on human rights." 
For many users, like Whitney, getting accounts reinstated on TikTok is starting to feel like a battle of Sisyphean proportions.
"Each time I’m banned I contact TikTok immediately and wait until they 'unban' me. Each time it’s resolved they tell me they have reviewed my account and there are no community violations," Whitney laments. "It takes me, my friends and followers emailing repeatedly to get my account reinstated. It’s an exhaustive process. The fact that I have been silenced for the past five weeks amid abortion bans sweeping the USA is concerning and also suspect." 
Instagram isn’t any better for Whitney, either. In a post, she revealed a DM where her infant son had been the subject of rape threats while threats were made to her life. The account that sent the abuse remains active. It begs the question: how come these threats and activities don’t violate terms of service but providing information on where to access safe sexual healthcare does?  
Advertisement
Dr Carolina Are, an online moderation researcher at City, University of London and pole dancing instructor, is no stranger to the apparent hypocrisy of social media apps: her TikTok account has been banned four times since February 2021. She tells Refinery29: 
"Rather than a double standard, I think there’s a moderation mechanism that’s trying to be conservative at all times and that has painted nudity and anything associated with sex as ‘risky’. It quickly becomes a witch hunt at the hands of the most conservative users who are deciding what should go up and what shouldn’t. All the while the platform acts completely powerless against it when clearly, they’re not. 
"Users are not always going to be the best judge of things. The fact that the platform enforces what these users are doing is really worrying. Especially when their terms of service are unclear and miscommunicated," she adds. 
This issue concerning clarity could have something to do with pervasively puritanical views about abortion and sex on both sides of the Atlantic, as well as the fact that nudity seems easier for these platforms to detect than hate speech. 
Portia Brown, a sex educator and sex coach, says that "the juice isn’t worth the squeeze" when it comes to TikTok, and Instagram isn’t much better. 
"Not ever in the history of me having this account has Instagram ever supported me, at all, in any form from abuse, be it men in my DMs saying inappropriate things and harassing me and threatening me, or anyone else who is saying that I’m going to hell and so on and so forth, even when I report it. Nothing happens," she tells me.
Advertisement
There has been speculation that this banning process is felt more by marginalised communities. Portia confirms this: "Sex workers, sex educators, people who talk about sex on the internet, people who are fat, people who are Black, people who are visibly queer, are experiencing an increased level of censorship and wrist slaps from Instagram and their moderators, versus cis hetero white thin able-bodied people. And with that information, I know for a fact it’s not the community that’s reporting my content, it’s Instagram doing what Instagram does."
Like many of her peers, Portia has experienced unregulated censorship on TikTok, too. 
"I pretty consistently had everything I tried to create taken down and marked as inappropriate, which I didn’t anticipate because I know other sex educators are using TikTok in this way. I felt like I was following the guidelines. I was censoring my language. Even still, a lot of my content was taken down save for a few that remain. It got so bad that I haven’t posted anything in six months because it was just insufferable." 
The rules and regulations designed to keep users safe are now doing the opposite. These restrictions are preventing experts, educators and activists from striving to better their community. All the while, users are left to fend for themselves against an onslaught of online abuse and censorship.
"If algorithms, our government and the people running Instagram and TikTok can’t tell the difference between [progressive and problematic content], we have a real fucking problem," adds Portia.
Advertisement
Knowingly or otherwise, TikTok, once praised for its educational resources, is de-platforming the most vulnerable in society while oppressing those who are trying to raise awareness of systematic inequality. In short, the removal of this content is preventing access to education, resources and community. It's a wild contradiction to TikTok's entire UK TV advertising campaign, which is built on the premise of learning, sharing and educating.
TikTok refutes the claim that 'mass reporting' leads to automatic removal of content or an account. They told Refinery29 that they review all reports in line with their community guidelines and moderators remove those which violate their policies. 
A spokesperson for TikTok told Refinery29: "Our community of creators is vibrant and diverse, and everything we do at TikTok is about providing a safe space for people to express their ideas and creativity, no matter who they are.  We are open about the fact that we don't always get every decision right, which is why we continue to invest at scale in our safety operations."
Recently TikTok has introduced new tools to promote kindness on the platform. A 'filter all comments' feature gives creators the power to decide which comments will appear on their videos. When enabled, comments aren't displayed unless the video's creator approves them using the new comment management tool.
In addition to empowering creators with more tools, they want to encourage TikTokers to treat one another with kindness and respect. A prompt now asks people to reconsider posting a comment that may be inappropriate or unkind. It also reminds users about the community guidelines and allows them to edit their comments before sharing. 
Advertisement
However, these opaque guidelines leave users to self-police and moderate comment sections without support, manually typing in the kind of abuse they don’t want to receive. And although asking people to be nice to one another is, well, nice, it doesn’t prevent hate speech, harassment and abuse. It also leaves decisions about what is and what is not appropriate to moderators who, judging by the number of accounts which are reinstated time and time again, are consistently getting it wrong.
When approached, Instagram didn’t find any evidence of reduced visibility on any of the accounts spoken to for this article. Regarding sex education, activists and workers, Instagram had the following to say: 
"We understand that sex workers often disagree with what we do and don’t allow on Instagram, but with people as young as 13 using our service, we need to consider the responsibility we have to our youngest users. We don’t want to marginalise sex workers, and we’ll continue to listen and respond to their concerns."

More from Living

R29 Original Series

Advertisement