Facebook just announced one of its most comprehensive bans to date. Yesterday, in a press release, the company shared that it is expanding previous measures designed to disrupt the ability of QAnon — a conspiracy theory turned movement that claims there is a "deep state" mechanism out to get President Trump — and other so-called "militarised social movements" to operate and organise on its platforms. Now, Facebook will be removing any Facebook Pages, Groups, and Instagram accounts representing QAnon, even if they do not contain any violent content.
Facebook says it has removed over 1,500 Pages and Groups related to QAnon and containing discussions of potential violence, and over 6,500 Pages and Groups tied to more than 300 militarised social movements in the first month following the announcement of its initial plans to curb QAnon content. However, perhaps because of growing outside criticism, the company has decided that isn't enough.
According to the Associated Press, Facebook will make decisions on which Pages, Groups, and accounts to ban based on a variety of factors, including their names, "about" sections, and discussions within them. Facebook says it isn't only relying on user reports to discover Pages, Groups, and accounts associated with QAnon. Its Dangerous Organisations Operations team is proactively searching for content that warrants removal, per the press release.
Facebook isn't the only social media company working to crack down on the spread of QAnon on its platforms. In July, Twitter banned thousands of accounts linked to QAnon content for violating the platform's terms and conditions, specifically its rules against targeted harassment. Then, in August, Twitter removed a tweet that was retweeted by President Trump because it violated the platform's misinformation policies. The tweet in question cited a QAnon-linked Facebook post, which downplayed the actual number of coronavirus deaths. Over the summer, too, TikTok removed the QAnon hashtag, which at one point had 82 million posts. Still, with a critical election quickly approaching, more QAnon-supports making their way into politics, and alternative facts growing ever-more mainstream, these measures from Facebook and others may be too little, too late.