The Big Problem With The Viral ‘Propaganda I’m Not Falling For’ Trend
TW: This article discusses disordered eating.
Posts with the text “Propaganda I'm not falling for” and a list that might include diet culture, something about the rise of conservatism and alternative milk choices — to name a few — have been dominating our social feeds of late. The videos seem to suggest that if you can spot a "trend" or “problem”, then you can choose whether to participate in it or not. But in a world where we seem to be at the behest of the algorithm, and with social media increasingly becoming an avenue for mental health support, how much agency and say do we really have?
Posts with the text “Propaganda I'm not falling for” and a list that might include diet culture, something about the rise of conservatism and alternative milk choices — to name a few — have been dominating our social feeds of late. The videos seem to suggest that if you can spot a "trend" or “problem”, then you can choose whether to participate in it or not. But in a world where we seem to be at the behest of the algorithm, and with social media increasingly becoming an avenue for mental health support, how much agency and say do we really have?
AdvertisementADVERTISEMENT
Let's start with the algorithm, which, despite what we're sometimes led to believe, is not inherently evil. It can actually be great, showing us content and advertisements that we find engaging based on specific signals, such as what posts we like, comment on, and share, how much time we spend viewing a post, and what similarities we have with users we interact with. However, when social media platforms use them to further their cause, algorithms can lead us to content we didn’t necessarily choose.
For example, in 2018, Facebook saw a decline in engagement and reconfigured its recommendation algorithms. A 2021 study says that the reconfiguration increased the virality of outrageous and sensationalised content. And no, this was not just a coincidence, Facebook’s internal documents stated that "Misinformation, toxicity, and violent content are inordinately prevalent among reshares.” Similarly, a 2020 study found YouTube recommendations can lead people to more extremist content.
The bigger issue is that even if a user decides to step away from polarising content, a New York Times article reported that viewing 20 widely-shared TikToks sowing doubt about election systems will push more “election disinformation, polarising content, far-right extremism, QAnon conspiracy theories and false Covid-19 narratives” despite using neutral search terms. In this case, while we click or scroll away, it becomes clear that the algorithm can encroach on our autonomy.
Besides misinformation and polarising content, we can’t talk about social media algorithms without discussing thinness culture or “thinspo.” What once existed on Tumblr sites, thinness culture, also known as “pro-ana” type content, has now made its way to TikTok, claiming its niche as “SkinnyTok.” This side of TikTok is filled with low-calorie recipe videos, What I Eat In A Days, exercise routines and more which glorify thinness culture and disordered eating, veiled as a "healthy" lifestyle.
AdvertisementADVERTISEMENT
In June, TikTok "blocked search results for #skinnytok since it has become linked to unhealthy weight loss content." However, how much impact does this have when our algorithms deliver us this content regardless of needing to search for it? Users who are currently diagnosed with an eating disorder are 4137% more likely to have the next video delivered by the TikTok algorithm to be eating disorder-related. Similarly, for those struggling with disordered eating, it is 322% more likely that the next video will be diet-orientated. The recent Butterfly Body Kind Youth survey also found that 57.2% of young people (aged 12-18) report that social media makes them feel dissatisfied with their bodies.
At the same time, 73% of young people use social media for mental health support. However, the number of young people using social media for support rather than the mental health care system is worrying when, in June 2025, an investigation by The Guardian found that over half of the top 100 mental health TikToks contain misinformation. From suggesting that everyday experiences are symptoms of borderline personality disorder to misusing therapeutic language, social media has increasingly seen users self-diagnosing based on a 10-second sound bite.
As someone who could not access mental health support for a couple of years for an eating disorder, I’ve experienced the pipeline of content where helpful tips for recovering turn into glorifying low-calorie, high-protein diets and strength training in the name of "health". But as a 16-year-old, it was impossible not to fall for this, and even now, at 22, it can be a battle on hard days, and I have to delete the apps altogether.
AdvertisementADVERTISEMENT
While we like to think we have full control over what we do or do not “fall for”, our algorithms clearly also have a say in that. But the long-term answer to extremist or thinspo content is not to delete or ban social media. We should all be cautious of what we see on the internet, perhaps take things with a pinch of salt, and do our research, of course — but social media platforms and governments have a role to play in safeguarding us from risks. We need more than just a hashtag ban; we need social media platforms to stop pushing content that leads us down this path in the first place. We need our government to regulate content such as deepfakes and misinformation and, instead of banning social media, implement education on how to use and be safe on it.
We need to address the systemic issues. And funding the mental health system so that fewer people turn to social media for help would also be a great place to start.
If you are struggling with an eating disorder and need support, please call Butterfly at 1800 33 4673.
AdvertisementADVERTISEMENT