top of page
Check back soon
Once posts are published, you’ll see them here.

The Financial Math Behind Cleaning Up Your Instagram Feed

Updated: May 12


The unseen cost of keeping your feed squeaky clean
The unseen cost of keeping your feed squeaky clean


It was a sunny Saturday afternoon, and I was doing what many of us do when we’re half-bored and half-curious: mindlessly scrolling through Instagram. 


Brunches, Bali trips, babies in designer gear — the endless carousel of curated perfection spun by without much thought. My biggest concern was whether I’d ever master latte art.


Meanwhile, across the world, someone else was scrolling through a very different Instagram. Their feed was flooded not with sunsets and smoothie bowls but with torture videos, graphic violence, and the worst humanity has to offer - and they were being paid daily less than the cost of a Pret sandwich to clean it all up.

Keeping Instagram "safe" is a billion-dollar operation. In 2024, Meta - the parent company of Instagram and Facebook - earned over $164.5 billion, mostly from advertisers who demand squeaky-clean, brand-safe platforms. The secret most users don’t realise is that while AI does some of the heavy lifting, the truly grim work still falls to human hands. Thousands of them. Sitting in nondescript office parks in places like Accra and Ghana, these workers spend their days - and sometimes their nights - sifting through the internet’s ugliest content.


Their reward? According to contracts seen by the Guardian and the Bureau of Investigative Journalism, salaries start at around £64 a month. Even with “performance bonuses” that could, in theory, push earnings up to £243, it’s still a paycheck that barely covers basic living expenses. In cities like Accra, monthly costs for rent, food, and transport often hit £400-£500. To put it bluntly: some moderators make less in a month than you’d pay to fix a cracked iPhone screen.


Yet their job is one of the most psychologically brutal in the tech industry. Imagine spending eight hours a day looking at images that you can’t unsee.


And here's the kicker: most of these workers don’t even technically work for Meta. They’re employed by outsourcing firms like Majorel and Samasource, shielded from Meta by layers of contracts and legalese. Inside their workplaces, they’re constantly reminded: “You don't work for Meta.” Even though they follow Meta’s guidelines, use Meta’s tools, and protect Meta’s brand.


This outsourcing is no accident. It's a ruthlessly efficient business move. If Meta hired London or New York-based moderators, it would cost tens of thousands of pounds per worker each year - and that’s before factoring in real mental health support. By outsourcing to regions where labour is cheaper and protections are weaker, Meta cuts costs dramatically. Some moderators in Ghana make about 10 times the local minimum wage - but that "minimum" is so low, it’s hardly a meaningful improvement.


On paper, there are mental health resources - “licensed” psychologists, resiliency training, mandatory therapy sessions. In practice, moderators describe support that feels superficial at best and invasive at worst. Some report that private therapy conversations were shared with their managers. Privacy, even in the most basic sense, becomes a luxury.


Now, a growing movement of African moderators is pushing back. In Ghana, a legal team backed by the UK nonprofit Foxglove and Ghanaian firm Agency Seven Seven is preparing to sue Majorel and Meta. If successful, it would set a major precedent: recognising psychological trauma as a legitimate workplace injury and forcing Big Tech to reckon with the true costs of content moderation.


This isn’t just a fight about wages. It’s a fight about visibility. About recognising that every polished ad you swipe past, every feel-good Reel you binge-watch, only exists because someone else shielded you from the internet’s most horrific realities.


Next time you’re scrolling through brunches and beach shots, take a second to think about who’s making that possible. The brutal math behind Instagram’s clean, happy façade isn’t just about low pay or legal loopholes - it’s about a system that turns human suffering into just another line item on a quarterly report.

Maybe it’s time we started looking a little closer.

Stay Updated

Quick Links

  • Instagram
  • Facebook
  • Twitter
  • LinkedIn
  • YouTube
  • TikTok

© 2025 by The LighHouse Review. All Rights Reserved

bottom of page