Charity's Facebook Ban: AI Mistake Flags Heroin Name - Hundred Heroines Reinstated (2025)

Imagine a charity dedicated to celebrating female photographers, only to be mistaken for a drug-promoting organization by a tech giant’s AI. This is the baffling reality faced by Hundred Heroines, a UK-based charity that saw its Facebook page wrongly flagged and removed not once, but twice, for allegedly violating community guidelines on drugs. But here’s where it gets even more frustrating: the AI mistook the charity’s name for a reference to the opioid heroin, despite the organization’s clear focus on art and photography. After a month-long battle, the page was finally reinstated—with no explanation or apology from Facebook.

Founded in 2020, Hundred Heroines operates from a physical space in Nailsworth, near Stroud, boasting a collection of 8,000 items that highlight the contributions of women in photography. The charity’s Facebook group is a lifeline, drawing about 75% of its visitors. Yet, its founder, Dr. Del Barrett, a former president of the Royal Photographic Society, describes the takedowns as ‘devastating.’ ‘AI technology picks up the word ‘heroin’ without an ‘e,’ and suddenly we’re banned,’ she explains. ‘It’s nearly impossible to reach anyone at Facebook, and it’s deeply frustrating because we rely on the platform to connect with our audience.’

The irony? In 2024, Meta ramped up its efforts to combat drug-related content amid the US opioid crisis, which claimed 80,000 lives last year. While Meta claims to have ‘robust measures’ to detect and remove drug-related posts, its AI tools have proven fallible—and the consequences can be Kafkaesque. Users often find themselves trapped in a loop of automated responses, with little recourse for human intervention. ‘Should we change our name?’ Barrett asks. ‘Why should we have to alter our brand just because of Facebook?’

And this is the part most people miss: Meta’s AI moderation tools, while central to its content review process, have been criticized for their lack of nuance. Earlier this year, the company faced backlash for mass bannings and suspensions, which it attributed to a ‘technical error.’ Yet, incidents like Hundred Heroines’ ordeal raise questions about the balance between automation and accountability. Is it fair for organizations to bear the brunt of AI mistakes? And more importantly, what steps should tech giants take to ensure their algorithms don’t harm innocent users?

Meta’s statement on its website emphasizes its commitment to safety and community standards, but cases like this highlight the gaps in its system. As Barrett puts it, ‘It’s both scary and laughable. These bots are running the world, yet they can’t distinguish between a woman and an opioid. Heaven help us.’

What do you think? Should Meta be held more accountable for AI errors? Or is this simply the cost of relying on automated systems? Let us know in the comments—this conversation is far from over.

Charity's Facebook Ban: AI Mistake Flags Heroin Name - Hundred Heroines Reinstated (2025)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Greg Kuvalis

Last Updated:

Views: 6171

Rating: 4.4 / 5 (75 voted)

Reviews: 82% of readers found this page helpful

Author information

Name: Greg Kuvalis

Birthday: 1996-12-20

Address: 53157 Trantow Inlet, Townemouth, FL 92564-0267

Phone: +68218650356656

Job: IT Representative

Hobby: Knitting, Amateur radio, Skiing, Running, Mountain biking, Slacklining, Electronics

Introduction: My name is Greg Kuvalis, I am a witty, spotless, beautiful, charming, delightful, thankful, beautiful person who loves writing and wants to share my knowledge and understanding with you.