By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Scoopico
  • Home
  • U.S.
  • Politics
  • Sports
  • True Crime
  • Entertainment
  • Life
  • Money
  • Tech
  • Travel
Reading: Meta censored a post on lesbian relationships, proving its priorities are all wrong
Share
Font ResizerAa
ScoopicoScoopico
Search

Search

  • Home
  • U.S.
  • Politics
  • Sports
  • True Crime
  • Entertainment
  • Life
  • Money
  • Tech
  • Travel

Latest Stories

Netanyahu discusses China’s support for Iran
Netanyahu discusses China’s support for Iran
Opinion | She’s a Sex Therapist Who Wants to Eradicate Zionism. She Could Cost Democrats a House Seat.
Opinion | She’s a Sex Therapist Who Wants to Eradicate Zionism. She Could Cost Democrats a House Seat.
Grow a Garden Empress Bee guide
Grow a Garden Empress Bee guide
Thinking Machines shows off preview of near-realtime AI voice and video conversation with new 'interaction models'
Thinking Machines shows off preview of near-realtime AI voice and video conversation with new 'interaction models'
Hantavirus fears heighten with 4 Californians exposed to the disease
Hantavirus fears heighten with 4 Californians exposed to the disease
Have an existing account? Sign In
Follow US
  • Contact Us
  • Privacy Policy
  • Terms of Service
2025 Copyright © Scoopico. All rights reserved
Meta censored a post on lesbian relationships, proving its priorities are all wrong
Tech

Meta censored a post on lesbian relationships, proving its priorities are all wrong

Scoopico
Last updated: May 11, 2026 11:02 pm
Scoopico
Published: May 11, 2026
Share
SHARE


In April, Meta quietly reversed itself after removing an Instagram post honoring older lesbian relationships in Brazil. The excised post was not sexual in nature and did not contain material harmful to minors. The post in question documented a snapshot from a moment in history where lesbians were forced to hide their relationships as “roommates” or “gal pals” and their love was scrubbed from the public record. Nevertheless, Meta removed the content. 

Meta cited its hate speech rules. The Oversight Board later acknowledged what should have been obvious from the start: The Brazil case was an instance of over‑enforcement against a marginalized community, driven by automated systems that could not read context, reclaimed language, or even the full post itself. The content was restored only after outside intervention and advocacy from the LGBTQ+ community.

Mashable 101 Fan Fave: Vote for your favorite creator today!

This case is now being treated as a narrow content moderation error, but policymakers need to recognize that it signals a clear warning about what happens when lawmakers push platforms to police content instead of fixing design. Across the country, states are rushing to “protect kids online” by restricting access to social media or pressuring companies to remove vaguely defined “harmful” content. But what happened in Brazil shows the human cost of that approach. 

When platforms are incentivized to remove speech quickly and at scale, they do not become better judges of nuance. Social media becomes a blunt instrument, and the first people hit are those whose stories require human context and radical empathy to be understood.

If lawmakers actually want to protect kids, they should stop asking platforms to decide which stories are acceptable and start regulating core design choices that cause harm in the first place, like endless scroll, engagement‑based recommendations, and surveillance‑driven feeds.

SEE ALSO:

I had a Grindr sugar daddy for a day. Then he tried to get a refund.

Here’s why that distinction matters, especially for LGBTQ+ kids and other marginalized communities, like neurodivergent kids. LGBTQ+ young people are far more likely than their peers to rely on online spaces to find community, information, and support, often because those things are unavailable or unsafe at home or school. But they are also significantly more likely to end up in unsafe online interactions: harassment, grooming, doxxing, or being pushed into high‑risk spaces they didn’t seek out. 

In Australia, after a social media ban on anyone under 16 was enacted, disability rights advocates noted that autistic youth were cut off from some of the only support and peer networks available to them. 

Mashable Light Speed

Recommendation systems don’t understand vulnerability, but they understand engagement. When a queer kid searches for community, platforms often respond by aggressively amplifying whatever keeps them clicking. Usually, this means increasingly sexualized content, adult strangers, extremist rhetoric, or predatory accounts that know exactly how to exploit isolation. 

Infinite scroll makes disengagement much harder for adolescents, according to the Electronic Privacy Information Center, even more so for those in vulnerable communities. Algorithmic “friend” or “account” suggestions collapse liminal boundaries between teens and adults. Weak defaults make it difficult to block, mute, or disappear.

Young people, not just LGBTQ+ young people, are exposed to harm online because platforms are built to extract attention, not protect users. Parents are right to be worried and to advocate for change. But a content-based framing misses the real problem. 

The greatest risks kids face online don’t come from a single bad post slipping through moderation, but from automated systems that push content at kids they didn’t ask for, connect them to people they don’t know, and keep them scrolling long after warning signs appear.

Policymakers at both the state and federal levels need to design regulations that address those risks directly. Age‑appropriate design codes don’t tell platforms what speech to allow, but they can tell platforms how to behave. Design codes require safer defaults, like limits on behavioral profiling, stronger blocking tools, reduced amplification of unsolicited recommendations, and guardrails that slow down virality and compulsive use. 

The public should advocate for product refinement, rather than infringement of First and Fourth Amendment rights. Design codes reduce the chance that a curious or lonely kid is algorithmically funneled into danger, like I was, searching for community and nudged toward risk by systems that did not care who I was.

Age‑appropriate design codes offer a way out of this mess. By regulating how platforms are built rather than what people are allowed to say, design code laws reduce harm without turning companies into cultural censors. They don’t require platforms to interpret reclaimed slurs, queer history, or political speech. Companies should instead be required to stop engineering addiction and risk. 

We don’t need more content or platform bans. We need fewer harmful systems. If we’re serious about protecting kids online, especially the ones already most at risk, this case reminds us exactly where to start.

This article reflects the opinion of the writer.

Lennon Torres is the Movement Director at the Heat Initiative and Founding Partner of The Attention Studio. 

[/gpt3]

Jacob Elordi drops a cryptic 2-word trace about his ‘Euphoria’ Season 3 storyline
As we speak’s Hurdle hints and solutions for August 29, 2025
Right this moment’s Hurdle hints and solutions for August 26, 2025
Gaming monitor deal: Save $400 on the Samsung Odyssey G8 OLED at Amazon
Rethinking AEO when software agents navigate the web on behalf of users
Share This Article
Facebook Email Print

POPULAR

Netanyahu discusses China’s support for Iran
News

Netanyahu discusses China’s support for Iran

Opinion | She’s a Sex Therapist Who Wants to Eradicate Zionism. She Could Cost Democrats a House Seat.
Opinion

Opinion | She’s a Sex Therapist Who Wants to Eradicate Zionism. She Could Cost Democrats a House Seat.

Grow a Garden Empress Bee guide
Sports

Grow a Garden Empress Bee guide

Thinking Machines shows off preview of near-realtime AI voice and video conversation with new 'interaction models'
Tech

Thinking Machines shows off preview of near-realtime AI voice and video conversation with new 'interaction models'

Hantavirus fears heighten with 4 Californians exposed to the disease
U.S.

Hantavirus fears heighten with 4 Californians exposed to the disease

Senate bars senators from prediction markets over security concerns
Politics

Senate bars senators from prediction markets over security concerns

Scoopico

Stay ahead with Scoopico — your source for breaking news, bold opinions, trending culture, and sharp reporting across politics, tech, entertainment, and more. No fluff. Just the scoop.

  • Home
  • U.S.
  • Politics
  • Sports
  • True Crime
  • Entertainment
  • Life
  • Money
  • Tech
  • Travel
  • Contact Us
  • Privacy Policy
  • Terms of Service

2025 Copyright © Scoopico. All rights reserved

Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?