Fun Facts

Recent Content

You’ve Been Doing This Wrong… Sleeping Longer Isn’t Helping

You’ve Been Doing This Wrong… Sleeping Longer Isn’t Helping

For years we’ve heard: “Just get more sleep.” But new sleep data shows something surprising

Read more
This Sounds Fake… But Your Groceries Are Secretly Shrinking

This Sounds Fake… But Your Groceries Are Secretly Shrinking

You’re not imagining it. That cereal box feels lighter. That chip bag seems emptier. That snack pack looks… smaller.

Read more

How Monopoly Games Helped POWs Escape Nazi Camps

British intelligence hid maps, compasses, and real money inside WWII Monopoly games sent to POW camps. Hundreds escaped—Germans never discovered it.

Read more
The Space Pen Myth (And What Really Happened)

The Space Pen Myth (And What Really Happened)

The space pen myth is backwards. Fisher spent his own $1M, sold pens to NASA for $6 each. Russia bought them too—pencils were too dangerous in space.

Read more
The Truth About Red Fire Trucks

The Truth About Red Fire Trucks

Fire trucks are red from 1800s tradition, but studies show lime-yellow trucks have 3x fewer accidents. Most departments chose tradition over proven safety.

Read more
See All Content
logo
  • Sports

  • History

  • Language

  • Food

  • Tech

  • Animals

  • Sports
  • History
  • Language
  • Food
  • Tech
  • Animals
  • ​
    ​

Facebook's Secret Mind Control Experiment

In 2012, Facebook secretly conducted a psychological experiment on 689,003 users without their explicit consent - and proved they could control human emotions on a massive scale.

For one week, the social media giant deliberately manipulated users' news feeds to show either more positive or more negative content from their friends and pages they followed.

The results were chilling. Users who were shown more negative content began posting significantly more negative content themselves, while those exposed to positive posts became more positive in their own updates.

Facebook had demonstrated that emotional states could be transmitted through social networks - essentially proving they had the power to make people happy, angry or sad at will.

The study was conducted in collaboration with Cornell University and published in the prestigious Proceedings of the National Academy of Sciences in 2014. When the research became public, it sparked international outrage about digital consent and corporate ethics.

Critics pointed out that Facebook's data policy at the time only vaguely mentioned using information for "research" - there was no specific mention of psychological manipulation experiments.

What made this particularly disturbing was the scale and secrecy. Nearly 700,000 people had unknowingly become test subjects in a corporate psychology experiment. The revelation raised serious questions about the power tech companies wield over users' mental states and daily experiences.

The controversy forced Facebook to revise their research policies and sparked congressional hearings about social media ethics.

But the damage was done - Facebook had proven that with algorithmic manipulation, they could puppet human emotions like invisible strings, making millions of people feel however they wanted them to feel.

This revelation highlights why getting news and forming opinions based on social media feeds is very dangerous. Every post you see, every article that appears in your timeline, and every "trending" topic has been algorithmically selected to provoke specific emotional responses.

You're not seeing neutral information - you're seeing content designed to make you angry, sad, happy, or outraged in ways that benefit the platform's engagement metrics. Your political views, purchasing decisions, and even personal relationships could be shaped by corporate algorithms rather than genuine information or authentic human interaction.

Related Content

Terms and ConditionsDo Not Sell or Share My Personal InformationPrivacy PolicyPrivacy NoticeAccessibility NoticeUnsubscribe
Copyright © 2026 Fun Fact Feed