Why the Internet Punishes You for Being Kind
Empathy is a losing strategy on platforms built to reward conflict.
A confession about loneliness appears in a comment section. A question asked in good faith. A defense of someone being piled on.
Within minutes, the replies arrive. Sarcasm. Deliberate misreadings. Screenshots shared to larger accounts for mockery. The original words disappear. Maybe the account disappears with them.
This happens thousands of times a day. It is so common that most people have stopped noticing it. They scroll past the wreckage the way commuters pass a car accident.
The question people keep asking is: why is everyone online so angry?
The platforms are designed to make anger the dominant visible behavior, to suppress everything else, and to punish anyone who breaks the pattern.
How Social Media Algorithms Reward Cruelty Over Kindness
The platforms that host public conversation are built on advertising revenue. Advertising revenue depends on engagement. Engagement increases when people are upset.
The data backs this up. William Brady and colleagues at NYU found that each moral or emotional word in a tweet increased its spread by 20 percent. Conflict travels further than kindness on every major platform. The algorithm learned this on its own.
The platforms know this is happening. Frances Haugen, a former Facebook product manager, disclosed an internal presentation from 2018 that read: "Our algorithms exploit the human brain's attraction to divisiveness." When Facebook changed its feed to prioritize "meaningful social interactions," divisive posts were the ones generating the interactions the algorithm rewarded. The company saw the data and kept the system in place.
The foundation is simple. The platform measures what people do. People react to conflict. The platform shows more conflict. People become more conflicted. The cycle accelerates.
Why Online Toxicity Is a Design Problem
The problem extends past angry comment sections. The platforms are changing which behaviors survive online and which ones die.
Think of it as selection pressure. Every person posting online is competing for visibility. The algorithm decides what gets seen. The algorithm favors outrage, mockery, and combative framing. So the posts that gain traction are the ones that attack, ridicule, or provoke. The posts that fade into silence are the ones that extend grace, ask genuine questions, or defend unpopular people.
Over time, this trains everyone. Outrage begets engagement. Engagement begets more outrage. The people most susceptible to this conditioning are moderates. Post something kind and nothing happens. Post something cruel and the numbers go up. Moderate voices learn that moderation gets them ignored, so they either escalate or leave.
The visible internet looks far more hostile than the population it represents. The loudest and meanest voices are the survivors of an algorithmic selection process that filters for aggression. The quiet majority is still there. The architecture just made them invisible.
When researchers have tested what happens if you change the curation, attitudes change with it. Push hostile content down in the feed and people feel less hostile toward the other side. The anger is a product of the ranking. Change the ranking and the anger recedes.
The platform chooses conflict on behalf of its users. The users would choose something else if given the option.
What Happens When Kind People Stop Posting
The people who leave are the ones the platforms can least afford to lose.
Most people have watched someone get torn apart online and decided to keep their own mouth shut. The ones most likely to self censor are the ones who were going to say something honest, something vulnerable, something worth reading. They type it out, look at it, and delete it. The platform never sees the best version of what its users had to say.
The spiral of silence is real. The stage rewards performers, and performers learn that cruelty gets applause. Everyone else watches from the seats and absorbs the show.
The effects reach people who never post at all. A 2014 experiment on 689,003 Facebook users demonstrated that emotional states transfer through text alone, without any in person contact. Seeing cruelty makes people feel worse even when the cruelty is aimed at someone else. The bystanders absorb the damage from fights they never joined.
The trend has one destination. The kind people leave. The moderate people go quiet. The people who remain visible are the ones willing to perform cruelty for an audience.
The Other Side
The strongest counterargument is that the internet also connects people who have nowhere else to go. For people in isolated or hostile environments, the internet can be the only space where they encounter others like them. That is real and it is worth protecting.
The platforms can be lifelines. They can also be machines that grind down the people most willing to be open. Both are true. The question is whether the architecture can be changed to preserve the first without accelerating the second.
Right now, the architecture is choosing conflict. The engineers know it. The internal documents confirm it.
The internet is mean because meanness is profitable. Kindness generates less engagement. And engagement is the only metric that pays.
In Crushed Between, the gods who feed on conflict never have to start fights. The architecture does it for them. All they do is watch the numbers climb. If the machinery described here sounds familiar in a way that feels mythological, the serialized fiction explores what it looks like when the machinery wakes up. Read the latest chapter on Substack.
This article is for informational purposes only and does not constitute medical, financial, or professional advice.