How the Internet Ruined Our Passion for Politics — And How We Can Get It Back
If we blame each other, we’re looking the wrong way.
The generation that grew up on the internet is turning silent online.
When K*, 22, organized a queer collective in her college, she didn’t expect that it would turn against her. There were objections about her presence as someone who wasn’t queer. But here’s the catch: K is closeted.The backlash – particularly on social media – was swift. Her peers began calling the collective queerphobic – not knowing that they were silencing a queer person in the process. “I stopped saying anything online… Because I always look at myself through the lens of these young queer activists because I feel they’ll cancel me,” she said.
There’s a righteousness to The Discourse online that’s led to a lot of in-fighting, and everyone’s tired now. Even for the most vociferous contributors to it, the passion for social issues has fizzled out over time. Many tend to blame one another; think pieces and pundits place the onus of bad discourse on those participating in it. But in an era of speech being mediated through social media algorithms, the real problem is how undemocratic platforms erode our agency by controlling how our passions are expressed.
We’d like to be heard online. But is anyone really listening? What used to be a productive space for engagement has quickly devolved into an impulse to do anything to gain traction. But it exacts a heavy mental toll that ultimately distances us from our own passion – or weaponizes them against our better judgment.
Whistleblower Frances Haugen spoke about how Facebook’s algorithms incentivize angry, polarizing engagement for the benefit of advertisers. This is also why social media makes us feel worse about the world – it mediates what we see of it and promotes what makes us angry or upset. Research too shows that anger is a powerful motivator online: it promotes information-seeking, and induces somebody to find out more. The algorithm boosts content whose tonality is bombastic, loud, and elicits outrage, as a 2021 report on Facebook prioritizing the ‘angry’ emoji showed. Rage baiting, then, is a specific tool that has spread to other platforms: TikTok creators are even incentivized to employ it for more views.
But this rage has also come to shape the speech of progressives too – destroying the discourse from within.
Sneha Rooh, 34, who is bisexual, wrote an article suggesting that a character in the film Maja Maa could be bisexual – and was called lesbian-phobic online. She didn’t counter the charge – instead, she kept her subsequent opinions to herself. “It’s almost as if my standpoint on issues is questioned.”
For others, internet-fuelled backlash has made them more passionate – but without trust in their own language and tone to express it.
“Given my network is majorly upper caste/class, privileged cis- het people, my views on almost all topics… have received dissent. This affected my tone of sharing my views, if not the views themselves,” said Kevin Jain, 25.
There was a time when talking about issues online held promise: it was the new, revolutionary technological ace up a political public’s sleeve: aiding protestors in mobilizing support, providing information where traditional news outlets didn’t, and disseminating ideas for causes one believed in that ultimately aimed to speak truth to power. It was a passionate, lively place – rippling with the frenetic energy of people who cared, who had things to say, and now had a platform to say them.
Today, all rhetoric must be couched in rage and abrasiveness in order to make its way through to an audience. Ironically, this alienates people, and creates an environment of fear – even in safe spaces. The result is that political expression is directed against someone rather than for something. Especially if that someone is an individual whose politics largely align with ours, save for a few finer points.
When the best, more surefire way to be amplified online is using the most jarring, attention-grabbing vocabulary or stats possible, it speaks to a larger problem with the environment of the internet, where attention is mined and weaponized against our will.
Prashanth Kanduri, 31, once used to engage passionately on issues earlier. He no longer bothers because he believes that “winning arguments online has zero consequences on the voting patterns on the ground.” He’s speaking to a sensibility that’s come to pervade all those who used to engage online but are now simply too tired: what’s the point? For a generation that grew up on the internet and who were the most vocal about their politics, this miasma of dispassion is a bleak conclusion. For Anusha Sreekant, 24, the circularity of the conversation has turned her initial passion for issues into irony, tiredness, rage, and “plain despair.”
All this is happening amid an unprecedented rise in hate speech, cyber-bullying, and alt-right trolling online – which means that people who engage in bad-faith call-outs of people ostensibly on “their side” are adding to the problem. What gives?
The language of exaggeration, or even “reaching” to make a political point isn’t new. It’s in fact as old as politics itself. What’s new is the volume, access, and the democratization of the pulpit, who owns the pulpit, and what’s being extracted at the pulpit. In other words, we may be tempted to blame one another for bad takes and ruined discourse – but we’re looking the wrong way.
“I think that the focus on content and speech on social media platforms is misplaced. Instead, we need to be looking much closer at the role that algorithms play in amplifying certain kinds of content,” says Urvashi Aneja, the founder of Digital Futures Lab.
“The way that even we’re talking about regulating speech… focuses too much on the individual, and not enough on the structural causes that are contributing to this.”
Emerging research also shows that algorithms can influence our behavior in tangible ways offline too – from whom we vote for to dating decisions. “It’s not a bug – it’s a feature,” says Aneja.
Behavioral data extraction based on the ad tech revenue model isn’t just allowing companies to profit from us – it’s changing how we think, engage, and talk. For a generation that saw social media platforms as a promise, it’s a devastating blow: “What is abrogated here is our right to the future tense,” Shoshana Zuboff, who coined the term surveillance capitalism said. The generation that grew up online is now left with less autonomy to speak than ever.
According to media studies professor Andrew Dodd, people are walled into information bubbles more than ever in the social media age. And because of how behavioral data shapes what our feeds look like, however, the valid passion and anger around issues is often directed at entirely the wrong people – those who are on “our side” – rather than the systems that perpetuate them.
For New York Times journalist Charlie Werzel, this is engineered context collapse – the very design of platforms is one that isn’t conducive to nuance or adequate context before forming opinions. That unknown individuals become Twitter’s “main character” for a time because of relentless dogpiling isn’t any one individual’s fault – it’s how the algorithm picks up a tweet that’s generating discourse and puts it on people’s “trending” sections. This is how conspiracy theories and dogpiling over harmless tweets fall under the same category as far as the algorithm is concerned. Both become trending topics and both fuel rage in a way that’s shutting down conversations entirely.
But there’s another issue at play.
“Beyond algorithmic amplification, there’s certain design choices that have been made that enable this kind of toxicity to grow and spread. So something as small as allowing people to comment on each other’s comments,” Aneja adds.
Take the quote-tweet feature on Twitter, which allows users to pile onto a single tweet with ease – and also disseminate bad ones far and white. They often create multiple loops of reactions-to-reactions that, by the end of it, turn the discourse into a soup of rage-baity words and indignant call-outs.
How do we do things differently, and reclaim our passion for the things we care about?
For some, the answer simply seems to be to log off. Shreya Thaker, 26, has disengaged completely. She used to be active and vocal online during the anti-CAA protests in 2019, but it’s different now. Further, views shift as we grow – picking a side and sticking to it doesn’t help, in her opinion.
In the end, “all sides behave in similar ways.”
This is true of social media, where progressive discourse gets so impassioned and desperate, it sometimes ends up taking a horse-shoe turn toward the right.
Ultimately, social media platforms as they currently are don’t answer our need to be heard. Instead, they manipulate our speech such that we’re convinced outrage at someone else is the only way to speak online. But what if the algorithms stop rewarding some kinds of speech over others?
Decentralizing platforms’ content moderation strategies, algorithmic changes, and design are not only essential to reclaiming how we speak online – it’s also possible. Tech activists calling for the same led to Facebook
“I don’t think it’s so bleak that one cannot imagine an internet or social media platforms where we don’t have space for genuine, meaningful political engagement,” says Aneja.
Transparency in design, moderation practices, and algorithmic decisions, she says, are key.
The lack of transparency about how algorithms work has unintended effects. Sometimes, distrusting algorithms itself can have outsized effects on the discourse, wherein paranoia about algorithmic manipulation can itself breed suspicion, distrust, and conspiracy theories.
“This opacity births a lot of myths about the ‘algorithmic overlord’ controlling us or manipulating our experiences online, mostly because opacity breeds distrust,” says Aliya Bhatia, a policy researcher from the Center for Democracy and Technology.
The Santa Clara Principles, moreover, call on platforms to give users more agency in determining their online experience – by giving them control over their feed and what they see in it.
But even for the everyday social media user, knowing how big tech shapes how we speak can restore some agency. If tech influences what we see, whom we engage with, and rewards our anger – the best way to destabilize the machine, for now, is to recalibrate anger itself. Anger is essential to passion – it’s what powers movements and revolutions. Big tech diverts its path – leading to destructive in-fighting that ruins movements from within. What if the political commitment in the age of surveillance capitalism, then, is to resist individualized anger? ’
In isolation, anger about social injustices – like sexism, racism, homophobia, ableism – comes from a place of deep hurt, vulnerability, and a vision for something better than what we have. But it’s not our fault that we’re compelled to communicate badly in order to be heard. It is still within our control, however, to redirect the hurt when we feel it wrested away from us.
Rohitha Naraharisetty is a Senior Associate Editor at The Swaddle. She writes about the intersection of gender, caste, social movements, and pop culture. She can be found on Instagram at @rohitha_97 or on Twitter at @romimacaronii.