Fri. Dec 5th, 2025

The Power of Web-Based Media: How Algorithms Shape Public Perception and Control Participation

Have you ever surfed TikTok or Instagram “just for five minutes,” only to realise hours have passed? You think you’re the one choosing what to watch — but what if, in reality, something else is choosing for you?


 In today’s digital media world, the gatekeepers are no longer just journalists or editors. Algorithms have quietly taken over that role.

The web has turned us from passive viewers into active participants — filming, sharing, and posting because every moment feels worth remembering. It feels wonderfully democratic, doesn’t it? Everyone gets a voice, a stage.


 But here’s the twist: behind that freedom lies a quiet referee, the algorithmic “gatekeeper”, deciding whose voice gets heard and whose disappears into the scroll abyss. As Gillespie (2014) highlights, algorithms aren’t neutral; they’re the result of human choices. In other words, “our feeds are being chosen with opinions built in”.

By rewarding posts with high engagement, algorithms amplify what gets the most clicks, not necessarily what’s true or diverse. The result? A “filter bubble” that limits what we see and traps us in loops of familiar opinions. Allcott and Gentzkow (2017) find that during the 2016 U.S. election, inaccurate stories on social media received massive engagement, spreading faster than factual ones.

So how did something so invisible, so voiceless, end up calling the shots? This article takes us inside the quiet empire of algorithms: how they shape what we see, guide what we believe, and blur the line between “just data” and deliberate control. And hopefully, you’ll see yourself in this story too: not just as a user being influenced, but as someone who can begin to take that power back.

Algorithms: The New Gatekeepers of the Media World

In the past, media power was visible. It belonged to editors, journalists, and major outlets, the “gatekeepers” who decided which stories were worth telling and which would remain in the shadows. These individuals shaped public discourse through deliberate editorial decisions, guided by professional norms, deadlines, and human bias. Readers might have disagreed with them, but at least the source of control was clear.


In contrast, today’s news ecosystem is ruled by something far less tangible: the algorithm. On platforms like Facebook, YouTube, and TikTok, code has replacedthe newsroom as the editor-in-chief. Algorithms determine which posts rise to the top of your feed, which videos appear on your “For You” page, and which headlines are quietly buried. What once required human judgment has been automated through data, engagement metrics, and predictive modelling.

“Algorithms are not just tools for sorting information — they shape the very reality we live in.” – Tarleton Gillespie (2014)

Every like, scroll, and pause whispers something to the system. From these signals, it learns what catches our eye, how long we linger, and what makes us click again. The algorithm gradually constructs a world that is customized to us, such as familiar, comforting, and emotionally resonant. We’re surrounded by content that mirrors our assumptions and reinforces our biases. It’s the feeling of personal choice, even empowerment, but in fact, it’s a subtle curation. We’re not so much choosing; we’re being chosen.

This shift in media authority is both thrilling and unsettling. At first glance, social media appears to democratize the means of communication: anyone can post, share, and go viral. The entry barriers are lower than they have ever been. Yet behind this openness lies a new hierarchy of visibility. Algorithms act as invisible editors, deciding whose stories rise and whose sink without trace. As a result, the power to be heard is no longer about journalistic authority: it’s about algorithmic compatibility.

“News no longer comes to us; we find ourselves within the news.”Alfred Hermida (2010)

Hermida (2010) describes this phenomenon as “ambient journalism” which is the constant background hum of information we consume without even seeking it. News now flows around us, not through scheduled broadcasts or printed pages, but through endless feeds optimised for engagement. In this ambient environment, what goes viral is rarely what is most accurate or socially vital; it is what is most clickable.

TikTok, for instance, has been repeatedly criticised for privileging dance trends, challenges, and entertainment over educational or political content. Its recommendation system, designed to maximise “watch time,” promotes whatever keeps users on the app longest. The purpose is not enlightenment; it is retention. The algorithm’s success is measured not in truth but in attention.

The result is a reconfigured media ecosystem where attention is currency and data is power. The pen and microphone have been replaced by silent lines of code that govern what we see, when we see it, and how long we stay. Every user becomes both audience and raw material in a system built to monetise engagement.

“We don’t just use algorithms. We live within them.”

This invisible architecture of power has transformed the public sphere. We can imagine ourselves as free to choose, but our decisions are constantly influenced by what the algorithm decides should catch our attention. The gatekeepers are not gone; they’ve just been made invisible, written into the platforms that set the pace of our online lives.

How Algorithms Shape What We Think and Believe

If it ever feels like the whole internet secretly agrees with you, you are probably inside what Eli Pariser (2011) calls a “filter bubble”. It is an invisible wall built by digital platforms that constantly feed you content matching what you already believe. Inside this bubble, your online experience begins to feel comfortable and familiar. Your opinions are echoed back at you, your worldview affirmed by every like, share, and recommendation. What used to feel like a varied chorus of voices gradually becomes a mirror that just reflects back your own opinions.

The Internet is showing us what it thinks we want to view, but not necessarily what we should see.” – Eli Pariser (2011)

This does more than cause us to “hear the same song on repeat.” It actively influences how we view and perceive the world. Its implications became evident during the 2016 United States presidential election. Allcott and Gentzkow (2017) discovered that fake news on social media was shared much more than real news. The explanation is straightforward: algorithms favor engagement over accuracy. They prefer content that generates emotion, such as outrage, humor, or curiosity, to content that educates. Emotional reactions generate clicks, comments, and longer watch time, which are exactly what keep users on the website.

As Tandoc et al. (2018) explain, digital platforms are not designed to promote fairness or balanced communication. Their business model revolves around one thing only: attention. In the digital economy, attention is the most valuable resource. Every scroll, pause, or click feeds the system, teaching it how to keep us hooked.

“We are not the customers of social media; we are the product.”

This algorithmic bubble influences its effects far beyond the realm of politics. It shapes the manner in which we communicate about climate change, health, and other social issues. Schäfer (2012) identifies that debate on climate change has been polarised online because people are largely exposed to information sources which uphold existing views. Instead of expanding knowledge, these selective interactions limit it, creating fragmented realities in which productive conversation between disparate perspectives is difficult.

“Personalisation is liberating, until you appreciate it’s a trap.”

Algorithms quietly reshape the lens through which we see the world, giving us a comforting illusion that our choices are truly our own. In reality, those choices were predicted and coded long before we made them. Bucher (2018) calls this a form of “soft power,” the kind that “guides actions and perceptions without ever saying a word.”

“Power today doesn’t need to shout; it only needs to suggest.”

The result is a quiet but powerful shift in media control. Algorithms do not merely recommend what we see; they define the limits of what we can imagine. They shape our sense of what is normal, what is important, and what is true. So the next time your feed feels too calm, too familiar, and too agreeable, pause and ask yourself: Is this really the world, or just the one my algorithm has built for me?

The Illusion of Participation — When We Think We’re in Control

Social media loves to say that “everyone has a voice.” It presents itself as a global stage where anyone can speak, share, and be heard. But in reality, not every voice travels far. Some echo endlessly, while others vanish into silence. Behind every feed, it is not people deciding what rises to the top; it is algorithms quietly determining who gets to shine and who fades into the background.

“Visibility is not a right; it’s a privilege granted by the algorithm.”

At first glance, the system feels democratic. You can post a photo, write your thoughts, or share a story with the world. Yet what we see online is shaped less by freedom and more by design. It is an elegant illusion of participation because we believe we are in control, but our choices exist within invisible limits drawn by the platform itself. Srnicek (2017) calls this platform capitalism, a structure where user data is the most precious resource. Every tap, scroll, or watch becomes raw material for profit. What rises to the surface online is not necessarily what matters most; it is what keeps us watching.

TikTok is perhaps the clearest example of this design. Its never-ending cascade of short videos is engineered to spark what psychologists call the “infinite scroll reflex.” One more clip, one more laugh, one more dance trend, until you suddenly realise hours have passed. Montag et al. (2021) found that TikTok’s recommendation system builds what they term a dopamine loop, a cycle of anticipation and reward that keeps us locked in, not because we choose to stay, but because the app has learned what keeps us hooked.

“What feels like choice is often just a pattern predicted by the platform.”

And it is not only users who are caught in this loop; creators are too. YouTubers, streamers, and TikTokers understand that they are not being seen by accident. To survive, they must feed the algorithm, post at the appropriate time, stay abreast of trending sounds, copy winning formulas, and always adapt to what gets the best results. Creativity becomes a game of strategy. As Abidin (2021) notes, the algorithm acts as an invisible partner that every creator must satisfy to stay relevant.

“Creators no longer perform for audiences alone; they perform for algorithms.”

Couldry and Mejias (2019) take this one step further, calling it data colonialism, a new kind of digital exploitation where personal data is extracted, traded, and turned into profit without genuine consent. The more we post, react, and engage, the more value we create for the system but not for ourselves. Privacy becomes a luxury, and even identity is algorithmically defined.

“Data is not just extracted; it is colonised.” – Couldry & Mejias (2019)

In the end, social media may look like a democracy, but underneath, it functions like a marketplace where attention is currency and we are both the product and the participant. Every act of visibility depends on systems built to monetise engagement. What feels like freedom of expression is often just a script written in code. Until we understand how these systems shape our choices, our freedom remains a beautifully designed illusion.

“In the age of algorithms, freedom is not about speaking; it is about being heard.”

In the digital age, algorithms have become a new form of power, not by commanding, but by suggesting. They guide who gets seen, what becomes popular, and quietly shape how we understand the world. Their control is subtle, woven into our daily routines: a click, a swipe, a share. Everything we do feeds the system, teaching it what to show us next, creating a loop where behaviour becomes both cause and effect.

But this power isn’t absolute. Once we begin to understand how algorithms work, we can start to take back control. Awareness changes behaviour. We can diversify our feeds, follow voices outside our echo chambers, or pause before rewarding sensationalism with a like. As Bucher (2018) reminds us, recognising algorithmic power is “the first step toward digital freedom.”

Being in control again doesn’t mean rejecting technology, but learning to live consciously with it. The more we understand the patterns behind recommendations, the freer we become.

In the end, the challenge of our time isn’t how smart algorithms have become. It’s how aware we choose to be.

“In a world run by code, awareness is our quiet revolution.”

By Clara

Related Post