The whole thing about “populist leaders with strong personal branding and social media as their main weapon” isn’t just happening in one country—it’s a global trend. In the West, you see people like Trump, Johnson, Berlusconi, even Zelenskyy all following a similar playbook. And yeah, there are already a bunch of serious books that dig into this from the angles of politics, communication, and mass psychology.Donald Trump (USA) basically turned Twitter into his personal megaphone. He used it to talk directly to people, skip the mainstream media, and control his own narrative. He also built a super strong personal brand—so strong that “Trump” became its own kind of symbol.Boris Johnson (UK) played the populist card too. His campaign style was super casual, avoided the mainstream press whenever possible, and leaned on simple, emotional messages to win over regular folks—even though he’s actually from a super elite background.Silvio Berlusconi (Italy) was a media tycoon before he became Prime Minister. He literally owned a bunch of media outlets and used them to shape how the public saw him. He was all about flashy visuals, quirky gestures, and personal storytelling to come off as the “strongman” who’s also one of the people.Volodymyr Zelenskyy (Ukraine) used his background as an actor and comedian to his advantage. He built a relatable, powerful image through social media, TV, and memes—pushing a message of fighting corruption and going against the elites.Older dictators like Stalin and Mao ruled by fear. Today’s “spin dictators” rule by confusing, entertaining, and flattering the public. These leaders often maintain democratic-looking institutions—elections, courts, a free press (in appearance)—but subtly rig them in their favour. Rather than censoring all dissent, spin dictators flood the public sphere with distractions, conspiracy theories, and carefully crafted propaganda.Spin dictators understand the power of image. They hug children, pet animals, shed tears on camera, and stage “unscripted” moments that go viral. Social media, instead of democratizing information, becomes a powerful tool of manipulation, allowing leaders to micro-target messages and discredit critics through online troll armies or “buzzers.”Populist leaders come in many forms, but their playbook is surprisingly similar. Take the billionaire who claimed to be “anti-elite”—a man with golden towers and private jets who convinced millions he was one of them. He raged against the system while quietly benefiting from it, proving that nothing says “man of the people” like tax breaks and reality TV.Then there was the leader who handed out free rice on camera while democracy quietly crumbled behind the scenes. His gestures were powerful, symbolic—and perfectly timed for the news cycle. Meanwhile, opposition voices were muted, and the constitution was tweaked like a settings menu on a smartphone.In Brazil, a populist shouted “God, Family, Brazil!” while pushing divisive politics that split society down the middle. His slogans hit hard, especially in online echo chambers, but his governance left the country polarised and public trust in ruins.Another one, a master of catchy phrases, reduced complex issues to hashtags and bumper-sticker wisdom. For every real problem, there was a viral quote. For every question, there was a performance.Despite cultural differences, the common thread remains: image over substance. They pose as outsiders, blame "the elite," offer easy answers, and build loyalty through charisma, not competence. And while they promise to fight for the people, they often dismantle the very systems meant to protect them.Unlike old-school dictators who jailed dissidents and shut down newspapers, spin dictators allow a degree of openness—but it’s carefully stage-managed. They permit some media and opposition to exist, but only within tightly controlled boundaries, giving the illusion of democracy while neutralising real dissent. The goal isn’t to persuade everyone of a single truth, but to create so much noise that people give up trying to figure out what’s real. It's not about narrative control, but narrative overload.In situations where political leaders, regimes, or interest groups want to influence public opinion, suppress dissent, or control narratives—especially in digital spaces, "the Algorithmic Army" is deployed. It’s most commonly used in populist or authoritarian contexts, but also appears in democratic settings during critical events.Modern authoritarian or populist leaders maintain influence and control, not through traditional brute force, but through media manipulation and digital propaganda. They rely heavily on media—especially digital media—to shape how people perceive reality. Rather than surrounding themselves with armies of soldiers, they rely on a virtual army made up of social media accounts, trending hashtags, viral content, and emotionally charged commentary ("hot takes")."These digital tools and keyboard warriors aren’t just for show—they’re the megaphones for the leader’s agenda. They crank up the volume on propaganda, drown out dissenting voices, fake public support with bot armies and echo chambers, and expertly distract the masses from scandals and screw-ups. Instead of old-school censorship where the truth gets locked away, this playbook floods your feed with so much noise, spin, and drama that the truth gets lost in the scroll. It's gaslighting at scale. By hijacking online conversations, spin dictators keep a grip on power—while still looking ‘democratic’ enough to fool the global stage.""The Algorithmic Army" is typically deployed in situations where political leaders, especially populists or authoritarian figures, want to shape public opinion without resorting to obvious repression. Instead of using police, prisons, or overt censorship, they fight their battles in the digital arena—using social media accounts, bots, influencers, and trolls to flood timelines, control narratives, and drown out dissent.During elections, these armies are mobilised to push propaganda, promote the ruling party, and attack political rivals. They make a candidate appear more popular than they are by artificially boosting likes, shares, and trending hashtags. It's about making noise that looks like genuine public support, even if it's fake or paid for.When a scandal breaks out—such as corruption, human rights abuses, or policy failures—the algorithmic army is used to distract the public. They might start unrelated trending topics, spam critical voices, or spread counter-narratives to muddy the waters. The goal is to make sure the scandal disappears under a sea of viral noise.In many countries, journalists, activists, and opposition voices are systematically attacked online. These attacks are often coordinated—sending waves of trolls and bots to harass or discredit them. Sometimes, it’s about overwhelming them with hate; other times, it's about manipulating the algorithm to push their posts out of sight.The algorithmic army is also useful for promoting nationalist or ideological narratives. It celebrates government achievements, amplifies patriotic messaging, and attacks anyone who questions the official story. It gives the illusion of grassroots passion, even when it's all orchestrated.Sometimes, governments even use these tactics to shape global perception. Through multilingual bot networks and coordinated campaigns, they try to counter criticism from international media or human rights organizations. They want to make their country look good—or at least confuse people enough to doubt the critics.At its core, the algorithmic army is about manipulating attention. In an age when people live online, controlling what’s seen, liked, and shared is as powerful as controlling newspapers or TV used to be. It’s a new form of censorship—not by silencing people, but by shouting over them.In the world of modern politics, especially in digitally active societies, a “buzzer” is not just someone who makes noise—it’s a strategic digital agent. A buzzer is a person or account, often paid or organised, whose job is to promote political content, manipulate online narratives, or attack critics on social media platforms. What makes buzzers powerful is not just their reach, but their coordination. They operate in packs, acting as digital soldiers to shape the battlefield of public opinion.The people who use buzzers are typically political actors—such as political parties, state institutions, or influential elites with political interests. Buzzers may include public figures, influencers, social media consultants, or everyday internet users who are recruited and paid to post content that aligns with a certain agenda.Buzzers become most active during key political moments. During election seasons, they push candidates into the spotlight, making them seem more popular than they are. When a scandal erupts—be it corruption, policy backlash, or criticism of a leader—buzzers work to distract the public, change the narrative, or flood platforms with alternative views. Their activity intensifies in moments when the ruling class’s reputation is at stake.These buzzers operate mostly on social media platforms where information spreads rapidly and public perception is shaped in real time. They are especially common in countries where traditional media is either distrusted or controlled, and where people rely on platforms like Twitter (X), Facebook, TikTok, and YouTube for news and political conversation.In Indonesia, buzzers played a prominent role in both the 2019 and 2024 presidential elections. Supporters of candidates such as Joko Widodo and Prabowo Subianto were mobilised online to promote positive narratives, attack critics, and spread hashtags. A well-known example is the existence of "buzzerp RP" accounts—trolls and paid influencers who target activists, journalists, and academics critical of the government. During the Omnibus Law protests in 2020, buzzers flooded Twitter with hashtags like #IndonesiaButuhKerja to counter anti-law sentiments and suppress student-led movements.In the Philippines, former President Rodrigo Duterte’s rise and reign heavily relied on a buzzer-like army. According to Maria Ressa (Nobel laureate and co-founder of Rappler), Duterte’s team used Facebook troll farms and influencer networks to amplify his tough-on-crime persona, spread disinformation, and silence dissenters. During elections, critics of Duterte or his allies (like Bongbong Marcos) were overwhelmed with harassment, fake news, and troll attacks. Social media, especially Facebook, became a weaponised tool for narrative control.In both countries, these buzzers are not just random fans—they are often funded or coordinated, working as part of broader political strategy.Why are they used? Buzzers offer a modern solution for controlling public discourse without using overt censorship. By faking consensus or manufacturing popularity, they can influence how people think and vote. They also serve to silence opposition by overwhelming critics with noise or hate.Buzzers operate through well-planned digital tactics. Some work alone, while others are part of organized “buzzer farms” coordinated through apps like WhatsApp or Telegram. These groups receive daily talking points, hashtags, memes, and even fake news articles to distribute. Their mission is to flood the algorithm—ensuring the message reaches as many people as possible and drowns out opposing views.In The Hype Machine (2020, Currency), Sinan Aral explains that social media platforms like Facebook, Twitter, Instagram, and YouTube are powered by engagement-driven algorithms. These algorithms are designed to maximise user attention by showing content that is most likely to trigger strong emotional reactions—such as anger, fear, outrage, or excitement. The longer users stay engaged, the more ads they see, which means more revenue for the platforms.This algorithmic design unintentionally creates a perfect playground for actors like buzzers. Since buzzers specialise in crafting emotionally charged content—clickbait headlines, shocking memes, polarising narratives—they are able to “hack” the algorithm. Their content gets boosted not because it’s truthful or balanced, but because it’s emotionally contagious and engagement-heavy.Aral highlights how these dynamics create what he calls a “hype loop”: attention fuels emotional response, which fuels more sharing, which fuels even more visibility. Buzzers exploit this loop to dominate the conversation, drown out critics, and manipulate public sentiment. As a result, the public sphere becomes less about truth and more about virality.Engagement-driven algorithms don’t just amplify what’s popular—they amplify what’s provocative. And that’s exactly what buzzers feed on.According to Sinan Aral, breaking the hype loop starts with reforming how social media algorithms work. These platforms should prioritise authenticity, accuracy, and meaningful engagement instead of just amplifying whatever gets the most emotional reaction. Aral suggests increasing transparency in how content is promoted, labelling bots and paid accounts, and giving users more control over what they see.On an individual level, people can fight the loop by slowing down before sharing, fact-checking sources, and resisting emotional manipulation. Using tools like browser extensions that label misinformation, following diverse voices, and curating your feed intentionally can reduce the emotional hijack effect. In short, we need a combination of smarter tech design and more mindful media habits.Globally, countries like the Philippines have become case studies. Under Rodrigo Duterte, entire “troll farms” were used to spread disinformation, praise the president, and harass journalists. Similar tactics were seen in Brazil under Jair Bolsonaro and even in the United States during the Trump era, where right-wing digital influencers and trolls weaponised engagement for political gain.Buzzers and their algorithms are not just local tools—they’ve become a global strategy to game democracy through attention-hacking.By the way, a troll is someone who intentionally posts provocative, offensive, or misleading content online to stir up emotions, cause conflict, or derail conversations. Trolls don’t necessarily believe in what they say—they just enjoy upsetting others or watching people argue.There are different types of trolls. Some are just doing it for fun or attention (“for the lulz”), while others have more serious motives, like discrediting activists, dividing communities, or pushing propaganda. Unlike buzzers—who usually have a political or paid agenda—trolls often act out of personal amusement, frustration, or malice. However, in today’s digital world, some trolls are part of organised disinformation campaigns, especially during elections or crises.Trolls thrive on reactions. They feed off outrage, replies, and arguments. The best defence against a troll? Don’t feed them—ignore, block, or report.In The Twittering Machine by Richard Seymour (2019, The Indigo Press), the author argues that users of social media platforms are not just passive consumers of information, but often become active participants in systems of performance, persuasion, and manipulation—whether knowingly or not.Seymour describes how social media platforms transform users into performers. People are encouraged—through likes, retweets, and algorithmic visibility—to constantly curate and present an engaging, clickable version of themselves. This “performance” isn't always authentic; instead, it's optimised to trigger emotional reactions, amplify visibility, or attract attention.As users become immersed in this culture, they unknowingly act as tools of persuasion. Even without being paid or officially affiliated with any political cause, users may: Spread memes or slogans; Retweet emotionally charged content; Attack opposing views; or reinforce narratives that benefit certain groups.These acts often align with political or commercial agendas, whether the user intends it or not. Seymour argues that social media rewards outrage, controversy, and tribal loyalty—all behaviours that make users susceptible to manipulation. Buzzers, trolls, and coordinated propaganda networks exploit this structure by blending in with regular users, creating a performance-driven space where emotion trumps reason.In this sense, even ordinary individuals—just by seeking likes or defending their beliefs passionately—become “soldiers” in digital culture wars. The line between sincere expression and strategic influence gets blurred. And thus, the Twittering Machine doesn’t just broadcast content—it drafts users into its logic.
Spotting a buzzer isn’t always easy, especially when they’re disguised as regular users. But there are several signs you can watch for to identify them from a distance:
- They post excessively about one political figure or issue—often with glowing praise or aggressive defence. Their timelines look like fan accounts or digital PR teams.
- They repeat the same phrases, hashtags, or talking points, sometimes copy-pasted word-for-word across many accounts. That’s a red flag for coordinated messaging.
- They attack dissenting voices instead of engaging in honest debate. Instead of discussing ideas, they insult, mock, or gang up on critics.
- They appear suddenly and disappear just as fast—often only active during political moments like elections or protests. Between big events, their activity is nearly zero.
- They rarely post personal content. You won’t see family photos, hobbies, or random thoughts—just pure propaganda.
- Their accounts are often anonymous or use fake profile pictures. It’s hard to find any trace of their real-life identity.
- They follow and are followed by other suspicious or similarly political accounts. Buzzer networks often move in digital herds.
- Some more advanced buzzers may use AI-generated images, carefully crafted bios, and even fake “relatable” tweets to seem authentic. But if you sense their only purpose online is to defend or attack specific people or ideas 24/7—it’s probably a buzzer.
- The best defense? Stay curious, ask questions, and check for patterns. If something feels scripted, it probably is.
How to take back the remote? Taking back the remote means reclaiming control over what influences your attention, emotions, and beliefs in the digital world. In today’s algorithm-driven society, this “remote” is often in the hands of social media companies, political actors, and attention-hacking buzzers and trolls.To take it back:
- Be intentional with your attention. Don't let algorithms decide what you see. Follow diverse and credible sources. Mute or unfollow accounts that thrive on outrage.
- Pause before you share. Think twice before forwarding that shocking tweet or angry video. Ask: “Who benefits if I spread this?”
- Verify before believing. Use fact-checking tools, reverse image search, or trusted platforms like Snopes or MAFINDO to check if something’s real.
- Customise your feed. Most platforms let you control what shows up. Use that feature! Block buzzers, silence trolls, and boost voices that add real value.
- Educate your circle. Help your friends and family recognise manipulation. Teach them how to sniff out buzzers and trolls, too.
- Support platforms and creators who value truth over hype. Reward quality over clicks.
- Disconnect when necessary. If the noise is too much, log off. Digital boundaries are a form of resistance.
Taking back the remote is about refusing to be a passive consumer of rage and manipulation. It’s choosing thoughtfulness over virality, and connection over chaos.
[Bahasa]