Home Sections Opinion Elections and info disorders: Sri Lanka’s experience holds global significance
Opinion

Elections and info disorders: Sri Lanka’s experience holds global significance

Share
Share

By Buddhika Samaraweera

Sri Lanka’s 2024 Presidential and Parliamentary Elections took place in an information environment that, at first glance, appeared familiar. Social media feeds were filled with political commentary, memes, and campaign messaging much like in previous elections. However, the nature of electoral influence had changed in ways that were less visible and far more difficult to track. What once showed up as obvious falsehoods has given way to subtler forms of manipulation that rely on emotional cues, selective amplification, and coordinated activity that looks organic to the ordinary user. 

This shift mirrors trends seen in elections across the world, where influence campaigns increasingly operate in the space between fact and persuasion rather than through outright lies. To examine how these dynamics have played out in Sri Lanka, what they mean for electoral integrity, and why existing oversight mechanisms are struggling to keep pace, The Daily Morning spoke to digital rights researcher and disinformation analyst, Dr. Sanjana Hattotuwa, whose work spans multiple countries and political systems. 

Following are excerpts of the interview:

You have studied information-related disorders across different regions and political systems. In the context of Sri Lanka’s elections, how has the nature of disinformation changed in recent years?

A: The difference between the 2019 and 2024 election cycles is striking and concerning. In 2019, the patterns were more visible. There were proxy Facebook pages, mock polls designed to spark emotional reactions, and misleading voter education content. Researchers and fact-checkers could follow these activities, trace their connections, and understand their effects, even if the tools for tracking them were far from perfect. 

By 2024 however, the threat had evolved into something far more difficult to define or counter. The central challenge is that modern campaigns no longer need to rely on outright falsehoods. Instead, they operate as influence operations, focusing on shaping perception rather than making claims that can be fact-checked. Emotional priming, selective amplification, and the strategic use of ready-made audiences have replaced obvious lies as the primary tools of manipulation. Pages initially built for entertainment, gossip, or viral content are repurposed overnight to serve partisan ends. Administrators are paid to post content that never enters platform ad libraries, making it invisible to regulators and outside researchers. This shift represents a fundamental transformation in the ecosystem of disinformation. It now exists in the grey space between truth and manipulation, where the ordinary voter cannot easily distinguish between authentic and orchestrated content. Election commissions, civil society groups, and platform companies find their existing tools insufficient because they were designed to identify false claims rather than monitor the complex interactions of emotion, network effects, and algorithmic amplification. 

The problem has become one of scale and subtlety. The content does not need to lie to succeed; it only needs to manipulate the lens through which voters perceive reality. The danger of this evolution is that it allows campaigns to operate under the radar, bypassing scrutiny from the institutions and frameworks designed to protect electoral integrity. It combines technical sophistication with an understanding of human psychology. 

By leveraging the biases and emotional responses of ordinary users, these campaigns can shape collective perception in ways that feel organic. The result is a system where influence is pervasive, subtle, and exceptionally difficult to combat through traditional oversight mechanisms.

What are the most common forms of election-related disinformation today, and who benefits most from their spread?

The forms of manipulation have grown far more diverse, and that diversity itself is a challenge. Coordinated inauthentic behaviour remains central. Networks of Facebook pages, mostly repurposed from entertainment or gossip content, post identical or nearly identical material simultaneously. Administrators are often paid in cash to ensure that these posts never appear in ad transparency libraries. That means that regulators cannot track the spending or distribution of the material. Layered on top of this are chameleon ads that exploit gaps in moderation systems, websites that simultaneously produce disinformation and appear to fact-check it, and fabricated institutional reports designed to borrow legitimacy from authority. Short message service (SMS) campaigns targeting millions of users without consent also play a role, as do professionally designed infographics and reports that appear credible even when entirely false. These tactics allow manipulation to reach large audiences quickly. They shape perception long before any corrective information can intervene. Who benefits most depends on the structural advantage within a given system. Incumbents often have easier access to State resources, media infrastructure, and partisan networks capable of amplifying messaging. 

Dark campaign financing also favours those with deep pockets and established networks. At the same time, organic engagement can counteract financial advantage. In 2024, Anura Kumara Dissanayake, a challenger with strong grassroots support, generated five times more Facebook shares than the incumbent and main Opposition candidate combined. This demonstrates that genuine popular momentum can amplify a candidate’s reach in ways that paid campaigns cannot fully replicate. Even though financial and institutional resources provide structural advantage, the unpredictable nature of viral content means that even challengers without access to the State or deep-pocketed networks can achieve a disproportionate influence. However, those with control over infrastructure, media access, and technical know-how retain the ability to manipulate narratives at scale.

Social media platforms play a central role in political communication. From your research, how does platform design and algorithms shape the visibility and impact of false or misleading election content?

Platform architecture is not neutral. Every design choice shapes what voters see, how they react, and what spreads. My research across multiple Sri Lankan election cycles shows that algorithms prioritise content that generates engagement. Emotional reactions, shares, and comments drive visibility. Content that triggers anger, fear, or excitement travels further than sober, fact-based reporting. 

In the 2024 elections, coordinated networks exploited this systematically. Proxy pages posted identical content across multiple entertainment, gossip, and meme pages. Because engagement was high, the algorithm rewarded the content with increased organic reach. These posts were never paid advertisements and therefore remained invisible to ad libraries and most oversight tools. When the primary tracking tool for public page activity, CrowdTangle, was sunset in 2024, independent researchers were left almost entirely blind to these networks. 

Language also plays a critical role. Sinhala-language content largely escapes platform moderation because the algorithms and human reviewers are poorly resourced for less commercially dominant languages. Coordinated campaigns that shape perception without making clear false statements operate in a grey zone. They remain undetected by standard moderation. Platforms profit from engagement, which means that content designed to provoke outrage or fear is naturally amplified. Until these incentive structures change, platforms will continue to function as accelerators of content that undermines electoral integrity rather than protecting it.

Sri Lanka has experienced conflict, political instability, and democratic erosion. How do these factors make the country more vulnerable to disinformation during elections?

Sri Lanka’s post-war information landscape carries long-lasting vulnerabilities. The military defeat of the Liberation Tigers of Tamil Eelam in 2009 suppressed underlying ethno-political fractures but did not resolve them. The unresolved tensions left fertile ground for narratives built on fear, anxiety, and appeals to security and order. Incidents such as the anti-Muslim violence in Kandy in 2018 demonstrated how quickly online content could transform tensions into physical conflict. 

By 2024, networks and pages created during anti-authoritarian protests were repurposed for political ends. Distrust in institutions including the Election Commission, the Judiciary, and mainstream media made alternative narratives, even completely fabricated ones, seem plausible. Political instability reinforces a constant campaign environment. Incumbents with access to State broadcasting, telecommunications, and SMS infrastructure can push content to millions of voters. These structural conditions amplify vulnerability to disinformation because they create channels of power that do not exist in neutral, competitive systems. The broader context also matters. 

Sri Lanka’s long history of political volatility means that citizens are often skeptical of official information, which creates fertile ground for alternative narratives to spread. Manipulated or selective content can easily exploit existing tensions, ethnic anxieties, and distrust in institutions. The combination of weak oversight, concentrated media influence, and emotionally charged historical narratives makes disinformation particularly effective. 

There is often a focus on false content itself. In your view, how important are emotional triggers such as fear, anger, or identity in driving the spread of election disinformation? 

Focusing only on false content is misleading. It draws attention to what can be fact-checked and ignores the true levers of influence. My research shows that manipulation rarely relies on outright lies. Instead, it works through what I call ‘limbic engagement’. Content is designed to trigger fear, outrage, or identity-based solidarity before rational thought can intervene. Memes, short videos, and selectively framed narratives do not need to be false to be deeply persuasive. They only need to evoke strong emotional responses that prompt people to react, share, or engage without questioning the accuracy. During the 2024 Presidential Election, most posts appearing on Facebook were election related.

But, a large majority of users only read headlines. The architecture of persuasion has adapted to this reality. Mock polls and coordinated campaigns create emotional contagions. They generate a perception that a candidate is winning or losing, which can influence voter behaviour, independent of the actual facts. Entertainment and gossip pages were pivoted to partisan content precisely because their audiences were already primed for engagement. 

In Sri Lanka, these triggers carry unique potency. They map onto historical ethnic and political fractures, majoritarian anxiety, post-war insecurity, and the distrust of institutions. Fact checking is ineffective against feelings. Once a strong emotional response has been triggered, corrective content struggles to reach the same audience or elicit the same intensity of engagement. Emotional manipulation has become the backbone of modern disinformation campaigns because it is invisible to traditional oversight and powerfully shapes perception and behaviour.

What role do political actors and organised networks play in amplifying disinformation, compared to ordinary users who may share content unknowingly?

The distinction between organised networks and ordinary users is crucial, but, the interaction between the two is what makes modern electoral manipulation so effective. In Sri Lanka, political actors and organised networks create the architecture of influence, and ordinary users, acting in good faith, provide the amplification. Coordinated networks of proxy Facebook pages form the backbone of these campaigns. 

During the 2024 Presidential Election, the campaigns of major candidates operated through dozens of accounts, with each account posting content in tightly synchronised bursts. The numbers captured in Meta’s Ad Library represent only a fraction of the actual spending. Far more content was paid for in cash to page administrators and posted as organic content, completely invisible to regulators. 

Political actors exploit legal loopholes. Individuals running nominally independent campaigns in support of a candidate face no reporting obligation. This creates a shadow infrastructure that exists outside formal oversight. But, the influence operation only achieves its full effect when ordinary users engage with the content. 

Pages that were originally built for entertainment or gossip were repurposed for partisan content. Users who followed these pages for neutral or anti-authoritarian reasons became unwitting distributors, sharing, reacting, and amplifying content. They didn’t realise that it was part of a coordinated campaign. Seventy per cent of users engaging with election content only read headlines before sharing. The campaigns are designed for exactly this type of interaction. The relationship between organised networks and ordinary users is therefore parasitic. Political actors create the infrastructure, determine the timing, and engineer the content for maximum emotional effect. Ordinary users, seeking information or entertainment, supply the reach and credibility. This combination of top-down coordination and bottom-up dissemination makes influence operations extremely difficult to detect or counter.

Fact-checking initiatives exist, but, disinformation continues to circulate widely. What limits their effectiveness during election periods?

Fact-checking addresses symptoms rather than the underlying problem. Its primary limitation is timing. By the time a correction is issued, the original content has often already achieved its purpose. In the 2024 Presidential Election, fabricated postal vote results circulated widely in professionally designed infographics. Fact checking platforms issued corrections, but, the reach of the original content had already far surpassed the corrective message. Emotional engagement, algorithmic amplification, and rapid network effects mean that corrections almost never achieve equivalent visibility.  

A new challenge is the rise of “pink slime” website operations that simultaneously produce disinformation and publish fact-checks of their own content. This creates a closed loop where the act of correction becomes part of the manipulation. It confuses audiences further. Civil society organisations often rely on the vocabulary of disinformation, bots, and fake news without understanding the operational mechanics of campaigns. Fact-checking is necessary but is insufficient by itself.

How should Governments and institutions such as election authorities, media organisations, and the civil society respond to disinformation without undermining the freedom of expression?

Content regulation during elections is risky and counterproductive. Granting authorities the power to decide acceptable political speech is inherently fraught and can easily be misused. The focus should instead be on making the machinery of influence visible. This includes exposing networks, coordination, and hidden financing. 

In 2024, the Election Commission (EC) issued social media guidelines for the first time, but, these had no practical effect. Page administrators ignored them, and the Commission lacked the technical capacity to monitor compliance. A proper response would involve dedicated units within the electoral authorities, capable of the real-time monitoring of content spread, coordinated networks, and expenditure flows. This capacity does not require restricting speech. It requires visibility, accountability, and competence. The civil society also has an important role. Organisations need to move beyond issuing statements about “disinformation” and build technical literacy to track, document, and expose influence operations. Journalists and watchdogs should be equipped to analyse how content spreads, identify networks, and understand funding mechanisms.

Based on your experience across different countries, are there lessons that Sri Lanka can draw from other contexts that have faced similar challenges?

There are lessons, but, Sri Lanka’s willingness to adopt them is uncertain. Mexico has developed a formal memorandum of understanding with Facebook for flagging content during election periods. Civil society and journalistic organisations participate in fact-checking with a genuine technical capacity rather than performative oversight. Estonia has a tripartite coordination system between its electoral body, platforms, and private technical firms. They have admitted that no single institution can address the problem alone. Panama’s Digital Ethical Pact encourages citizens to commit to responsible information sharing. Sri Lanka’s experience is not just a recipient of lessons; it is also a laboratory whose strategies now appear in Canada, the United Kingdom, and New Zealand. Chameleon ads, proxy page networks, and pink slime operations first documented in Sri Lanka have since emerged in other democracies. This demonstrates that the country’s experience holds global significance. 

What are the most urgent steps that Sri Lanka needs to take to protect electoral integrity in an era of rapidly evolving digital technologies?

The most urgent intervention is institutional. The EC entered the 2024 elections without the technical capacity to monitor campaigns, detect coordinated networks, or verify online expenditure. A dedicated digital monitoring unit staffed with analysts capable of tracing content flows, proxy networks, and unreported spending is essential. Campaign finance reform is also urgent. Loopholes allowing individuals to run independent campaigns without reporting expenditure must be closed. The definition of campaign spending must include coordinated organic content paid for in cash, which currently escapes all oversight. The Data Protection Authority also has a role. The 2024 SMS campaign reached all 27.6 million registered mobile numbers without consent. Existing laws provide a basis to prevent future misuse if enforced rigourously. Finally, platforms must be engaged formally. Agreements with Meta, modelled on international precedents, should establish real-time communication during elections and extend transparency requirements to organic, coordinated content. Civil society and election monitors need genuine technical literacy to track and expose influence operations. None of this requires suppressing speech.

Source: The Daily Morning

Disclaimer: The views and opinions expressed in this column are those of the Interviewee, and do not necessarily reflect those of this publication.

Author

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
Opinion

Tax system resting on the wrong shoulders

Sri Lanka has restored a degree of fiscal order, yet the structure...

Opinion

Who pays for SOE dysfunction?

By Rohan Samarajiva It appears that the latest procurement by Lanka Coal...

Opinion

New opportunities and a new chapter in UK–Sri Lanka trade

By Andrew Patrick From 1 January 2026, the UK implemented reforms to...

Opinion

Curated legitimacy in a moral economy

The curious case of Namal Rajapaksa and the Cambridge Union By Kusum...