Humans who never lived now shape public opinion in Bangladesh’s politics
The election season arrived with familiar noise. Loudspeakers, convoys, street rallies, and a flood of digital campaigning that now shapes more political mood than any roadside poster.
But this year, something else slipped into the mix. They appear on your phone screen like neighbours, uncles, fruit sellers, or elderly women.
They talk about voting, scold one party, praise another, and speak in the cadence of everyday Bangladesh.
They look human. They sound human. But they are not.
They are AI-crafted voters. And they are multiplying, finds a study conducted by online verification and media research platform ‘dismislab’.
The page that became a machine
The trail leads to a busy Facebook page that reinvented itself four times before settling on the name “Uttar Banga Television.” Over 90 thousand people now follow it, most without knowing that the page is quietly pushing out synthetic citizens.
The page once posted charity appeals and motivational content. Today, it broadcasts political persuasion. Seven admins in Bangladesh operate it. The shift began only days after the election schedule was announced.
On December 15, 2025, the page uploaded its first AI-generated political video.
A grandmotherly figure declared her unwavering support for Jamaat-e-Islami. The tone was casual, the backdrop familiar, the dialogue delivered with certainty. Many viewers responded with blessings and support. Few suspected the woman was fabricated.
But a closer look told a different story. Her skin shifted unnaturally between frames. Her nasolabial folds smoothed out mid-sentence. She never blinked. Google’s Synth-ID later confirmed the suspicion; the audio and visuals were machine-made.
That was only the beginning.
By the next morning, four more AI “voters” appeared. Each had a different age, a different background, a different dialect, but all pledged allegiance to the same symbol.
As December passed, the page posted at least 35 such videos.
When fiction starts speaking like fact
The pattern sharpened. Supporters who didn’t exist began promoting candidates who did. Others cropped up criticising BNP, twisting narratives and delivering sharp attacks in the tone of everyday folk.
The most viral one featured a fruit seller. He lectured on politics, standing beside his cart, as if caught in an honest moment on a busy street. The clip exploded, drawing more than eight million views. People praised his “wisdom” in the comments.
Yet the truth was hiding in plain sight.
Behind him, a CNG-run autorickshaw-like three-wheeler morphed into something that wasn’t physically possible. A rickshaw-puller’s face blurred into unnatural shapes. A signboard displayed gibberish in Bangla. Synth-ID confirmed it again: AI-generated from start to finish.
This wasn’t accidental misinformation. It was an engineered illusion.
The assault on rival parties intensifies
The attacks widened.
A fake disabled woman is accusing BNP leaders of bribery for providing benefit cards.
A slick video mocking Tarique Rahman ahead of his return from London.
Another clip claiming BNP insiders were plotting against their own supporters.
All widely shared. All believed by many.
And all made by artificial intelligence.
One video, framed as a “warning message” from a disabled citizen, reached more than 8,00,000 views. Commenters blessed her, prayed for her, and echoed her words. They didn’t notice her lip movements glitching. They didn’t notice the shadow inconsistencies. They didn’t know she wasn’t real.
A fact-check later confirmed it. But the damage was already done.
AI that pretends to be Hindu voters
Then came the next wave. Elderly Hindu men in dhotis, middle-aged women wearing conch bracelets, and young Hindu youths speaking in local accents. All declared their loyalty to Jamaat-e-Islami. All framing it as a rejection of BNP.
The most popular one surged past 1.6 million views.
Comments poured in:
“Right, uncle. Well said.”
“You are brave.”
“May your words come true.”
Yet the “uncle” didn’t exist. His eyebrows changed shape across frames. His fingers bent unnaturally. His lips floated slightly out of sync. His backdrop contained signs written in broken Bangla. Synth-ID found its digital watermark again.
A synthetic Hindu voter, manufactured for political leverage.
A strategy seen across Asia
Researchers warn that this isn’t an isolated case. A study in Frontiers in Political Science outlines how AI-generated voters were used in the Philippines and Thailand to manipulate audiences with customised messaging. The tactic works because the videos feel intimate. They appear candid, personal, and unpolished. And that style is effective.
Bangladeshi media researcher Fahmidul Haque says the danger is real and immediate. “If someone sees only one piece of content, there is a chance to be confused,” he says. “And that confusion can influence how someone votes.”
Elections begin, the lines blur
The Election Commission has rules forbidding the use of AI for misleading or defamatory content. Those rules now sit on paper while deepfake videos spread by the hour. The CEC himself warned that artificial intelligence is being weaponised to distort public opinion.
But warnings alone cannot outrun algorithms.
dismislab attempted to contact Election Commission officials for comment on enforcement. No reply came.
How to tell a human from a synthetic one
Despite improvements in AI, the cracks still show. Experts offer some simple checks:
• Blinking: AI characters often blink too little or too mechanically.
• Skin texture: Too smooth, too shiny, or inconsistent.
• Lips and audio: Slight delays or mismatches.
• Background text: Gibberish letters, distorted signs.
• Movement: Robotic turns, unnatural stillness.
• Light reflection: Glasses without natural illumination changes.
And the simplest check of all:
If an unfamiliar page suddenly posts dozens of similar videos with the same political message, something is off.
The new shape of influence
The troubling part is not just that the videos exist. It’s how ordinary people respond as though they are real voices. An AI grandmother can convince thousands. A synthetic Hindu elder can shape community sentiment. A manufactured fruit seller can stir nationwide debate.
These digital ghosts are not voters. They are tools.
They don’t cast ballots. They sway the ones who do.
As Bangladesh heads toward the polls, the battle is no longer only on the ground or even in the media. It is unfolding in invisible layers of code, in smiling faces that never lived, and in voices designed to sound like ours.
In a season marked by uncertainty, one thing is clear:
The biggest influencer in this election may be something that cannot vote at all.