Dance videos of Modi, rival turn up AI heat in India election

The Indian prime minister reshared the video on X, saying 'such creativity in peak poll season is truly a delight'

Read more...
India's Prime Minister Narendra Modi shows the Bharatiya Janata Party (BJP) symbol during a roadshow as part of an election campaign, in Varanasi, India, on May 13, 2024. — Reuters

By Reuters

Published: Thu 16 May 2024, 2:50 PM

An AI video shows an ecstatic Narendra Modi sporting a trendy jacket and trousers, grooving on a stage to a Bollywood song as the crowd cheers. The Indian prime minister reshared the video on X, saying "such creativity in peak poll season is truly a delight."

Another video, with the same stage setting, shows Modi's rival Mamata Banerjee dancing in a saree-like outfit, but the background score is parts of her speech criticising those who quit her party to join Modi's. State police have launched an investigation saying the video can "affect law and order."

The different reactions to videos created using artificial intelligence (AI) tools underscore how the use and abuse of the technology is increasing and creating worries for regulators and security officials as the world's most populous nation holds a mammoth general election.

Advertising
Advertising

Easy to make AI videos, which contain near-perfect shadow and hand movements, can at times mislead even digitally-literate people. But risks are higher in a country where many of the 1.4 billion people are tech challenged and where manipulated content can easily stir sectarian tensions, especially at election time.

According to a World Economic Forum survey published in January, the risk to India from misinformation is seen higher than the risk from infectious diseases or illicit economic activity in the next two years.

"India is already at a great risk of misinformation — with AI in picture, it can spread at the speed of 100X," said New Delhi-based consultant Sagar Vishnoi, who is advising some political parties on AI use in India's election.

"Elderly people, often not a tech savvy group, increasingly fall for fake narratives aided by AI videos. This could have serious consequences like triggering hatred against a community, caste or religion."

The 2024 national election – being held over six weeks and ending on June 1 – is the first in which AI is being deployed. Initial examples were innocent, restricted to some politicians using the technology to create videos and audio to personalize their campaigns.

But major cases of misuse hit the headlines in April including deepfakes of Bollywood actors criticizing Modi and fake clips involving two of Modi's top aides that led to the arrest of nine people.

DIFFICULT TO COUNTER

India's Election Commission last week warned political parties against AI use to spread misinformation and shared seven provisions of information technology and other laws that attract jail terms of up to three years for offences including forgery, promoting rumours and enmity.

A senior national security official in New Delhi said authorities are concerned about the possibility of fake news leading to unrest. The easy availability of AI tools makes it possible to manufacture such fake news, especially during elections, and it's difficult to counter, the official said.

"We don't have an (adequate monitoring) capacity...the ever evolving AI environment is difficult to keep track of," said the official.

A senior election official said: "We aren't able to fully monitor social media, forget about controlling content."

They declined to be identified because they were not authorised to speak to media.

AI and deepfakes are being increasingly used in elections elsewhere in the world, including in U.S., Pakistan and Indonesia. The latest spread of the videos in India shows the challenges faced by authorities.

For years, an Indian IT ministry panel has been in place to order blocking of content that it feels can harm public order, at its own discretion or on receiving complaints. During this election, the poll watchdog and police across the nation have deployed hundreds of officials to detect and seek removal of problematic content.

While Modi's reaction to his AI dancing video - "I also enjoyed seeing myself dance" - was light hearted, the Kolkata city police in West Bengal state launched an investigation against X user, SoldierSaffron7, for sharing the Banerjee video.

Kolkata cyber crime officer, Dulal Saha Roy, shared a typed notice on X asking the user to delete the video or "be liable for strict penal action."

"I am not deleting that, no matter what happens," the user told Reuters via X direct messaging, declining to share their number or real name as they feared police action. "They can't trace (me)."

Election officers told Reuters authorities can only tell social media platforms to remove content and are left scrambling if the platforms say the posts don't violate their internal policies.

VIGGLE VIDEOS

The Modi and Banerjee dancing videos, with 30 million and 1.1 million views respectively on X, were created using a free website, Viggle. The site allows a photograph and a few basic prompts that are detailed in a tutorial to generate videos within minutes that show the person in the photograph dancing or making other real-life moves.

Viggle co-founder Hang Chu and Banerjee's office did not respond to Reuters queries.

Other than the two dancing AI videos, one other 25-second Viggle video spreading online shows Banerjee appear in front of a burning hospital and blowing it up using a remote. It's an AI altered clip of a scene from the 2008 movie, The Dark Knight, that shows Batman's foe, Joker, wreaking havoc.

The video post has 420,000 views.

The West Bengal police believes it violates Indian IT laws, but X has not taken any action as it "strongly believes in defending and respecting the voice of our users", according to an email notice sent by X to the user, which Reuters reviewed.

"They can't do anything to me. I didn't take that (notice) seriously," the user told Reuters via X direct messaging.

Reuters

Published: Thu 16 May 2024, 2:50 PM

Recommended for you