Mon, Nov 25, 2024 | Jumada al-Awwal 23, 1446 | DXB ktweather icon0°C

From Tom Hanks to Alia Bhatt, the celebrities that have been targeted by AI-generated deepfakes

As AI takes over more and more areas of our lives, we take a look at the need for legal safeguards to protect privacy and reputation

Published: Fri 1 Dec 2023, 4:51 PM

  • By
  • Yasser Usman

Top Stories

What do Tom Hanks, Alia Bhatt, Katrina Kaif, and Sara Tendulkar have in common? They’ve all recently had a taste of the deepfake treatment, courtesy of modern technology’s mischievous side. On October 1, Hollywood actor Tom Hanks warned fans that an AI-generated video of him pitching a dental plan is not real. This week Indian actress Alia Bhatt became the latest target of deepfake with a morphed clip which originally shows a woman making obscene gestures at the camera. Earlier this month, actress Rashmika Mandanna also became a victim of a disturbing incident when a video featuring her through deepfake went viral. This was followed by the circulation of manipulated images featuring Katrina Kaif, Sara Tendulkar, and cricketer Shubman Gill, all falling prey to the perils of deepfake technology.

For those unaware, deepfakes are fake images or videos made using artificial intelligence. They involve replacing a person in an existing image or video with someone else’s face. This recent outcry in Bollywood over the misuse of deepfake technology is understandable, but there’s also the classic paradox. In the recent past, the same entertainment industry has extensively used this technology to its advantage. Film stars regularly enjoy the benefits of the same technology, and then shout about it being misused. Let’s look at a couple of examples from the last few years.

In 2020, Shah Rukh Khan, serving as the brand ambassador for a major chocolate company, was featured in AI-powered, highly personalised advertisements of over 2,000 grocery and retail stores. They utilised machine learning to replicate Shah Rukh Khan’s appearance and voice to endorse these local stores across India. Salman Khan appeared in a soft drink ad in 2022 where a younger version of him was shown using deepfake technology.

Photo: Instagram/Rashmika Mandanna

Photo: Instagram/Rashmika Mandanna

In fact, all the top Indian actors have been using the deepfake for ‘de-ageing’ on screen to play their younger selves: Superstar Rajinikanth, in almost all his films in the last few years, Salman Khan in Bharat (2019), SRK in Fan (2016), Zero (2018), and now in his upcoming Dunki (2023), and Aamir Khan in Lal Singh Chaddha (2022). In Hollywood, the most notable case is Paul Walker, who tragically passed away after partially filming Furious 7 (2015). He was brought back to life in the film using his brother as a body double and deepfake technology. Closer to home, a brief scene of the late Amrish Puri was recreated in this year’s blockbuster Gadar 2 using the same technology. Isn’t it ironic that there’s a widespread effort to vilify deepfake technology when it has firmly established its presence in entertainment industries worldwide?

Katrina Kaif

Katrina Kaif

If celebrities (or their families) willingly permit their digital personas to be employed in deepfakes for both movies and lucrative advertisements, why is there such a fuss about this technology? Has there ever been a technology without both advantages and disadvantages? So the problem is not with the technology.Then, what is the underlying issue? It ultimately boils down to a simple yet vital factor: consent. When deepfakes are used without someone’s consent, it is a violation of their privacy and can have serious consequences.

Sharing the Rashmika Mandanna fake video, legendary actor Amitabh Bachchan called for an urgent need for legal and regulatory action to tackle the spread of fake content online. Even before that, the veteran actor had been the first Bollywood actor to express fear of this technology. In an episode of Kaun Banega Crorepati in September, Bachchan said, “I am scared, I might be replaced with hologram. In films, such things are happening. We are taken to a room and around 40 cameras rotate around and we’re made to make several expressions by making faces and looking all around. I didn’t know for what, but later I learned that they would be used accordingly in my absence. Even if I haven’t given the shot, it will seem that it is me. So, I get scared that AI will take our jobs.” So it seems he was fine using the technology and recording with 40 cameras, but his main worry was how AI might be misused in the future and its potential impact on his earnings.

In September 2023, Indian actor Anil Kapoor emerged victorious in a unique legal battle concerning the use of his likeness in AI applications. He now has the authority to issue an injunction if his image, voice, or even his famous catchphrase jhakaas is used in gifs, deepfakes, or unauthorised merchandise. AI can indeed distort and harm reputations, highlighting the need for monitored regulations to protect public figures from image-damageing AI attacks. Currently, many actors are adding clauses to their contracts to protect themselves from any potential misuse through AI or similar technologies.

However, these crimes extend beyond celebrities. Anyone, regardless of their public profile, could fall prey to malicious deepfake creations. It’s especially concerning from a woman’s perspective. What’s the solution? While it might seem simple to suggest deleting online profiles to avoid AI abuse, we know it’s unrealistic in today’s world. Celebrities may have the resources to fight against misuse, but the average person needs legal safeguards. It is equally important to adapt. As technology advances, our ability to address and counter these threats should evolve in tandem, allowing us to be less shocked and more prepared to tackle them effectively.

Uncle Ben was right: with great power comes great responsibility. And deepfakes have a lot of power. What we need are some laws and regulations in place to make sure people use them responsibly. Otherwise, we’re all in for a world of hurt.

wknd@khaleejtimes.com



Next Story