This is one of the four services launched by Mohre aimed at rectifying the status of violating workers and exempting establishments from paying fines
Scammers are getting more sophisticated and convincing day by day. Cybersecurity experts warned that one of the methods now employed by fraudsters is audio deepfake, where AI (artificial intelligence) is used to replicate voices and even faces to make scams more realistic and believable.
Irene Corpuz, founding partner and board member at Women in Cybersecurity Middle East, cited a case in May this year, where a British engineering company in Hong Kong lost around HK$200 million (Dh94 million) to criminals who used AI-generated video call.
“Scammers will engage you in phone conversations so that they can record your voice and use it in a future scam,” said Corpuz, adding that this can also be done in Zoom meetings where there are multiple participants.
Stay up to date with the latest news. Follow KT on WhatsApp Channels.
“When a victim hears the voice or sees a video of a friend or a loved one, the scam becomes more believable,” she explained.
For audio deepfakes, Corpuz said the public should pay attention to the words used. She told Khaleej Times: “Be cautious if you receive a call from an unknown number, especially if the caller initiates a conversation with questions that require a ‘yes’ or ‘no’ answer.”
How do scammers use this? Corpuz explained: “Scammers can initiate calls with a chatbot and when a chatbot will confirm a transaction request with a question: ‘Would you like to initiate a payment. Is this correct?’ This is when the scammers can use the recorded ‘yes’ or ‘no’ answer."
“So, avoid answering with affirmative phrases like ‘yes’ or ‘no’ to unknown callers. Scammers may record your voice and use it to authorise fraudulent transactions or trick automated systems that use voice recognition for identity verification,” Corpuz reiterated.
Scammers can also use verification tactics to make the other person on the line believe it was a legitimate call. “The scammer would say ‘the first digits of your Emirates ID are 784-19… and then ….???’ Take note that the scammer is trying to make you fall into a trap by supplying the remaining digits of your Emirates ID,” Corpuz said, adding,” Once they convince you that they are legit, then they can execute their modus. Most scam callers also pretend they are from banks, central banks, the police, and utility companies.”
JD Ackley, CEO at Raizor, a conversational AI deployment and services organisation, told Khaleej Times: “There are things to look out for but first, be aware of any unsolicited calls. Typically, scammers are looking for any reason you might take a call from them and their premise will be generic and they will zero in on things you mention — because bots are programmed to take direction based on your responses.”
“Scammers always have a vague idea of your demographic and nothing more specific — except what you tell them,” added Ackley, explaining: “If they are asking probing questions about who you are or your habits, be cautious and ask them why they need to know.”
Second thing to be vigilant about is the request for payment in unusual terms. Ackley said: “A legitimate business will not request payment in any type of gift card or money transfer.”
“And finally, if it is something you really need to address, ask for a call back number and tell them you will call them back. Scammers will do anything to keep you on the call and extract payment during that call, but only a legitimate business will have a proper way for you to call them back,” added Ackley, who is a veteran in the contact centre space for more than 25 years.
Barney Almazar, director of corporate-commercial department at Gulf Law, noted: “Scammers often target you during times when you're most likely to have your guard down, such as during your commute to work or at dinner time.”
“This tactic makes it easier for them to conceal their fraudulent intentions. Additionally, during these periods, it can be challenging to reach bank hotlines, providing scammers with a crucial window to exploit before you can report the fraud,” he added.
Almazar said education and awareness play a crucial role in combating the impact of audio deepfake and other related scams. He noted: “Under the UAE Cybercrime Law (Federal Decree-Law No. 5 of 2012), there are stringent measures in place to combat these abuses. Article 2 explicitly criminalises electronic fraud and impersonation, imposing severe penalties including imprisonment, fines and subsequent deportation."
“Furthermore, Article 21 prohibits the recording, sharing, or publishing of personal data without consent, highlighting the legal ramifications of unauthorised voice recordings,” he added.
Almazar continued: “Vigilance and critical thinking play important roles when dealing with deepfakes or any kind of scam. Don’t just believe what your eyes see or what your ears hear. Check and verify before doing anything.”
ALSO READ:
This is one of the four services launched by Mohre aimed at rectifying the status of violating workers and exempting establishments from paying fines
They were awarded certificates of appreciation after their efforts contributed to enhancing safety and security in the community
While her first few Friday morning runs were attended by only three to five women, it now attracts up to 50 per session
Heavy school bags can have emotional and mental effects on children, warn doctors
The new vehicles will be utilised to organise traffic flow and enhance security presence in tourist areas and other locations across the emirate
Minister Lana Zaki Nusseibeh, in a letter to The Economist, also referred to the constructive role played by the UAE in the recent talks that took place in Switzerland
The Ruler of Dubai took to X to praise the two figures
Some of the decisions are effective from April 1, 2024, and will be published in the Official Gazette