August 16, 2024
PETALING JAYA – A 15-second audio clip is all it takes to clone the voice of a person to near perfection using artificial intelligence (AI) tools, says Bukit Aman Commercial Crime Investigation Department director Comm Datuk Seri Ramli Mohamed Yoosuf (pic).
He said as AI technology continues to develop, related software or tools to replicate a person’s voice are becoming more sophisticated too.
He said this has proven to be a boon to online scam artists who get even more aggressive in promoting non-existent investment schemes using deepfake images and videos of prominent politicians and businessmen spliced with their AI-generated voices.
“With just a 15-second audio file of a person’s voice, AI can turn it into a lengthy conversation and be manipulated to say what the scammers want them to. This is how advanced AI has become.
“Some of these deepfakes are so real that only the person in the video can tell if it’s (not genuine),” Comm Ramli said.
He added that Internet users should always be wary and sceptical of such videos that are often posted on social media platforms.
“Surely it is far-fetched and highly unlikely for a high-ranking official such as the Prime Minister to be promoting an investment scheme even when the video and audio bear a striking resemblance to him and his voice.
“Moreover, these prominent figures cannot be stepping up with a rebuttal each time such fake videos of them emerge,” he said.
Comm Ramli was responding to the latest warning issued by the Securities Commission Malaysia (SC) on investment scams using deepfakes created with AI to mimic prominent people and reputable companies.
The SC said scammers were using the deepfake videos to deceive unsuspecting social media users into signing up for non-existent investment schemes and siphoning funds from them.
It said it is working with social media platforms and enforcement agencies to remove such videos and take action against those behind them.
Universiti Sains Malaysia Centre for Policy Research criminologist Datuk Dr P. Sundramoorthy stressed the importance of educating the public about the risks associated with deepfakes and recognising and reporting them to the authorities.
“These deepfake scams can exploit the trust people have in well-known individuals and companies that the scammers have manipulated for their gain.
“The authorities should engage tech companies and AI developers to create solutions that can mitigate the misuse of deepfake technology,” he said.
Cybersecurity specialist Fong Choong Fook said with the ongoing rise of online scams, the public should be wary of anything and everything that comes out of social media.
“Stop, think and process before trusting such content. Verify it with the relevant agencies.
“The sophistication of AI today makes it almost impossible to ascertain if such videos are real or fake by simply viewing them. To play it safe, we have to get into the practice of verifying and validating everything we come across in cyberspace,” he said.
Certified fraud examiner Raymon Ram said there are AI-driven tools and software designed to detect deepfakes and analyse videos for signs of manipulation. He urged Internet users to learn up on AI-related matters for awareness as it is their “first line of defence”.
“First, understand what deepfakes are and how they work. Approach unsolicited offers with caution, especially those that seem too good to be true. Cross-check information from multiple reliable sources,” he added.
Among the deepfake videos that had been posted on social media in recent months were those of Prime Minister Datuk Seri Anwar Ibrahim, business tycoon Robert Kuok, and various investment gurus.