December 28, 2023
SINGAPORE – A deepfake video of Deputy Prime Minister Lawrence Wong promoting an investment scam has been circulating on Facebook and Instagram.
In the video, his mouth is noticeably altered to synchronise with a fake voice-over promoting an investment scam. The voice-over mimics the pitch and intonation of his real voice. The Straits Times’ logo is used at the top right-hand corner of the video.
The video has modified footage of DPM Wong at a media doorstop interview recorded by ST.
An SPH Media spokeswoman said the video in question was not created or published by the company or ST.
“It has come to our attention that there is a video attributed to The Straits Times, featuring Deputy Prime Minister Lawrence Wong endorsing commercial projects, circulating online,” said the spokeswoman.
“We urge members of the public to stay vigilant and not circulate videos of unknown sources.”
In response to queries from ST, the police said a report had been made about the video.
In a Facebook post on Dec 11, DPM Wong said he was aware of deepfake scam posts and messages that show him endorsing products, and spreading misinformation that the Government was planning to implement a Covid-19 circuit breaker.
“These are all falsehoods. Let’s stay vigilant and discerning online,” said DPM Wong.
The use of deepfake technology to spread disinformation or dupe scam victims is becoming a growing area of concern.
National University of Singapore associate professor Terence Sim, who is involved in research work related to deepfakes and other digitally altered images, told ST in June that deepfake technology has become easier to use over the years. “All a scammer needs is a few photos of the target’s face, which can be taken from social media, to create a deepfake. That is scary,” he said then, adding that deepfakes using audio alone may also be used to trick victims.
“A video clip of the target’s voice, which can be as short as 10 to 15 seconds, can be used to create an audio deepfake.”