March 7, 2022
SINGAPORE – Online platforms will be legally required to take prompt action when users report harmful content and implement systems such as content filters to protect children as Singapore takes steps to counter growing online harm.
Unveiling plans for the new codes of practice, Minister for Communications and Information Josephine Teo told Parliament on Friday (March 4) that the aim is to raise the standard of online safety, in line with new laws enacted in recent times in Germany, Australia and Britain.
She said Singapore’s new codes will focus on three areas: child safety, user reporting and platform accountability.
Like existing codes of practice administered by the Infocomm Media Development Authority, the new codes will have the force of law, Mrs Teo said during the debate on her ministry’s budget.
Under the new codes, platforms will have to ensure robust systems are in place to minimise the exposure of children and young people to harmful content, such as content filters for child accounts and ways for parents to supervise and guide their children online.
They will also have to set up easily accessible ways for users to report harmful content. The platforms will have to be responsive in evaluating and acting on these reports, and to inform users of actions taken in a timely manner.
The codes of practice will also require platform operators to regularly publish reports on the effectiveness of their measures, including information on how prevalent harmful content is on their platforms, user reports they received and acted on, and the systems and processes they have in place to address such content.
Said Mrs Teo: “Users can then compare the approaches taken by platforms and make informed decisions about which to engage or disengage.”
In her speech, the minister cited a survey commissioned by The Straits Times, which found that while two-thirds of children aged seven to nine use smartphones daily, a third of parents do not know who their children interact with on social media.
Mrs Teo also noted a National Youth Council poll which found that two-thirds of youth had experienced online harm such as harassment and unwanted advances, causing many to develop distrust towards others and experience stress and anxiety.
Dangerous social media “challenges” are one example of harmful online trends.
Mrs Teo cited how a 10-year-old Italian girl died last year after taking part in an online “blackout challenge” which encouraged users to choke themselves until they pass out while live-streaming on the viral video platform TikTok.
Another is the use of online platforms for criminal activities.
For instance, a gunman live-streamed himself firing on Muslims in a New Zealand mosque in 2019, and rioters who stormed Capitol Hill in the United States last year used social media to organise themselves and amplify their messages, Mrs Teo said.
In response to the growing prevalence of online harm, many governments around the world have enacted new laws.
In 2017, for instance, Germany enacted its Network Enforcement Act, which requires platforms to act on unlawful content reported by users.
Last July, Australia enacted an Online Safety Act, which introduces basic safety expectations for online service providers.
Meanwhile, Britain’s draft Online Safety Bill aims to create a duty of care for online platforms towards their users, including requirements to take action against harmful content.
Singapore ranked fourth in a 2020 study of online safety for children in 30 countries by international think-tank DQ Institute, Mrs Teo noted.
She said this brings some comfort, but Singapore, too, must step up efforts to keep online spaces safe, especially for children.
“Online platforms accessible by users in Singapore can, and must, take greater responsibility for user safety.
“They should endeavour to keep online spaces free from harmful content, including age-inappropriate content, such as violent and graphic content, and content that promotes sexual violence.”