The second proposed Content Code for Social Media Services covers areas of content assessed to be "egregious online harms" on social media platforms such as sexual harm, self-harm and racial or religious intolerance. It will allow the IMDA to direct social media services to take action against harmful online content to protect users.
A public consultation on these Codes is expected to take place in July.
Both Codes will have the force of law and be enforced through appropriate legislative updates.
In more detail
The first proposed Code is intended to target social media services that have been designated as having high reach or high risk.
Such social media services would be expected to do the following:
- Implement community standards and content moderation mechanisms to mitigate users' exposure to certain harmful content, which should at least include child sexual exploitation and abuse material, as well as terrorism content.
- Offer tools enabling users to reduce and mitigate their exposure to unwanted content, for example, by including content filters for child accounts and mechanisms for parents to supervise and guide their children online.
- Enable users to report harmful content and unwanted interactions via an easy-to-use and permanently available reporting mechanism.
- Assess and act appropriately in response to harmful content reports.
- Proactively detect and remove child sexual exploitation and abuse material, as well as terrorism content.
- Produce an annual accountability report for publishing on the IMDA's website.
The second proposed Code targets content areas assessed to be "egregious online harms" on social media platforms. These include areas such as sexual harm, self-harm, public health, public security and racial or religious intolerance.
Under this Code, the IMDA will be granted powers to direct any social media service accessible from Singapore to disable access to specified types of egregious harmful content or disallow specified accounts to communicate such content and interact with users in Singapore.
The prevalence of online harms has been a major concern for both regulators and platforms alike due to its divisive potential and detriment to individual well-being.
The government has taken several initiatives to address online harms thus far, including:
- Requiring internet content providers to comply with the Internet Code of Practice.
- Empowering the regulatory authority to take down content that goes against public interest, public morality, public order and national harmony.
- Directing internet service providers to block access to prohibited websites, and requiring them to offer filtering services for parents to subscribe to.
- Requiring over-the-top and video-on-demand streaming services offering content rated NC-16 or higher to provide parental controls.
The proposed Codes are, therefore, a welcome addition that would help raise Singapore's baseline standard for online safety.
They are also consistent with efforts currently being undertaken in other jurisdictions such as Australia and the UK, which have recently or are about to introduce online safety legislations.
Related articles
Singapore: The High Court issues injunction to block potential sale and transfer of NFT
Singapore: PDPC publishes guide on responsible use of biometric data in security applications
Singapore: Launches World's first AI Governance Testing Framework and Toolkit

© 2022 Baker & McKenzie.Wong & Leow. All rights reserved. Baker & McKenzie.Wong & Leow is incorporated with limited liability and is a member firm of Baker & McKenzie International, a global law firm with member law firms around the world. In accordance with the common terminology used in professional service organizations, reference to a "principal" means a person who is a partner, or equivalent, in such a law firm. Similarly, reference to an "office" means an office of any such law firm. This may qualify as "Attorney Advertising" requiring notice in some jurisdictions. Prior results do not guarantee a similar outcome.