Social media services in Singapore will soon have to impose system-wide processes to enhance online safety for users, especially young users. This Code of Practice for Online Safety is one of the two proposed Codes of Practice by the government to beef up online safety.
Josephine Teo, minister for communications and information and minister-in-charge of smart nation and cybersecurity, said in a Facebook post that under the first code, social media services should put in place measures such as community standards to ensure younger users' exposure to harmful or inappropriate content, including those promoting dangerous self-harming acts, are minimised.
She added that users who encounter or suffer from online harms - such as the non-consensual distribution of their intimate videos - should have access to user reporting mechanisms to flag them out to social media services to take appropriate action. Separately, Channel NewsAsia reported that social media platforms must also produce an annual accountability report for publishing on IMDA's website and that these platforms should "proactively detect and remove child sexual exploitation and abuse material and terrorism content".
Additionally, the government is also proposing a Content Code for Social Media Services to protect users from "egregious harms" - such as sexual and self-harm, or content that can threaten public health. Racial or religious disharmony and intolerance are also among the area of concerns, according to CNA. Teo said that this second proposed code will allow IMDA to direct social media services to take action against harmful online content so as to protect users.
This comes months after Teo announced in March that the government will introduce new Codes of Practice to deal with harmful online content accessible by users in Singapore, especially children. "We acknowledge the efforts of the tech companies to better support their users, and are now engaging them on the codes," her Facebook post said. A public consultation exercise will commence next month.
Citing a survey by the Sunlight Alliance for Action (AfA), public-private-people partnership to tackle online harms, done in January this year, Teo said that one in two Singaporeans personally experienced online harms, with teenagers and young adults forming the majority of those who have experienced them. Through various engagement sessions with over 300 stakeholders, the government has also noted calls to develop more support mechanisms and infrastructure for victims of online harms.
According to The Straits Times , the codes are likely to be added to the Broadcasting Act after the consultations. Also, if passed, IMDA will have the authority to direct social media services to "disable access" to harmful online content for users in Singapore.
Under the new codes, content such as livestreams of mass shootings will be blocked, ST said, along with viral social media challenges that urge younger audiences to perform dangerous stunts. At the same time, the codes will also keep in mind "Singapore's unique context and sensitive issues such as race and religion", ST reported.
ST cited the example of a man posing as a Chinese woman named Sharon Liew and posing racially offensive tweets. He was eventually charged for stirring up racial tensions. Another example given by MCI was of an individual using a profile titled "NUS Atheist Society" which portrayed the Bible and the Quran in an offensive manner two years ago. Meanwhile, in 2021, there was also a poll which ranked local female Muslim religious teachers according to their sexual attractiveness. MCI's spokesperson told ST that it is still too early to provide specifics as to what other consequences errant platforms could face as the details are still being ironed out with the tech industry.
Across the border, Malaysia recently unveiled its Content Code 2022 aimed at fostering a robust content landscape in Malaysia where freedom of expression and responsibility can seamlessly coexist. Established in 2004, the latest content code identifies eight key focus areas.
They are upholding rights of children in advertising; upholding rights of PWD; ensuring ethical reporting of suicide cases; addressing abuse of religion in ads; prohibition against online abuse and gender-based violence; addressing false content and its impact on the community; ensuring influencers and online marketplaces are guided by ad guidelines; and requiring disclosure of ads from influencers and paid-for space in news.
Photo courtesy: Shutterstock
Authenticity over popularity: what consumers are really expecting from brands on social media in 2022
Winning the content race: 'Don't chase huge numbers, look at deep engagement'
Malaysia comms and content body sets rules for responsible advertising during COVID-19
Unilever will cease marketing of F&B to children below 16 years old
Chinese authorities fine Chinese tech firms for implicit child pornography content on platform