“Platforms Now Responsible For Online Safety” — How Malaysia’s Online Safety Act Will Work
The act shifts the burden from user reports or parental controls to the platforms.
Follow us on Instagram, TikTok, and WhatsApp for the latest stories and breaking news.
This Spotlight is sponsored by the Malaysian Communications and Multimedia Commission (MCMC).
The Online Safety Act is expected to come into force in January 2026

It will fundamentally change how online harm, especially child sexual abuse material (CSAM), is dealt with in Malaysia. Instead of relying mainly on user reports or parental controls, the law places the legal and operational responsibility squarely on digital platforms themselves.
According to Dr Fauziah Mohd Saad, the act represents a clear shift away from reacting after harm has already happened.
"The act fundamentally shifts the burden of online safety from users to platforms. Governance moves away from a 'react after harm' model towards preventing and reducing harm by design, making platforms accountable for how their systems amplify risk," said the counselling and psychology expert from Universiti Pendidikan Sultan Idris (UPSI).
Under the act, platforms will be required to take active and preventive measures to detect, block, and remove CSAM in a timely and effective way
This includes building child protection directly into how platforms are designed and operated, not treating safety as an optional add-on.
"Safety-by-design and child-centric product changes become part of governance. Features such as recommendation systems, direct messaging, live streaming, search, and discoverability are no longer purely product choices but safety considerations," said Dr Fauziah.

In practice, this could mean safer default settings for minors, stricter controls on who can message children, and tighter limits on how harmful content spreads.
According to Dr Fauziah, the act will hold platforms accountable and set clearer expectations around internal decision-making and governance.
"Platforms are likely to face more structured oversight, including requests for information, compliance audits, mandated reporting and penalties for systemic failures, not just individual illegal posts. They will also be expected to demonstrate risk management, safety governance, internal controls, and documented decision-making, rather than relying on ad hoc takedowns," added Dr Fauziah.
While the law is ambitious, the expert warned that improvements won't appear overnight, and progress may vary between platforms

"In the first one to three months, the public may see faster removals, clearer reporting channels and early safety-default changes for minors, particularly on large platforms with mature moderation systems," said Dr Fauziah.
More durable improvements, however, will take time.
"The biggest gains are likely to emerge in the nine- to 18-month window, driven by product redesign and risk reduction in algorithms and features such as discoverability, recommendations and direct messaging," she explained.
Dr Fauziah cautioned that public trust will depend heavily on whether platforms can show measurable results
"Without clear thresholds and metrics such as time-to-action and re-upload rates, compliance may not translate into real child safety outcomes," she said.


Cover image via