Singapore regulator studying age assurance for social media services

Singapore has already slapped app stores with age verification requirements, and is among nations exploring age assurance measures for social media. A newly released report from the government assesses how the six designated social media services in the country – Facebook, Instagram, TikTok, X, YouTube and HardwareZone – are complying with the country’s Code of Practice for Online Safety for social media services.
It finds that while most have “have largely put in place required user safety measures,” some are lagging in specific areas, suggesting that age assurance requirements for social platforms may be on the near horizon.
X is for Xtra child sexual exploitation: report
The Online Safety Assessment Report on Designated Social Media Services (DSMS) singles out X, the social network formerly known as Twitter, on the issue of child sexual exploitation and abuse material (CSEM). “X needs to improve the effectiveness of its efforts to detect and remove CSEM on its service,” it says. “In its annual report, X stated that it proactively detected and removed 6 pieces of CSEM originating from Singapore. However, our tests detected considerably more cases of CSEM originating from Singapore on X during the same period.”
Elon Musk’s pet project will thus be required to provide Singapore’s regulator, the Infocomm Media Development Authority (IMDA) with an update on “steps taken to improve the effectiveness of its measures against CSEM.”
Overall, the report says, DSMSs “need to take greater responsibility to protect children.” And while the online safety code does not require biometric age assurance measures, it notes that “age assurance technology has improved considerably in recent years,” noting the IMDA’s decision to implement age assurance measures for app stores, starting in April 2025.
App store age check regulation sets stage for social intervention
Among other social media platforms, TikTok has the highest overall rating, at four checks out of five. However, in specific categories, its user reporting and resolution score is low.
HarwareZone, Instagram, Facebook and YouTube share second place with three and a half checks apiece. Both Instagram and TikTok are assessed to have the strongest user safety measures for children.
In a resounding last is X, with two and a half ticks.
While all of the platforms had appropriate community guidelines in place, there were gaps in practice that allowed kids to view restricted material. Notably, “X did not effectively enforce its policies to restrict children’s accounts from viewing adult sexual content.” Tests, it says, “found that children’s accounts could easily find and access explicit adult sexual content, especially hardcore pornography, with simple search terms.”
IMDA’s Code of Practice for Online Safety for App Distribution Services goes into effect on March 31, 2025. But while the regulator has deemed the social media giants’ online safety measures “generally comprehensive,” its report will likely not feel like a win for social media firms that may have believed they scored a win in Singapore with the app store regulation, but now find themselves in the crosshairs. IMDA says it is studying how social media services “should use age assurance technology to better protect children and youth online,” and it seems to be only a matter of time before the platforms join the app stores in browsing the age assurance aisle of the global biometrics market.
Article Topics
age verification | biometrics | children | digital identity | Singapore | social media
Comments