Wikipedia says OSA categories ensnare it in IDV regulations targeting social media

The Wikimedia Foundation, which runs Wikipedia, has launched a legal challenge to the UK’s Online Safety Act (OSA), claiming that its categories are too broad and that a digital identity verification requirement could put its contributors at risk.
A Medium article written by the Foundation’s lead counsel, Phil Bradley-Schmieg, argues that the OSA’s Categorization Regulations “place Wikipedia and its users at unacceptable risk of being subjected to the OSA’s toughest ‘Category 1’ duties, which were originally designed to target some of the UK’s riskiest websites,” including social media platforms.
Enforcement by Ofcom at that level, Bradley-Schmieg says, “would undermine the privacy and safety of Wikipedia volunteer users, expose the encyclopedia to manipulation and vandalism, and divert essential resources from protecting and improving Wikipedia and the other Wikimedia Projects.”
In effect, it means Wikipedia would have to collect personal information from contributors, which could put a chill on use.
In comments to the BBC, Wikimedia Foundation VP of global advocacy Rebecca MacKinnon notes that “we’ve seen in other parts of the world, when people do not feel safe contributing to Wikipedia, then they shy away from controversial topics that may be challenging to people who are powerful, and that reduces the quality and the usefulness of the encyclopedia.”
Nets to catch harmful content risk putting smaller, benign fish in danger
The request for a judicial review is significant in itself, coming from a non-profit that is typically seen as one of the more morally neutral or even benevolent names in big tech. But its larger implications point to a potential issue with the scope of the OSA and other online safety legislation worldwide: the question of what sites need to be policed, and what sites should be exempt because their value outweighs their risks.
The most notable instance of this to date concerns YouTube, which has been exempted from age assurance laws in Australia on account of its value for (and widespread use in) education.
Yet while the “carve-out” for YouTube raises questions about that site’s more questionable corners, and kids’ ability to access them, there would appear to be little ground on which to attack Wikipedia – which remains, at base, an encyclopedia – for hosting content that is harmful to children.
One major issue is that, if Wikipedia is included in Category 1, it will need to verify the identity of certain Wikipedia users, and a linked rule would “need to allow other (potentially malicious) users to block all unverified users from fixing or removing any content they post,” which “could mean significant amounts of vandalism, disinformation or abuse going unchecked on Wikipedia, unless volunteers of all ages, all over the world, undergo identity verification.”
The Foundation maintains that Wikipedia is not at all like social media sites, and that such a limitation would effectively hobble its model, which “relies on empowered volunteer users working together to decide what appears on the website.”
“This new duty would be exceptionally burdensome (especially for users with no easy access to digital ID),” it says. “Worse still, it could expose users to data breaches, stalking, vexatious lawsuits or even imprisonment by authoritarian regimes.”
The Wikimedia Foundation, says Bradley-Schmieg, is “not bringing a general challenge to the OSA as a whole, not even to the existence of the Category 1 duties themselves.” Rather, it focuses specifically on the scope of Category 1 duties coming into force either this year or in 2026.
We are not like social media: Wikimedia Foundation
Bradley-Schmieg’s argument says the Categorization Regulations were intentionally left vague to avoid the risk of loopholes. But the idea that casting a wider net will result in a better catch is “based on three flawed concepts.”
The first is an increasingly urgent issue for those working with algorithms and machine learning technologies: the generalization of “content recommender systems” under the banner of so-called AI. Bradley-Schmieg writes that “having any ‘algorithm’ on the site that ‘affects’ what content someone might ‘encounter,’ is seemingly enough to qualify popular websites for Category 1.”
That means features like Wikipedia’s Translation Recommendations or New Pages Feed, used by Wikipedia article reviewers to monitor for rule-breaking content, win it the Cat 1 stigma.
Also at issue are content forwarding or sharing functionality – which the Foundation says dramatically increase the likelihood of a Category 1 classification, but which are poorly defined in the regulations – and platform popularity.
Bradley-Schmieg says that “in assessing popularity, the regulations seemingly do not differentiate between users who visit the site just once a month, however briefly – for example, just to look up a date of birth on Wikipedia – versus those who spend hours each day ‘doomscrolling’ potentially harmful content on social media. All that matters is whether a website or app has several million UK visitors a month, total. Ofcom’s own research shows enormous differences in how educational services, like Wikipedia, are actually used in practice.”
Adding insult to injury, “as designed, the regulations will also fail to catch many of the services UK society is actually concerned about, like misogynistic hate websites.”
Time short to contest vague and excessive classification
One of the Foundation’s major complaints is that it has been banging this drum for some time now. “Despite widespread calls from academics and civil society, years of engagement with rulemakers, and even a general red-tape-reduction command from the UK’s Prime Minister and its Chancellor, other interests are still being prioritized. Eighteen months following the OSA’s entry into force, the official priority is still ‘the swift implementation of the Act’s duties.’”
There have certainly been calls for Ofcom to act swiftly in enforcing the OSA. The Wikimedia Foundation, by contrast, is asking Parliament to slow down and use the existing abilities it has to “tailor the scope of Category 1,” to exempt nonprofit organizations or educational projects and avoid including other low-risk sites.
It is pushing against an enforcement campaign that is gaining momentum. “Ofcom is expected to make its first decisions about Category 1 status very soon,” the lawyer writes. “For services like Wikipedia to thrive, it is essential that new laws do not endanger charities and public interest projects.”
Article Topics
identity verification | Ofcom | Online Safety Act | regulation | social media | UK | Wikipedia
Comments