A thousand small safety acts

Six-panel early 1970s cartoon strip illustrating IT project management as different parts of the team imagine a tree swing.

“The safest place in the world to be online.”

I think I remember that slogan from Tony Blair’s 1990s government, when it primarily related to ecommerce. It morphed into child safety – for example, in 2010, when the first Digital Economy Act was passed, or 2017, when the Online Safety Act, passed in 2023 and entering into force in March 2025, was but a green paper. Now, Ofcom is charged with making it reality.

As prior net.wars posts attest, the 2017 green paper began with the idea that social media companies could be forced to pay, via a levy, for the harm they cause. The key remaining element of that is a focus on the large, dominant companies. The green paper nodded toward designing proportionately for small businesses and startups. But the large platforms pull the attention: rich, powerful, and huge. The law that’s emerged from these years of debate takes in hundreds of thousands of divergent services.

On Mastodon, I’ve been watching lawyer Neil Brown scrutinize the OSA with a particular eye on its impact on the wide ecosystem of what we might call “the community Internet” – the thousands of web boards, blogs, chat channels, and who-knows-what-else with no business model because they’re not businesses. As Brown keeps finding in his attempts to help provide these folks with tools they can use are struggling to understand and comply with the act.

First things first: everyone agrees that online harm is bad. “Of course I want people to be safe online,” Brown says. “I’m lucky, in that I’m a white, middle-aged geek. I would love everyone to have the same enriching online experience that I have. I don’t think the act is all bad.” Nonetheless, he sees many problems with both the act itself and how it’s being implemented. In contacts with organizations critiquing the act, he’s been surprised to find how many unexpectedly agree with him about the problems for small services. However, “Very few agreed on which was the worst bit.”

Brown outlines two classes of problem: the act is “too uncertain” for practical application, and the burden of compliance is “too high for insufficient benefit”.

Regarding the uncertainty, his first question is, “What is a user?” Is someone who reads net.wars a user, or just a reader? Do they become a user if they post a comment? Do they start interacting with the site when they read a comment, make a comment, or only when they comment to another user’s comment? In the fediverse, is someone who reads postings he makes via his private Mastodon instance its user? Is someone who replies from a different instance to that posting a user of his instance?

His instance has two UK users – surely insignificant. Parliament didn’t set a threshold for the “significant number of UK users” that brings a service into scope, so Ofcom says it has no answer to that question. But if you go by percentage, 100% of his user base is in Britain. Does that make Britain his “target market”? Does having a domain name in the UK namespace? What is a target market for the many community groups running infrastructure for free software projects? They just want help with planning, or translation; they’re not trying to sign up users.

Regarding the burden, the act requires service providers to perform a risk assessment for every service they run. A free software project will probably have a dozen or so – a wiki, messaging, a documentation server, and so on. Brown, admittedly not your average online participant, estimates that he himself runs 20 services from his home. Among them is a photo-sharing server, for which the law would have him write contractual terms of service for the only other user – his wife.

“It’s irritating,” he says. “No one is any safer for anything that I’ve done.”

So this is the mismatch. The law and Ofcom imagine a business with paid staff signing up users to profit from them. What Brown encounters is more like a stressed-out woman managing a small community for fun after she puts the kids to bed.

Brown thinks a lot could be done to make the act less onerous for the many sites that are clearly not the problem Parliament was trying to solve. Among them, carve out low-risk services. This isn’t just a question of size, since a tiny terrorist cell or a small ring sharing child sexual abuse material can pose acres of risk. But Brown thinks it shouldn’t be too hard to come up with criteria to rule services out of scope such as a limited user base coupled with a service “any reasonable person” would consider low risk.

Meanwhile, he keeps an In Memoriam list of the law’s casualties to date. Some have managed to move or find new owners; others are simply gone. Not on the list are non-UK sites that now simply block UK users. Others, as Brown says, just won’t start up. The result is an impoverished web for all of us.

“If you don’t want a web dominated by large, well-lawyered technology companies,” Brown sums up, “don’t create a web that squeezes out small low-risk services.”

Illustrations: Early 1970s cartoon illustrating IT project management.

Wendy M. Grossman is an award-winning journalist. Her Web site has extensive links to her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon or Bluesky.

Author: Wendy M. Grossman

Covering computers, freedom, and privacy since 1991.