Revival

There appears to be media consensus: “Bluesky is dead.”

At The Commentary, James Meigs calls Bluesky “an expression of the left’s growing hypersensitivity to ideas leftists find offensive”, and says he accepts exTwitter’s “somewhat uglier vibe” in return for “knowing that right-wing views aren’t being deliberately buried”. Then he calls Bluesky “toxic” and a “hermetically sealed social-media bubble”.

At New Media and Marketing, Rich Meyer says Bluesky is in decline and engagement is dropping, and exTwitter is making a comeback.

At Slate, Alex Kirshner and Nitish Pahwa complain that Bluesky feels “empty”, say that its too-serious users are abandoning it because it isn’t fun, and compare it to a “small liberal arts college” and exTwitter to a “large state university”.

At The Spectator, Sean Thomas regrets that “Bluesky is dying” – and claims to have known it would fail from his first visit to the site, “a bad vegan cafe, full of humorless puritans”.

Many of these pieces – Mark Cuban at Fortune, for example, and Megan McArdle at the Washington Post – blame a “lack of diversity of thought”.

As Mike Masnick writes on TechDirt in its defense (Masnick is a Bluesky board member), “It seems a bit odd: when something is supposedly dying or irrelevant, journalists can’t stop writing about it.”

Have they so soon forgotten 2014, when everyone was writing that Twitter was dead?

Commentators may be missing that success for Bluesky looks different: it’s trying to build a protocol-driven ecosystem, not a site. Twitter had one, but destroyed it as its ad-based business model took over. Both Bluesky and Mastodon, which media largely ignores, aim to let users create their own experience and are building tools that give users as much control as possible. It seems to offend some commentators that one of them lets you block people you don’t want to deal with, but that’s weird, since it’s the one every social site has.

All social media have ups and downs, especially when they’re new (I really wonder how many of these commentators experienced exTwitter in its early days or have looked at Truth Social’s user numbers). Settling into a new environment and rebuilding take time – it may look like the old place, but its affordances are different, and old friends are missing. Meanwhile, anecdotally, some seem to be leaving social media entirely, driven away by privacy issues, toxic behavior, distaste for platform power and its owners, or simply distracted by life. Few of us *have* to use social media.

***

In 2002, the UK’s Financial Services Authority was the first to implement an EU directive allowing private organizations to issue their own electronic money without a banking license if they could meet the capital requirements. At the time, the idea seemed kind of cute, especially since there was a plan to waive some of the requirements for smaller businesses. Everyone wanted micropayments; here was a framework of possibility.

And then nothing much happened. The Register’s report (the first link above) said that organizations such as the Post Office, credit card companies, and mobile operators were considering launching emoney offerings. If they did, the results sank without trace. Instead, we’re all using credit/debit cards to pay for stuff online, just as we were 23 years ago. People are relucrtant to trust weird, new-fangled forms of money.

Then, in 2008, came cryptocurrencies – money as lottery ticket.

Last week, the Wall Street Journal reported that Amazon, Wal-Mart, and other multinationals are exploring stablecoins as a customer payment option – in other words, issuing their own cryptocurrencies, pegged to the US dollar. As Andrew Kassel explains at Investopedia, the result could be to bypass credit cards and banks, saving billions in fees.

It’s not clear how this would work, but I’m suspicious of the benefits to consumers. Would I have to buy a company’s stablecoin before doing business with it? And maintain a floating balance? At Axios, Brady Dale explores other possibilities. Ultimately, it sounds like a return to the 1970s, before multipurpose credit cards, when people had store cards from the retailers they used frequently, and paid a load of bills every month. Dale seems optimistic that this could be a win for consumers as well as retailers, but I can’t really see it.

In other words, the idea seems less cute now, less fun technological experiment, more rapacious. There’s another, more disturbing, possibility: the return of the old company town. Say you work for Amazon or Wal-Mart, and they offer you a 10% bonus for taking your pay in their stablecoin. You can’t spend it anywhere but their store, but that’s OK, right, because they stock everything you could possibly want? A modern company town doesn’t necessarily have to be geographical.

I’ve long thought that company towns, which allowed companies to effectively own employees, are the desired endgame for the titans. Elon Musk is heading that way with Starbase, Texas, now inhabited primarily by SpaceX employees, as Elizabeth Crisp reports at The Hill.

I don’t know if the employees who last month voted enthusiastically for the final incorporation of Starbase realize how abusive those old company towns were.

Illustrations: The Starbase sign adjoining Texas Highway 4, in 2023 (via Jenny Hautmann at Wikimedia.

Wendy M. Grossman is an award-winning journalist. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon or Bluesky.

A thousand small safety acts

“The safest place in the world to be online.”

I think I remember that slogan from Tony Blair’s 1990s government, when it primarily related to ecommerce. It morphed into child safety – for example, in 2010, when the first Digital Economy Act was passed, or 2017, when the Online Safety Act, passed in 2023 and entering into force in March 2025, was but a green paper. Now, Ofcom is charged with making it reality.

As prior net.wars posts attest, the 2017 green paper began with the idea that social media companies could be forced to pay, via a levy, for the harm they cause. The key remaining element of that is a focus on the large, dominant companies. The green paper nodded toward designing proportionately for small businesses and startups. But the large platforms pull the attention: rich, powerful, and huge. The law that’s emerged from these years of debate takes in hundreds of thousands of divergent services.

On Mastodon, I’ve been watching lawyer Neil Brown scrutinize the OSA with a particular eye on its impact on the wide ecosystem of what we might call “the community Internet” – the thousands of web boards, blogs, chat channels, and who-knows-what-else with no business model because they’re not businesses. As Brown keeps finding in his attempts to help provide these folks with tools they can use are struggling to understand and comply with the act.

First things first: everyone agrees that online harm is bad. “Of course I want people to be safe online,” Brown says. “I’m lucky, in that I’m a white, middle-aged geek. I would love everyone to have the same enriching online experience that I have. I don’t think the act is all bad.” Nonetheless, he sees many problems with both the act itself and how it’s being implemented. In contacts with organizations critiquing the act, he’s been surprised to find how many unexpectedly agree with him about the problems for small services. However, “Very few agreed on which was the worst bit.”

Brown outlines two classes of problem: the act is “too uncertain” for practical application, and the burden of compliance is “too high for insufficient benefit”.

Regarding the uncertainty, his first question is, “What is a user?” Is someone who reads net.wars a user, or just a reader? Do they become a user if they post a comment? Do they start interacting with the site when they read a comment, make a comment, or only when they comment to another user’s comment? In the fediverse, is someone who reads postings he makes via his private Mastodon instance its user? Is someone who replies from a different instance to that posting a user of his instance?

His instance has two UK users – surely insignificant. Parliament didn’t set a threshold for the “significant number of UK users” that brings a service into scope, so Ofcom says it has no answer to that question. But if you go by percentage, 100% of his user base is in Britain. Does that make Britain his “target market”? Does having a domain name in the UK namespace? What is a target market for the many community groups running infrastructure for free software projects? They just want help with planning, or translation; they’re not trying to sign up users.

Regarding the burden, the act requires service providers to perform a risk assessment for every service they run. A free software project will probably have a dozen or so – a wiki, messaging, a documentation server, and so on. Brown, admittedly not your average online participant, estimates that he himself runs 20 services from his home. Among them is a photo-sharing server, for which the law would have him write contractual terms of service for the only other user – his wife.

“It’s irritating,” he says. “No one is any safer for anything that I’ve done.”

So this is the mismatch. The law and Ofcom imagine a business with paid staff signing up users to profit from them. What Brown encounters is more like a stressed-out woman managing a small community for fun after she puts the kids to bed.

Brown thinks a lot could be done to make the act less onerous for the many sites that are clearly not the problem Parliament was trying to solve. Among them, carve out low-risk services. This isn’t just a question of size, since a tiny terrorist cell or a small ring sharing child sexual abuse material can pose acres of risk. But Brown thinks it shouldn’t be too hard to come up with criteria to rule services out of scope such as a limited user base coupled with a service “any reasonable person” would consider low risk.

Meanwhile, he keeps an In Memoriam list of the law’s casualties to date. Some have managed to move or find new owners; others are simply gone. Not on the list are non-UK sites that now simply block UK users. Others, as Brown says, just won’t start up. The result is an impoverished web for all of us.

“If you don’t want a web dominated by large, well-lawyered technology companies,” Brown sums up, “don’t create a web that squeezes out small low-risk services.”

Illustrations: Early 1970s cartoon illustrating IT project management.

Wendy M. Grossman is an award-winning journalist. Her Web site has extensive links to her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon or Bluesky.

Negative externalities

A sheriff’s office in Texas searched a giant nationwide database of license plate numbers captured by automatic cameras to look for a woman they suspected of self-managing an abortion. As Rindala Alajazi writes at EFF, that’s 83,000 cameras in 6,809 networks belonging to Flock Safety, many of them in states where abortion is legal or protected as a fundamental right until viability.

We’ve known something like this was coming ever since 2022, when the US Supreme Court overturned Roe v. Wade and returned the power to regulate abortion to the individual US states. The resulting unevenness made it predictable that the strongest opponents to legal abortion would turn their attention to interstate travel.

The Electronic Frontier Foundation has been warning for some time about Flock’s database of camera-captured license plates. Recently, Jason Koebler reported at 404 Media that US Immigration and Customs Enforcement has been using Flock’s database to find prospects for deportation. Since ICE does not itself have a contract with Flock, it’s been getting local law enforcement to perform search on its behalf. “Local” refers only to the law enforcement personnel; they have access to camera data that’s shared nationally.

The point is that once the data has been collected it’s very hard to stop mission creep. On its website, Flock says its technology is intended to “solve and eliminate crime” and “protect your community”. That might have worked when we all agreed what was a crime.

***

A new MCTD Cambridge report makes a similar point about menstrual data, when sold at scale. Now, I’m from the generation that managed fertility with a paper calendar, but time has moved on, and fertility tracking apps allow a lot more of the self-quantification that can be helpful in many situations. As Stephanie Felsberger writes in introducing the report, menstrual data is highly revealing of all sorts of sensitive information. Privacy International has studied period-tracking apps, and found that they’ve improved but still pose serious privacy risks.

On the other hand, I’m not so sure about the MCTD report’s third recommendation – that government build a public tracker app within the NHS. The UK doesn’t have anything like the kind of divisive rhetoric around abortion that the US does, but the fact remains that legal abortion is a 1967 carve-out from an 1861 law. In the UK, procuring an abortion is criminal *except* during the first 24 weeks, or if the mother’s life is in danger, or if the fetus has a serious abnormality. And even then, sign-off is required from two doctors.

Investigations and prosecutions of women under that 1861 law have been rising, as Shanti Das reported at the Guardian in January. Pressure in the other direction from US-based anti-choice groups such as the Alliance for Defending Freedom has also been rising. For years it’s seemed like this was a topic no one really wanted to reopen. Now, health care providers are calling for decriminalization, and, as Hannah Al-Oham reported this week, there are two such proposals currently in front of Parliament.

Also relevant: a month ago, Phoebe Davis reported at the Observer that in January the National Police Chiefs’ Council quietly issued guidance advising officers to search homes for drugs that can cause abortions in cases of stillbirths and to seize and examine devices to check Internet searches, messages, and health apps to “establish a woman’s knowledge and intention in relation to the pregnancy”. There was even advice on how to bypass the requirement for a court order to access women’s medical records.

In this context, it’s not clear to me that a publicly owned app is much safer or more private than a commercial one. What’s needed is open source code that can be thoroughly examined that keeps all data on the device itself, encrypted, in a segregated storage space over which the user has control. And even then…you know, paper had a lot of benefits.

***

This week the UK Parliament passed the Data (Use and Access) bill, which now just needs a royal signature to become law. At its site, the Open Rights Group summarizes the worst provisions, mostly a list of ways the bill weakens citizens’ rights over their data.

Brexit was sold to the public on the basis of taking back national sovereignty. But, as then-MEP Felix Reda said the morning after the vote, national sovereignty is a fantasy in a globalized world. Decisions about data privacy can’t be made imagining they are only about *us*.

As ORG notes, the bill has led European Digital Rights to write to the European Commission asking for a review of the UK’s adequacy status. This decision, granted in 2020, was due to expire in June 2025, but the Commission granted a six-month extension to allow the bill’s passage to complete. In 2019, when the UK was at peak Brexit chaos, it seemed possible that the Conservative then-government would allow the UK to leave the EU with no deal in place, net.wars noted the risk to data flows. The current Labour government, with its AI and tech policy ambitions, ought to be more aware of the catastrophe losing adequacy would present. And yet.

Illustrations: Map from the Center for Reproductive Rights showing the current state of abortion rights across the US.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast and a regular guest on the TechGrumps podcast. Follow on Mastodon or Bluesky.

Nephology

For an hour yesterday (June 5, 2025), we were treated to the spectacle of the US House Judiciary Committee, both Republicans and Democrats, listening – really listening, it seemed – to four experts defending strong encryption. The four: technical expert Susan Landau and lawyers Caroline Wilson-Palow, Richard Salgado, and Gregory Nejeim.

The occasion was a hearing on the operation of the Clarifying Lawful Overseas Use of Data Act (2018), better known as the CLOUD Act. It was framed as collecting testimony on “foreign influence on Americans’ data”. More precisely, the inciting incident was a February 2025 Washington Post article revealing that the UK’s Home Office had issued Apple with a secret demand that it provide backdoor law enforcement access to user data stored using the Advanced Data Protection encryption feature it offers for iCloud. This type of demand, issued under S253 of the Investigatory Powers Act (2016), is known as a “technical capability notice”, and disclosing its existence is a crime.

The four were clear, unambiguous, and concise, incorporating the main points made repeatedly over the last the last 35 years“>35 years. Backdoors, they all agreed, imperil everyone’s security; there is no such thing as a hole only “good guys” can use. Landau invoked Salt Typhoon and, without ever saying “I warned you at the time”, reminded lawmakers that the holes in the telecommunications infrastructure that they mandated in 1994 became a cybersecurity nightmare in 2024. All four agreed that with so much data being generated by all of us every day, encryption is a matter of both national security as well as privacy. Referencing the FBI’s frequent claim that its investigations are going dark because of encryption, Nojeim dissented: “This is the golden age of surveillance.”

The lawyers jointly warned that other countries such as Canada and Australia have similar provisions in national legislation that they could similarly invoke. They made sensible suggestions for updating the CLOUD Act to set higher standards for nations signing up to data sharing: set criteria for laws and practices that they must meet; set criteria for what orders can and cannot do; and specify additional elements countries must include. The Act could be amended to include protecting encryption, on which it is currently silent.

The lawmakers reserved particular outrage for the UK’s audacity in demanding that Apple provide that backdoor access for *all* users worldwide. In other words, *Americans*.

Within the UK, a lot has happened since that February article. Privacy advocates and other civil liberties campaigners spoke up in defense of encryption. Apple soon withdrew ADP in the UK. In early March, the UK government and security services removed advice to use Apple encryption from their websites – a responsible move, but indicative of the risks Apple was being told to impose on its users. A closed-to-the-public hearing was scheduled for March 14. Shortly before it, Privacy International, Liberty, and two individual claimants filed a complaint with the Investigatory Powers Tribunal seeking for the hearing to be held in public, and disputing the lawfulness, necessity, and secrecy of TCNs in general. Separately, Apple appealed against the TCN.

On April 7, the IPT released a public judgment summarizing the more detailed ruling it provided only to the UK government and Apple. Short version: it rejected the government’s claim that disclosing the basic details of the case will harm the public interest. Both this case and Apple’s appeal continue.

As far as the US is concerned, however, that’s all background noise. The UK’s claim to be able to compel the company to provide backdoor access worldwide seems to have taken Congress by surprise, but a day like this has been on its way ever since 2014, when the UK included extraterritorial power in the Data Retention and Investigatory Powers Act (2014). At the time, no one could imagine how they would enforce this novel claim, but it was clearly something other governments were going to want, too.

This Judiciary Committee hearing was therefore a festival of ironies. For one thing, the US’s own current administration is hatching plans to merge government departments’ carefully separated databases into one giant profiling machine for US citizens. Second, the US has always regarded foreigners as less deserving of human rights than its own citizens; the notion that another country similarly privileges itself went down hard.

More germane, subsidiaries of US companies remain subject to the PATRIOT Act, under which, as the late Caspar Bowden pointed out long ago, the US claims the right to compel them to hand over foreign users’ data. The CLOUD Act itself was passed in response to Microsoft’s refusal to violate Irish data protection law by fulfilling a New York district judge’s warrant for data relating to an Irish user. US intelligence access to European users’ data under the PATRIOT Act has been the big sticking point that activist lawyer Max Schrems has used to scuttle a succession of US-EU data sharing arrangements under GDPR. Another may follow soon: in January, the incoming Trump administration fired most of the Privacy and Civil Liberties Oversight board tasked to protect Europeans’ rights under the latest such deal.

But, no mind. Feast, for a moment, on the thought of US lawmakers hearing, and possibly willing to believe, that encryption is a necessity that needs protection.

Illustrations: Gregory Nejeim, Richard Salgado, Caroline Wilson-Palow, and Susan Landau facing the Judiciary Committee on June 5, 2025.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon or Bluesky.