Email to Ofgem

So, the US has claimed victory against the UK.

Regular readers may recall that in February the UK’s Home Office secretly asked Apple to put a backdoor in the Advanced Data Protection encryption it offers as a feature for iCloud users. In March, Apple challenged the order. The US objected to the requirement that the backdoor should apply to all users worldwide. How dare the Home Office demand the ability to spy on Americans?

On Tuesday, US director of national intelligence Tulsi Gabbard announced the UK is dropping its demand for the backdoor in Apple’s encryption “that would have enabled access to the protected encrypted data of American citizens”. The key here is “American citizens”. The announcement – which the Home Office is refusing to comment on – ignores everyone else and also the requirement for secrecy. It’s safe to say that few other countries would succeed in pressuring the UK in this way.

As Bll Goodwin reports at Computer Weekly, the US deal does nothing to change the situation for people in Britain or elsewhere. The Investigatory Powers Act (2016) is unchanged. As Parmy Olson writes at Bloomberg, the Home Office can go on issuing Technical Capability Notices to Apple and other companies demanding information on their users that the criminalization of disclosure will keep the companies silent. The Home Office can still order technology companies operating in the UK to weaken their security. And we will not know they’ve done it. Surprisingly, support for this point of view comes from the Federal Trade Commission, which has posted a letter to companies deploring foreign anti-encryption policy (ignoring how often undermining encryption has been US policy, too) and foreign censorship of Americans’ speech. This is far from over, even in the US.

Within the UK, the situation remains as dangerously uncertain as ever. With all countries interconnected, the UK’s policy risks the security of everyone everywhere. And, although US media may have forgotten, the US has long spied on its citizens by getting another country to do it.

Apple has remained silent, but so far has not withdrawn its legal challenge. Also continuing is the case filed by Privacy International, Liberty, and two individuals. In a recent update, PI says both legal cases will be heard over seven days in 2026 as much as possible in the open.

***

For non-UK folk: The Office of Gas and Electricity Markets (Ofgem) is the regulator for Britain’s energy market. Its job is to protect consumers.

To Ofgem:

Today’s Guardian (and many others) carries the news that Tesla EMEA has filed an application to supply British homes and businesses with energy.

Please do not approve this application.

I am a journalist who has covered the Internet and computer industries for 35 years. As we all know, Tesla is owned by Elon Musk. Quite apart from his controversial politics and actions within the US government, Elon Musk has shown himself to be an unstable personality who runs his companies recklessly. Many who have Tesla cars love them – but the cars have higher rates of quality control problems than those from other manufacturers, and Musk’s insistence on marketing the “Full Self Drive” feature has cost lives according to the US National Highway and Transportation Safety Agency, which launched yet another investigation into the company just yesterday. In many cases, when individuals have sought data from Tesla to understand why their relatives died in car fires or crashes the company has refused to help them. During the covid emergency, thousands of Tesla workers got covid because Musk insisted on reopening the Tesla factory. This is not a company people should trust with their homes.

With Starlink, Musk has exercised his considerable global power by turning off communications in Ukraine while it was fighting back Russian attacks. SpaceX launches continue to crash. According to the children’s commissioner’s latest report, far more children encounter pornography online on Musk’s X than on pornography sites, a problem that has gotten far worse since Musk took it over.

More generally, he is an enemy of workers’ rights. Misinformation on X helped fuel the Southport riots, and Musk himself has considered trying to oust Keir Starmer as prime minister.

Many are understandably awed by his technological ideas. But he uses these to garner government subsidies and undermine public infrastructure, which he then is able to wield as a weapon to suit his latest whims.

Musk is already far too powerful in the world. His actions in the White House have shown he is either unable to understand or entirely uninterested in the concerns and challenges that face people living on sums that to him seem negligible. He is even less interested in – and often actively opposes – social justice, fairness, and equity. No amount of separation between him and Tesla EMEA will be sufficient to counter his control of and influence over his company. Tesla’s board, just weeks ago, voted to award him $30 billion in shares to “energise and focus” him.

Please do not grant him a foothold in Britain’s public infrastructure. Whatever his company is planning, it does not have British interests at heart.

Ofgem is accepting public comments on Tesla’s application until close of business on Friday, August 22, 2025.

Illustration: Artist Dominic Wilcox’s Stained Glass Driverless Sleeper Car..

Wendy M. Grossman is an award-winning journalist. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon or Bluesky.

Magic math balls

So many ironies, so little time. According to the Financial Times (and syndicated at Ars Technica), the US government, which itself has traditionally demanded law enforcement access to encrypted messages and data, is pushing the UK to drop its demand that Apple weaken its encryption. Normally, you want to say, Look here, countries are entitled to have their own laws whether the US likes it or not. But this is not a law we like!

This all began in February, when the Washington Post reported that the UK’s Home Office had issued Apple with a Technical Capability Notice. Issued under the Investigatory Powers Act (2016) and supposed to be kept secret, the TCN demanded that Apple undermine the end-to-end encryption used for iCloud’s Advanced Data Protection feature. Much protest ensued, followed by two legal cases in front of the Investigatory Powers Tribunal, one brought by Apple, the other by Privacy International and Liberty. WhatsApp has joined Apple’s legal challenge.

Meanwhile, Apple withdrew ADP in the UK. Some people argued this didn’t really matter, as few used it, which I’d call a failure of user experience design rather than an indication that people didn’t care about it. More of us saw it as setting a dangerous precedent for both encryption and the use of secret notices undermining cybersecurity.

The secrecy of TCNs is clearly wrong and presents a moral hazard for governments that may prefer to keep vulnerabilities secret so they can take advantage for surveillance purposes. Hopefully, the Tribunal will eventually agree and force a change in the law. The Foundation for Information Policy Research (obDisclosure: I’m a FIPR board member) has published a statement explaining the issues.

According to the Financial Times, the US government is applying a sufficiently potent threat of tariffs to lead the UK government to mull how to back down. Even without that particular threat, it’s not clear how much the UK can resist. As Angus Hanton documented last year in the book Vassal State, the US has many well-established ways of exerting its influence here. And the vectors are growing; Keir Starmer’s Labour government seems intent on embedding US technology and companies into the heart of government infrastructure despite the obvious and increasing risks of doing so. When I read Hanton’s book earlier this year, I thought remaining in the EU might have provided some protection, but Caroline Donnelly warns at Computer Weekly that they, too, are becoming dangerously dependent on US technology, specifically Microsoft.

It’s tempting to blame everything on the present administration, but the reality is that the US has long used trade policy and treaties to push other countries into adopting laws regardless of their citizens’ preferences.

***

As if things couldn’t get any more surreal, this week the Trump administration *also* issued an executive order banning “woke AI” in the federal government. AI models are in future supposed to be “politically neutral”. So, as Kevin Roose writes at the New York Times, the culture wars are coming for AI.

The US president is accusing chatbots of “Marxist lunacy”, where the rest of the world calls them inaccurate, biased toward repeating and expanding historical prejudices, and inconsistent. We hear plenty about chatbots adopting Nazi tropes; I haven’t heard of one promoting workers’ and migrants’ rights.

If we know one thing about AI models it’s that they’re full of crap all the way down. The big problem is that people are deploying them anyway. At the Canary, Steve Topple reports that the UK’s Department of Work and Pensions admits in a newly-published report that its algorithm for assessing whether benefit claimants might commit fraud is ageist and and racist. A helpful executive order would set must-meet standards for *accuracy*. But we do not live in those times.

The Guardian reports that two more Trump EOs expedite building new data centers, promote exports of American AI models, expand the use of AI in the federal government, and intend to solidify US dominance in the field. Oh, and Trump would really like if it people would stop calling it “artificial” and find a new name. Seven years ago, aspirational intelligence” seemed like a good idea. But that was back when we heard a lot about incorporating ethics. So…”magic math ball”?

These days, development seems to proceed ethics-free. DWP’s report, for example, advocates retraining its flawed algorithm but says continuing to operate it is “reasonable and proportionate”. In 2021, for European Digital Rights Initiative, Agathe Balayn and Seda Gürses found, “Debiasing locates the problems and solutions in algorithmic inputs and outputs, shifting political problems into the domain of design, dominated by commercial actors.” In other words, no matter what you think is “neutral”, training data, model, and algorithms are only as “neutral” as their wider context allows them to be.

Meanwhile, nothing to curb the escalating waste. At 404 Media, Emanuel Maiberg finds that Spotify is publishing AI-generated songs from dead artists without anyone’s’ permission. On Monday, MSNBC’s Rachel Maddow told viewers that there’s so much “AI slop ” about her that they’ve posted Is That Really Rachel? to catalog and debunk them.

As Ed Zitron writes, the opportunity costs are enormous.

In the UK, the US, and many other places, data centers are threatening the water supply.

But sure, let’s make more of that.

Illustrations: Magic 8 ball toy (via frankieleon at Wikimedia).

Wendy M. Grossman is an award-winning journalist. Her website has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon or Bluesky.

Nephology

For an hour yesterday (June 5, 2025), we were treated to the spectacle of the US House Judiciary Committee, both Republicans and Democrats, listening – really listening, it seemed – to four experts defending strong encryption. The four: technical expert Susan Landau and lawyers Caroline Wilson-Palow, Richard Salgado, and Gregory Nejeim.

The occasion was a hearing on the operation of the Clarifying Lawful Overseas Use of Data Act (2018), better known as the CLOUD Act. It was framed as collecting testimony on “foreign influence on Americans’ data”. More precisely, the inciting incident was a February 2025 Washington Post article revealing that the UK’s Home Office had issued Apple with a secret demand that it provide backdoor law enforcement access to user data stored using the Advanced Data Protection encryption feature it offers for iCloud. This type of demand, issued under S253 of the Investigatory Powers Act (2016), is known as a “technical capability notice”, and disclosing its existence is a crime.

The four were clear, unambiguous, and concise, incorporating the main points made repeatedly over the last the last 35 years. Backdoors, they all agreed, imperil everyone’s security; there is no such thing as a hole only “good guys” can use. Landau invoked Salt Typhoon and, without ever saying “I warned you at the time”, reminded lawmakers that the holes in the telecommunications infrastructure that they mandated in 1994 became a cybersecurity nightmare in 2024. All four agreed that with so much data being generated by all of us every day, encryption is a matter of both national security as well as privacy. Referencing the FBI’s frequent claim that its investigations are going dark because of encryption, Nojeim dissented: “This is the golden age of surveillance.”

The lawyers jointly warned that other countries such as Canada and Australia have similar provisions in national legislation that they could similarly invoke. They made sensible suggestions for updating the CLOUD Act to set higher standards for nations signing up to data sharing: set criteria for laws and practices that they must meet; set criteria for what orders can and cannot do; and specify additional elements countries must include. The Act could be amended to include protecting encryption, on which it is currently silent.

The lawmakers reserved particular outrage for the UK’s audacity in demanding that Apple provide that backdoor access for *all* users worldwide. In other words, *Americans*.

Within the UK, a lot has happened since that February article. Privacy advocates and other civil liberties campaigners spoke up in defense of encryption. Apple soon withdrew ADP in the UK. In early March, the UK government and security services removed advice to use Apple encryption from their websites – a responsible move, but indicative of the risks Apple was being told to impose on its users. A closed-to-the-public hearing was scheduled for March 14. Shortly before it, Privacy International, Liberty, and two individual claimants filed a complaint with the Investigatory Powers Tribunal seeking for the hearing to be held in public, and disputing the lawfulness, necessity, and secrecy of TCNs in general. Separately, Apple appealed against the TCN.

On April 7, the IPT released a public judgment summarizing the more detailed ruling it provided only to the UK government and Apple. Short version: it rejected the government’s claim that disclosing the basic details of the case will harm the public interest. Both this case and Apple’s appeal continue.

As far as the US is concerned, however, that’s all background noise. The UK’s claim to be able to compel the company to provide backdoor access worldwide seems to have taken Congress by surprise, but a day like this has been on its way ever since 2014, when the UK included extraterritorial power in the Data Retention and Investigatory Powers Act (2014). At the time, no one could imagine how they would enforce this novel claim, but it was clearly something other governments were going to want, too.

This Judiciary Committee hearing was therefore a festival of ironies. For one thing, the US’s own current administration is hatching plans to merge government departments’ carefully separated databases into one giant profiling machine for US citizens. Second, the US has always regarded foreigners as less deserving of human rights than its own citizens; the notion that another country similarly privileges itself went down hard.

More germane, subsidiaries of US companies remain subject to the PATRIOT Act, under which, as the late Caspar Bowden pointed out long ago, the US claims the right to compel them to hand over foreign users’ data. The CLOUD Act itself was passed in response to Microsoft’s refusal to violate Irish data protection law by fulfilling a New York district judge’s warrant for data relating to an Irish user. US intelligence access to European users’ data under the PATRIOT Act has been the big sticking point that activist lawyer Max Schrems has used to scuttle a succession of US-EU data sharing arrangements under GDPR. Another may follow soon: in January, the incoming Trump administration fired most of the Privacy and Civil Liberties Oversight board tasked to protect Europeans’ rights under the latest such deal.

But, no mind. Feast, for a moment, on the thought of US lawmakers hearing, and possibly willing to believe, that encryption is a necessity that needs protection.

Illustrations: Gregory Nejeim, Richard Salgado, Caroline Wilson-Palow, and Susan Landau facing the Judiciary Committee on June 5, 2025.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon or Bluesky.

Unsafe

The riskiest system is the one you *think* you can trust. Say it in encryption: the least secure encryption is encryption that has unknown flaws. Because, in the belief that your communication or data is protected, you feel it’s safe to indulge in what in other contexts would be obviously risky behavior. Think of it like an unseen hole in a condom.

This has always been the most dangerous aspect of the UK government’s insistence that its technical capability notices remain secret. Whoever alerted the Washington Post to the notice Apple received a month ago commanding it to weaken its Advanced Data Protection performed an important public service. Now, Carly Page reports at TechCrunch based on a blog posting by security expert Alec Muffett, the UK government is recognizing that principle by quietly removing from its web pages advice to use that same encryption that was directed at people whose communications are at high risk – such as barristers and other legal professionals. Apple has since withdrawn ADP in the UK.

More important long-term, at the Financial Times, Tim Bradshaw and Lucy Fisher report that Apple has appealed the government’s order to the Investigatory Powers Tribunal. This will be, as the FT notes, the first time government powers under the Investigatory Powers Act (2016) to compel the weakening of security features will be tested in court. A ruling that the order was unlawful could be an important milestone in the seemingly interminable fight over encryption.

***

I’ve long had the habit of doing minor corrections on Wikipedia – fixing typos, improving syntax – as I find them in the ordinary course of research. But recently I have had occasion to create a couple of new pages, with the gratefully-received assistance of a highly experienced Wikipedian. At one time, I’m sure this was a matter of typing a little text, garlanding it with a few bits of code, and garnishing it with the odd reference, but standards have been rising all along, and now if you want your newly-created page to stay up it needs a cited reference for every statement of fact and a minimum of one per sentence. My modest pages had ten to 20 references, some servicing multiple items. Embedding the page matters, too, so you need to link mentions to all those pages. Even then, some review editor may come along and delete the page if they think the subject is not notable enough or violates someone’s copyright. You can appeal, of course…and fix whatever they’ve said the problem is.

It should be easier!

All of this detailed work is done by volunteers, who discuss the decisions they make in full view on the talk page associated with every content page. Studying the more detailed talk pages is a great way to understand how the encyclopedia, and knowledge in general, is curated.

Granted, Wikipedia is not perfect. Its policy on primary sources can be frustrating, and errors in cited secondary sources can be difficult to correct. The culture can be hostile if you misstep. Its coverage is uneven, But, as Margaret Talbot reports at the New Yorker and Amy Bruckman writes in her 2022 book, Should You Believe Wikipedia?, all those issues are fully documented.

Early on, Wikipedia was often the butt of complaints from people angry that this free encyclopedia made by *amateurs* threatened the sustainability of Encyclopaedia Britannica (which has survived though much changed). Today, it’s under attack by Elon Musk and the Heritage Foundation, as Lila Shroff writes at The Atlantic. The biggest danger isn’t to Wikipedia’s funding; there’s no offer anyone can make that would lead to a sale. The bigger vulnerability is the safety of individual editors. Scold they may, but as a collective they do important work to ensure that facts continue to matter.

***

Firefox users are manifesting more and more unhappiness about the direction Mozilla is taking with Firefox. The open source browser’s historic importance is outsized compared to its worldwide market share, which as of February 2025 is 2.63%, according to Statcounter. A long tail of other browsers are based on it, such as LibreWolf, Waterfox, and the privacy-protecting Tor.

The latest complaint, as Liam Proven and Thomas Claburn write at The Register is that Mozilla has removed its commitment not to sell user data from Firefox’s terms and conditions and privacy policy. Mozilla responded that the company doesn’t sell user data “in the way that most people think about ‘selling data'” but needed to change the language because of jurisdictional variations in what the word “sell” means. Still, the promise is gone.

This follows Mozilla’s September 2024 decision, reported by Richard Speed at The Register, to turn on by default a “privacy-preserving feature” to track users that led the NGO noyb to file a complaint with the Austrian data protection authority. And a month ago, Mark Hachman reported at PC World that Mozilla is building access to third-party generative AI chatbots into Firefox, and there are reports that it’s adding “AI-powered tab grouping.

All of these are basically unwelcome, and of all organizations Mozilla should have been able to foresee that. Go away, AI.

***

Molly White is expertly covering the Trump administration’s proposed “US Crypto Reserve”. Remains only to add Rachel Maddow, who compared it to having a strategic reserve of Beanie Babies.

Illustrations:: Beanie baby pelican.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon or Bluesky.

Isolate

Yesterday, the Global Encryption Coalition published a joint letter calling on the UK to rescind its demand that Apple undermine (“backdoor”) the end-to-end encryption on its services. The Internet Society is taking signatures until February 20.

The background: on February 7, Joseph Menn reported at the Washington Post (followed by Dominic Preston at The Verge) that in January the office of the Home Secretary sent Apple a technical capability notice under the Investigatory Powers Act (2018) ordering it to provide access to content that anyone anywhere in the world has uploaded to iCloud and encrypted with Apple’s Advanced Data Protection.

Technical capability notices are supposed to be secret. It’s a criminal offense to reveal that you’ve been sent one. Apple can’t even tell users that their data may be compromised. (This kind of thing is why people publish warrant canaries.) Menn notes that even if Apple withdraws ADP in the UK, British authorities will still demand access to encrypted data everywhere *else*. So it appears that if the Home Office doesn’t back down and Apple is unwilling to cripple its encryption, the company will either have to withdraw ADP across the world or exit the UK market entirely. At his Odds and Ends of History blog, James O’Malley calls the Uk’s demand stupid, counter-productive, and unworkable. At TechRadar, Chiara Castro asks who’s next, and quotes Big Brother Watch director Silkie Carlo: “unprecedented for a government in any democracy”.

When the UK first began demanding extraterritorial jurisdiction for its interception rules, most people wondered how the country thought it would be able to impose it. That was 11 years ago; it was one of the new powers codified in the Data Retention and Investigatory Powers Act (2014) and kept in its replacement, the IPA in 2016.

Governments haven’t changed – they’ve been trying to undermine strong encryption in the hands of the masses since 1991, when Phil Zinmmermann launched PGP – but the technology has, as Graham Smith recounted at Ars Technica in 2017. Smartphones are everywhere. People store their whole lives on them for everything and giant technology companies encrypt both the device itself and the cloud backups. Government demands have changed to reflect that, from focusing on the individual with key escrow and key lengths to focusing on the technology provider with client-side scanning, encrypted messaging (see also the EU) and now cloud storage.

At one time, a government could install a secret wiretap by making a deal with a legacy telco. The Internet’s proliferation of communications providers changed that for a while. During the resulting panic the US passed the Communications Assistance for Law Enforcement Act (1994), which requires Internet service providers and telecommunications companies to install wiretap-ready equipment – originally for telephone calls, later broadband and VOIP traffic as well.

This is where the UK government’s refusal to learn from others’ mistakes is staggering. Just four months ago, the US discovered Salt Typhoon, a giant Chinese hack into its core telecommunications networks that was specifically facilitated by…by…CALEA. To repeat: there is no such thing as a magic hole that only “good guys” can use. If you undermine everyone’s privacy and security to facilitate law enforcement, you will get an insecure world where everyone is vulnerable. The hack has led US authorities to promote encrypted messaging.

Joseph Cox’s recent book, Dark Wire touches on this. It’s a worked example of what law enforcement internationally can do if given open access to all messages criminals send across a network when they think they are operating in complete safety. Yes, the results were impressive: hundreds of arrests, dozens of tons of drugs seized, masses if firearms impounded. But, Cox writes, all that success was merely a rounding error in global drug trade. Universal loss of privacy and security versus a rounding error: it’s the definition of “disproportionate”.

It remains to be seen what Apple decides to do and whether we can trust what the company tells us. At his blog, Alec Muffett is collecting ongoing coverage of events. The Future of Privacy Forum celebrated Safer Internet Day, February 11, with an infographic showing how encryption protects children and teens.

But set aside for a moment all the usual arguments about encryption, which really haven’t changed in over 30 years because mathematical reality hasn’t.

In the wider context, Britain risks making itself a technological backwater. First, there’s the backdoored encryption demand, which threatens every encrypted service. Second, there’s the impact of the onrushing Online Safety Act, which comes into force in March. Ofcom, the regulator charged with enforcing it, is issuing thousands of pages of guidance that make it plain that only large platforms will have the resources to comply. Small sites, whether businesses, volunteer-run Fediverse instances, blogs, established communities, or web boards, will struggle even if Ofcom starts to do a better job of helping them understand their legal obligations. Many will likely either shut down or exit the UK, leaving the British Internet poorer and more isolated as a result. Ofcom seems to see this as success.

It’s not hard to predict the outcome if these laws converge in the worst possible timeline: a second Brexit, this one online.

Illustrations: T-shirt (gift from Jen Persson).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon or Bluesky.

Review: Dark Wire

Dark Wire
by Joseph Cox
PublicAffairs (Hachette Group)
ISBNs: 9781541702691 (hardcover), 9781541702714 (ebook)

One of the basic principles that emerged as soon as encryption software became available to ordinary individuals on home computers was this: everyone should encrypt their email so the people who really need the protection don’t stick out as targets. Also at that same time, the authorities were constantly warning that if encryption weren’t controlled by key escrow, an implanted back door, or restrictions on its strength, it would help hide the activities of drug traffickers, organized crime, pedophiles, and terrorists. This same argument continues today.

Today, billions of people have access to encrypted messaging via WhatsApp, Signal, and other services. Governments still hate it, but they *use* it; the UK government is all over WhatsApp, as multiple public inquiries have shown.

In Dark Wire: The Incredible True Story of the Largest Sting Operation Ever, Joseph Cox, one of the four founders of 404 Media, takes us on a trip through law enforcement’s adventures in encryption, as police try to identify and track down serious criminals making and distributing illegal drugs by the ton.

The story begins with PhantomSecure, a scheme that stripped down Blackberry devices and installed PGP to encrypt emails and systems to ensure the devices could exchange emails only with other Phantom Secure devices. The service became popular among all sorts of celebrities, politicians, and other non-criminals who value privacy – but not *only* them. All perfectly legal.

One of my favorite moments comes early,when a criminal debating whether to trust a new contact decides he can because he has one of these secure Blackberries. The criminal trusted the supply chain; surely no one would have sold him one of these things without thoroughly checking that he wasn’t a cop. Spoiler alert: he was a cop. That sale helped said cop and his colleagues in the United States, Australia, Canada, and the Netherlands infiltrate the network, arrest a bunch of criminals, and shut it down – eventually, after setbacks, and with the non-US forces frustrated and amazed by US Constitutional law limiting what agents were allowed to look at.

PhantomSecure’s closure made a hole in the market while security-conscious criminals scrambled to find alternatives. It was rapidly filled by competitors working with modified phones: Encrochat and Sky ECC. As users migrated to these services and law enforcement worked to infiltrate and shut them down as well, former PhantomSecure salesman “Afgoo” had a bright idea, which he offered to the FBI: why not build their own encrypted app and take over the market?

The result was Anom, From the sounds of it, some of its features were quite cool. For example, the app itself hid behind an innocent-looking calculator, which acted as a password gateway. Type in the right code, and the messaging app appeared. The thing sold itself.

Of course, the FBI insisted on some modifications. Behind the scenes, Anom devices sent copies of every message to the FBI’s servers. Eventually, the floods of data the agencies harvested this way led to 500 arrests on one day alone, and the seizure of hundreds of firearms and dozens of tons of illegal drugs and precursor chemicals.

Some of the techniques the criminals use are fascinating in their own right. One method of in-person authentication involved using the unique serial number on a bank note, sending it in advance; the mule delivering the money would simply have to show they had the bank note, a physical one-time pad. Banks themselves were rarely used. Instead, cash would be stored in safe houses in various countries and the money would never have to cross borders. So: no records, no transfers to monitor. All of this spilled open for law enforcement because of Anom.

And yet. Cox waits until the end to voice reservations. All those seizures and arrests barely made a dent in the world’s drug trade – a “rounding error”, Cox calls it.

Return of the Four Horsemen

The themes at this week’s Scrambling for Safety, hosted by the Foundation for Information Policy Research, are topical but not new since the original 1997 event: chat control; the online safety act; and AI in government decision making.

The EU proposal chat control would require platforms served with a detection order to scan people’s phones for both new and previously known child sexual abuse materialclient-side scanning. Robin Wilton prefers to call this “preemptive monitoring” to clarify that it’s an attack.

Yet it’s not fit even for its stated purpose, as Claudia Peersman showed, based on research conducted at REPHRAIN. They set out to develop a human-centric evaluation framework for the AI tools needed at the scale chat control would require. Their main conclusion: AI tools are not ready to be deployed on end-to-end-encrypted private communications. This was also Ross Anderson‘s argument in his 2022 paper on chat control (PDF) showing why it won’t meet the stated goals. Peersman also noted an important oversight: none of the stakeholder groups consulted in developing these tools include the children they’re supposed to protect.

This led Jen Persson to ask: “What are we doing to young people?” Children may not understand encryption, she said, but they do know what privacy means to them, as numerous researchers have found. If violating children’s right to privacy by dismantling encryption means ignoring the UN Convention on the Rights of the Child, “What world are we leaving for them? How do we deal with a lack of privacy in trusted relationships?”

All this led Wilton to comment that if the technology doesn’t work, that’s hard evidence that it is neither “necessary” nor “proportionate”, as human rights law demands. Yet, Persson pointed out, legislators keep passing laws that technologists insist are unworkable. Studies in both France and Australia have found that there is no viable privacy-preserving age verification technology – but the UK’s Online Safety Act (2023) still requires it.

In both examples – and in introducing AI into government decision making – a key element is false positives, which swamp human adjudicators in any large-scale automated system. In outlining the practicality of the Online Safety Act, Graham Smith cited the recent case of Marieha Hussein, who carried a placard at a pro-Palestinian protest that depicted former prime minister Rishi Sunak and former home secretary Suella Braverman as coconuts. After two days of evidence, the judge concluded the placard was (allowed) political satire rather than (criminal) racial abuse. What automated system can understand that the same image means different things in different contexts? What human moderator has two days? Platforms will simply remove content that would never have led to a conviction in court.

Or, asked Monica Horten suggested, how does a platform identify the new offense of coercive control?

Lisa Sugiura, who campaigns to end violence against women and girls, had already noted that the same apps parents install so they can monitor their children (and are reluctant to give up later) are openly advertised with slogans like “Use this to check up on your cheating wife”. (See also Cindy Southworth, 2010, on stalker apps.) The dots connect into reports Persson heard at last week’s Safer Internet Forum that young women find it hard to refuse when potential partners want parental-style monitoring rights and then find it even harder to extricate themselves from abusive situations.

Design teams don’t count the cost of this sort of collateral damage, just as their companies have little liability for the human cost of false positives, and the narrow lens of child safety also ignores these wider costs. Yet they can be staggering: the 1990s US law requiring ISPs to facilitate wiretapping, CALEA, created the vulnerability that enabled widescale Chinese spying in 2024.

Wilton called laws that essentially treat all of us as suspects “a rule to make good people behave well, instead of preventing bad people from behaving badly”. Big organized crime cases like the Silk Road, Encrochat, and Sky ECC, relied on infiltration, not breaking encryption. Once upon a time, veterans know, there were four horsemen always cited by proponents of such laws: organized crime, drug dealers, terorrists, and child abusers. We hear little about the first three these days.

All of this will take new forms as the new government adopts AI in decision making with the same old hopes: increased efficiency, lowered costs. Government is not learning from the previous waves of technoutopianism, which brought us things like the Post Office Horizon scandal, said Gavin Freeguard. Under data protection law we were “data subjects”; now we are becoming “decision subjects” whose voices are not being heard.

There is some hope: Swee Leng Harris sees improvements in the reissued data bill, though she stresses that it’s important to remind people that the “cloud” is really material data centers that consume energy (and use water) at staggering rates (see also Kate Crawford’s book, Atlas of AI). It’s no help that UK ministers and civil servants move on to other jobs at pace, ensuring there is no accountability. As Sam Smith said, computers have made it possible to do things faster – but also to go wrong faster at a much larger scale.

Illustrations: Time magazine’s 1995 “Cyberporn” cover, the first children and online pornography scare, based on a fraudulent study.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon.

The safe place

For a long time, fear that technical decisions – new domain names ($), cooption of open standards or software, laws mandating data localization – would splinter the Internet. “Balkanize” was heard a lot.

A panel at the UK Internet Governance Forum a couple of weeks ago focused on this exact topic, and was mostly self-congratulatory. Which is when it occurred to me that the Internet may not *be* fragmented, but it *feels* fragmented. Almost every day I encounter some site I can’t reach: email goes into someone’s spam folder, the site or its content is off-limits because it’s been geofenced to conform with copyright or data protection laws, or the site mysteriously doesn’t load, with no explanation. The most likely explanation for the latter is censorship built into the Internet feed by the ISP or the establishment whose connection I’m using, but they don’t actually *say* that.

The ongoing attrition at Twitter is exacerbating this feeling, as the users I’ve followed for years continue to migrate elsewhere. At the moment, it takes accounts on several other services to keep track of everyone: definite fragmentation.

Here in the UK, this sense of fragmentation may be about to get a lot worse, as the long-heralded Online Safety bill – written and expanded until it’s become a “Frankenstein bill”, as Mark Scott and Annabelle Dickson report at Politico – hurtles toward passage. This week saw fruitless debates on amendments in the House of Lords, and it will presumably be back in the Commons shortly thereafter, where it could be passed into law by this fall.

A number of companies have warned that the bill, particularly if it passes with its provisions undermining end-to-end encryption intact, will drive them out of the country. I’m not sure British politicians are taking them seriously; so often such threats are idle. But in this case, I think they’re real, not least because post-Brexit Britain carries so much less global and commercial weight, a reality some politicians are in denial about. WhatsApp, Signal, and Apple have all said openly that they will not compromise the privacy of their masses of users elsewhere to suit the UK. Wikipedia has warned that including it in the requirement to age-verify its users will force it to withdraw rather than violate its principles about collecting as little information about users as possible. The irony is that the UK government itself runs on WhatsApp.

Wikipedia, Ian McRae, the director of market intelligence for prospective online safety regulator Ofcom, showed in a presentation at UKIGF, would be just one of the estimated 150,000 sites within the scope of the bill. Ofcom is ramping up to deal with the workload, an effort the agency expects to cost £169 million between now and 2025.

In a legal opinion commissioned by the Open Rights Group, barristers at Matrix Chambers find that clause 9(2) of the bill is unlawful. This, as Thomas Macaulay explains at The Next Web, is the clause that requires platforms to proactively remove illegal or “harmful” user-generated content. In fact: prior restraint. As ORG goes on to say, there is no requirement to tell users why their content has been blocked.

Until now, the impact of most badly-formulated British legislative proposals has been sort of abstract. Data retention, for example: you know that pervasive mass surveillance is a bad thing, but most of us don’t really expect to feel the impact personally. This is different. Some of my non-UK friends will only use Signal to communicate, and I doubt a day goes by that I don’t look something up on Wikipedia. I could use a VPN for that, but if the only way to use Signal is to have a non-UK phone? I can feel those losses already.

And if people think they dislike those ubiquitous cookie banners and consent clickthroughs, wait until they have to age-verify all over the place. Worst case: this bill will be an act of self-harm that one day will be as inexplicable to future generations as Brexit.

The UK is not the only one pursuing this path. Age verification in particular is catching on. The US states of Virginia, Mississippi, Louisiana, Arkansas, Texas, Montana, and Utah have all passed legislation requiring it; Pornhub now blocks users in Mississippi and Virginia. The likelihood is that many more countries will try to copy some or all of its provisions, just as Australia’s law requiring the big social media platforms to negotiate with news publishers is spawning copies in Canada and California.

This is where the real threat of the “splinternet” lies. Think of requiring 150,000 websites to implement age verification and proactively police content. Many of those sites, as the law firm Mischon de Reya writes may not even be based in the UK.

This means that any site located outside the UK – and perhaps even some that are based here – will be asking, “Is it worth it?” For a lot of them, it won’t be. Which means that however much the Internet retains its integrity, the British user experience will be the Internet as a sea of holes.

Illustrations: Drunk parrot in a Putney garden (by Simon Bisson; used by permission).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Follow on Mastodon.