The censorship-industrial complex

In a sign of the times, the Academy of Motion Picture Arts and Sciences has announced that in 2029 the annual Oscars ceremony will move from ABC to YouTube, where it will be viewable worldwide for free. At Variety, Clayton Davis speculates how advertising will work – perhaps mid-roll? The obvious answer is to place the ads between the list of nominees and opening the envelope to announce the winner. Cliff-hanger!

The move is notable. Ratings for the awards show have been declining for decades. In 1960, 45.8 million people in the US watched the Oscars – live, before home video recording. In 1998, the peak, 55.2 million, after VCRs, but before YouTube. In 2024: 19.5 million. This year, the Oscars drew under 18.1 million viewers.

On top of that, broadcast TV itself is in decline. One of the biggest audiences ever gathered for a single episode of a scripted show was in 1983: 100 million, for the series finale of M*A*S*H. In 2004, the Friends finale drew 52.5 million. In 2019, the Big Bang Theory finale drew just 17.9 million. YouTube has more than 2.7 billion active users a month. Whatever ABC was paying for the Oscars, reach may matter more than money, especially in an industry that is also threatened by shrinking theater audiences. In the UK, YouTube is second most-watched TV service ($), after only the BBC.

The move suggests that the US audience itself may also not be as uniquely important as it was historically. The Academy’s move fits into many other similar trends.

***

During this week’s San Francisco power outage, an apparently unexpected consequence was that non-functioning traffic lights paralyzed many of the city’s driverless Waymo taxis. In its blog posting, the company says, “While the Waymo Driver is designed to handle dark traffic signals as four-way stops, it may occasionally request a confirmation check to ensure it makes the safest choice. While we successfully traversed more than 7,000 dark signals on Saturday, the outage created a concentrated spike in these requests. This created a backlog that, in some cases, led to response delays contributing to congestion on already-overwhelmed streets.”

Friends in San Francisco note that the California Driver’s Handbook (under “Traffic Control”) is specific about what to do in such situations: treat the intersection as if it had all-way stop signs. It’s a great example of trusting human social cooperation.

Robocars are, of course, not in on this game. In an uncertain situation they can’t read us. So the volume of requests overwhelmed the remote human controllers and the cars froze, blocking intersections and even sidewalks. Waymo suspended the service temporarily, and says it is updating the cars’ software to make them act “more decisively” in such situations in future.

Of course, all these companies want to do away with the human safety drivers and remote controllers as they improve cars’ programming to incorporate more edge cases. I suspect, however, that we’ll never really reach the point where humans aren’t needed; there will always be new unforeseen issues. Driving a car is a technical challenge. Sharing the roads with others is a social effort requiring the kind of fuzzy flexibility computers are bad at. Getting rid of the humans will mean deciding what level of dysfunction we’re willing to accept from the cars.

Self-driving taxis are coming to London in 2026, and I’m struggling to imagine it. It’s a vastly more complex city to navigate than San Francisco, and has many narrow, twisty little streets to flummox programmers used to newer urban grids.

***

The US State Department has announced sanctions barring five people and potentially their families from obtaining visas to enter or stay in the US, labeling them radical activists and weaponized NGOs. They are: Imran Ahmed, an ex-Labour advisor and founder and CEO of the Centre for Countering Digital Hate; Clare Melford, founder of the Global Disinformation Index; Thierry Breton, a former member of the European Commission, whom under secretary of state for public diplomacy Sarah B. Rogers, called “a mastermind” of the Digital Services Act; and Josephine Ballon and Anna-Lena von Hodenberg, managing directors of the independent German organization HateAid, which supports people affected by digital violence. Ahmed, who lived in Washington, DC, has filed suit to block his deportation; a judge has issued a temporary restraining order.

It’s an odd collection as a “censorship-industrial complex”. Breton is no longer in a position to make laws calling US Big Tech to account; his inclusion is presumably a warning shot to anyone seeking to promote further regulation of this type. GDI’s site’s last “news” posting was in 2022. HateAid has helped a client file suit against Google in August 2025, and sued X in July for failing to remove criminal antisemitic content. The Center for Countering Digital Hate has also been in court to oppose antisemitic content on X and Instagram; in 2024 Elon Musk called it a ‘criminal organization’. There was more logic to”the three people in hell” taught to an Irish friend as a child (Cromwell, Queen Elizabeth I, and Martin Luther).

Whatever the Trump administration’s intention, the result is likely to simply add more fuel to initiatives to lessen European dependence on US technology.

Illustrations: Christmas tree in front of the US Capitol in 2020 (via Wikimedia).

Wendy M. Grossman is an award-winning journalist. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon or Bluesky.

Watching YouTube

One of the reasons it’s so difficult to figure out what to do about misinformation, malinformation, and disinformation online is the difficulty of pinpointing how online interaction translates to action in the real world. The worst content on social media has often come from traditional media or been posted by an elected politician.

At least, that’s how it seems to text-based people like me. This characteristic, along with the quick-hit compression of 140 (later 280) characters, was the (minority) appeal of Twitter. It’s also why legacy media pays so little attention to what’s going on in game worlds, struggle with TikTok, and underestimate the enormous influence of YouTube. The notable exception is the prolific Chris Stokel-Walker, who’s written books about both YouTube and TikTok.

Stokel-Walker has said he decided to write YouTubers because the media generally only notices YouTube when there’s a scandal. Touring those scandals occupies much of filmmaker Alex Winter‘s now-showing biography of the service, The YouTube Effect.

The film begins by interviewing co-founder Steve Chen, who giggles a little uncomfortably to admit that he and co-founders Chad Hurley and Jawed Karim thought it could be a video version of Hot or Not?. In 2006, Google bought the year-old site for $1.65 billion in Google stock, to derision from financial commentators certain it had overpaid.

Winter’s selection of clips from early YouTube reminds of early movies, which pulled people into theaters with little girls having a pillow fight. Winter moves on through pioneering stars like Smosh and K-Pop, 2010’s Arab spring, the arrival of advertising and monetization, the rise of alt-right channels, Gamergate, the 2016 US presidential election, the Christchurch shooting, the horrors lurking in YouTube Kids, George Floyd, the multimillion-dollar phenomenon of Ryan Kaji, January 6, the 2020 Congressional hearings. Somewhere in the middle is the arrival of the Algorithm that eliminated spontaneous discovery in favor of guided user experience, and a brief explanation of the role of Section 230 of the Communications Decency Act in protecting platforms from liability for third-party content.

These stories are told by still images and video clips interlaced with interviews with talking heads like Caleb Cain, who was led into right-wing extremism and found his way back out; Andy Parker, father of Alison Parker, footage of whose murder he has been unable to get expunged; successful YouTuber (“ContraPoints”) Natalie Wynn; technology writer and video game developer Brianna Wu; Jillian C. York, author of Silicon Values; litigator Carrie Goldberg, who works to remediate online harms one lawsuit at a time; Anthony Padilla, co-founder of Smosh; and YouTube then-CEO Susan Wojcicki.

Not included among the interviewees: political commentators (though we see short clips of Alex Jones) or free speech fundamentalists. In addition, Winter sticks to user-generated content, ignoring the large percentage of YouTube’s library that is copies of professional media, many otherwise unavailable. Countries outside the US are mentioned only by York, who studies censorship around the world. Also missing is anyone from Google who could explain how YouTube fits into its overall business model.

The movie concludes by asking commentators to recommend changes. Parker wants families of murder victims to be awarded co-copyright and therefore standing to get footage of victims’ deaths removed. Hany Farid, a UC Berkeley professor who studies deepfakes, thinks it’s essential to change the business model from paying with data and engagement to paying with money – that is, subscriptions. Goldberg is afraid we will all become captives of Big Tech. A speaker whose name is illegible in my notes mentions antitrust law. Cain notes that there’s nothing humans have built that we can’t destroy. Wojcicki says only that technology offers “a tremendous opportunity to do good in the long-term”. York notes the dual-use nature of these technologies; their effects are both good and bad, so what you change “depends what you’re looking for”.

Cain gets the last word. “What are we speeding towards?” he asks, as the movie’s accelerating crescendo of images and clips stops on a baby’s face.

Unlike predecessors Coded Bias (2021) and The Great Hack (2019), The YouTube Effect is unclear about what it intends us to understand about YouTube’s impact on the world beyond the sheer size of audience a creator can assemble via the platform. The array of scandals, all of them familiar from mainstream headlines, makes a persuasive case that YouTube deserves Facebook and Twitter-level scrutiny. What’s missing, however, is causality. In fact, the film is wrongly titled: there is no one YouTube effect. York had it right: “fixing” YouTube requires deciding what you’re trying to change. My own inclination is to force change to the business model. The algorithm distorts our interactions, but it’s driven by the business model.

Perhaps this was predictable. Seven years on, we still struggle to pinpoint exactly how social media affected the 2016 US presidential election or the UK’s EU referendum vote. Letting it ride is dangerous, but so is government regulation. Numerous governments are leaning toward the latter.

Even the experts assembled at last week’s Cambridge Disinformation Summit reached no consensus. Some saw disinformation as an existential threat; others argued that disinformation has always been with us and humanity finds a way to live through it. It wouldn’t be reasonable to expect one filmmaker to solve a conundrum that is vexing so many. And yet it’s still disappointing not to have found greater clarity.

Illustrations: YouTube CEO (2014-2023) Susan Wojcicki (via The YouTube Effect).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Follow on Mastodon.