Scott C. Richmond

Computing without "AI"

In June, while I was visiting my parents' house on the beach in Massachusetts, Apple announced its Apple Intelligence. For the first time in a long while, I wanted to throw my Apple devices into the see. I had a very vivid picture of yeeting my barely-a-year-old, very-nearly-maxed-out M2 MacBook Pro, like a frisbee, off a cliff and into the Atlantic.

For reasons I'm not entirely sure how to explain, Apple's announcement finally flipped a switch in my head: I need to get away from corporate, and consumer, computing culture.


Since 2000, I've been a very happy, and occasionally very smug, Apple user. (The smug has even been recent: did somebody say CrowdStrike?) I bought a Cube with money from my first "real" job (as a web developer at the tail end of the dot.com era, taking time off from university). I got the first-gen iPod and iPad (and several iPods afterwards, although I wouldn't get a second iPad until this past fall). My hatred for AT&T overcame my love of Apple; I didn't get my first iPhone until 2013, when I could get an iPhone 4 on a different wireless carrier in the US.

Before I was an Apple user, I ran Linux. I first installed Linux on my PC in 1999, in my second semester at university. Part of why I was so enthused about Apple in 2000, beyond the lucite dreams of the Cube, was OS X. The new Apple operating system was unix on a personal workstation, but without the care work. Linux was a pet you needed to tend to.

I was fired from that first "real" job (I am not kidding) for writing down that Microsoft was evil (through no fault of my own, my words ended up on a client website; that is a different story for another time). I am still allergic to every user interface Microsoft designs. And Microsoft is all-in on AI bullshit. Even so, the dream of a well-lubricated, "just works" operating system in MacOS is beginning to unravel. Not because MacOS isn't well-lubricated; indeed, precisely because it is.


Current mainstream computing culture is marked less by questions of software and affordances and more to do with the consumer cultures of computing. What kinds of relationships do you have to the corporations who sell the hardware, operating system, software that you use? From all accounts, Windows 11 (is that the current version?) is just adware; most people who use Windows, by now, are so habituated to being sold as a receptacle for advertising that such embedded registers (if it registers at all) as an ambient background annoyance, just the faintly off-putting experience of using computers. Meanwhile, I admit I have no actual theory of mind for what it might be like to be a Windows power-user nerds; perhaps it's about gaming and hardware? (I have watched a Linus Tech Tips video from time to time.)

Most of the people I know have made something like the same deal I have with Apple: don't be visibly evil (or, for that matter, tacky or thirsty), make sure things stay well-lubricated, and I'll hang around, buying new hardware from time to time. Apple devices come with a patina of just-enough-privacy-focused that Apple has seemed to be a defender of its users from the more unseemly predations of surveillance capitalism. My own sense of the my situation went something like this:

Well, these days, you've got to be tethered to a large corporation to do computing. You're necessarily in some kind of relation to Apple, Microsoft, Google, Facebook. Of the big actors, because it is largely a mass-affluence, hardware-and-infrastructure sort of company, Apple has the most stable and least-extractive relationship to its customers. Therefore, Scott, you've made the right choice, and you can feel as good as one might about your consumer disposition inside the wretched political economy of intensively computational racial capitalism we live under.

That's actually bullshit! A textbook example of ideology as Louis Althusser worked it out: an imaginary relation to my real conditions of existence. Its faintly, knowingly cynical character is part of what makes it effective as ideology (thanks, Žižek).


As I said, I don't really know why Apple Intelligence was the straw that broke this particular camel's back. Maybe it's because the whole AI bubble seems to me to be transparently bullshit from top to bottom. It's where the lubrication of computing—the development of computing environments as highly integrated scenes of productivity and consumption, rather than tinkering and exploration—most fully shows its character. It's a mad grab for value in an industry that is, in kind terms, decadent. It makes very little of value. It does not know where new value might lie. There's a string of recent embarrassments: cryptocurrency and the blockchain, the metaverse, and now artificial intelligence (nota bene, these are all misnomers—no cryptocurrency is actually a currency; the metaverse simply doesn't exist; artificial intelligence is not, in fact, intelligent). We haven't seen the AI bubble pop yet, but I really don't see how this could work out well for anybody. Except Nvidia, I guess.

The problem, it seems to me, is not Apple's vision for AI, at least not exactly. Unlike approximately everybody else, their vision is for AI to lubricate acts of productivity and consumption. Train your own personal LLM on your email, locally, on device (or using their secure [we promise] cloud), and let it draft emails for you. Or recall when that appointment was. Or whatever. The same as always, but better, somehow.

The Apple vision for computing, from at least 1984 on, has turned on the idea that a computer should be a personally expressive medium, but whose mode of expression is consumption. Not expert use of a finely-tuned system that requires learning, training, feedback. That was the model of Doug Engelbart and many of the people working on early interactive systems, including the folks at Xerox PARC, whose Smalltalk system was not only the first explicitly object-oriented programming system, but a complete environment for computing. Alan Kay & co.'s mental model for computing included programming, for all users. Apple and its software ecosystem tended, instead, to de-skill computational activity. You can read about the forces that articulated the commercial software system in Laine Nooney's wonderful Apple II Age. Apple's turn to AI is the most recent turn in its tendency to want to lubricate everything, re-tooling it for the 2020s.


In other words, the turn away from Apple—which, to be clear, is still the best of the tech-behemoth lot in terms of treating its customers something like humanely—will almost necessarily look like increasing the friction in how I use my computer. The sheer number of things I will need to do to move away from the Apple ecosystem is, frankly, daunting. The software, sure. (I hate Excel and use Numbers.) I use Safari so that Firefox doesn't chew through battery. But there's also the services. Feeling vaguely icky about my use of Spotify, until recently, I was contemplating moving to Apple Music. I use iCloud for file sync & sharing, and backup; Photos; iMessage; Passwords; etc. Each one of these needs a new thing. And, if in 2024, the idea is to decrease reliance on corporate cloud offerings, each one of these probably needs to be self-hosted.

The good news is that these things are all possible to self-host. For the next while—and it will be a while—I plan on documenting my moves away from Apple here on this lil blog here. The hope is that what I learn can help people. But because the goal is, actually, to increase friction, not to lubricate all the things, this won't document how to install Linux the friendly way. The idea here is to think a bit with computing as bricolage, tinkering, futzing, making; not computing as consumption, productivity, all glassy and smooth.

Part of this intersects with what we're doing at the Centre for Culture and Technology. We'll be running Computer Club this academic year, dedicated to Computing without "AI". This supplements that, and is maybe a bit more navel-gazey and analytical. I don't purport to be the first person to walk this path; I'll just be doing it here, a little bit in public, and talking about what I learn and think along the way. Come along with, if you want. And if you're in Toronto, come join us at CCT.