If you move fast and break things, do you ever come back to clean up your mess? Or do you just look for the next thing to smash?

The October 2023 cover of Wired magazine irked me the moment I saw it.

Cover of Wired Magazine featuring the leaders of OpenAI, with the caption: 'Dear AI Overlords, Don't F*ck This Up'
Cover of Wired Magazine featuring the leaders of OpenAI, with the caption: 'Dear AI Overlords, Don't F*ck This Up'

On one hand, the cover irked me because it seemed to be saying that we, the commoners, are at the mercy of the lords of AI (let’s just scratch out ‘overlords’ for the sake of accuracy). And it bothered me, on the other hand, because there seems to be truth in the sentiment.

Why shouldn’t the lords of AI mold our future, since the tech industry has had its way so far in the 21st century?

But don’t we have enough evidence of why it’s a bad idea to let tech call all the shots?

We’ve already seen what happens when AI has free rein. All we have to do is look at the algorithmic wasteland that is now social media1. Tech moved fast and broke a lot as it formed social media. But tech has yet to go back and fix the mess it created along the way.

And why should they? What’s their incentive? Companies exist to make money. Tech companies are no different. Nor should they be. But when you consider the reach of the industry’s influence (empowered by a hands-off approach from regulators), is it wrong to ask tech to be a better steward?

Leaning on AI in the form of algorithms has seen the internet flooded with example after example of misinformation and disinformation, making respectable journalism even harder to find in the 21st century. And as a recent lawsuit from The New York Times brings to light, the tech industry is at risk of doubling down on its prior negligence2. But, as is the case with social media, it’s not worth their time to go back and pick up the pieces. So, they never will.

Why should we trust these same companies to break more stuff with generative AI?

Tech needs a better guiding principle than ‘Move fast and break things’, one that recognizes the responsibility that comes with disruption.

Remember when your elders told you to leave things better than you found them? Why shouldn’t that wisdom apply to tech as well? Or when your mother said told you it’s not what you say, but how you say it?

The mantra ‘Move fast and break things’ has horrible implications. Why not focus on fixing things, a far more constructive act? Breaking for breaking’s sake doesn’t serve anyone, especially if we’re never coming back to build something better.

Tech needs better philosophy

So many of tech’s problems seem to come down to matters of philosophy, in that the tech industry doesn’t properly value people beyond their potential to become customers who buy tech’s ‘solutions’ that may or may not actually solve a problem3.

It’s easy for tech to adopt the philosophy of moving fast and breaking things when the results will benefit them. The tech industry is like a toddler who runs around smashing vases and busting windows, with a parent trailing close behind to clean up and apologize for the mess. Who wouldn’t love to operate in such a fashion?

AI in particular could benefit from adopting the simple philosophy below:

Helping humans > replacing humans

When we talk about creating or improving company cultures, many of us will utter the phrase ‘It starts at the top,’ meaning it starts with the people in charge. But I’d argue that truly great companies go one step further and start with a company’s ideals, which have the potential to stick around longer than any human employee can. And everyone who joins that company should be expected to adopt those ideals, because the ideals themselves, not who’s in charge, are the focus.

Tech needs better philosophy. Stoicism is great, but it’s not enough.

  1. Social media engagement algorithms and the illusion of choice on jakelacaze.com ↩︎

  2. The New York Times is suing OpenAI and Microsoft for copyright infringement on The Verge ↩︎

  3. Is AI just a solution looking for a problem? on jakelacaze.com ↩︎