This week we had the pleasure of speaking with Ed Zitron. In addition to being the founder of his own media relations firm, Zitron has a tech-focused Substack (“Where’s Your Ed At”), and is also a contributing writer for Insider. This week, Zitron wrote an op-ed humorously suggesting that companies should replace their CEOs with AI. Executives didn’t love it. We spoke with Zitron about AI, labour, and the current foibles of corporate governance.
This interview has been edited for brevity and clarity.
For people who haven’t read your op-ed, they should obviously just do that. But I wanted to give you an opportunity to make your case. So, just briefly, what argument are you making in this piece? And why should we replace corporate executives with ChatGPT?
The argument I’m mostly making is that the CEO has become an extremely vague role. It’s become one with very little accountability, very little in the sense of a definitive set of responsibilities. If you look at the basic literature around the CEO role, it’s actually not that obvious what they do. There was a Harvard study from 2018 where they looked into what they were doing and it was like “people,” “meetings,” “strategy.” That could mean anything—quite literally anything! “Strategy”? What does that mean? So, CEOs appear to be just going into meetings and saying, ‘We should do this’ or ‘we shouldn’t do that.’ The problem is that if your only role in an organization is to take information and go ‘eh, we should do this’ and you’re not a lawyer or a doctor or someone with a real, actual skill set, what’s the goddamn point?
What sort of responses have you gotten from your piece so far?
Everybody on Twitter seemed happy with it, whereas people on LinkedIn were split 50-50. If you say anything negative about executives on LinkedIn, a lot of guys who aren’t executives get very pissed off. (And it’s always guys, btw—men seem really sensitive about this subject.) But there’s still a good amount of people who think, yeah, if there’s a chief executive who has a vague role where they don’t actually execute—where they do stuff that isn’t actually connected to the product but they still get paid a ridiculous amount of money—maybe we do need to automate them! Or maybe we need to more clearly define their role and hold them accountable for that role and fire them if they perform poorly.
What do you think the chances are that companies will take you up on your suggestions here?
Oh, extremely low. Just to be abundantly clear I do not think a single goddamn company does this. That’s why I offer an alternative in the piece, which is that we need working CEOs. Me, personally, I do a lot of the leg work at my own business. I would say I do more than my fair share. But, also, why would you work for me if I didn’t? That’s what I’ve never understood about these CEOs that don’t work. It’s like, I can understand an editor that doesn’t write but an editor that’s never written or never writes? An editor who just sits there and makes calls? Or an executive editor? Or, I don’t know, some kind of private equity guy who buys a large organization but doesn’t seem to have any appreciation for what goes on there, and then proceeds to make a bunch of really stupid calls…that’s where you run into problems.
That’s what my Insider piece was about, basically. Executives seem disconnected from work-product. It’s a fundamental issue.
I’m curious about what you make of generative AI and how the executive class seems to be weaponizing it against workers?
Generative AI is hilarious because it has the appearance of intelligence without actually having any. It’s the perfect kind of McKinsey-level consultant; it just regurgitates content based on a certain subset of data. It does not bring life experience to what it does. It doesn’t create anything new. It’s not learning or thinking. It’s basically just taking a big box of Legos and trying to create something, using no actual creativity, with a rough approximation of what it thinks a house looks like.
There’s a lot of mystification around AI and there’s all this rhetoric about how it’s going to “change the world.” But really, when you get right down to it, AI is basically being pitched to companies as a cost-saver, because it offers them the opportunity to automate a certain percentage of their workforce.
This relates back to what we were talking about earlier. When you have executives and managers who are disconnected from the means of production—or the process of production—they will make calls based entirely on cost, output, and speed, because they don’t actually understand the production process. They don’t know what’s going on inside the machine. The only things they see is what goes in the pipeline and what comes out the end and they pay attention to how fast it’s happening.