


First, a confession: I tried writing this essay with A.I.
Listen to this article, read by Malcolm Hillgartner
I started with ChatGPT’s “deep research” mode, asking it to compile a report on what new jobs for humans might be created by the rise of A.I. It asked a few follow-up questions and then set off, returning with a 6,000-word report, broken down by industry. I fed that report into ChatGPT 4o — along with the original assignment memo from my editor and a few other recent industry reports on the future of work — and asked for an article in the style of The New York Times Magazine.
It was done within 90 minutes. The article was lively and informative, and while some of its imagined future careers were a bit fanciful (a “synthetic relationship counselor” apparently will be someone who can step in when you’re in love with your A.I.), it also covered an interesting spectrum of plausible jobs and featured some delightful turns of phrase. To the average reader, it likely would have come across as a breezy Sunday read with just enough interesting points to warrant a bit of reflection.
So why aren’t you reading that version? Well, for starters, it would have gotten me fired: Almost all quotes and experts in the article were entirely made up. But I had a deeper, more philosophical concern. Even if the A.I.-written version of this piece was entirely factual, submitting it to my editors would have represented a fundamental misunderstanding of why they hired me. In freelance journalism, as in many fields where the work product is written text, you aren’t just being paid for the words you submit. You’re being paid to be responsible for them: the facts, the concepts, the fairness, the phrasing. This article is running with my byline, which means that I personally stand behind what you’re reading; by the same token, my editor is responsible for hiring me, and so on, a type of responsibility that inherently can’t be delegated to a machine.
Commentators have become increasingly bleak about the future of human work in an A.I. world. The venture-capitalist investor Chris Sacca recently went on Tim Ferriss’s podcast and declared that “we are super [expletive].” He suggested that computer programmers, lawyers, accountants, marketing copywriters and most other white-collar workers were all doomed. In an email to his staff, Fiverr’s chief executive, Micha Kaufman, added designers and salespeople to the list of the soon-to-be-damned.
Such laments about A.I. have become common, but rarely do they explore how A.I. gets over the responsibility hurdle I’m describing. It’s already clear that A.I. is more than capable of handling many human tasks. But in the real world, our jobs are about much more than the sum of our tasks: They’re about contributing our labor to a group of other humans — our bosses and colleagues — who can understand us, interact with us and hold us accountable in ways that don’t easily transfer to algorithms.