AI is fast automating many white-collar jobs, from designing t-shirts to diagnosing cancers. So, I wondered, will it steal mine, writing about issues that are important to executives? I have done some thinking and research, and I believe AI will hijack a slew of business writing jobs.
But not mine.
Businesses have been using technology to automate the professional workplace for more than 30 years, starting with accounting transactions on mainframes and progressing to enterprise risk management (ERM) systems and Office 365 in the cloud. To build these tools, programmers automated rules that people could articulate e.g., an invoice can be queued for payment if it has a corresponding purchase order and delivery note, according to the payment terms for that vendor. But AI is different. Given inputs and outputs, it can automate tasks where the rules are complex and hard to describe. For instance, given enough MRI scans and corresponding diagnoses, an AI machine can learn to screen for tumors itself, without anyone telling it how.
Machines are already writing articles for The Washington Post, which has built a tool called Heliograph that produces routine reports on elections and high school football games. The Associated Press, Bloomberg, and others are using AI tools for similar purposes. So far, these machines haven’t replaced any reporters (that we know of), but they are enabling these publications to report more stories, more accurately. Forbes is testing a tool that will provide authors with rough drafts and story templates.
But can AI produce something fresh? Insightful? Creative?
Well, it can certainly appear to. When an AI machine first defeated the world GO champion (an Asian strategy game), commentators said they had never seen a human make some of its winning moves. They called those moves “creative.”
But that, I’d argue, is creative problem-solving. How about creativity for creativity’s sake? AI machines already write music and paint pictures. Beauty, of course, is in the eye of the beholder, and to many beholders these are terrible. They mimic things that people have done, and it shows. The tools will get better, and probably quickly. But could AI ever produce something truly original? For example, draft a thought leadership article that captures the attention of busy executives and gives them insights they’ve not read before?
In articles we help our clients develop, that usually requires a better way to solve a pressing problem, supported by research or field experience. For the article’s point of view to be new (and therefore valuable), it must be based on experience, knowledge, and thinking that only the author has.
For example, our client Karen Brown explained in a recent HBR article why the “inclusion” part of diversity and inclusion is key to employee retention, but often overlooked. Based on her experience working at companies like Baker McKenzie, Baxter, and Rockwell Collins, she describes how business leaders can help minorities feel welcomed rather than merely employed. Her point of view, and the real-life examples she uses to support it, are from her personal experience – neither were already in the public domain. No AI machine could have found her examples and it’s, therefore, hard to see how one could produce her point of view.
AI already is changing some aspects of the thought leadership business. We often use Grammarly, which relies on AI, for proofreading pieces before submitting them for publication. We already employ our proofreaders less than we did a few months ago.
Writing tools will get better, and I expect it will be increasingly difficult to tell whether an article was written by a machine, a person, or both. The tools will help us all write more and better, and they may take away a large part of the routine. But I don’t see that a robot can ever develop truly original content without human help, no matter how well it plays GO.