A trope in content and marketing circles says: “AI won’t replace you, but someone using AI will.”
It’s stupid. Pointless. And entirely based on fear.
Now, to be clear. Though I think the trope is stupid, it doesn’t mean AI won’t replace you. However, if it does, it’s not because of the reasoning in that trope. A truer version would be: “AI shouldn’t replace you, but some short-sighted, misguided person will think someone using AI can.”
Stupid. Pointless. Fear-based. That’s what @Robert_Rose says about the trope, “AI won’t replace you, but someone using AI will,” via @CMIContent. Share on XThat conversation in the marketing community usually urges content writers, artists, and designers to ramp up their skills to get “good” at using generative AI to create content. I don’t disagree, but I don’t know what “getting good” at generative AI means just yet.
AI courses, toolkits, and templates fill my social media feeds. They claim to help teach you how to get to exactly the right “prompts” and use AI to “10x your marketing ROI” (yes, that’s an actual headline). But if I had a dollar every time some magic tech promised to 10x my marketing ROI over the last 20 years, I would have multiplied my income by more than 10.
But I digress.
Tech evolution may slow, but fear won’t
Generative AI technology and capabilities evolved rapidly and recently. It’s easy to forget that OpenAI only debuted ChatGPT in November 2022. You’re forgiven if you haven’t become an expert in including generative AI in your creative workflow in the past six months.
In that short time, ChatGPT has become the tool people associate with the idea of creating marketing content on demand. Now, most new marketing content-focused apps and new features in enterprise products promise to automatically generate your blog articles, email copy, and new ad campaign creative. They’re simply front-end systems for the ChatGPT function.
Amazed by the whirlwind of evolving quality, you may assume the current trajectory and pace of innovation will lead AI-generated content to rival your skill level. The fear of being replaced by these tools is not completely unfounded.
Over the next few months, the pace of quality improvement releases for generative content may slow. However, the tools to generate content and the more innovative ways to manage these tools will probably increase. The successive iterations will likely include customizable language models, easier-to-use interfaces, and technologies such as Auto-GPT – tools designed to autonomously perform tasks as they interact with software and services online. For example, a tool could not only write the content for your new website but build the website itself.
“Wait a minute,” you say. (Or is it a yell?) “I thought you were going to convince me that AI isn’t going to take my job.”
I’m pretty convinced it won’t (your misguided boss notwithstanding), at least not in the short term.
#AI generative tools like #ChatGPT shouldn’t replace your #ContentMarketing work in the short term, says @Robert_Rose via @CMIContent. Share on XGoals act as the human differentiator
A core concept in artificial intelligence centers around the idea of “goals” – what the “intelligent agent” (i.e., AI) pursues. Two types – final goals and instrumental goals – are commonly discussed.
Final goals represent the end objective. It’s the purpose. You want it because you want it. For example, I might say, “I want to go to Paris.” I don’t need to offer further reasons. I just want to go. It is my end goal, not a stop along a longer journey.
Instrumental goals or objectives are the means to achieve the final goals. In my example, an instrumental goal could be to get on a plane. Why? A plane is the most efficient and affordable way of getting to my final goal of Paris.
Now, instrumental goals can be what I’d call stupid or ill-advised as they relate to the end goals. For example, suppose my instrumental goal is to get in a rowboat and go to Paris (my final goal). In that case, you can rightfully judge that the instrumental goal is stupid based on its relationship to reality.
However, final goals can’t be stupid. Why? Because you can’t judge them against anything. You might think my final goal of watching every Star Trek episode in a month-long binge is stupid. But that’s just your perspective. To me, your thinking about that might be stupid.
What does all that talk of goals and stupidity have to do with AI-generating marketing content?
Let me explain.
End goals can’t be stupid, but people can
Different AI tools are imbued with final goals. Chess-playing AI, like IBM’s Deep Blue, has a final goal of playing chess to win. Tesla’s self-driving cars AI may have a final goal of driving on roads safely. An AI-enhanced thermostat may have the final goal of maintaining a comfortable home temperature.
ChatGPT (and the associated generative AI tools used by marketers) has a final goal: Respond to prompts to allow for human-like conversations, answer questions, and create communications (i.e., content). Then, a human issues other goals (final and instrumental) to it. The software responds if your final goals fit within the AI’s final goals.
For example, I could ask ChatGPT (or any similar tool) to write a blog article (final goal) with these instrumental goals:
- Keep it to 500 words or less
- Include these keywords …
- Write in the style of Robert Rose
The system will dutifully try to comply. What it will never do is respond with, “You know, Robert, that’s kind of a stupid goal. What you ought to think about is a white paper and associated marketing campaign on the topic of customer experience instead.”
You can assess if the original idea might contribute to your content marketing goals. You also can come up with something different and unrelated that might be better in helping your program achieve its goals. That’s one reason why the value of humans is not yet threatened by generative AI.
#ChatGPT can’t critically evaluate prompts for content to assess whether they’re the right tactic for your #ContentMarketing strategy, says @Robert_Rose via @CMIContent. Share on XLet me share one more reason why marketing leaders shouldn’t think of replacing good human content creators with ChatGPT any time soon.
Cognitive bias affects AI users
Part of the stupidity and oversimplification of the “AI won’t replace you, but someone using AI will” trope is that it implies good content creators can be replaced by someone who can simply download those prompt toolkits and push the buttons of the AI system.
However, a cognitive bias theory called the Dunning-Kruger effect will come into play in this scenario. People with low ability, expertise, or knowledge of a task tend to overestimate their ability or knowledge. In other words, knowing how good you are at something requires the same skills as being good at that thing in the first place.
You get AI content as good as the human prompter is at creating original content. So, the ultimate response to anyone who throws around that trope is, “Who will replace the talented writer needed to create the prompts and assess whether what the AI spits back is valuable content?”
The answer should be someone skilled at writing, who can create the right prompts with the right goals and recognize the strengths and weaknesses of the content the AI machine spits back.
Now, of course, as generative AI develops, these ideas may change. The day may come when an AI tool can challenge your assumptions about what you should do.
But if those transformative challenges come to pass, the least of your problems will be whether a ChatGPT-based tool can replace your ability to create blog posts, social media ads, manuals, and emails for our company.
So, until then, I don’t think it’s ever been more appropriate to say:
It’s your story. Tell it well.
HANDPICKED RELATED CONTENT:
Cover image by Joseph Kalinowski/Content Marketing Institute