AI or ChatGPT makes (not only) consultants think lazy!
The ChatGPT program cannot think "out of the box". Many consultants - of whatever persuasion - do not consider this when using the chat program.

Since the company OpenAI released its ChatGPT program for general free use at the end of 2022, a hype has arisen around the topic of artificial intelligence (AI). The consulting scene has also recognized the benefits of chat programs such as ChatGPT - and rightly so, because they can be used to quickly and easily generate at least initial drafts of such advertising texts as blog posts, advertising letters or posts for social media, which can then be further edited.
Consultants often have texts written by ChatGPT
Sometimes, however, the consultants' use of ChatGPT takes on strange forms. For example, when we as a PR and marketing agency are asked to write an article for consultants on a current trend topic - such as artificial intelligence, transformation, hybrid teams, sustainability, blended learning, Generation Z, etc. - and place it in print and online media, for example because the consultant in question has just developed a new product on this topic and wants to promote it.
Suppose we then say to the consultant "We'd be happy to do that. But please give us some input beforehand, so that we know your core messages and the direction of your content". Then, not infrequently, a text is sent to us a short time later that was recognizably created by ChatGPT. That is, ideally it consists of some rather general statements, for example on the topic of "Artificial Intelligence" or "Transformation", which we ourselves would have found by googling on the net. However, there is no trace of the consultant's own thoughts in the texts.
Consultants often do not think through issues
Quite often, if we were to offer the texts to trade journals without a new focus in terms of content, they would even be absolute nonsense from their point of view - for example, because they do not reflect the fact that small companies have fewer resources than corporations and that the logistics sector, for example, ticks quite differently from the financial sector, which is why different solutions are also required for many problems. In other words, there is no differentiation in the articles, even though this is precisely where a consultant's expertise can be seen.
Here's an example. A few weeks ago, a personnel consultant specializing in SMEs, who had obviously also read somewhere "The future belongs to AI", asked us to write an article for him on the topic of "AI use in the personnel selection process". After I had asked him to send me some keywords in this regard, I received a text of about 30 lines a short time later. It described a possible AI application in the personnel selection process for applicant pre-selection - without any reference to small and medium-sized enterprises.
The consultant had told me in advance that most of his customers were currently struggling with the following problem: they were receiving a maximum of 1 or 2 applicants, if any, in response to their job postings and therefore, due to a lack of alternatives, they often had to hire applicants who only partially met their requirements in order to retain their ability to work. When I called the consultant and asked what benefit an AI system for pre-selecting applicants would offer SMEs in such a labor market situation, his answer after a moment's thought was: "Actually, none - because if there is only one applicant at the door, then..."
Consultants often regurgitate phrases and clichés
I had a similar experience when we were asked to write an article on the topic of "Intergenerational Collaboration" for a larger consulting firm. The draft text I received suggested the impression: The majority of employees and managers in companies today are still digital immigrants who are at war with IT and have strong emotional reservations about IT solutions, resulting in problems in collaboration with digital natives.
When I then asked the text supplier to what extent this was still true today, since many members of the generations X and Y quoted in the article were already 35 or even 40 years old and had not infrequently been among the top performers in companies for years, his answer was: "You could be right about that. Obviously, however, he had never thought about the extent to which these clichés, which were valid a decade or two ago, are still true today. So they didn't bother him in ChatGPT's draft text either.
Consultants reflect too little: Who are my addressees?
We also gather similar experiences more and more often when we are asked to write new pages for consultants' homepages and articles for their blogs, for example because they have developed a new product or want to be found by their potential customers on the web for a certain keyword that is "in" at the moment. Even then, when we sift through their copy, we often find ourselves asking, "What was the consultant thinking here?" And quite often the answer is, "Nothing, because he just entered some prompts at ChatGPT."
The reason for this: The texts are so banal and general that one senses nothing of independent thinking on the part of the consultant or even of his field and practical experience. The only thing is, why should potential customers who come across the consultant's website while Googling contact him at all? Many consultants obviously don't ask themselves this when they use ChatGPT. They don't ask themselves this any more than they ask themselves when writing articles: Why should a professional journal publish an "expert article" by me, which their editorial staff could also create themselves by entering certain prompts in ChatGPT?
ChatGPT cannot think out of the box
The above lines are not meant to be a vote against the use of the program ChatGPT by consultants of any kind. It is and remains a very helpful tool. What ChatGPT cannot do for consultants, however, is to think (in all its facets such as think-through, think-about, and think laterally) and to develop tailor-made problem solutions for their target customers.
Ultimately, this program can only reproduce a more or less meaningful substrate of the information it finds on the web. It cannot (to use a current consultant buzzword) think "out of the box" and find completely new solutions to problems. That is and remains the job of the consultants (alone or in dialog with their customers).
What applies to the consulting guild naturally also applies to the use of AI in companies. Here, too, there is a danger that users will become lazy and blindly trust the solutions proposed by AI systems instead of asking themselves: To what extent are these goal-oriented?
Qualify for adequate AI deployment
Incidentally, sensitizing and training the employees of the companies in this regard could be a consulting or training offer of the providers in the education and consulting sector. I have not yet found such an offer when googling on the net. However, at the latest after the publication of this article, this is only a question of time - if only because a corresponding reference appears in a text created by ChatGPT for consultants.
To the author:
Bernhard Kuntz is managing director of the marketing and PR agency Die PRofilBerater GmbH, Darmstadt, which specializes in consultants. He is the author of, among others, the books "Selling a Cat in a Bag," "Fat Booty for Trainers and Consultants," and "Why Does Everyone Know Him?" (Internet: www.die-profilberater.de).