OpenAI employees say that prompts are useless, which is true, but not entirely true

Mondo Social Updated on 2024-01-31

I believe that friends who have used ChatGPT, Wenxin Yiyan, New Bing, and other AI models may have similar experiences, that is, AI models are very similar to children, and they need to be good at giving the desired results. The premise of the large model to output high-quality results is to input prompts that AI can understand, which must have been the consensus of users of products such as ChatGPT and Wenxin Yiyan.

However, some employees of OpenAI behind ChatGPT and GPT-4 don't seem to think so. Recently, OpenAI developer promotion ambassador Logan Kilpatrick posted on social platforms, "Hot opinion: There are many people who believe that in order to remain competitive in the future, it is necessary to master prompt word engineering skills. But in fact, there is no essential difference between prompting an AI system and communicating effectively with others. ”

Logan Kilpatrick further explained that while prompt word engineering is an increasingly popular field of expertise, the three essential skills that really matter in 2024 are reading, writing, and speaking, and as AI technology continues to evolve, honing these skills will give humans a competitive edge over highly intelligent robots in the future.

There is no doubt that this is a statement that can be subverted by public perception, but is there any truth to the claims of this openai "evangelist"?

In fact, Logan Kilpatrick is true, but it only works in a limited number of scenarios. In fact, after more than half a year of iteration, OpenAI's most advanced product for the consumer market, GPT-4, has been able to understand human intentions and motivations to a certain extent. Compared to GPT-3 based5 ChatGPT, when using more colloquial and emotional content to communicate with GPT-4, the latter can often gain insight into the user's needs and then give results, but ChatGPT will be indifferent.

This means that the difference between a user communicating with GPT-4 and communicating with a human is getting smaller and smaller, and in this case, Logan Kilpatrick is clearly valid for arguing that there is no essential difference between prompting AI and communicating effectively with others.

But the problem is that not everyone is a subscriber to ChatGPT Plus, and the vast majority of people still use the free ChatGPT or other large AI models, and it is obviously unrealistic to expect the latter to be able to reach the level of GPT-4 in natural language understanding (NLP).

What's more, as ChatGPT Plus subscriptions continue to grow, GPT-4 inevitably suffers from a decline in performance. Previously, in the summer of 2023, researchers from Stanford University found that compared with March, the June version of GPT-4 showed a certain degree of reduction in solving mathematical problems, answering sensitive questions, ** generation, and visual reasoning. Counting on GPT-4 to continue to maintain its current performance requires OpenAI to continue to invest in computing power.

Speaking of prompts, is there any need to learn it?Naturally, there is none. Because with the advancement of technology, not only GPT-4, but also the performance of other large AI models will continue to spiral, and sooner or later these large models will allow users to achieve similar effects to chatting with real humans. But if you really don't understand the prompt words and master a little trick to communicate with AI, maybe the current AI application will close the door to you.

The significance of the prompt is that when the current AI model is still not an AI agent and cannot set goals independently, it is more like an efficient tool than an intelligent assistant. Prompt words are an effective way to use AI tools to get the desired output, and they are composed in a variety of forms, either in natural language or in **.

In a sense, prompts are like magic spells, and only by pronouncing the exact spell can the AI unleash the corresponding magic, so the prompt word is to distill the essence and meaning of things into shorter words.

According to the words of Xiao Yang, vice president of the group and head of the search platform, it is a prompt word = task + role + context + details supplement. At present, the answers given by AI large models tend to be more comprehensive and decent without prompt word optimization, but after prompt word optimization, the quality of answers will be significantly improved.

For example, if you tell the large model and ask it to make a corporate official website for you, the effect may be very average. But if you tell the big model app to make it as a product manager, the quality will be much better.

In addition, the current large models tend to emphasize versatility, which means that in some specific scenarios, the capabilities of large models cannot be covered, and prompt engineering comes into play. Its role is to design the best prompt words to guide the AI model to help users solve the problems encountered in various subdivided scenarios. What's more, today's hot chain of thought (COT) technology can actually be regarded as an extension of prompt words.

So looking at it this way, Logan Kilpatrick didn't make a big fuss, but he could only listen to what he said. If you are an interested in the current application of AI and want to use AI to improve work and study efficiency, then it is still very necessary to understand and use prompt words proficiently.

Pat the smell of fireworks around you

Related Pages