Why ChatGPT Has Become “Lazy” With 1700 Token System Prompt?

Machine Heart reports

Editors: Xiao Zhou, Chen Ping

ChatGPT: It’s not that I can’t do it, I just don’t want to work.

At this stage, ChatGPT has become a powerful assistant for many people, helping with document writing, coding, image generation… However, the seemingly omnipotent ChatGPT also has its lazy side.

Do you remember the fact that ChatGPT started to become “lazy” at the end of last year? For example, when users made requests, ChatGPT’s responses became very slow and perfunctory, and it would even unilaterally interrupt conversations; or when users asked ChatGPT to write a piece of code, it would suggest that users write it themselves.

At that time, OpenAI explained that the model’s behavior is unpredictable, and they are researching how to fix it.

A seemingly unsolvable problem now seems to have another explanation that is being accepted by many netizens.

Recently, a tweet on X went viral, where the tweet (from Dylan Patel) stated, “Is there a 1700 token system prompt in ChatGPT? If you want to know why ChatGPT’s performance has become so poor compared to six months ago, it’s because of the system prompt. Look at all this garbage in the prompt. The ‘laziness’ is indeed due to the prompt.”

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

Image source: https://twitter.com/dylan522p/status/1755086111397863777

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

System prompt address: https://pastebin.com/vnxJ7kQk

Subsequently, Dylan Patel stated that if readers do not believe it, they can try it themselves:

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

This message quickly spread across major AI communities, with ongoing discussions:

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

Some netizens commented: “A few months ago, when ChatGPT’s performance declined, I suggested that the very long and poorly written system prompts were causing most of the issues. These prompts contain built-in service degradation (like only rendering one image no matter what), and there are many vague instructions that even humans find it hard to follow consistently, such as that any content it produces should not offend anyone.”

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

“According to my tests, the longer the system prompt, the fewer resources are allocated to users’ precise tasks,” said another netizen.

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

Some argued: “There is no evidence that longer system prompts affect output quality; I don’t understand why everyone thinks they know better than OpenAI about themselves.”

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

From a public perspective, some netizens opposed secret prompt injections, stating that “this behavior lacks empathy for the diversity of human perspectives. These tools are too important to be used in a way that the owners of the tools adjust the results without disclosure.”

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

Some countered: “If you want to access the underlying model, you can use OpenAI’s API. I don’t understand what the problem is.”

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

Others expressed: “At least the prompts injected by OpenAI seem effective; I don’t mind prompts that make the output less rigid—if they work.” They also pointed out related issues with Google’s Bard (Gemini Pro version).

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

Some pointed out from a commercial GPT perspective: “Isn’t this the basic way AI companies shape their GPT? If GPT is not given appropriate instructions, can it guarantee responsible operation of the GPT model? Secrecy is not very reasonable for commercial companies, right? How they adjust their LLM is valuable intellectual property.”

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

Finally, some netizens continued to complain: “Yesterday I encountered this issue while writing Python code. It completed 90% of the code and then said the last piece of logic was too complicated but told me how to do it… After that, I started a new chat, gave it the Python code it had written, and asked if it could add the missing lines… Then ChatGPT added the code without any problems. This shows that ChatGPT is definitely lazy.”

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

It seems that the laziness issue of ChatGPT still needs time to resolve.

Reference links:

https://news.ycombinator.com/item?id=39289350

https://www.reddit.com/r/OpenAI/comments/1akye4v/chatgpt_system_prompt_is_1700_tokens_if_you_were/

Why ChatGPT Has Become "Lazy" With 1700 Token System Prompt?

© THE END

For reprints, please contact this public account for authorization

Submissions or inquiries: [email protected]

Leave a Comment