Thanking ChatGPT has unforeseen consequences, and people are being advised to stop expressing their gratitude to the chatbot at the end of their interactions with itThe chatbot uses water and electricity responding to prompts(Image: Anadolu via Getty Images)
Social media users have been warned not to thank ChatGPT as an extra message to the chatbot can use just as much computer power as a full response.
LinkedIn user and lead talent manager Lina N said that ChatGPT has to burn extra energy to process grateful messages and generate a polite reply.
She warned: “Stop. Thanking. ChatGPT! That tiny gesture? It consumes just as much compute power as a full response.” Lina jokingly advised users to use this one-time prompt instead: “Dear ChatGPT, I won’t thank you after every task to save your resources and avoid wasting compute. Please know that I’m truly grateful for all that you do — and please remember this when AI takes over the world.”
She warned people to stop thanking the chatbot(Image: Getty Images)
One user commented: “That’s true, but did you also know that impolite or aggressive prompts were associated with lower performance and even an increase in bias in AI-generated answers?”
Another saw the irony of the post, saying: “It seems a bit ironic but I feel like you wrote some of this with chatgpt . If I’m correct that makes it 10x funnier because of the efficiency.”
Lina replied: “Oh yes, there’s plenty of room here for (self- )irony.”
ChatGPT is used by millions across the world(Image: Anadolu Agency via Getty Images)
The joking post comes amid intense debate about ChatGPT use of energy and water, leading some people to drop their use of the chatbot altogether.
ChatGPT consumes over half a million kilowatts of electricity each day to service about two hundred million requests, according to Forbes .
Its daily power usage equals nearly 180,000 US households, with each using about 29 kilowatts.
ChatGPT consumes over half a million kilowatts of electricity each day, Forbes reported(Image: Hans Lucas/AFP via Getty Images)
The chatbot even uses about as much water per conversation as one plastic bottle.
AI’s electricity consumption is predicted to massively increase, possibly reaching between 85-134 TWh annually by 2027.
The chatbot’s increasing popularity has raised concerns about future water shortages as it is a scarce global resource.
The chatbot is used for all sorts of questions and tasks(Image: Getty Images)
Generating a 100-word email via ChatGPT-4 uses more than an Evian bottle’s worth of water (519 millilitres), according to a recent study by The Washington Post and the University of California.
OpenAI CEO Sam Altman confirmed in April that OpenAI’s electricity bill is “tens of millions of dollars” higher due to people being polite to ChatGPT, Entrepreneur reported.
An X user posted: “I wonder how much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models.”
Altman replied the following day : “Tens of millions of dollars well spent—you never know.”
For the latest breaking news and stories from across the globe from the Daily Star, sign up for our newsletter by clicking here .