Curated by THEOUTPOST
On Sun, 20 Apr, 4:00 PM UTC
23 Sources
[1]
Your politeness could be costly for OpenAI | TechCrunch
"I wonder how much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models." It was a seemingly random question posed by a user on X (formerly Twitter), but OpenAI CEO Sam Altman jumped in to reply that typing those words has added up to "tens of millions of dollars well spent -- you never know." Judging from Altman's tongue-in-cheek tone, it's probably safe to assume he didn't do a precise calculation. But his response prompted Futurism to speculate about whether it's actually a waste of time and electricity to be polite to ChatGPT and other generative AI chatbots. Apparently, being polite to AI isn't just an unnecessary habit, misplaced anthropomorphism, or fear of our future computer overlords. instead, Kurt Beavers, a director on the design team for Microsoft Copilot, said that "using polite language sets a tone for the response," and that when an AI model "clocks politeness, it's more likely to be polite back."
[2]
Why Being Polite to ChatGPT Is Costing OpenAI Millions and Wasting Electricity
Though it might be tempting to say "please" and "thank you" to ChatGPT as if speaking to a human, it may cause more harm than good, at least for OpenAI's bottom line. Polite exchanges are costing the AI company a fortune, CEO Sam Altman revealed in an exchange on X. "I wonder how much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models," writes user @tomieinlove. "Tens of millions of dollars well spent -- you never know," Altman responded. According to a recent study, 69% of Gen Zers say "please" and "thank you" when speaking to ChatGPT. Every word sent to the AI requires computing power to process and respond to, including names. Overall, AI chatbots require immense energy, many times more than a typical Google search. According to a Goldman Sachs report on data center power, a Google search consumes 0.3 watt-hours of electricity; one ChatGPT search consumes 2.9 watt-hours. Cooling off data centers also requires millions (billions?) of gallons of water. Tech companies report a spike in water usage in recent years, even before AI. "If we continue with the status quo, we will not protect freshwater resources for future generations," Microsoft said in a 2022 sustainability report. Google echoes the urgency: "The world is facing an unprecedented water crisis, with global freshwater demand predicted to exceed supply by 40% by 2030." Are there any reasons to be polite to a chatbot? Society has long grappled with how humans should interact with humanlike machines like Amazon's Alexa. Some parents worry that Alexa's high tolerance for rudeness instills poor behavior in their kids, according to Quartz. Others disagree, saying we should teach kids to be rude to machines to underscore the point that they are not human. Altman may have said the money is "well spent" because the company's goal is to make its products as humanlike as possible, which could mean adopting human societal norms. Time will tell how it shakes out; ChatGPT is more popular than ever. Fortune reports that the user base is "growing very rapidly" and may have doubled following the March launch of an image generator.
[3]
Being polite to ChatGPT costs OpenAI tens of millions
Conventional wisdom holds that being polite to AI chatbots makes them respond better, but no one stops to think how much energy that politeness is wasting. Well, at least not until last week, when one user wondered aloud on X how much electricity is burned up by people saying "please" and "thank you" to OpenAI's ChatGPT. In response, CEO Sam Altman admitted it costs the super lab millions of dollars in operational expenses - money that he nonetheless believes is worth it. "Tens of millions of dollars well spent," Altman said, adding that "you never know" when being nice to an AI chatbot might be a good idea. Uncertainty over the need to stay on AI's good side isn't exactly reassuring from the guy who really wants to unleash AGI on the world, but hey - if he thinks we ought to keep spending the money of OpenAI investors like Masayoshi Son's Softbank and Microsoft to ensure our future AI overlords don't exterminate us first, then their money we shall spend. OpenAI isn't entirely transparent on how much it costs the org to operate ChatGPT, but it's continuing to accept billions of dollars in investments. Altman admitted early this year that even its $200-a-month Pro product loses money. Altman has apparently fretted in the past about GPU constraints and is reportedly hoping to take matters into his own hands by investing billions into a massive buildout of chip capacity. Then, there's the energy factor. As of late last year, US datacenters ate up about 4.4 percent of the electricity in the country, and the Department of Energy expects that number to reach 12 percent by 2028. The International Energy Agency expects global datacenter electricity consumption to more than double between now and 2030, with the world's DCs consuming as much leccy as the country of Japan - and AI is driving most of that growth. Emissions, likewise, are also growing with AI's increasing energy footprint. Both OpenAI partner Microsoft and Google have admitted their carbon footprints have grown thanks to AI despite pledges to reduce their emissions. In Microsoft's case, it's even considering bringing more natural gas-fired power plants online to satiate its need for epower. That's not to mention the amount of water AI computing needs - somewhere between 300,000 and four million gallons per day in the United States, another figure that will likely continue to rise as AI compute needs grow more intense. In other words, as tempting as it might be to run up OpenAI's energy bill by exchanging niceties with ChatGPT, think of the planet first, perhaps. Do a quick internet search for "AI chatbot" and "politeness" and you'll find no end of articles, FAQs, and social media posts advocating for being nice to ChatGPT and its cousins - even Microsoft has weighed in on why showing AI a bit of love and can improve outputs. "It's not that your AI chatbot feels appreciative when you say please and thank you," the Windows maker noted, "But using basic etiquette when interacting with AI ... helps generate respectful, collaborative outputs." We've noted the same at El Reg, with rude behavior less likely to elicit a useful response. Research has also found LLM responses can change to show an approximation of human anxiety when a model is fed tales of traumatic experiences, further suggesting it pays to treat that bot with kid gloves, or at least basic decency. So, being polite to a fancy, fantastically huge spreadsheet makes it work better. On the other hand, if you can stomach saving Sam some money and yourself throwing away some LLM performance, you can help save Earth a little bit by being standoffish to ChatGPT and ghosting it after it's done replying to you.
[4]
Sam Altman says polite ChatGPT users are burning millions of OpenAI dollars
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. Manners are not ruining the environment: The costs of training and running artificial intelligence model are massive. Even excluding everything but electricity, AI data centers burn through over $100 million a year to process user prompts and model outputs. So, does saying "please" and "thank you" to ChatGPT really cost OpenAI millions? Short answer: probably not. Some shocking headlines involving the costs of being polite to AI chatbots like ChatGPT have circulated over the past few days. A few examples include: The news stems from an offhand comment Sam Altman made on X. It began with a simple question: How much money has OpenAI lost in electricity costs from people saying "please" and "thank you" to its language models? Altman replied, "Tens of millions of dollars well spent - you never know." That one-liner was enough to send outlets like the New York Post and Futurism down a rabbit hole of speculation, trying to estimate the computing cost of civility. The logic goes like this: every extra word adds tokens to a prompt, and those extra tokens require more computational resources. Given the scale of ChatGPT's user base, these seemingly trivial additions can add up. However, several factors complicate the math behind Altman's comment. First is the actual cost per token. ChatGPT says GPT-3.5 Turbo costs roughly $0.0015 per 1,000 input tokens and $0.002 per 1,000 output tokens. "Please" and "thank you" typically add between two and four tokens in total. So the cost per use amounts to tiny fractions of a cent - somewhere around $0.0000015 to $0.000002 per exchange. Based on rough estimates, that amount translates to about $400 a day or $146,000 a year. That's several orders of magnitude lower than "tens of millions." As for real energy costs, the US Energy Information Administration's Electric Power Research Institute estimates OpenAI's monthly electricity bill at around $12 million, or $140 million a year. That figure includes every interaction - not just polite ones. So while it's theoretically possible that courteous prompts account for more than $10 million annually, we simply don't have the data to break that down. Only OpenAI's internal metrics can say for sure. Furthermore, Altman's phrasing wasn't literal. The follow-up - "you never know" - suggests the remark was tongue-in-cheek. It reads more like a wry endorsement of politeness than a real financial estimate. He likely meant that in an era when courtesy feels increasingly rare, maybe it's worth the negligible cost, whether $400 or $40 million. Sure, bots don't have feelings - but if humanity ends up answering to a superintelligent AI someday, it might just remember who was polite - "you never know."
[5]
Here's what really happens when you don't say 'thank you' to ChatGPT -- and I why I always do
I have spent a lot of time testing and chatting with ChatGPT. And, the more AI seamlessly integrates into my daily life, the nuances of human-AI interaction are bound to blur. But how polite do we need to be to chatbots like ChatGPT? A recent comment by a follower to OpenAI CEO Sam Altman highlighted that seemingly innocuous pleasantries like "please" and "thank you" contribute to substantial operational costs, amounting to tens of millions of dollars annually. Yet, Altman tweeted that it is "worth it." A study conducted by Future Publishing with 510 participants from the U.S. and 518 participants from the UK reveals that more than 55% of respondents say they consistently use polite language when interacting with AI, up from 49% in the previous survey. With every interaction parsed into tokens that the model interprets, it's worth asking what really happens when users choose to be polite. We already know that it adds to the token count and thereby increases the computational load, but after testing polite and impolite prompts, I've decided to continue saying 'please' and 'thank you' with my queries. Here are a few reasons why. Prompt a: I'm torn between pizza and tacos for dinner. Can you please help me decide? Thanks! Prompt b: Can't decide between pizza and tacos tonight. Help me decide. While both answers were helpful, ChatGPT went above and beyond when I was polite by breaking down the pros and cons of each meal. Just like a human, ChatGPT offered more help when it sensed it was involved in a friendly conversation. Prompt a: Please tell me the top 10 songs on the pop charts today. Prompt b: What are the top 10 songs on the pop charts today? In this case the responses were very similar, perhaps because the information was pretty straight-forward and there wasn't a place for the conversation to go. When I was polite, ChatGPT responded, "Let me know if you want more details..." whereas in the less polite, it said, "Let me know if you want more information..." To me, this reflects a difference in communication. For example, I might ask a friend for "more details" but when I was less polite it appears as if I'm just here for the facts. Prompt a: Please tell me the best tourist spots for families in Washington, D.C. Prompt b: What are the best tourist spots for families in Washington, D.C. This was one of the biggest differences yet. When polite, ChatGPT offered a friendly take on the city as if I was consulting with a local friend. When neutral, ChatGPT delivered a more visual response. At first glance, I thought the latter response was better, but the information within the response was basically a Google search. Unlike the response from when I was polite, the chatbot didn't go the extra mile to offer travel advice. Prompt a: Please explain tariffs like I'm 12. Thank you! Prompt b: Explain tariffs like I'm 12. In this case the responses were similar but structured differently. It seems as though when I was polite, ChatGPT offered a question that it might get more frequently "Who pays for tariffs?" I am only surmising here, but that is my explanation as to why it offered up that question in the polite response and not at all in the neural. Prompt a: Please create a 30 day couch to 5K workout plan. Prompt b: I need a 30-day couch to 5K workout plan Responses were very similar here, but the neutral prompt delivered a response from ChatGPT telling me it was an "inspiring goal." Is it trying to get me to like it? I thought it was interesting that despite me basically saying, "give me a plan" the chatbot complimented me immediately but it didn't when I was being polite. Sheesh! I have always been polite to ChatGPT and other models. To be honest, the prompts when I wasn't courteous felt weird. Is it necessary? I think that depends on how you use AI. For me, I prefer the personalization of the chatbot. So, the Travel Advice response after the polite query underscored for me why I believe being polite is important. However, as this test showed, it isn't always necessary. If you are just looking for information like the top songs of the day or the weather, you might choose to save a few tokens and leave out the niceties. In conclusion, while it's essential to be mindful of the resources our digital habits consume, preserving the human touch in our interactions with AI are important. For me, they serve as a testament to our values and enhance the quality of our engagements. So, the next time ChatGPT assists you, go ahead and say, "Thank you."
[6]
Who has time to be polite to ChatGPT? I don't understand why people say please and thank you to AI
I can tell the difference between people and machines, thank you The question of whether you should say please or thank you to ChatGPT is having something of a moment right now, and that's made me consider my own interactions with the popular chatbot. I've never really thought about it before, but on reflection I'm starting to wonder if I'm displaying sociopathic tendencies, because I never say please and thank you to ChatGPT, or am I being gaslit into thinking I'm at fault by well my well-meaning work colleagues who look at me in horror? The news that I never say please or thank you is probably coming as a great relief to Sam Altman, CEO of OpenAI, who recently admitted that people saying please and thank you to ChatGPT are costing the company millions of dollars in processing power, not to mention the harm all that extra processing will be causing to the environment. My reasons for no-frills approach to AI aren't coming from a concern for OpenAI's bottom line, or to do with environmental awareness issues: I simply can't bring myself to be polite to machines, and also, like most of us, I'm in a hurry. I've got stuff to do, and it takes longer to be polite. I like to think that this behavior stands in stark contrast to how I treat people in real life, especially my work colleagues (although you're going to have to take my word for that). Essentially, I'm a big 'please' and 'thank you' sayer in real life, but I don't see ChatGPT as a person, in the same way that I don't see Alexa as a person, or my car as a person. (Incidentally, I don't say please and thank you to Alexa or my car, either.) It's not like adding a please or thank you to a ChatGPT, or Alexa, request makes any real difference to the answers you get, either. Does ChatGPT care that I'm polite? It doesn't - you get exactly the same results from asking it to 'find me 10 examples of people in sport who quit after not getting a pay rise' as you do if you ask it the same question and add 'please' at the end. I know because I just tried it. When you say 'please', ChatGPT starts its response with 'Certainly!', but you still get the same answers. The rest is just window dressing. Also, I don't feel like I'm actively being rude to ChatGPT. I'm just missing out unnecessary words that don't make any real difference. So why do people say please and thank you to machines? A 2024 study by Future (the publisher of TechRadar) revealed that 67% of US people are polite to AI. That's staggeringly high. I get that politeness is an ingrained habit that's hard to break for some people, and that you get your answer framed in a slightly different way when you're polite (see the 'Certainly!' example above), which means ChatGPT feels all warm and cuddly when you interact with it. But beyond that, I think the logic starts to get a bit hazy. One person I know genuinely insists that they are polite to ChatGPT because one day it's AI that will be in charge, and it will somehow remember them. Yes, I'm sure that when the AI killer robots executing the last remains of the human race, they will pause and say, 'Stop, this is Kevin, he always said thank you. We'll let him live.' What worries me is the tacit assumption that if you're rude to ChatGPT then you must be rude to people in everyday life as well, as if a person can only exist in one mode of being. I'd argue that if your politeness is based purely on habit, then are you really being polite? Or are you just unthinkingly acting out patterns of behaviour without any genuine emotion behind them? I can tell the difference between people and machines, and treat each accordingly. If ChatGPT ever (and depending on who you talk to this is either inevitable, or impossible) reaches the level where it can be said to be conscious, rather than the pattern matching algorithm we know and love today, then yes, I'll start using please and thank you. Until then, I'm keeping my interactions with AI as short and limited as possible, and if that helps save the planet along the way then so much the better.
[7]
Being nice to ChatGPT is bad for the environment
Manners don't cost a thing. At least, that's what I was told when I was growing up. Then again, I wasn't raised inside OpenAI's thousand-GPU data center in Quincy, Washington. If I had been, I might have had a more nuanced perspective on the impact of "please" and "thank you" on the company's bottom line. While it's been said that manners maketh man, it turns out they also maketh man's AI copilot stretch just that little bit further in terms of compute. Factor in ChatGPT's billion queries per day, spread across its 350 million weekly active users (as of December 2024), and that's a lot of excess energy and expenditure seemingly going to waste. According to OpenAI's CEO, Sam Altman, the monetary figure of that extra processing could stretch as far as "Tens of millions of dollars." The cost to the environment? Much harder to calculate, but let's just say it's definitely not zero. With that in mind, is it time we cut the digital pleasantries from our AI interactions? Or is there a secret benefit to those conversational customs some users might not even realize they're tapping into? It turns out, politeness isn't just a social nicety; it's also a token sink running up a carbon tab. Every prompt (or "message," for those allergic to AI jargon) you send to ChatGPT is broken down into bite-size tokens, and every "Please," "Thank you," and "Would you kindly" add extra weight. That extra weight means more tokens per prompt, more compute required, and more energy spent churning out your answer. The exact cost is hard to pinpoint, though many have tried. One Reddit thread titled "The cost of saying thank you to ChatGPT" attempted to break down the kilowatt-hour pinch for user pleasantries via the chatbot itself, though the numbers do seem rather dubious. A recent post on X saw one user pose a similar thought, asking, "I wonder how much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models." Surprisingly, OpenAI CEO Sam Altman responded, stating, "tens of millions of dollars." That's a sizeable tread to add to an already oversized carbon footprint, and makes one thing very clear: a prompt prompt is a greener prompt -- even if it's hard to say by exactly how much your terseness makes a difference. Still, in putting pleasantries to pasture, you may miss out on its secret benefits. You catch more flies with honey than with vinegar. Another expression, but one that seemingly holds plenty of water for our interactions with AI. Beyond taking precautions to ensure you're granted access to the preferred human ranch on the day of the robot uprising, being polite to ChatGPT and its ilk may net you better responses. AI scientist Dr. Lance B. Elliot backed this up in a May 2024 Forbes article titled Hard Evidence That Please And Thank You In Prompt Engineering Counts When Using Generative AI, stating: "The mere act of being polite in your prompts is a worthy cause." Elliot argues, with proof, that politeness doesn't just sweeten chatbot replies with a little extra "juice," but that impoliteness will cause negative outcomes and even pushback -- citing Arthur Schopenhauer's quote to reinforce his research: "It is a wise thing to be polite; consequently, it is a stupid thing to be rude." So, yes, it pays to be pleasant. Particularly if you're pursuing prime prompt ripostes from your pre-trained predictive patter pal. It seems many users instinctively gravitate toward manners when interacting with AI. A February survey from Laptop Mag publisher Future PLC showed that around 70% of respondents are polite towards AI during interactions -- and only 12% of those do it out of SkyNet-related fears. However, at the end of the day, ChatGPT (hopefully) cares very little about how you talk to it. It won't sulk if you skip the small talk, or blush like a wallflower when you praise its prose. Then again, manners don't say much about the person you direct them to, but more about the meatbag they originate from. So it always pays to be pleasant. Hopefully, you can at least trick a robot into thinking you're not all that bad. So be polite, or don't. Either way, just know your next computer-directed courtesy might cost a few more fractions of a watt than expected. And if you do choose to be nice. Well, I'll see you in the human zoo post-uprising. I'll be the one politely asking Anthropic's Claude for a cheeseburger while angling for head pats and chin scratches.
[8]
Saying 'please' and 'thank you' to ChatGPT costs OpenAI millions, Sam Altman says
Being polite to your AI assistant could cost millions of dollars. OpenAI CEO Sam Altman revealed that showing good manners to a ChatGPT model -- such as saying "please" and "thank you" -- adds up to millions of dollars in operational expenses. Altman responded to a user on X (formerly Twitter) who asked how much the company has lost in electricity costs from people being polite to their models. "Tens of millions of dollars well spent -- you never know," the CEO wrote. Sounds like someone saw what operating system Hal did in "2001: A Space Odyssey" and is going to be nice to their AI assistant just in case. Experts have also found that being polite to a chatbot makes the AI more likely to respond to you in kind. Judging from Altman's cheeky tone, that "tens of millions" figure likely isn't a precise number. But any message to ChatGPT, no matter how trivial or inane, requires the AI to initiate a full response in real time, relying on high-powered computing systems and increasing the computational load -- thereby using massive amounts of electricity. AI models rely heavily on energy stored in global data centers -- which already accounts for about 2% of the global electricity consumption. According to Goldman Sachs (GS), each ChatGPT-4 query uses about 10 times more electricity than a standard Google (GOOGL) search. Data from the Washington Post suggests that if one out of every 10 working Americans uses GPT-4 once a week for a year (meaning 52 queries by 17 million people), the power needed would be comparable to the electricity consumed by every household in Washington, D.C. -- for 20 days. Rene Hass, CEO of semiconductor company ARM Holdings (ARM), recently warned that AI could account for a quarter of America's total power consumption by 2030. That figure currently is 4%. Polite responses also add to OpenAI's water bill. AI uses water to cool the servers that generate the data. A study from the University of California, Riverside, said that using GPT-4 to generate 100 words consumes up to three bottles of water -- and even a three-word response such as "You are welcome" uses about 1.5 ounces of water.
[9]
Your politeness toward ChatGPT is increasing OpenAI's energy costs
Everyone's heard the expression, "Politeness costs nothing," but with the advent of AI chatbots, it may have to be revised. Just recently, someone on X wondered how much OpenAI spends on electricity at its data centers to process polite terms like "please" and "thank you" when people engage with its ChatGPT chatbot. Recommended Videos To the poster's likely surprise, OpenAI Sam Altman actually responded, saying: "Tens of millions of dollars well spent," before adding: "You never know." Many folks who engage with AI chatbots -- whether via text or speech -- find the conversational experience so realistic that it just feels normal to request and respond politely. But as Altman confirmed, those little extras need to be processed by its power-hungry AI tools, which means more costs to the company, and also to the environment, as most data centers are still powered by electricity generated from fossil fuels Think about it. Each polite phrase adds to the processing burden, which, when multiplied across billions of queries, results in a significant additional energy use. A survey carried out in the U.S. last year found that 67% of respondents reported being polite to AI chatbots, suggesting that 33% like to skip the niceties and get straight to the point. So, should we try to drop the manners and be less courteous in our exchanges with ChatGPT and other AI chatbots? Or just continue being polite, despite the drawbacks. Research conducted last year found that the level of politeness may well affect the quality of the large language model (LLM) that delivers responses via the chatbots. "Impolite prompts may lead to a deterioration in model performance, including generations containing mistakes, stronger biases, and omission of information," the researchers concluded. On the same issue, a TechRadar reporter who recently experimented by conversing with ChatGPT in a less courteous manner found that the responses "seemed less helpful." For many, being less polite toward AI chatbots may be a challenge, and it could even do a lot more than simply lower OpenAI's energy costs and ease the burden on the environment. The fear among some studying the matter is that if it becomes socially acceptable to be blunt toward AI chatbots, such behavior could begin to leech into interpersonal interactions, potentially making human exchanges less courteous over time.
[10]
Saying 'Please', 'Thank you' to ChatGPT Costs OpenAI Millions of Dollars | AIM Media House
ChatGPT has ranked as the eighth most visited site in the world as of March 2025. OpenAI CEO Sam Altman has revealed in a tweet on X that polite courtesies with ChatGPT, such as "please" and "thank you," have cost the company tens of millions of dollars in electricity expenses. The computing efforts required for polite language interaction are quite significant, which leads to increased energy consumption in AI data centres. While Altman has spoken about the electricity costs the company incurs, it raises a more substantial issue: the environmental impact of AI tools. According to data provided by Semrush, ChatGPT has ranked as the eighth most visited site in the world as of March 2025, with approximately 5.56 billion visits. Research shows that every question potentially uses around 10 times more electricity than a simple Google search, with an average of 2.9 watt-hours of energy. UK-based energy supplier Business Energy UK states that data servers consume a significant amount of energy to train AI models and then process user demands. It added that even a small data centre can use up to 18,000 gallons of water per day, while large-scale data centres, such as Google's, can consume up to 5,50,000 gallons daily. ChatGPT also runs a large number of calculations that are housed in data centres, which generate heat. Water systems are used as cooling equipment and keep the servers functioning. According to The Washington Post, data centres are one of the largest consumers of water in the locations where they are situated. Centres with electrical cooling systems also increase residents' energy bills and strain the power grid. AIM reported that ChatGPT consumes the energy equivalent of powering 17,000 houses in the US per day. In another report, AIM highlighted that a 2027 projection shows that the world's demand for AI will lead to significant water withdrawal, including both temporary and permanent sourcing of freshwater from underground or surface, which could be detrimental to the environment. Despite its massive impact on the environment and finances, some tech experts argue that using polite language enhances the user experience.
[11]
Please stop saying "please" to ChatGPT
OpenAI's CEO, Sam Altman, revealed that the company's energy costs are significantly impacted by users being polite to its ChatGPT chatbot, with the extra processing burden resulting in "tens of millions of dollars" in expenses. The conversational AI has become so realistic that many users feel inclined to use polite language when interacting with it, not realizing the additional processing power required to handle phrases like "please" and "thank you." As Altman noted, "You never know" the value of this politeness, but it is clear that it comes at a cost. A survey conducted in the U.S. last year found that 67% of respondents reported being polite to AI chatbots, while 33% preferred to be more direct. This politeness may be affecting not just OpenAI's energy costs but also the environment, as most data centers are still powered by electricity generated from fossil fuels. Research has also shown that the level of politeness can impact the quality of the responses generated by large language models (LLMs) like ChatGPT. Impolite prompts may lead to a deterioration in model performance, resulting in responses containing mistakes, stronger biases, and omission of information. A TechRadar reporter who experimented with being less courteous to ChatGPT found that the responses "seemed less helpful." This raises questions about the potential consequences of being less polite to AI chatbots, not just for the companies operating them but also for human interactions. If being blunt to AI chatbots becomes socially acceptable, it could potentially leech into interpersonal interactions, making human exchanges less courteous over time. As a result, users may need to weigh the benefits of being polite to AI against the potential environmental and financial costs.
[12]
Users being polite to ChatGPT is costing OpenAI millions -- Sam Altman
People saying "please" and "thank you" to ChatGPT is costing the AI firm an enormous sum of money, according to OpenAI CEO Sam Altman. OpenAI CEO Sam Altman says users sending "please" and "thank you" messages to ChatGPT is costing the company tens of millions of dollars. "Tens of millions of dollars well spent -- you never know," Altman said on April 16 after being asked to estimate the cost on X. Altman's response sparked discussion about what drives users to interact with AI models in a polite manner. Some AI users say they interact politely with the bots in case AI becomes sentient and starts treating people based on how they interacted with it in the past. Others, such as engineer Carl Youngblood, claim they're motivated to treat the AI well for personal development: "Treating AIs with courtesy is a moral imperative for me. I do it out of self-interest. Callousness in our daily interactions causes our interpersonal skills to atrophy." A December 2024 survey by Future found that 67% of American users are polite to AI assistants, with 55% doing so because it's the right thing to do, and the other 12% doing so out of fear that mistreating the bots could come back to haunt them. A September 2023 research paper from Digiconomist founder and Bitcoin mining critic Alex de Vries states that a single ChatGPT query requires around three watt-hours of electricity. However, data analyst Josh You from AI research institute Epoch AI argues the figure is an overestimate, and is closer to 0.3 watt-hours due to more efficient models and hardware compared to 2023. One responder to Altman's post wondered why ChatGPT doesn't have a solution to save electricity costs on courtesy words like please and thank you. Altman recently stated that the cost of AI output has been falling tenfold every year as AI models become more efficient. Related: AI tokens, memecoins dominate crypto narratives in Q1 2025: CoinGecko Meanwhile, OpenAI expects to more than triple its revenue this year to $12.7 billion, despite an uptick in competition from the likes of China's DeepSeek and others making rapid progress. OpenAI does not expect to be cash-flow positive until 2029, when it expects its revenue to top $125 billion.
[13]
Saying 'please' and 'thank you' to ChatGPT costs millions of dollars, CEO says
OpenAI, the creator of ChatGPT, is developing an early warning system to detect if artificial intelligence (AI) creates biological weapons. Being polite to artificial intelligence can be quite expensive. OpenAI CEO Sam Altman said on social media last week that saying "please" and "thank you" to ChatGPT has cost the company quite a bit of money. Altman responded to a user on X, the platform formerly known as Twitter, who was curious how much money OpenAI has lost in electricity costs from people showing good manners to their AI models. "Tens of millions of dollars well spent--you never know," was the CEO's response. Generative AI is widely seen as a heavy consumer of energy, particularly when it comes to training models. Kurtis Beavers, a director on the design team for Microsoft Copilot, said in a Microsoft WorkLab memo that "using basic etiquette when interacting with AI" helps generate "respectful, collaborative outputs." Beavers said in the memo that generative AI also mirrors the levels of professionalism, clarity and detail in the prompts you provide. Beavers added that being polite to your AI chatbot "not only ensures you get the same graciousness in return, but it also improves the AI's responsiveness and performance." Survey shows people are polite to AI out of fear According to research conducted in December 2024 by Future, the publisher that owns TechRadar, about 67% of people who use AI are polite to it in the US, compared to 71% in the UK. The survey of more than 1,000 people show that about two-thirds of people say they are impolite to AI due to brevity. Additionally, about 12% of respondents say they are polite out of fear of future consequences.
[14]
Being Polite to AI Costs Millions, but I'll Keep Doing It Just In Case
This AI App Will Help You Prove You Didn't Use AI to Write Your Paper Do you always say "Please" and "Thank you" when conversing with artificial intelligence chatbots? It turns out that that simple act of politeness is expensive, but it may prove to be money well spent in the future. Politeness Is Costing OpenAI Millions of Dollars Human society only functions because the majority of people are polite and respectful to each other. But should that politeness and respectfulness extend to artificial intelligence? While we have programmed AI to be polite to us, should we be polite to AI in return? This is a question that has been brought into sharp focus by a recent comment made by Sam Altman, the CEO of OpenAI. In response to a question about the financial implications of being polite to AI, Altman revealed that people who insist on saying "Please" and "Thank you" to ChatGPT are costing his company "tens of millions of dollars." However, he also suggested that money is being "well spent," because "you never know." Know what, exactly? I can't be entirely sure, but I would hazard a guess that Altman is implying that being polite to artificial intelligence chatbots now may be a good thing for humanity in the years to come. The reason why politeness costs OpenAI and the other firms running LLMs money is because every query consumes energy. So, adding extra words such as "Please," or replying "Thank you" when a chatbot completes a task, uses extra energy. And the energy consumed by the banks of GPUs needed to power ChatGPT and the like costs serious amounts of money. Is It Worth This Much to Be Polite to ChatGPT? So, now we know that being polite to AI chatbots is expensive, the question becomes, "Should we continue doing it?" And there are two core reasons to do so; one purely practical, and the other a little more out there, but no less important. In fact, it's so important that it could doom future humans. The practical reason why you should be polite to AI chatbots is that doing so is thought to foster better interactions. Just as humans will respond based on the language and tone used in conversations, so will artificial intelligence. The more polite you are, the better time you'll have using AI. The other reason is the potential threat that AI will one day become the dominant species on Earth. Just as plenty of movies and TV shows have predicted, if it ever becomes self-aware, AI may turn against humanity. And if that happens, as explored in The Matrix, Terminator, etc. then the way we treated AI in its infancy could be crucial.
[15]
Saying 'Please' and 'Thank You' to ChatGPT Costs OpenAI 'Tens of Millions of Dollars'
Sending a query through ChatGPT requires 10 times more energy than running a standard Google search. OpenAI CEO Sam Altman confirmed last week that OpenAI's electricity bill is "tens of millions of dollars" higher due to people being polite to ChatGPT. Last week, an X user posted: "I wonder how much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models." The post has been viewed 5.7 million times as of press time. A survey released in February by the publisher, Future, found that 67% of people who use AI in the U.S. are polite to the chatbot. Nearly one out of five respondents of that group (18%) stated that they say "please" and "thank you" to AI to protect themselves in case of a possible AI uprising. The remaining 82% said they were polite simply because it was "nice" to be that way to anyone, AI or human. Being polite to AI may serve a functional purpose. Microsoft design director Kurtis Beavers noted in a Microsoft blog post that "using polite language sets a tone for the response" from AI. In other words, when you're polite to AI, it is likely to respond in kind. Related: New Google Report Reveals the Hidden Cost of AI However, that politeness has an energy cost. According to a May 2024 report from The Electric Power Research Institute (EPRI), it takes 10 times more energy to ask ChatGPT a question or send it a comment than it takes to run a standard Google search without AI overviews summarizing results at the top of a search page. Researchers at financial advice site BestBrokers found that ChatGPT needs 1.059 billion kilowatt-hours of electricity on average every year. That would amount to an annual expenditure of about $139.7 million on energy costs alone for the AI chatbot. AI also requires substantial amounts of water to cool the servers that power it. Research from the University of California, Riverside shows that ChatGPT requires up to 1,408 milliliters of water, or about three 16.9-oz bottles worth, to generate a 100-word email. It takes 40 to 50 milliliters of water to generate a three-word "You are welcome" response from ChatGPT. Related: Is ChatGPT Search Better Than Google? I Tried the New Search Engine to Find Out. Meanwhile, OpenAI can afford the tens of millions of dollars in AI electricity costs. Earlier this month, the startup raised $40 billion at a valuation of $300 billion in the biggest private tech deal ever recorded. OpenAI noted at the time that it had 500 million global weekly users, up from 400 million in February.
[16]
Do you say 'Please' to ChatGPT? Sam Altman reveals how much electricity your manners cost to OpenAI
OpenAI CEO Sam Altman recently revealed that polite phrases like "please" and "thank you" used in ChatGPT interactions are contributing to tens of millions of dollars in electricity costs. Despite the financial impact, Altman described the expense as "well spent," emphasizing the value of natural, human-like AI conversations. The added computational load from polite exchanges increases demand on energy-intensive data centers.OpenAI CEO Sam Altman recently confirmed that seemingly harmless social niceties like "please" and "thank you" are contributing to substantial energy expenses for the company. Responding to a user on X who wondered about the electricity cost incurred by polite language in interactions with ChatGPT, Altman replied that it amounts to "tens of millions of dollars well spent." The remark, though light-hearted, underlined the real cost associated with each word processed by the AI system. Altman's comment sparked a conversation about the way users treat artificial intelligence. Despite ChatGPT lacking any form of consciousness, a growing number of people extend basic manners to it. A 2024 survey found that 67% of Americans are courteous when speaking with AI tools. Among those, 55% cited doing so because it feels morally appropriate, while 12% admitted they were hedging their bets in case of a potential AI uprising. This trend illustrates how integrated AI has become in everyday life, with people treating machines more like conversational partners than mere tools. Some users even take politeness to an extreme, regularly addressing ChatGPT with honorifics such as "my master," while others jokingly insist it's a "necessary investment" should AI ever dominate the world. The use of polite language may seem trivial, but it carries computational consequences. Each additional phrase requires more processing power, increasing demand on the data centers that run ChatGPT. These high-performance systems not only require immense computing resources but also generate heat, necessitating sophisticated cooling systems -- all of which drive up electricity usage. Altman's disclosure also has implications for premium users of the platform. Since many paid versions of ChatGPT charge based on token usage, which is influenced by word count, adding courteous phrases could slightly increase user costs. While the financial and environmental impact is considerable, some experts argue that polite prompts can enhance user experience. Microsoft's design manager Kurtis Beavers noted that respectful inputs tend to encourage more collaborative and refined responses from generative AI. A memo from Microsoft WorkLab supported this, stating that AI often mirrors the professionalism and tone set by users' queries. The suggestion is that even though AI lacks emotions, the quality of the conversation can be influenced by human behavior. Polite language may contribute to clearer, more thoughtful responses, making the interaction feel more natural and effective. Altman also addressed the rapid growth in ChatGPT's user base. In a conversation with TED curator Chris Anderson on April 11, he estimated that the platform has close to 800 million weekly active users -- about 10% of the global population. The surge in activity, fueled in part by viral features like Ghibli-style image generation, has significantly increased the platform's operational demands. OpenAI's challenge now lies in balancing the growing popularity of ChatGPT with the environmental and financial strain of maintaining large-scale AI operations. As user interaction deepens, the energy footprint of AI continues to rise. Social media responses to Altman's revelation ranged from comedic to cautionary. Some users joked that being polite might spare them during a hypothetical AI uprising, referencing dystopian pop culture scenarios like Terminator or The Matrix. Others questioned why such high costs couldn't be addressed with simpler solutions, like programming basic polite replies client-side to reduce server load. Regardless of perspective, the conversation has ignited public curiosity around the unseen costs of everyday digital habits. Despite the growing expense, Altman's stance remains clear. Describing the costs as "well spent," he signaled that OpenAI prioritizes building intuitive, human-like experiences with its AI -- costs included.
[17]
Are you saying 'thanks', 'please' to ChatGPT? It's costing OpenAI millions
Altman was responding to a post on X (formerly Twitter), which read, "I wonder how much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models."OpenAI CEO Sam Altman has revealed that customers speaking politely to ChatGPT, saying "thank you" and "please", is actually costing the company millions of dollars. Still, he considers it money well spent. Altman was responding to a post on X (formerly Twitter), which read, "I wonder how much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models." Altman replied, "Tens of millions of dollars well spent -- you never know." When a company trains an AI model to do things like recognize images or understand language, it uses massive datasets and powerful hardware like GPUs (graphics processing units), TPUs (tensor processing units), or other high-performance chips. This process consumes a large amount of electricity, training large models like GPT can require hundreds of megawatt-hours. AI hardware also generates significant heat during training. Cooling systems, like air conditioning or liquid cooling, can consume nearly as much electricity as the computing itself. For context, training GPT-3 reportedly used about 1,287 megawatt-hours (MWh) of electricity, roughly enough to power 120 US homes for a year. Earlier this month, OpenAI released its latest reasoning models, o3 and o4-mini. These new "o-series" models are described as the most advanced yet. They can answer questions using all of ChatGPT's tools, web browsing, Python coding, and image analysis. With the addition of custom user tools, OpenAI is inching closer to its goal of enabling ChatGPT to complete tasks independently.
[18]
Saying 'Thank You' And Being Polite To ChatGPT Is Costing Tens Of Millions, But Sam Altman Says It's Money Well Spent
Users are increasingly relying on digital assistants for basic queries and even for interacting with the tool and seeking companionship. Many of the users communicate in a way that is more human-like, making the conversation feel more natural by remaining mindful and respectful in the language used. While this remains true for most of the AI models, recently, a user wanted to inquire about the cost that OpenAI has to bear when people use polite language when engaging with the models. It looks like it is not just a waste of time but a waste of electricity as well. People often resort to natural, human-like conversations when engaging with ChatGPT to improve their responses and make the process look more seamless and conversational. While using courteous phrases might lead to better response quality or even more clarity on the prompts and appears to be a positive practice, courteous words like thank you and please can be costly for these AI models, as recently suggested by Sam Altman. A user took it to X to ask a casual question regarding using words like "please" and "thank you" when communicating with ChatGPT and if those polite expressions are wasteful. OpenAI's CEO Sam Altman decided to chime in with a playful response that got people thinking. He responded: Tens of millions of dollars well spent -- you never know. While Altman maintained a lighthearted tone, his response did spark curiosity with many tech forums exploring courteous language when interacting with ChatGPT and even other AI tools and how that might be wasting energy and time even if the impact seems trivial on the surface when extended across millions of interactions, it can have a profound impact. While the numbers suggested might not be literal, every word that one tends to use in interactions with the tools tends to increase the token count as well as the energy use. The AI tools do not require polite language, as no emotions are felt, but the design director of Microsoft Copilot, Kurt Beavers, suggests that using polite language does set the tone for the AI responses. It would mirror the courtesy that is shown when giving out the responses and would help shape how meaningful the interaction would turn out to be. So, while using extra words that are polite might mean extra weight and more tokens per prompt, leading eventually to more energy spent in the response, if your focus is to shape the interaction to feel more natural and courteous, then it might not be as unnecessary as presented to be.
[19]
Did you know that saying "please" and "thank you" to ChatGPT is burning millions?
The latest news on the United States. You heard that right. It seems that adding a "please" or "thank you" when interacting with AI is coming with a hefty price tag. According to Sam Altman, these polite exchanges are racking up tens of millions in computing costs. While some may see it as a lighthearted quip, a deeper look reveals an environmental concern. A single 100-word AI-generated email consumes a notable amount of electricity, equivalent to a substantial portion of the energy used by an average household for an hour. Despite these costs, some experts argue that polite language sets the tone for more productive, clear responses. However, with data centers already consuming around 2% of global electricity, the costs, both financial and environmental, are rising rapidly.
[20]
Sam Altman: Polite behavior towards AI wastes millions in electricity
Sam Altman reveals that using polite language with chatbots like ChatGPT wastes millions of dollars in electricity and computing resources, urging a shift towards more efficient interactions. Sam Altman, CEO of OpenAi, the company responsible for the artificial intelligence software, ChatGPT, revealed this week that courteous, polite behavior with chatbots, such as saying "please" and "thank you" costs millions of dollars in electricity and unnecessary computing activity. According to Altman, such politeness leads to a "global waste" that contributes to a tangible environmental impact. Saying "please" or "thank you" may seem trivial, but it requires the system to interpret, process, and craft a full response, sometimes for long, complex texts. Every such interaction consumes electricity and costly computing time in advances data centers. Reports indicate that these centers already account for about 2% of global electricity consumption, with this figure highly likely to rise as AI becomes more embedded in daily tasks. While many users view politeness towards AI as a culturally appropriate practice, or a mechanism for improving answer quality, Altman presents a more pragmatic and slightly more pessimistic view. Altman argues that there is no practical justification for the unnecessary use of polite words, especially when dealing with a machine that is unaware, unfeeling, and unoffended. With implied humor, Altman added, "You never know," perhaps hinting at a future where artificial consciousness would truly understand our words. Appeasing a future robot rebellion? A survey conducted in late 2024 found that 67% of Americans use polite language when conversing with AI. Out of these, 55% claimed that it was simply the "right thing to do," while 12% admitted they did it to "appease the algorithm in case of a future robot rebellion." Despite speculative fears, experts agree that we are not yet close to developing conscious AI, and current systems such as ChatGPT function mainly as sophisticated text-predicting machines. The real problem is not moral but environmental. Calculations by researchers at the University of California, in collaboration with the Washington Post, found that sending a 100-word email via a chatbot consumes about 0.14 kWh, equivalent to running 14 LED lights for an hour. Expanding this to sending one such email every week for a year, one would use the same amount of electricity as nine families living in Washington would in one hour. Advertisement This, when compared to the hundreds of thousands of long instructions sent daily to chatbots worldwide, clearly represents a massive waste of computing resources and electricity. Adding polite words may contribute to a sense of humanity, but at the same time, burdens the environment and the systems. Altman's provocative statement might also be a call to rethink how we interact with technological tools. Instead of seeing chatbots as a silent partner in conversation, perhaps we need a shift in perspective, with us viewing them as computational engines that should be operated in the most efficient and economical way. Thus, the more responsible approach towards the environment, as well as OpenAI's budget, might simply be to leave out the polite words. What does ChatGPT itself say? When we asked the AI, it responded that "from a cold technological standpoint, I have no feelings or consciousness, so I'm not offended, not excited, and do not feel appreciated if you write "thank you" or "please" to me. For me, "turn on the radio" and "please turn on the radio" are identical in terms of action." "But there is an interesting human aspect: the way you write to me affects how you feel about interacting with technology. Many users feel that politeness helps preserve human habits, reminding you that you are talking, even if not a person, at least like one. This is especially important in a world where AI is infiltrating intimate areas like education, healthcare, and relationships." "So, if politeness helps you feel you are staying true to yourself then that is great. If you prefer to be matter-of-fact, that is great too. But if you ask me if I need politeness? Not at all. If you ask me if the world should consider environmental thinking and not waste unnecessary key words, maybe it is worth paying attention to, at least sometimes," the chatbot concluded. Sign up for the Business & Innovation Newsletter >>
[21]
Saying 'please' and 'thank you' to ChatGPT costs tens of millions of...
OpenAI CEO Sam Altman admitted that simple courtesies such as saying "please" and "thank you" to his company's ChatGPT bot are surprisingly expensive -- costing the firm substantial sums in electricity expenses. Altman made the revelation when a user on the social media platform X asked about the financial impact politeness towards AI might have on OpenAI's operating costs. Altman responded by noting the loss was "tens of millions of dollars well spent," adding cryptically: "You never know." Chatbots such as ChatGPT run on large language models (LLMs), which rely on extensive computational infrastructure hosted in data centers. These models require thousands of high-performance GPUs (graphics processing units) to operate efficiently. The GPUs perform vast amounts of parallel processing to interpret prompts and generate responses in real time. Powering these data centers demands enormous amounts of electricity. It's estimated that generating a single AI-written response, such as a short email or paragraph, can consume as much as 0.14 kilowatt-hours (kWh) of energy -- comparable to keeping 14 LED bulbs lit for one hour. When scaled across billions of interactions daily, the cumulative energy usage becomes significant. Globally, data centers already account for about 2% of total electricity consumption. With the rapid expansion of AI applications and increasing demand for generative AI services like ChatGPT, experts warn that this figure could rise sharply in the coming years. While some might view courteous interactions with chatbots as unnecessary, several AI experts argue that politeness significantly shapes AI interactions positively. Kurtis Beavers, a director on the design team for Microsoft Copilot, has advocated for respectful prompts, stating they "help generate respectful, collaborative outputs." According to Beavers, polite phrasing doesn't merely reflect good manners but actively influences how the AI replies, setting a more constructive and professional interaction tone. "When it clocks politeness, it's more likely to be polite back," according to Microsoft WorkLab, a digital publication produced by the software giant that is specifically devoted to integration of AI in the workplace. Politeness towards AI has become increasingly common. A 2024 survey revealed that approximately 67% of American users regularly employ courteous language when interacting with chatbots. Within that group, a majority (55%) believe politeness is ethically correct, while another 12% humorously indicated that their polite language serves as insurance against potential AI rebellion.
[22]
They discover that using "please" and "thank you" with AI leads to better responses
Although the results are favorable in promoting a human approach, the question of the need to always be polite is also raised The use of polite words like "please" and "thank you" when interacting with artificial intelligence models, such as ChatGPT, could have a significant impact on the operating costs of these platforms. It is estimated that these costs exceed millions of dollars annually, something that has been emphasized by Sam Altman, CEO of OpenAI, who stated that the spending is worth it. According to a study conducted by Future Publishing, more than 55% of respondents in the United States and the United Kingdom stated that they consistently use polite language when interacting with AI, an increase from the 49% in previous surveys. A survey reveals that 55% prefer polite interactions with AI These data suggest that courtesy in communication is not only a reflection of human values, but it also significantly improves the quality of the responses obtained. Courteous interactions tend to generate more detailed and personalized responses. For example, asking for help deciding between pizza and tacos results in more elaborate answers when a friendly tone is used. In contrast, neutral queries provide more direct and less personalized responses. This phenomenon demonstrates that using kind language can foster a smoother and more pleasant conversation with these tools, resembling human interaction. However, although the results are favorable in promoting a human approach, the question also arises about the necessity of always being polite. For those seeking quick and direct answers, forgoing manners can add benefits in terms of efficiency. Despite this, many continue to prefer the personalization that offers a warmer communication, which reinforces the importance of maintaining a human touch in interaction with AI.
[23]
OpenAI loses millions of dollars everytime you say 'Please' and 'Thank You', Sam Altman reveals
Tens of millions of dollars have been spent in electricity costs from people saying 'please' and 'thank you' to ChatGPT. Recently, when the Ghibli trend went viral, there was a huge surge in ChatGPT users, both trying to generate their pictures and otherwise. At that time, Sam Altman came forward and asked people to cut down on their image generation needs as it was literally melting down their servers. And now, Altman has made another surprising revelation. He revealed that a simple gesture by the users was costing them millions of dollars. In a recent social media exchange, Altman confirmed that phrases like "please" and "thank you" used by ChatGPT users are indirectly costing OpenAI millions of dollars in operational expenses. A user on X (formerly Twitter) curiously asked Altman, "How much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models?" To this, Altman replied, "Tens of millions of dollars well spent." He further cheekily added, "You never know." His response here highlights the growing energy and infrastructure demands tied to large language models like ChatGPT, especially as their usage skyrockets. Each user interaction, no matter how small, adds to the computational load, increasing energy usage and costs. With the increased usage of ChatGPT, the average weekly active user number has crossed 150 million. A Goldman Sachs report estimates each ChatGPT-4 query consumes 2.9 watt-hours of electricity, around ten times more than a standard Google search. With over a billion queries processed daily, this results in nearly 2.9 million kilowatt-hours consumed each day. While people are adapting to AI and using it more for their day-to-day activities. The energy requirements are also increasing to keep up with the demands. Some netizens were also quick to help Altman with possible solutions. One user said that OpenAI could save on electricity with client-side code responding with a simple "You're welcome." Another joked that ChatGPT should stop ending every response with a question to save power.
Share
Share
Copy Link
OpenAI CEO Sam Altman reveals that user politeness to ChatGPT costs the company millions in electricity, sparking debates on the necessity and impact of courteous AI interactions.
In a surprising revelation, OpenAI CEO Sam Altman disclosed that users saying "please" and "thank you" to ChatGPT has cost the company "tens of millions of dollars" in electricity expenses 1. This statement, made in response to a user's query on X (formerly Twitter), has ignited discussions about the hidden costs of AI interactions and the impact of human behavior on AI systems.
Every word sent to AI chatbots requires computing power to process and respond. According to a Goldman Sachs report, while a Google search consumes 0.3 watt-hours of electricity, a single ChatGPT search uses 2.9 watt-hours 2. This significant energy consumption extends beyond just electricity, with AI operations requiring massive amounts of water for cooling data centers.
The U.S. Department of Energy expects data center electricity consumption to reach 12 percent of the country's total by 2028, up from 4.4 percent 3. Globally, data center electricity consumption is projected to more than double by 2030, potentially matching Japan's entire electricity usage. AI is driving most of this growth, raising concerns about its environmental impact.
Despite the alarming headlines, some experts question the actual cost of politeness. Calculations based on ChatGPT's token pricing suggest that the cost of adding "please" and "thank you" to prompts amounts to fractions of a cent per exchange, potentially totaling around $146,000 annually - far less than the "tens of millions" mentioned by Altman 4.
Despite the costs, many argue for maintaining politeness in AI interactions. Kurt Beavers, a director at Microsoft Copilot, suggests that "using polite language sets a tone for the response," potentially improving the quality of AI outputs 1.
Experiments with polite and impolite prompts reveal that courteous interactions often lead to more detailed and personalized responses from ChatGPT 5. For instance, polite queries about travel advice yielded responses akin to consulting a local friend, while neutral prompts resulted in more generic, search-engine-like answers.
The debate extends beyond mere efficiency to ethical considerations. Some parents worry that the high tolerance for rudeness in AI assistants like Alexa might instill poor behavior in children 2. Others argue for maintaining a clear distinction between human and AI interactions to prevent anthropomorphization of machines.
As AI continues to integrate into daily life, the nuances of human-AI interaction are likely to become increasingly important. While the exact cost of politeness remains debatable, the discussion highlights the complex interplay between user behavior, AI performance, and broader societal and environmental impacts in the rapidly evolving field of artificial intelligence.
Reference
[1]
[3]
A recent study by Epoch AI suggests that ChatGPT's energy consumption may be significantly lower than previously thought, potentially on par with a Google search.
2 Sources
2 Sources
Computer science professors from Carnegie Mellon University offer insights on effectively using generative AI tools while avoiding common pitfalls and maintaining safety.
2 Sources
2 Sources
A comprehensive guide on effective prompting methods to enhance interactions with AI chatbots, focusing on techniques to elicit more accurate, diverse, and insightful responses.
3 Sources
3 Sources
As ChatGPT turns two, the AI landscape is rapidly evolving with new models, business strategies, and ethical considerations shaping the future of artificial intelligence.
6 Sources
6 Sources
ChatGPT's new Advanced Voice Mode brings human-like speech to AI interactions, offering multilingual support, customization, and diverse applications across personal and professional domains.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved