ChatGPT’s potential to cost $700,000 per day to operate

  • ChatGPT’s operational expenses exceed training costs 
  • Microsoft’s own chips reduce the cost of running generative AI models 
ChatGPT’s potential to cost $700,000 per day to operate

OpenAI's chatbot, ChatGPT, could cost up to $700,000 a day to operate due to its expensive tech infrastructure, according to Dylan Patel, the chief analyst at semiconductor research firm SemiAnalysis. Patel explained that the high cost is primarily due to the expensive servers required to run the AI. While the cost of training ChatGPT's large language models is already in the tens of millions, operational expenses far surpass training costs when deploying the model at scale, said Patel and another researcher at SemiAnalysis, Afzal Ahmad. In fact, inference costs of the chatbot surpass the training expenses on a weekly basis, they added.

Companies utilizing OpenAI's language models, such as the AI dungeon game developer Latitude, have been paying unduly high prices for years. The CEO of Latitude, Nick Walton, revealed that running the model cost the company $200,000 a month in 2021, in addition to payments to Amazon Web Services servers, as it answered millions of user questions. As a result, Walton decided to change to a language software provider supported by AI21 Labs, which cut his company's AI expenses in half to $100,000 a month.

Intending to decrease the cost of running generative AI models, Microsoft has reportedly been working on an AI chip called Athena since 2019. The chip might be announced for inner use by Microsoft and OpenAI beginning next year, according to sources familiar with the matter. Microsoft acknowledged that it was falling behind Google and Amazon in its attempts to construct its own in-house chips and was seeking for cheaper variants to Nvidia's chips, which its AI models currently run on. Over 300 Microsoft employees are now reportedly working on the chip.

Summing up, OpenAI’s ChatGPT nowadays is the most powerful technology. Its popularity is gaining momentum. It assists people in seeking for best answers to their requests. No wonder why it uses expensive servers required to run the AI. However, switching to the company’s own chips can be a significant risk due to its previous falling behind its competitors.

You can read about Apple’s M-series chips that give Mac devices functionality more capabilities.

Besides, we wrote about Musk’s intentions to build Truth ChatGPT that might annihilate humans.

Also, in our previous news, we mentioned Qualcomm which presents new solutions to boost the IoT ecosystem.

Nataliia Huivan
Nataliia Huivan
Professional author in IT Industry

Author of articles and news for Atlasiko Inc. I do my best to create qualified and useful content to help our website visitors to understand more about software development, modern IT tendencies and practices. Constant innovations in the IT field and communication with top specialists inspire me to seek knowledge and share it with others.

Share your thoughts in the comments below!

Have any ideas or suggestions about the article or website? Feel free to write it.

Any Questions?

Get in touch with us by simply filling up the form to start our fruitful cooperation right now.

Please check your email
Get a Free Estimate