10 billion dollars, then 6.6 billion… OpenAI, the parent company of ChatGPT, is raising funds at lightning speed and its valuation now exceeds 150 billion dollars. But this is never enough and more cash injections will soon be needed. The race for generative AI is currently costing enormous sums of money without bringing in any money. And it’s not about to get better. Explanations.

Very expensive AI

In a fascinating article, our colleagues from New York Times explored this issue. According to OpenAI’s estimates, by 2029, no less than $37.5 billion in annual spending will be needed to train and run its various AI products, compared to $5.4 billion currently.

We don’t have to look far for the reasons for these exorbitant costs. Indeed, these technologies do not fall from the sky. To work, language models must have computing power from gigantic data centers.

These spaces house crazy quantities of computers equipped with thousands of the latest electronic chips, each of which can cost more than $30,000. And their price is all the higher as the demand rages in this market, while the Tech giants seek to give themselves a head start on the competition.

We must add to this problem the high cost of these data centers in terms of energy, which further increases the bill in addition to posing concrete sustainability problems. The American daily also rightly specifies that these chips spend months performing calculations which allow ChatGPT to identify patterns in all this data. Therefore, the price of each training cycle can reach hundreds of millions of dollars!

The worst is yet to come

It goes without saying that free users of OpenAI’s AIs cost it quite a bit of money, as they are not yet monetized. That said, the company says it is considering adding advertising, even if it has not yet made its decision. However, while the company charges its customers 20 euros per month for the premium subscription, this still fails to make the initial investment profitable.

Things are unlikely to improve as OpenAI is eyeing models such as o1. Launched at the end of the year, the latter “thinks” before responding. He therefore explores the many possibilities before generating content that is supposedly more reliable than what he would produce without this “thinking” time. This feature requires even more computing power. Enough to further inflate the company’s expenses.

Shares:
Leave a Reply

Your email address will not be published. Required fields are marked *