OpenAI, the creators of ChatGPT, has been on a tear since publically releasing the generative AI tool last year.
The company value has skyrocketed in a hype cycle over all things AI.
CEO Sam Altman is reportedly going to the investment well again, this time with an estimated corporate value of $80 billion.
This makes the company the third largest unicorn company (a startup valued at over $1 billion) in history behind Chinese company ByteDance and Elon Musk’s SpaceX.
While all things generative AI are hot, they are also extremely expensive to create.
Last year, OpenAI had losses in excess of half a billion dollars due to the computing power needed to build the training model that underlies ChatGPT.
Microsoft, OpenAI’s biggest data center partner, pays upwards of $800 million just for the power to run their data centers.
With thousands of computers powered by Nvidia’s very expensive AI chips, the only way that OpenAI could afford to build out the infrastructure needed to grow its models was to accept funding from one of the big tech companies.
Microsoft bought in for $13 billion for 49% of the company and OpenAI got the use of its AI data centers.
Microsoft reportedly has 150 such data centers already with 80 to 100 more in the works.
The differences between last year’s OpenAI offering 3.5 versus this year’s version 4.0 points to the problem.
ChatGPT 3.5 was trained on 175 billion parameters and provides very acceptable and impressive answers in response to questions and prompts, which seems antiquated compared to the newer version.
However, the 4.0 version released this year expanded the training parameters tenfold, with over 1 trillion parameters resulting in a much wider and deeper ability to respond.
To reach the goal of general AI, OpenAI will have to increase those parameters exponentially several times over (it is guessed).
That kind of exponential growth can only happen with a lot more investment dollars.
Leave a Reply