The AI Market is Firming Up Fast
There are 3 major requirements for building a foundational AI model, each of which is becomming harder by the day:
1. Access to compute is getting harder.
It’s a challenge to acquire the best NVIDIA cards to build your model. You’re competing with the budgets of OpenAI, Anthropic, Amazon Web Services (AWS), Microsoft Azure, Google, Meta, and more.
Further, those parties make up an ouroboros of funding, where invested funds quickly come back to investors in the form of cloud compute fees. This allows Amazon, Microsoft, and Google to stomach sky-high valuations for these rounds, as they’re essentially buying marketshare for their cloud offerings. This dynamic is a major barrier standing between you and the latest GPUs.
2. Access to data is getting harder.
Sure, there’s great open datasets from the Common Crawl Foundation and others. But highly valuable and differentiating sources like Reddit, Twitter, and many major publishers were spurred by ChatGPT’s release to revise their terms of service and data agreements. Further, lawsuits by creators against major model makers will hang over your future until they’re settled.
3. Security, privacy, and trust requirements are growing.
This week’s Executive Order states that builders of significantly large models to perform red team testing and other risk assessments prior to their release. Unlike the rest of the order, this requirement falls under the authority of the Defense Production Act, giving it much firmer legal standing. The other Executive Order elements might not live through court challenges, but the guidelines they raise will be hard to fully adhere to with significant investment. (And as we’ve said before: privacy and AI is incredibly hard and potentially not compatible with existing legislation)
The AI market is still young, but the above requirements suggest we might see the leaders of this field locked in earlier than usual. Compared to the years it took for winners to be crowned in social, search, and mobile markets, the AI market might be locked up in a matter of months.
Here’s hoping for a breakthrough in the training of models – allowing them to be built with less compute and less data – enabling more challengers to enter and create a more dynamic field during these formative years.