OpenAI: Consumer & Cloud
Sam Altman, partway into his understated keynote at the first ever OpenAI DevDay:
Even though this is a developer conference, we can’t resist making some improvements to ChatGPT.
Before this moment, I only fuzzily understood that OpenAI operates in two separate but related markets. They have:
-
A consumer product called ChatGPT. This includes mobile apps, websites, secondary models like DALL·E 3, and integrated tools like the web browser and code interpreter. ChatGPT, used by 100M weekly actives, costs $20/month. Despite Monday being “DevDay”, OpenAI launched major new consumer-facing features. Most notably, it now allows anyone to build custom “GPTs”.
-
A growing suite of cloud services. Before OpenAI DevDay, I might have referred to this as the “OpenAI API” — a small piece of code that interfaces with OpenAI’s foundation models. Now, this seems too simplistic as the API quickly evolves into multiple value-add (and margin-add!) services like the Assistant API. This is similar to how AWS adds value (and margin) on top of its few core services. OpenAI’s cloud, with 2M registered developers, undoubtedly generates significant revenue.
Finding oneself in two very different product and market segments — suddenly and at scale — is no small feat! Yes, these markets utilize the same underlying technologies. However, their pricing models, sales strategies, and value propositions are quite different and are likely to diverge over time. This isn’t exactly unheard of in the tech industry, but it’s a significant undertaking for a company that’s been around for less than a decade.
Sam Altman, with charming understatement:
About a year ago […] we shipped ChatGPT as a “low-key research preview”… and that went pretty well.
Very few consumer services reach ChatGPT’s scale. In under a year, OpenAI accidentally defined a new consumer product category and became its heavyweight. I don’t think we need to look any further to find the future of consumer chatbots. Especially with the introduction of custom GPTs, it’s hard to see much room for creating differentiated consumer-facing chat services. Instead, developers will need to bring their unique data assets and external compute capabilites to the ChatGPT interface. The Canva demo is a compelling example: Canva is a decacorn but the expectation is that users will still start in ChatGPT.
As for the growth of a new kind of cloud, focused not on virtual compute and object storage but instead on foundation models, we should expect to see many more value-add services in the future. The logic behind this year’s launches seems straightforward: OpenAI simply observed the emerging architectural patterns behind LLM applications and implemented versions that are better and easier to use, readily available from the same vendor that provides the LLM itself. Like AWS, OpenAI is the heavyweight in this emerging cloud segment. While there will always be room for smaller players along competitive axes like model customization and privacy, and I hope we see plenty of innovation there, the default place to start building services that need LLMs will probably be OpenAI for the foreseeable future.