S32’s Wesley Tillu: Investing in the Age of AI Acceleration
AI adoption is accelerating within enterprises, creating a growing opportunity for application software companies....
At Lux, my investment thesis centers around the computational sciences, or companies leveraging disruptive and new technology innovations powering enterprise infrastructure. Specifically I’m excited about AI/ML/data infrastructure and applications, especially in financial services and healthcare, and blockchain or network infrastructure.
There are a lot of exciting ‘why nows’ that have been compounding not just for the past few months, but for several years which has caused 2022 to be an exciting inflection point for the industry.
Computers are powerful enough to run large models, there is an explosion of massive amounts of training data - both organically available and machine generated. Processing power, specifically GPUs developed by NVIDIA, have improved meaningfully in performance. We’ve also seen compounding advances in neural networks, starting with Google’s Transformer network architecture paper in 2017 and a growing maturation of the MLOps stack including companies like Hugging Face.
It’s now possible to generate superior quality models that are both more parallelizable – requiring significantly less time to train – and more easily be customizable to specific domains. Costs to train image-classification systems have decreased, and we’ve seen large language models both improve and be democratized, particularly on complex language tasks, making them even more ‘human-like’ with large language models like GPT-3 and Stable Diffusion.
Q: What is the time horizon for adoption of the trends that underline this thesis?
A: “The time horizon is now. We are already seeing massive Fortune 500 companies implementing ML within their organizations, increasing sophistication of AWS, GCP & Azure ML tooling, compute, and storage.
While it will likely take a few years for large language models to deeply penetrate every use case possible, within the next year or two, I predict we will see wide scale large language model adoption and within 3 to 5 years, I expect to see emergence of a highly sophisticated AI/ML infrastructure and compute stack that can be easily adopted by any organization at scale."
Q: What is something that other market participants often misunderstand about this category?
A: "One misunderstanding is that there are multiple buyers and revenue streams for generative AI applications. The market is still early and there aren’t massive budgets earmarked for generated AI plays yet, with many enterprises still figuring out how to leverage and deploy the technology internally."
We will see players at the application layer create products that leverage existing, larger models and build out derivative products via specialized datasets. Few players within the ecosystem will need to train language models from the bottom-up themselves. “In reality, only bespoke use cases will train models from scratch. Instead, many companies will just retrain a much smaller proportion of data for their use cases. LLMs can be pruned smaller, which makes them less cumbersome to deploy, but can still work 90% as well at a smaller scale. Also, there’s a lot of really exciting AI & ML use cases that don’t need to leverage large language models or unsupervised learning, especially in financial services and healthcare that are already in production today,” says Isford.
AI adoption is accelerating within enterprises, creating a growing opportunity for application software companies....
Carli Stein, Investor at M12 focused on AI startups across the stack, explores how smaller language models, robust security tools, and improved testing methodologies will enable broader AI adoption....
Vertical AI SaaS presents a prime opportunity for startups, fueled by increasing customer dissatisfaction with legacy tools. Karthik Ramakrishnan, Partner at IVP, assesses opportunities for startups to disrupt traditional workflows that offer tremendous market potential....