
For the most direct exposure to the AI boom, maintain a core position in NVIDIA (NVDA), as it remains the "gold standard" for data center infrastructure and holds a dominant market ecosystem. Shift your focus from high-priced AI model providers like OpenAI to "Neo-Cloud" infrastructure plays like CoreWeave and Lambda, which are currently undervalued relative to their massive revenue growth. Keep a close watch for the upcoming Cerebras IPO and monitor Groq, as the market shifts from training models to the multi-billion dollar "inference" sector. Consider private or secondary market exposure to Base 10, a "hidden gem" optimizing GPU efficiency that is already generating nearly $1 billion in revenue. For a long-term "picks and shovels" play, invest in a basket of small nuclear reactor companies to capitalize on the massive electricity demands of next-generation AI data centers.
• The speaker emphasizes that NVIDIA is the primary source of the world's AI chips and holds an extremely dominant market position. • NVIDIA often sits on the "cap table" (as an investor) of the smaller AI infrastructure companies they supply, creating a powerful ecosystem. • Mentioned as the "gold standard"; data center providers are hesitant to buy competing chips (like Google's TPU) because they fear losing their allocation of NVIDIA GPUs.
• Pure Play Exposure: If you want the simplest exposure to the AI boom, identify the companies that are successfully securing NVIDIA chips at scale. • The "Safe" Choice: In the enterprise world, "no one gets fired for buying NVIDIA." This suggests continued dominance in the near term despite emerging competitors.
• These are "Neo-Cloud" providers that specialize in providing GPU compute, often with direct relationships with NVIDIA. • Lambda is specifically noted for its potential to deliver compute "locally" (physical server stacks for homes or offices) to bypass high cloud costs. • These companies are currently valued in the $7 billion to $8 billion range, which the speaker believes is undervalued compared to their revenue growth.
• Infrastructure over Models: The speaker suggests focusing on these infrastructure providers rather than the famous AI model companies (like OpenAI). • Valuation Gap: There is a perceived opportunity for these companies to grow from sub-$10 billion valuations to $40–$100 billion over the next 5–10 years.
• A private company that acts as a "meta-layer" for AI compute. • It uses an API to route AI tasks to various data centers based on cost and efficiency (utilizing idle GPUs). • Reported to be doing nearly $1 billion in revenue with a secondary market valuation of approximately $4.7 billion.
• Efficiency Play: As GPU utilization is a "hard problem" (currently only ~40% efficient in some labs), companies that optimize this usage are highly valuable. • Hidden Gem: The speaker highlights this as a prime example of an "incredible risk-adjusted return" because it is doing massive revenue but remains largely unknown to the general public.
• Discussed as the leading "model providers." • The speaker notes a shift in the market: companies may "pick a partner" for specific workflows (e.g., using Claude for coding and ChatGPT for other tasks). • Risk factor: If a model provider runs out of compute, they lose revenue "in perpetuity" as customers quickly switch to competitors like XAI or ChatGPT.
• Wait for the IPO: The speaker advises against buying these in private secondary markets. Because they are already valued so high, the "upside" might only be 30%. • Public Market Strategy: Recommendation is to wait and buy these in a standard brokerage account (like Schwab or Fidelity) once they go public, rather than dealing with the fees and risks of private equity.
• Cerebras: Mentioned as a semiconductor company for "inference" (running AI models) that is expected to IPO very soon. • Groq: Highlighted as a key player in the inference chip space. • Positron: A company the speaker is watching closely for its work in inference compute. • Apple (AAPL): Noted as a hardware play; early adopters are already buying Mac Minis and Mac Studios to run AI locally rather than paying for cloud subscriptions.
• The Shift to Inference: While training models was the first phase, "inference" (the actual daily use of the models) will be a massive, multi-hundred-billion-dollar sector. • Local Compute: There is a growing trend of "in-housing" compute. Investors should look for companies enabling users to run AI on their own hardware to avoid "compute inflation."
• Compute is described as a finite, highly valuable resource. Unlike oil, we are currently "running out" of available compute, creating a massive supply-demand imbalance. • Action: Look for "AI Infrastructure" as a primary investment theme.
• Companies like Coinbase and Block are reducing headcount due to AI efficiencies. • Insight: The capital previously spent on human salaries is being redirected into "Compute" budgets. This is a fundamental shift in corporate spending that favors infrastructure providers.
• AI data centers require massive amounts of electricity. • Insight: For investors with "guts and patience," small nuclear fission companies are a long-term play to power the AI revolution.
• The speaker believes we are seeing the formation of a new "Magnificent Seven" group of companies centered around AI infrastructure. • Strategy: Instead of trying to pick one winner, buy a "basket" of these infrastructure and semiconductor companies.

By @aaronrosspreipo