The Cap Table — Pre IPO Podcast — Episode 5
The Cap Table — Pre IPO Podcast — Episode 5
YouTube26 min 24 sec
Watch on YouTube
Note: AI-generated summary based on third-party content. Not financial advice. Read more.
Quick Insights

For the most direct exposure to the AI boom, maintain a core position in NVIDIA (NVDA), as it remains the "gold standard" for data center infrastructure and holds a dominant market ecosystem. Shift your focus from high-priced AI model providers like OpenAI to "Neo-Cloud" infrastructure plays like CoreWeave and Lambda, which are currently undervalued relative to their massive revenue growth. Keep a close watch for the upcoming Cerebras IPO and monitor Groq, as the market shifts from training models to the multi-billion dollar "inference" sector. Consider private or secondary market exposure to Base 10, a "hidden gem" optimizing GPU efficiency that is already generating nearly $1 billion in revenue. For a long-term "picks and shovels" play, invest in a basket of small nuclear reactor companies to capitalize on the massive electricity demands of next-generation AI data centers.

Detailed Analysis

NVIDIA (NVDA)

• The speaker emphasizes that NVIDIA is the primary source of the world's AI chips and holds an extremely dominant market position. • NVIDIA often sits on the "cap table" (as an investor) of the smaller AI infrastructure companies they supply, creating a powerful ecosystem. • Mentioned as the "gold standard"; data center providers are hesitant to buy competing chips (like Google's TPU) because they fear losing their allocation of NVIDIA GPUs.

Takeaways

Pure Play Exposure: If you want the simplest exposure to the AI boom, identify the companies that are successfully securing NVIDIA chips at scale. • The "Safe" Choice: In the enterprise world, "no one gets fired for buying NVIDIA." This suggests continued dominance in the near term despite emerging competitors.


Lambda & CoreWeave

• These are "Neo-Cloud" providers that specialize in providing GPU compute, often with direct relationships with NVIDIA. • Lambda is specifically noted for its potential to deliver compute "locally" (physical server stacks for homes or offices) to bypass high cloud costs. • These companies are currently valued in the $7 billion to $8 billion range, which the speaker believes is undervalued compared to their revenue growth.

Takeaways

Infrastructure over Models: The speaker suggests focusing on these infrastructure providers rather than the famous AI model companies (like OpenAI). • Valuation Gap: There is a perceived opportunity for these companies to grow from sub-$10 billion valuations to $40–$100 billion over the next 5–10 years.


Base 10

• A private company that acts as a "meta-layer" for AI compute. • It uses an API to route AI tasks to various data centers based on cost and efficiency (utilizing idle GPUs). • Reported to be doing nearly $1 billion in revenue with a secondary market valuation of approximately $4.7 billion.

Takeaways

Efficiency Play: As GPU utilization is a "hard problem" (currently only ~40% efficient in some labs), companies that optimize this usage are highly valuable. • Hidden Gem: The speaker highlights this as a prime example of an "incredible risk-adjusted return" because it is doing massive revenue but remains largely unknown to the general public.


OpenAI & Anthropic (Claude)

• Discussed as the leading "model providers." • The speaker notes a shift in the market: companies may "pick a partner" for specific workflows (e.g., using Claude for coding and ChatGPT for other tasks). • Risk factor: If a model provider runs out of compute, they lose revenue "in perpetuity" as customers quickly switch to competitors like XAI or ChatGPT.

Takeaways

Wait for the IPO: The speaker advises against buying these in private secondary markets. Because they are already valued so high, the "upside" might only be 30%. • Public Market Strategy: Recommendation is to wait and buy these in a standard brokerage account (like Schwab or Fidelity) once they go public, rather than dealing with the fees and risks of private equity.


Emerging Semiconductor & Hardware Opportunities

Cerebras: Mentioned as a semiconductor company for "inference" (running AI models) that is expected to IPO very soon. • Groq: Highlighted as a key player in the inference chip space. • Positron: A company the speaker is watching closely for its work in inference compute. • Apple (AAPL): Noted as a hardware play; early adopters are already buying Mac Minis and Mac Studios to run AI locally rather than paying for cloud subscriptions.

Takeaways

The Shift to Inference: While training models was the first phase, "inference" (the actual daily use of the models) will be a massive, multi-hundred-billion-dollar sector. • Local Compute: There is a growing trend of "in-housing" compute. Investors should look for companies enabling users to run AI on their own hardware to avoid "compute inflation."


Investment Themes & Sector Insights

Compute as the "Next Oil"

• Compute is described as a finite, highly valuable resource. Unlike oil, we are currently "running out" of available compute, creating a massive supply-demand imbalance. • Action: Look for "AI Infrastructure" as a primary investment theme.

Human Calories to Computer Tokens

• Companies like Coinbase and Block are reducing headcount due to AI efficiencies. • Insight: The capital previously spent on human salaries is being redirected into "Compute" budgets. This is a fundamental shift in corporate spending that favors infrastructure providers.

Small Nuclear Reactors (Fission)

• AI data centers require massive amounts of electricity. • Insight: For investors with "guts and patience," small nuclear fission companies are a long-term play to power the AI revolution.

The "Next Mag 7"

• The speaker believes we are seeing the formation of a new "Magnificent Seven" group of companies centered around AI infrastructure. • Strategy: Instead of trying to pick one winner, buy a "basket" of these infrastructure and semiconductor companies.

Ask about this postAnswers are grounded in this post's content.
Video Description
On this episode Aaron Dillon and Aaron Ross break down: OpenAI may have missed its revenue target, but is still leading in compute capacity. Anthropic is still bottlenecked on compute. Could compute capacity be the key advantage that drives its next phase of revenue growth? XAI's Colossus is currently using only 11% of its 550,000 NVIDIA GPUs (just ~60k active) compared to Meta & Google (hitting 43-46%.) Is this SpaceX's secret revenue weapon? Is Compute the new Oil? What companies is Aaron Dillon excited about.
About Aaron Ross
Aaron Ross

Aaron Ross

By @aaronrosspreipo