
Investors should prioritize NVIDIA (NVDA) and AMD as the industry shifts from training models to "inference-time compute," where hardware optimized for real-time reasoning will become the most valuable asset. The emergence of "chain of thought" reasoning suggests a massive, underestimated demand for energy and data center infrastructure to support AI that "thinks" constantly. Look for "AI-first" software companies that integrate reasoning-based models like OpenAI’s o1 to solve complex problems, as traditional SaaS providers face obsolescence from hyper-deflation. Expect a significant market disconnect in 2025, as technological capabilities are projected to grow at an exponential 1,000x pace while most investors only anticipate linear growth. Focus on companies capable of achieving massive scale by leveraging the 40x year-over-year drop in the cost of compute to embed AI into every product.
The discussion highlights a fundamental shift in how AI models are being developed and scaled. The focus has moved from "training time compute" (the initial building of the model) to "inference time compute" (the power used when the AI is actually thinking and generating an answer).
The shift toward inference-time compute and reasoning models changes the requirements for the hardware and energy sectors supporting AI.
The "1,000x increase in capability per unit price" makes previously impossible software applications suddenly viable and affordable.

By @peterdiamandis
Tracking the future of technology and how it impacts humanity. Named by Fortune as one of the “World's 50 Greatest Leaders,” ...