Elon Musk - "In 36 months, the cheapest place to put AI will be space”
Elon Musk - "In 36 months, the cheapest place to put AI will be space”
Podcast2 hr 49 min
Listen to Episode
Note: AI-generated summary based on third-party content. Not financial advice. Read more.
Quick Insights

Consider investing in memory chip makers like Micron (MU), as memory is identified as the biggest bottleneck for scaling AI with prices reportedly "going ballistic." The entire semiconductor sector, including foundries like TSMC (TSM), is also experiencing demand that far outstrips supply, creating a bullish outlook. A major theme for the next few years will be the electricity shortage needed to power AI data centers, creating a systemic risk for hardware growth. This power constraint creates a long-term opportunity in the energy sector, particularly within the gas turbine supply chain, which is reportedly backlogged through 2030. While demand for NVIDIA (NVDA) chips is strong, be aware that this electricity bottleneck could create a growth ceiling for the company towards the end of this year.

Detailed Analysis

AI in Space (Investment Theme)

  • Elon Musk predicts that within 30 to 36 months, space will become the most economically compelling place to put AI data centers.
  • The rationale is based on two key factors:
    • Energy Efficiency: Solar panels in space are about 5 times more effective than on the ground because there is no day/night cycle, no clouds, and no atmospheric energy loss.
    • Cost Savings: This 24/7 solar power eliminates the need for massive, expensive battery systems that terrestrial solar farms require to provide power through the night.
  • This approach also bypasses the significant regulatory, permitting, and land-use challenges associated with building massive power plants and data centers on Earth.

Takeaways

  • This is a long-term, revolutionary investment theme that could reshape the data center and energy industries.
  • It points to a future with massive, sustained demand for:
    • Launch Services: To get the data centers into orbit.
    • Space-Grade Hardware: Including solar panels and radiation-tolerant computer chips.
  • The primary company positioned to enable this vision is SpaceX, which would need to achieve an unprecedented launch cadence.

SpaceX (Private)

  • SpaceX is positioned as the critical enabler for the "AI in Space" vision, with plans to scale its Starship rocket to conduct 10,000 or more launches per year.
  • This would transform SpaceX from a launch company into a dominant AI hyperscaler, similar to Amazon's AWS or Microsoft's Azure, but with infrastructure in orbit.
  • Musk strongly hinted that SpaceX might go public to raise the enormous amount of capital needed for this build-out, noting that the public markets have "at least 100 times more capital" than private markets. He was careful with his words, stating that hyping a company before it goes public can cause delays.

Takeaways

  • An IPO for SpaceX could be a major future event for investors to watch for, driven by the immense capital needs of building an orbital data center network.
  • The company's long-term total addressable market could expand from launch services to the multi-trillion-dollar cloud and AI computing market.
  • This represents a highly bullish, albeit long-term and capital-intensive, vision for the company's future.

Tesla (TSLA)

  • The discussion framed Tesla as a diversified technology company with major initiatives in AI, robotics, and energy.
  • Robotics (Optimus):
    • Musk referred to the humanoid robot as an "infinite money glitch," envisioning a future where robots can build more robots, leading to a "supernova" of economic growth.
    • He stated that the current version, Optimus 3, is the design intended for mass manufacturing, with a target of around 1 million units per year.
    • The AI for the robot will leverage the same principles and technology developed for Tesla's Full Self-Driving, applying it to real-world manipulation tasks.
  • Competition Risk:
    • Musk directly addressed the threat from Chinese competitors, stating that in the absence of breakthrough US innovations like robotics, "China will utterly dominate" manufacturing. He noted the "massive flood of Chinese vehicles" as a sign of their competitiveness.
  • In-House AI Chips:
    • Tesla is developing its own AI chips (AI5 and AI6) for its cars and robots, a key vertical integration strategy to reduce reliance on third-party suppliers.

Takeaways

  • For investors, the key insight is to evaluate Tesla as an AI and robotics company, not just an automotive manufacturer. The Optimus robot represents a potential future growth driver that could be significantly larger than the car business.
  • The intense and growing competition from Chinese EV makers like BYD is a significant and explicitly stated risk factor for Tesla's core automotive segment.
  • Tesla's vertical integration into chip design is a crucial strategic advantage to monitor.

xAI (Private)

  • Musk's AI company is focused on creating a "digital human emulator" capable of performing any task a human can do on a computer.
  • The "secret plan" to win in the competitive AI space is to apply the same methodology Tesla used for self-driving, which involves training on vast quantities of real-world action data.
  • The business model is to first target massive, accessible markets like customer service (estimated as a trillion-dollar industry) before moving up the difficulty curve to more complex digital tasks.

Takeaways

  • xAI's success is presented as being deeply linked to the AI progress and data engine developed at Tesla.
  • The company's strategy is to bypass complex integrations by creating an AI that can use existing software tools just like a human employee, allowing for rapid deployment and revenue generation.
  • The vision is that the revenue potential from digital labor is in the trillions of dollars, making current AI company revenues seem like "rounding errors" in comparison.

NVIDIA (NVDA)

  • NVIDIA is mentioned as the producer of the current state-of-the-art AI chips, like the GB300.
  • Musk's primary insight is that a major bottleneck is emerging. He predicts that towards the end of this year, the limiting factor for AI will shift from the supply of chips to the supply of electricity to power them.
  • He believes AI chips will soon be "piling up" because companies will not be able to secure enough power for their large data center clusters.

Takeaways

  • While long-term demand for NVIDIA's chips remains incredibly strong, a near-term ceiling could be imposed by the slow-moving utility and power generation industries.
  • This electricity shortage is a systemic risk that could affect the entire AI hardware sector's growth rate in the short to medium term.
  • The company that can solve the power problem fastest (which Musk argues is his own) will have a significant advantage.

Semiconductor Sector (TSMC, Samsung, ASML)

  • The demand for advanced chips is described as far outstripping supply. Musk stated that major foundries like TSMC (TSM) and Samsung are building new fabs "as fast as they can," but it's "still not fast enough."
  • He has offered to guarantee the purchase of their output to encourage faster expansion, but notes they are conservative due to historical boom-and-bust cycles in the industry.
  • Because of this, Musk is planning to build his own massive chip factory, a "TerraFab," to produce both logic and memory chips at a scale of "millions of wafers a month."

Takeaways

  • The discussion paints an extremely bullish picture for the entire semiconductor supply chain, from equipment makers like ASML to foundries like TSMC.
  • The demand for AI is so large that it is forcing new, large-scale players to consider entering the capital-intensive business of chip manufacturing.
  • The memory chip sub-sector was highlighted as a particularly acute bottleneck.

Memory Chips (e.g., Micron - MU)

  • Musk identified memory as his "biggest concern actually" when it comes to scaling AI compute.
  • He gave a strong bullish signal by noting that DDR prices are "going ballistic" and that there are popular internet memes about the desperate need for memory chips.

Takeaways

  • This is a direct and powerful endorsement of the investment thesis for memory manufacturers like Micron (MU), Samsung, and SK Hynix.
  • As AI models grow, the need for high-bandwidth memory (HBM) and other advanced memory solutions to support the logic chips will continue to grow exponentially.

Energy Sector (Solar & Gas Turbines)

  • The central theme is that electricity will be the primary constraint for AI growth on Earth over the next few years.
  • Gas Turbines: The ability to build new gas power plants is severely limited by a bottleneck in the supply of specialized turbine blades and veins. The few companies that make these are backlogged through 2030. This affects the ability of turbine manufacturers like GE and Siemens to deliver.
  • Solar Power: Terrestrial solar is a viable path, but its growth in the U.S. is being severely hindered by "gigantic" import tariffs and slow permitting processes. Musk views these tariffs as a major policy error preventing the U.S. from scaling up electricity production quickly.

Takeaways

  • The explosive growth of AI is creating unprecedented demand for electricity, which will be a major tailwind for the entire energy sector.
  • Companies in the gas turbine supply chain have significant pricing power due to extreme supply constraints.
  • The U.S. solar industry faces a conflict: tariffs protect domestic manufacturing but significantly slow down the overall deployment of cheap power needed for AI. Any change in this policy could have major impacts on solar stocks.
Ask about this postAnswers are grounded in this post's content.
Episode Description
In this episode, John and I got to do a real deep-dive with Elon. We discuss the economics of orbital data centers, the difficulties of scaling power on Earth, what it would take to manufacture humanoids at high-volume in America, xAI’s business and alignment plans, DOGE, and much more. Watch on YouTube; read the transcript. Sponsors * Mercury just started offering personal banking! I’m already banking with Mercury for business purposes, so getting to bank with them for my personal life makes everything so much simpler. Apply now at mercury.com/personal-banking * Jane Street sent me a new puzzle last week: they trained a neural net, shuffled all 96 layers, and asked me to put them back in order. I tried but… I didn’t quite nail it. If you’re curious, or if you think you can do better, you should take a stab at janestreet.com/dwarkesh * Labelbox can get you robotics and RL data at scale. Labelbox starts by helping you define your ideal data distribution, and then their massive Alignerr network collects frontier-grade data that you can use to train your models. Learn more at labelbox.com/dwarkesh Timestamps 00:00:00 - Orbital data centers 00:36:46 - Grok and alignment 00:59:56 - xAI’s business plan 01:17:21 - Optimus and humanoid manufacturing 01:30:22 - Does China win by default? 01:44:16 - Lessons from running SpaceX 02:20:08 - DOGE 02:38:28 - Terrafab Get full access to Dwarkesh Podcast at www.dwarkesh.com/subscribe
About Dwarkesh Podcast
Dwarkesh Podcast

Dwarkesh Podcast

By Dwarkesh Patel

Deeply researched interviews <br/><br/><a href="https://www.dwarkesh.com?utm_medium=podcast">www.dwarkesh.com</a>