
Investors should closely monitor upcoming IPO news for leading AI companies OpenAI and Anthropic, which could be some of the largest public offerings in history. Google's (GOOGL) new "personal intelligence" feature in Gemini creates a powerful competitive moat, strengthening the long-term investment case for the stock. Microsoft's (MSFT) partnership with Anthropic is a prudent move that de-risks its reliance on a single partner, reinforcing its position as a core AI holding. The massive Cerebras deal highlights soaring demand for specialized AI inference chips, a key growth area to watch within the semiconductor sector beyond just Nvidia (NVDA). Be prepared for significant post-IPO volatility in new AI stocks, which could present future buying opportunities for patient investors.
• The podcast highlights a report from the New York Times suggesting that leading AI companies OpenAI and Anthropic, as well as SpaceX, are taking early steps toward going public. • These could be some of the largest IPOs in history, with potential valuations that could create "multiple trillion-dollar IPOs" this year. • For most retail investors, this would be the first opportunity to invest directly in pure-play companies developing foundational AI models, as opposed to investing in large public tech companies that have AI divisions. • A historical comparison was made to the Facebook (META) IPO in 2012, which was initially considered a "fiasco" with the stock falling significantly before eventually recovering and becoming a massive success. This serves as a reminder of potential post-IPO volatility.
• Monitor IPO News: Investors interested in direct AI exposure should closely watch for official announcements from OpenAI and Anthropic. • Expect Hype and Volatility: Given the intense interest in AI, these IPOs will likely be surrounded by significant hype. Be prepared for potential price volatility in the months following the public offering, similar to what was seen with other major tech IPOs. • A New Investment Category: The arrival of these companies on the public market will establish a new, distinct investment category of "pure-play AI," which will likely dominate financial discussions.
• Google has introduced a major new feature for its AI called "personal intelligence" in the Gemini app. • This feature connects Gemini to a user's personal data across Google's ecosystem, including Gmail, Google Photos, YouTube, and Search. • The podcast describes this as Google's "AI moat nobody can replicate," as it leverages over a decade of a user's digital life to provide highly personalized AI assistance. • Examples of its use include Gemini accessing your emails and photos to understand your car's make and model to recommend new tires, or using your travel dates from Gmail to give personalized trip suggestions.
• Bullish on Competitive Advantage: This deep integration of personal data is a significant competitive advantage for Google in the consumer AI race. If personalization becomes the key factor for user adoption, Google is positioned extremely well. • Potential for Increased Engagement: By making Gemini more practically useful for day-to-day life, Google could see a significant increase in user engagement and solidify its position as a daily-driver AI for many consumers. This strengthens the long-term investment case for Google's AI strategy.
• Microsoft has been quietly deepening its partnership with Anthropic, one of OpenAI's main competitors. • Microsoft is now using Anthropic's models (like Claude Sonnet 4.5 and Claude Opus 4.1) to power features within GitHub Copilot and its broader Copilot productivity suite. • The report mentioned that Microsoft's spending with Anthropic started at over $40 million per month, indicating a significant financial and strategic commitment. • This move represents a diversification away from its previously exclusive-seeming partnership with OpenAI.
• Strategic Diversification is a Plus: Microsoft is not putting all its eggs in the OpenAI basket. By integrating models from Anthropic, it can use the best tool for specific jobs (e.g., coding, complex Excel tasks), reducing its reliance on a single partner and strengthening its product offerings. • Reduced Risk: This multi-supplier strategy de-risks Microsoft's overall AI ambitions. For investors, this shows prudent management and a commitment to maintaining a competitive edge by leveraging the entire AI ecosystem, not just one part of it.
• Apple is positioned as a key player in the "battle for personal context" due to its vast and unique data sources. • Apple has access to data that Google does not, most notably a user's iMessages, which can contain gigabytes of valuable personal context. • The company's ownership of the hardware ecosystem, especially devices like AirPods, gives it a potential entry point for AI to interact with the physical world, a different type of context that others lack. • However, the podcast notes that Google's Gemini is now set to power Apple Intelligence, which complicates the competitive landscape and raises questions about the progress of Apple's in-house AI models.
• Untapped Potential: Apple holds powerful assets for the AI race with its hardware ecosystem and unique data. The key for investors is to watch how, and if, Apple successfully leverages these assets. • Partnership with Google is Key: The reliance on Gemini is a critical factor to monitor. It could be a pragmatic move to quickly bring best-in-class features to users, or it could signal that Apple is lagging in its own model development. The outcome will significantly impact Apple's long-term competitiveness in AI.
• The podcast highlights a massive $10 billion, 3-year deal between AI chip startup Cerebras and OpenAI. • Cerebras will provide OpenAI with compute power specifically for AI inference, which is the process of running an already-trained AI model to generate responses. The host notes, "Revenue is made at inference." • This deal, along with the mention of Nvidia (NVDA) acquiring another chip company (Grok), points to a growing trend of using specialized chips designed for speed and efficiency in specific AI tasks.
• The Market is Bigger Than Nvidia: While Nvidia dominates the AI chip market, this major deal for Cerebras proves that there is significant room for specialized competitors. • Focus on Inference: The emphasis on inference suggests that as AI models become more widespread, the demand for fast and cost-effective inference chips will soar. This is a key growth area within the semiconductor industry. • Watch for Emerging Players: Investors should pay attention to private, specialized chip companies like Cerebras. They represent the next wave of innovation in AI hardware and could become major players or prime acquisition targets in the future.

By Nathaniel Whittemore
A daily news analysis show on all things artificial intelligence. NLW looks at AI from multiple angles, from the explosion of creativity brought on by new tools like Midjourney and ChatGPT to the potential disruptions to work and industries as we know them to the great philosophical, ethical and practical questions of advanced general intelligence, alignment and x-risk.