Why Are Palantir and OpenAI Scared of Alex Bores?
Why Are Palantir and OpenAI Scared of Alex Bores?
Podcast1 hr 32 min
Listen to Episode
Note: AI-generated summary based on third-party content. Not financial advice. Read more.
Quick Insights

Investors should monitor Palantir (PLTR) closely as its heavy reliance on government contracts makes it highly sensitive to political cycles and emerging state-level regulations like New York’s RAISE Act. To hedge against infrastructure bottlenecks, consider increasing exposure to utility and green energy stocks that are positioned to profit from private-sector funding of power grid upgrades for new data centers. Alphabet (GOOGL) remains a long-term play in autonomous transit via Waymo, but investors should expect slower ROI as cities introduce restrictive medallion requirements and labor protections. For those looking at the broader AI sector, prioritize companies that proactively adopt third-party safety audits, as these firms will navigate the inevitable shift toward binding federal and state oversight more efficiently. Be cautious of long-term fiscal risks to high-margin AI firms like OpenAI, as future "token taxes" or "windfall taxes" are increasingly proposed to offset potential white-collar labor displacement.

Detailed Analysis

Palantir (PLTR)

The transcript discusses Palantir's historical role in government data integration and its current stance on AI regulation. Key points include:

  • Historical Context: Originally marketed as a tool to expand government capacity while protecting civil liberties, specifically to prevent government failure that leads to fascism.
  • Shift in Sentiment: Former employees and critics highlight a shift toward supporting more controversial government activities, such as immigration enforcement and deportations under the Trump administration.
  • Internal Friction: Reports of internal dissent regarding the lack of guardrails in contracts with agencies like ICE.
  • Political Influence: Co-founder Joe Lonsdale is mentioned as a donor to the "Leading the Future" super PAC, which opposes AI regulation.

Takeaways

  • Regulatory Risk: Palantir faces potential headwinds from state-level regulations like New York’s RAISE Act, which mandates safety plans and incident reporting.
  • Reputational Volatility: The company’s brand is closely tied to political cycles; a shift in administration can drastically change its contract priorities and internal employee morale.
  • Data Integration vs. AI: Historically, Palantir prioritized "data integration" (organizing data) over "AI" (analysis), suggesting that the foundational work of data cleaning remains a primary value driver before advanced AI can be effective.

OpenAI

The discussion focuses on the tension between OpenAI’s public calls for regulation and the political actions of its leadership.

  • Leadership Dissonance: While CEO Sam Altman publicly supports the democratic process being more powerful than companies, co-founder Greg Brockman is cited as a major donor to a super PAC working to defeat legislators who propose AI regulations.
  • Safety Standards: OpenAI has signed voluntary commitments with the White House, but critics argue they lobby against binding state-level safety standards (like third-party audits).
  • Future Business Model: The transcript suggests OpenAI’s path to profitability relies on replacing white-collar labor and high-cost subscription models.

Takeaways

  • Lobbying Power: OpenAI is becoming a dominant political force, capable of deploying significant capital to influence the legislative environment in which it operates.
  • Market Positioning: The company is moving toward a "utility" model, which typically invites heavy government oversight and price controls in the long term.
  • Labor Displacement: Investors should monitor the "token tax" concept—a proposed tax on AI usage that replaces human labor—as it could impact OpenAI’s future margins.

AI Infrastructure & Data Centers

The transcript highlights a growing bipartisan "AI backlash" that could impact the physical expansion of AI.

  • Data Center Moratoriums: Mentions of proposals by Bernie Sanders, AOC, and Ron DeSantis to restrict or pause data center construction until safety regulations are passed.
  • Energy Grid Constraints: AI expansion is limited by an outdated electric grid. There is a proposal to move "green" data centers to the front of the interconnection queue if they fund grid upgrades.
  • Public Sentiment: Polling shows more Americans are worried about AI than enthusiastic, which may lead to local resistance against new infrastructure projects.

Takeaways

  • Infrastructure Bottlenecks: The "speed to market" for AI companies is increasingly dependent on energy grid capacity and local zoning laws rather than just software breakthroughs.
  • Investment Opportunity: Private capital entering the energy sector to "fast-track" data centers could lead to a modernized, more resilient power grid, benefiting utility and green energy stocks.

Investment Themes & Sectors

AI Regulation (The RAISE Act)

  • Context: New York’s RAISE Act is a blueprint for future state-level oversight, requiring safety plans, incident reporting, and potentially third-party audits.
  • Insight: Companies that proactively adopt "SOC 2-style" AI audits may be better positioned to navigate the coming patchwork of state regulations.

Autonomous Vehicles (Waymo / Alphabet)

  • Context: Discussion of Waymo's expansion in New York City and the potential displacement of taxi/rideshare drivers.
  • Insight: The transition to autonomous fleets is a "question of speed, not yes or no." Regulatory hurdles (like requiring medallions for autonomous cars) could slow ROI for companies like Alphabet (GOOGL).

The "AI Dividend" & Universal Basic Income (UBI)

  • Context: A proposal to give citizens a stake in the AI economy via warrants in AI companies or a token tax on automated labor.
  • Insight: If AI leads to mass job displacement, "windfall" taxes on highly successful AI firms are a significant long-term fiscal risk for investors in the sector.

Education & Pedagogy

  • Context: Generative AI has made traditional take-home essays obsolete, requiring a total overhaul of how writing and critical thinking are taught.
  • Insight: There is a growing market for "AI-proof" educational tools and platforms that can verify human keystrokes or provide personalized, secure tutoring.
Ask about this postAnswers are grounded in this post's content.
Episode Description
Leading the Future, a super PAC whose funders include the founders of companies like Palantir and OpenAI, is spending millions of dollars this election cycle, and a considerable amount of that money is going toward attack ads against Alex Bores – even though Bores himself used to work for Palantir. Bores is a New York state assemblyman who is running for Congress to represent New York’s 12th District. His campaign includes an extensive A.I. policy platform, including demands for A.I. companies to be more transparent about safety, and an idea for an “A.I. dividend” that would redistribute some of the profits of A.I. companies to the public. So his race has turned into a central battleground over the future of the A.I. industry and who has the power to shape it. In this conversation, we discuss how Bores went from working for Palantir to running a campaign that would regulate the A.I. industry, the major issues he thinks A.I. policy needs to address, and his response to the attacks against him. Mentioned: Give People Money by Annie Lowrey “Alex Bores’ AI Policy Framework For Congress” “NY Congressional Candidate Faced Palantir Sexual Comments Claim” by Laura Nahmias “AI populism’s warning shots” by Jasmine Sun Book Recommendations: A Theory of Justice by John Rawls World Eaters by Catherine Bracy Bird by Bird by Anne Lamott Thoughts? Guest suggestions? Email us at ezrakleinshow@nytimes.com. You can find transcripts (posted midday) and more episodes of “The Ezra Klein Show” at nytimes.com/ezra-klein-podcast, and you can find Ezra on Twitter @ezraklein. Book recommendations from all our guests are listed at https://www.nytimes.com/article/ezra-klein-show-book-recs. This episode of “The Ezra Klein Show” was produced by Annie Galvin. Fact-checking by Lori Segal. Our senior engineer is Jeff Geld, with additional mixing by Aman Sahota and Isaac Jones. Our recording engineer is Aman Sahota. Our executive producer is Claire Gordon. The show’s production team also includes Marie Cascione, Michelle Harris, Rollin Hu, Kristin Lin, Emma Kehlbeck, Jack McCordick, Marina King and Jan Kobal. Original music by Pat McCusker. Audience strategy by Shannon Busta and Lauren Reddy. The director of New York Times Opinion Audio is Annie-Rose Strasser. And special thanks to Brianna Johnson. Subscribe today at nytimes.com/podcasts or on Apple Podcasts and Spotify. You can also subscribe via your favorite podcast app here https://www.nytimes.com/activate-access/audio?source=podcatcher. For more podcasts and narrated articles, download The New York Times app at nytimes.com/app. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
About The Ezra Klein Show
The Ezra Klein Show

The Ezra Klein Show

By New York Times Opinion

Ezra Klein invites you into a conversation on something that matters. How do we address climate change if the political system fails to act? Has the logic of markets infiltrated too many aspects of our lives? What is the future of the Republican Party? What do psychedelics teach us about consciousness? What does sci-fi understand about our present that we miss? Can our food system be just to humans and animals alike? Unlock full access to New York Times podcasts and explore everything from politics to pop culture. Subscribe today at nytimes.com/podcasts or on Apple Podcasts and Spotify.