The market's rapid pivot from AI's promised benefits to its disruptive potential reveals a fundamental misunderstanding of technological adoption. While many focus on immediate gains or the fear of deflation, the real story lies in the non-linear progression of AI capabilities and the critical, often overlooked, infrastructure bottlenecks that will shape its deployment. This conversation, featuring insights from Stephen Byrd and Josh Baer, highlights how companies quantifying AI benefits are setting a new standard, while also exposing the hidden costs and systemic shifts that rapid advancement entails. Investors and business leaders who can look beyond the immediate hype and anticipate these downstream consequences, particularly around infrastructure and labor, will gain a significant competitive advantage. This analysis is crucial for anyone navigating the complex landscape of AI investment and strategic planning.
The Double-Edged Sword of AI's Tangible Wins
The narrative surrounding Artificial Intelligence has shifted with startling speed. What began as a focus on quantifiable adoption benefits and impressive stock performance has rapidly morphed into widespread concern about dramatic deflation and disruption. Stephen Byrd observes this rapid pivot, noting, "The mapping work suggests significant benefits. But the market is fast forwarding to very powerful AI that is very disruptive and deflationary, and that's been a surprise to me." This anxiety, however, risks obscuring the deeper, more nuanced story of AI's integration. The real advantage lies not in simply adopting AI, but in understanding its cascading effects and the infrastructure challenges that will dictate its pace and scale.
Josh Baer's perspective on software provides a crucial layer to this understanding. He frames AI not as a revolution, but as an evolution of existing software capabilities, expanding the Total Addressable Market (TAM) for enterprise software by an estimated $400 billion by 2028. This expansion is fueled by LLMs and diffusion engines unlocking new features, with incumbents playing a vital role. Companies are already leveraging AI internally for R&D efficiency and faster product innovation, using the same developer tools to accelerate their own progress. This internal adoption is a precursor to the external monetization strategies: separate AI-focused suites, standalone offerings, or embedded functionalities that enhance core platforms, leading to better retention and growth.
However, the rapid advancement of Large Language Models (LLMs) presents a significant, and perhaps underestimated, dynamic. Byrd points to a "continued non-linear improvement of LLMs," driven by a tenfold increase in compute power used for training. This scaling, he explains, can lead to a doubling of model capabilities. This isn't just about incremental gains; it's about unlocking entirely new levels of creativity and problem-solving. We've seen LLMs contribute to breakthroughs in physics and mathematics, demonstrating a capacity for discovery that extends beyond mere data processing.
"A doubling from here in a relatively short period of time is difficult to predict. It's obviously very significant and I think several of the LLM execs at our event sounded to me extremely bullish on what that will be."
-- Stephen Byrd
This exponential capability growth, while promising immense benefits, also amplifies the risks of disruption and deflation that the market is already concerned about. The implication is that the pace of AI's impact will likely outstrip many organizations' ability to adapt, creating a widening gap between the technology's potential and its practical, controlled implementation.
The Unseen Infrastructure Chokehold
While the capabilities of AI models are accelerating, the physical infrastructure required to power them is facing severe bottlenecks. Stephen Byrd highlights power as a primary constraint in the U.S., estimating a need for 74 gigawatts of data center capacity by 2028, with a significant portion requiring unconventional, off-grid solutions. This shortfall, estimated at 10-20% of needed capacity, is not just an issue of grid access but also of labor. The demand for skilled trades, particularly electricians, is immense, with hundreds of thousands of additional workers needed.
This labor shortage is a critical downstream effect that conventional planning often overlooks. The immediate impulse is to build more data centers, but the lack of skilled labor to construct and maintain them creates a multi-year delay. This is precisely where discomfort now creates advantage later. Companies that invest in training and securing labor for infrastructure development, or those that can leverage existing, repurposed infrastructure, will be significantly better positioned.
The economics of repurposing existing sites, such as former Bitcoin mining facilities, offer a compelling example of this dynamic. These sites, previously trading at $1-$2 per watt, can now command $10-$18 per watt by leasing to hyperscalers for AI data center hosting. This dramatic rerating demonstrates how a scarcity in one area (power and suitable locations) creates immense value for those who can provide it.
"We think the US is likely to be 10 to 20 percent short of the data center capacity that will need to be."
-- Stephen Byrd
This power and labor bottleneck is not merely a technical hurdle; it's a systemic issue that will influence the geographic distribution of AI development and the competitive landscape. Companies that can navigate or alleviate these constraints will gain a substantial lead. The narrative around AI is too often focused on the algorithms and the software, neglecting the fundamental physical realities that will govern its deployment and, consequently, its economic impact.
Navigating the AI Frontier: Actionable Insights
The rapid evolution of AI, coupled with infrastructural constraints, demands a strategic approach that looks beyond immediate gains. The insights from this conversation point to several key actions for businesses and investors aiming to capitalize on AI while mitigating its disruptive potential.
- Quantify AI Benefits Rigorously: As Stephen Byrd notes, quantifying adoption benefits is becoming "table stakes." Companies must move beyond qualitative assessments to detailed, data-driven projections of AI's impact on efficiency, productivity, and revenue. This provides a clear roadmap for internal investment and external communication.
- Immediate Action: Establish clear metrics for AI ROI within the next quarter.
- Integrate AI into Software Evolution: For software companies, AI is an opportunity to expand TAM and enhance existing products. Focus on how LLMs and diffusion models can unlock new features and improve core platform value, leading to better customer retention.
- Over the next 6-12 months: Develop and pilot new AI-enhanced features within existing software suites.
- Anticipate Non-Linear LLM Advancements: Recognize that LLM capabilities are improving at an accelerating rate. This means current predictions may quickly become outdated, and the potential for breakthrough applications and disruptive capabilities will emerge faster than anticipated.
- Ongoing: Continuously monitor LLM research and development for step-change improvements.
- Address Infrastructure Bottlenecks Proactively: The shortage in power and skilled labor for data center development is a significant constraint. Companies that can secure power solutions or invest in labor development will gain a substantial advantage.
- This pays off in 12-18 months: Explore partnerships for unconventional power solutions or invest in vocational training programs for critical infrastructure roles.
- Leverage Repurposed Infrastructure: The economics of converting sites like former Bitcoin data centers for AI hosting are highly attractive due to scarcity. Identifying and securing such opportunities can offer significant cost advantages.
- Over the next quarter: Investigate the feasibility and economics of repurposing underutilized infrastructure for AI workloads.
- Prepare for Disruption and Deflationary Pressures: While focusing on AI's benefits, acknowledge and plan for its potential to drive significant market disruption and deflationary forces. This requires scenario planning and building resilience into business models.
- Over the next 6-12 months: Conduct scenario planning exercises to assess the impact of AI-driven deflation and disruption on your market.
- Develop Agentic Capabilities: As LLMs become more sophisticated, focus on developing "agentic" capabilities--AI systems that can autonomously perform complex tasks. This represents a significant leap in AI's practical application and value creation.
- This pays off in 18-24 months: Begin R&D into agentic AI applications relevant to your core business processes.