AI is no longer just disrupting workflows – it’s redefining how entire industries operate. That was a key takeaway from Mizuho’s 2025 Technology Conference, a two-day event in New York City which brought together over 60 leading technology companies and nearly 300 investors representing over 125 investment firms.
In his keynote address, Sam Altman, CEO of OpenAI, forecasted that AI would “transform nearly every industry over the next five years.” The future is already taking shape, with demand surging for software automation, secure deployment of AI agents, and continued investment in next-generation data infrastructure.
Despite ongoing macroeconomic headwinds, including tariff-related uncertainties, discussions pointed to a strong spending environment. Across the board, AI has emerged as a top budget priority, with firms aiming to boost productivity and drive operational scale.
This transformation hinges on hardware. As speakers emphasized, the paradigm in data center, enterprise, and cloud is shifting from raw compute to a focus on training and inference. The ability to glean intelligence from data, make predictions, and react instantly relies entirely on the capabilities of modern chips – making semiconductors the core enabler of AI’s next phase.
A Resilient Consumer
From payments to chips to software, speakers underscored the strength of the consumer, consistent spending volumes, and limited revenue impact from macro uncertainty following Liberation Day. Tariffs and related costs have had minimal disruption, and management teams appear more seasoned in navigating economic volatility after years of practice.
Yet this business resilience isn't just about market conditions – it’s also a reflection of how AI is reshaping operations behind the scenes.
Across the board, firms are deploying AI to enhance productivity and unlock margin. In R&D, coding assistants are accelerating development cycles. In sales and marketing, AI is optimizing lead targeting and personalizing outreach. On the back end, automation is streamlining finance, legal, and customer service – often through AI-powered chatbots and tools that reduce manual work and cut costs.
Though few speakers spoke directly about headcount reductions, many noted slower hiring plans. As roles evolve, AI is enabling leaner, more efficient teams. The margin expansion this unlocks is real and still underappreciated.
The Emergence of AI Agents
Many speakers revealed their companies are already taking the next step in AI adoption: building and deploying AI agents tailored to specific business functions. While still in the early stages, the transition from conversational AI to purpose-built agents is underway – designed not just to interact, but to act on behalf of users, both internally and externally.
Most firms emphasized customer-facing use cases, including developing agents to anticipate needs, recommend actions, or even complete tasks autonomously. One of the most compelling applications discussed was agentic commerce – an emerging field where AI agents act on behalf of consumers or businesses to initiate, manage, and complete transactions with minimal or no human input. By embedding AI agents into its app and ecosystem, firms can predict consumer intent and facilitate seamless purchases, personalized offers, and intelligent financial decisions.
Currently, AI agents still require significant human oversight to fine-tune models and manage workflows. But that’s starting to change. Advances in self-improving models are reducing the need for constant human input. Foundation models are becoming more capable of learning from real-time data, training other models, and iterating on their own outputs – unlocking exponential efficiency gains.
The Hardware Powering the Revolution
As the "arms dealers" of the AI revolution, chips will be needed to enable not only more powerful training and inferencing, but also to handle the underlying infrastructure needs: energy efficiency, bandwidth, high-speed memory, and massive data throughput.
From core data centers and robotics to edge devices like PCs, wearables, vehicles, and industrial machines, the demand for compute, storage, power management, and networking is expanding rapidly.
Two of the most essential chips enabling this growth are DRAM and NAND.
- DRAM (Dynamic Random-Access Memory) delivers the ultra-fast working memory needed to train and run AI models, storing real-time inputs, weights, and activations
- NAND flash provides high-capacity, non-volatile storage that supports the massive datasets used in large language models and other AI systems.
As AI models grow larger and more complex, NAND’s role will become even more important. With large language models and autonomous systems generating and consuming unprecedented volumes of data, demand for fast, scalable storage is surging. Some analysts now project that NAND’s total addressable market could surpass $100 billion, fueled by AI’s insatiable appetite for data.
What’s Next
The pace of AI advancement remains relentless – and not without growing pains. Several leaders pointed to ongoing infrastructure constraints, including real-time compute bottlenecks and system outages. One panelist noted that scaling AI sustainably will require major investment in next-generation energy sources, particularly solar and nuclear, to meet the long-term power demands of training and inference at scale.
Despite these challenges, companies are pressing ahead. Many are actively embedding AI capabilities across their product portfolios, relying on foundation models from OpenAI, Anthropic, and other providers. Agentic AI, once a niche concept, is moving toward the mainstream, and some are already thinking beyond software – toward “physical AI” or edge-native intelligence embedded in devices like laptops, headsets, and robots.
There’s also a growing belief that AI will democratize software development. As Sam Altman suggested, developers across the world – not just in Silicon Valley – may soon build highly specialized apps using AI as their foundation. If that materializes, it could shift the power balance toward firms with valuable proprietary data or the infrastructure to help customers fine-tune models securely.
Agentic systems also have potential to fill critical skill gaps. Several speakers pointed to use cases in architecture, civil engineering, and manufacturing – fields where technical talent is in short supply. AI could help automate complex workflows and enable fewer workers to do more, faster.
Some sectors, however, remain relatively shielded from near-term AI disruption. Payments, for example, continue to move with consumer adoption cycles rather than AI breakthroughs. Still, innovation is emerging in adjacent areas, particularly with stablecoins as a fast, low-cost method for cross-border transactions, especially in remittance-heavy markets.
The next chapter of AI will be defined by scale, specialization, and integration. The winners won’t just be those who adopt AI – but those who adapt their businesses around it, from infrastructure to interface.