Power Takes Center Stage in the AI Infrastructure Race at IMN Data Centers Power Capital 2026

As AI infrastructure demand collides with grid constraints, the IMN Data Centers Power Capital event revealed how developers, utilities, and investors are rapidly transforming data centers into integrated energy infrastructure platforms built around behind-the-meter power, private generation, battery storage, and new approaches to financing, resiliency, and regulatory risk.

Held May 4 at the historic Union League Club in New York City, the IMN Data Centers Power Capital event confirmed that AI infrastructure boom is no longer being framed as simply a data center story. It is increasingly being framed as a power story; and more specifically, a capital allocation and infrastructure risk story centered on electricity, generation, transmission, and speed to energization.

This was not a traditional data center conference focused on facility operations or colocation strategy. Instead, the event functioned as a convergence point for developers, infrastructure investors, utilities, financiers, energy specialists, and operators grappling with the same emerging reality: AI-scale compute demand is colliding head-on with the physical limitations of the grid. Across nearly every session, the conversation shifted away from data centers as pure real estate products and toward a new conception of AI facilities as integrated energy infrastructure platforms, where generation strategy, fuel access, battery storage, regulatory policy, and interconnection timing now directly shape project viability.

The dominant intellectual throughline of the conference was the normalization of Bring Your Own Power (BYOP) as a defining development and underwriting model for the next generation of AI infrastructure. Speakers repeatedly described a market where the utility grid is no longer merely a constraint, but increasingly a development roadblock that's forcing hyperscalers, developers, and capital providers to rethink how projects are financed, structured, and delivered. Natural gas generation, microgrids, captive power systems, hybrid energy stacks, batteries, and long-duration resiliency architectures were discussed not as speculative side conversations, but as mainstream infrastructure finance categories.

Just as importantly, the event revealed that risk, not hype, has become the industry’s central preoccupation. Beneath the optimism surrounding AI demand sat recurring concerns about interconnection delays, speculative load requests, stranded capacity, fuel supply risk, permitting friction, regulatory uncertainty, equipment shortages, labor constraints, and the possibility that today’s infrastructure assumptions could rapidly become obsolete. The resulting conversations were unusually candid, exposing an industry still actively inventing the operational and financial playbook for the AI infrastructure era in real time.

AI Demand Expands, but Power Dictates Deployment

The opening panel at IMN’s Data Centers Power Capital event made clear that the data center industry’s center of gravity has deepened. AI demand may be driving the headlines, but power availability is now determining where capacity can actually be built.

Madelyn Brennan, Principal of DigitalBridge Investment Management, opened the conference by framing the discussion around the growing convergence of data centers and energy infrastructure. The panel quickly moved to one of the defining questions of the current market: whether AI infrastructure still depends on traditional data center geography, or whether the hunt for power has rewritten the map.

Mark McComiskey, Partner at Avaio Capital, argued that AI has weakened the location sensitivity that shaped the cloud buildout. Training workloads, and increasingly many inference workloads, can move to the markets where power, tax incentives and speed to market align. While a limited set of ultra-low-latency inference applications may still require proximity to dense urban markets, McComiskey said much of the next wave of AI infrastructure can be placed wherever power is available.

That shift has major implications. If data center demand is no longer anchored primarily to legacy cloud regions, then power becomes the first-order siting variable. In McComiskey’s view, the industry is moving toward a “democratization of geography,” with developers and hyperscalers following available electricity rather than traditional market logic.

The panel also addressed the inevitable question of whether the market is experiencing an AI power bubble. Pramit Mukherjee, Managing Director-Investment Solutions at SLC Management, said institutional investors remain focused on residual value, asset useful life, lease quality, tenant credit, and technology obsolescence. The concern is not simply whether demand exists today, but whether today’s high-density campuses will retain value if compute architectures, chip designs, or cooling requirements evolve over the next decade.

McComiskey pushed back on the bubble framing, at least for data center construction. Unlike the telecom bubble, he said, the current market is not characterized by large-scale speculative construction without customers. Capital markets still require signed tenants before major projects move forward. Colocation providers, build-to-suit developers and major platform operators may begin site work early, but heavy construction generally depends on contracted demand.

That discipline, he argued, makes a near-term stranded-asset scenario unlikely. With reserve margins tightening across many U.S. power markets and AI training and inference demand still expanding, newly built data centers with grid access are likely to retain value.

The harder question is where the power comes from.

On that point, McComiskey was blunt about small modular reactors. Despite intense interest from the data center sector, he argued that SMRs are unlikely to provide meaningful capacity in the time frame that matters for current AI infrastructure demand. Regulatory readiness, manufacturing capacity, public acceptance, cost and development timelines all remain formidable barriers. In his view, the industry will have to solve its power problem well before SMRs can scale.

That leaves natural gas as the practical near-term answer. Renewables will remain part of the mix, and storage will play an important role in grid support and backup architectures. But for large-scale baseload power delivered quickly, McComiskey said the path points toward gas-fired generation, including behind-the-meter power plants.

The economics driving that shift are stark. McComiskey compared Meta’s projected multibillion-dollar AI data center campus spending with the capital budget of a major utility, arguing that hyperscaler capex now dwarfs the capacity of utilities to build generation and transmission at the same pace. The result is a structural mismatch: technology companies want to deploy faster than the regulated utility model is designed to move.

That mismatch is pushing more developers toward behind-the-meter power. The panel described a market in which hyperscalers remain reluctant to embrace fossil generation but increasingly face a binary choice: secure power or delay capacity. The open question is whether behind-the-meter generation becomes a temporary bridge to the grid, or whether large technology companies develop a lasting appetite for private power infrastructure.

Mukherjee offered a somewhat more constructive view of utilities, noting that grid operators and utilities are experimenting with expedited interconnection pathways, high-load service frameworks, distributed energy resources, net metering and vertically integrated development models. But he also acknowledged that interconnection delays, permitting timelines, supply chain problems and weak demand forecasting remain major obstacles.

The final bottleneck may be the equipment itself. McComiskey said turbine availability is only part of the problem; the broader issue is the supply of prime movers, including reciprocating engines, transformers, switchgear and other balance-of-plant equipment needed to bring new power capacity online. OEMs have little incentive to dramatically expand manufacturing capacity until they are confident that the current wave of orders is financeable and tied to real projects.

That creates a paradox. The data center industry needs generation equipment faster than suppliers can deliver it, but some suppliers remain wary that portions of their backlog are not yet backed by committed capital or executable projects.

The session closed by returning to the question that will likely define the next phase of AI infrastructure development: not whether demand exists, but whether the power ecosystem can scale quickly enough to meet it. In that framing, AI does not merely create more data center demand. It forces the data center sector to behave more like the energy industry.

Behind-the-Meter Power Becomes the New Investment Filter

The second panel sharpened the conference’s opening thesis: power is now much more than just a utility input to the data center business. It is the primary investment filter.

Moderated by Michael Ortiz, Chief Commercial Officer of Innova Cloud Data Centers, the session examined where capital is actually moving across the data center power ecosystem. The discussion quickly converged around one conclusion: the old “power plus land” playbook has largely expired. A few years ago, developers could secure land near a substation, obtain a will-serve letter from a utility, and create meaningful speculative value. That window has narrowed dramatically.

Andrew Symons, Managing Director of Sustainable Development Capital (SDCL) said the investable opportunity is no longer simply identifying land in a desirable data center market. The more durable opportunity is delivering behind-the-meter power directly to customers, whether as a bridge to the grid or as a permanent islanded solution. For hyperscalers and large data center operators, time to power has become the central underwriting issue.

Taylor Malfitano, Managing Principal for Broadwaters Digital, described this as the move from “Power Land 1.0” to “Power Land 2.0.” In the first phase, developers chased substations and excess utility capacity. In the second, they are chasing gas pipelines, fuel deliverability, air permits, onsite generation equipment, and the practical ability to convert a site into an executable power platform.

That distinction matters because capital markets have become far less interested in theoretical power. Ortiz framed the shift as the move from power optionality to power certainty. Investors no longer want to hear only what a developer believes it may eventually receive from a utility. They want to know what power is actually secured, how it will be delivered, what infrastructure must be built, what long-lead equipment is required, and whether the public, permitting agencies and utilities will allow the project to proceed.

The panel repeatedly returned to public acceptance as an underpriced risk. Malfitano noted that some of the loudest opposition to large data center projects is now appearing in rural communities, where developers may have assumed resistance would be minimal. Symons said his firm has seen both sides of that dynamic, including a project in Scotland where early public engagement was central to execution, and a U.S. project where the team had to catch up after community concerns emerged.

That is an important development for the AI infrastructure buildout. The industry is learning that a technically viable site can still fail if the social license is not managed early. Public affairs, local government engagement and community transparency are no longer peripheral workstreams. They are becoming part of the underwriting package.

The panel also drew a useful distinction between large AI training campuses and smaller inference-oriented opportunities. Ortiz noted that while gigawatt-scale campuses attract the most attention, cloud and inference demand will continue to require capacity closer to established markets. Malfitano pointed to opportunities in the 10- to 30-megawatt range, including brownfield data center assets that may be upgraded for denser AI workloads without becoming full-scale greenfield campuses.

That point broadens the AI infrastructure story. Not every opportunity is a 500-megawatt or gigawatt campus. Some of the more executable investments may come from taking existing facilities from legacy density to higher-density inference capacity. Those projects may be smaller, but they may also be closer to demand and less exposed to the full risk stack of a new mega-campus.

On the asset side, Symons said clean power PPAs have become quieter in the U.S. market as demand for power begins to override prior corporate sustainability commitments. Behind-the-meter generation, microgrids and backup systems are increasingly converging into a single category of energy-as-a-service infrastructure. A generation asset may begin as bridge power, become permanent site power, or later convert into backup capacity if grid service arrives.

That flexibility is becoming central to the investment case. Alcantano argued that reciprocating engines and similar onsite generation assets should not necessarily be viewed as stranded if grid power eventually becomes available. Properly maintained, they can have long useful lives, serve as backup systems, or potentially adapt to future fuel sources such as hydrogen blends.

The strongest takeaway from the session was that power infrastructure is becoming a specialized asset class inside the data center economy. The best opportunities may belong not to firms that simply control land, but to those that can align siting, fuel, permitting, public support, equipment, capital and offtake into one executable package.

In that sense, the panel captured a broader market transition. Data center developers are discovering the complexity of the power business. Energy investors are discovering the scale of data center demand. The winners will likely be the firms that can speak both languages.

Regulation Becomes the Next Constraint in the BYOP Era

The third session moved the conversation from capital and siting into the regulatory architecture that will determine how fast the AI power buildout can actually move.

Moderated by Steven Shparber, Member/Co-Chair of the Digital Infrastructure Practice at Mintz, the panel focused on government policy, incentives and regulatory risk in a bring-your-own-power environment. That framing is useful because BYOP is no longer just a procurement strategy. It is now a regulatory challenge that touches federal jurisdiction, state utility authority, interconnection rules, transmission access, local land use and ratepayer politics.

Josh Price, Director-Intelligence & Research at Crux, laid out the emerging federal framework in three buckets. First, regulators are looking at how data centers can contract directly with generators, including front-of-the-meter and behind-the-meter structures. Second, FERC is examining whether bringing power, or agreeing to curtail load, can help large-load projects move more quickly through interconnection studies. Third, regulators are considering how utilities should provide non-firm transmission service while projects wait for permanent service.

That triad captures the policy problem now facing the industry. Data centers need speed to power. Utilities and grid operators need reliability. Regulators need to protect ratepayers and preserve jurisdictional boundaries. The challenge is building rules that accelerate infrastructure without creating a political backlash.

Tom Skinner, Managing Partner of Redbrick LMD, argued that the Trump administration has sent a clear signal that power and AI infrastructure are now being treated as national security priorities. He pointed to executive orders emphasizing energy dominance, AI infrastructure and coordinated federal action across agencies. In his view, the federal government is trying to use its own land and policy machinery to accelerate energy and data center development.

That federal-land strategy was one of the session’s sharper themes. Skinner said Redbrick has been working on power generation and data center development on federal land, with secure land development agreements expected to be announced. His argument was that federal sites may offer a way to move around some state and local friction by working with a single federal counterparty.

The panel did not treat that as a universal solution. Sophie Karp, MD/Senior Equity Research Analyst at KeyBanc, noted that federal urgency has not necessarily translated into faster implementation across regional grid operators. In PJM, for example, proposed timelines for transmission tariff changes still stretch into the 2030s, which is misaligned with the immediacy of data center demand. She contrasted that with MISO, where vertically integrated utilities may have more ability to plan and deliver capacity in a coordinated way.

That distinction is becoming central to the market. In regions dominated by vertically integrated utilities, planning can be more direct, even if not easy. In competitive markets such as PJM, independent power producers, utilities, hyperscalers, state regulators and grid operators are all fighting over economics, authority and timing. That makes the interconnection and power procurement process slower and more contested.

Jared Toothman, EVP and Market Leader of Lincoln Property Company, offered a developer’s caution against relying too heavily on state-level generalizations. Texas and Georgia may look more business-friendly than California or Washington, but every market with significant data center growth is now exposed to backlash. Virginia, Ohio and Arizona have all drawn scrutiny precisely because they attracted so much development.

For Toothman, the more important question is whether a specific site has the right ingredients: existing infrastructure, local support, available transmission, water, zoning, and a utility or power provider that wants the project to happen. He described Lincoln’s work at a former IBM chip manufacturing site in New York as an example of a location where the state, local government and power supplier are aligned around redevelopment.

The session also underscored the limits of BYOP as a quick fix. Toothman warned that bringing your own power does not remove the need for equipment, permits, interconnection strategy or political acceptance. Even if a developer can solve generation, it still needs transformers, generators, switchgear and other physical infrastructure that remain subject to long lead times.

The FERC discussion added another layer. Shparber explained that federal regulators are considering whether large-load interconnections directly to the transmission system should fall under FERC jurisdiction, rather than being left entirely to state and utility processes. That could create more standardized rules of the road for large data center loads, but it would also invite legal and political challenges from states that want to preserve authority.

Price said the key is balancing speed with disruption. A more standardized large-load framework could help, but a sweeping federal override of existing state and utility processes could slow projects if it triggers litigation. Karp made a similar point: the more aggressively FERC asserts jurisdiction over areas it has not historically regulated, the more pushback it will invite.

The panel also returned to ratepayer protection. The political question is not only whether data centers can get power, but whether local customers will believe they are subsidizing the infrastructure required to serve them. Karp noted that large-load customers can help absorb fixed system costs, but that benefit is difficult to explain to ordinary ratepayers. Some jurisdictions are beginning to show data center-related credits or offsets on customer bills, which may become one tool for improving public understanding.

The closing discussion pointed to the policy changes most likely to matter over the next year: FERC’s large-load rulemaking, PJM’s reliability and interconnection reforms, permitting reform in Congress, expanded use of federal land, and clearer treatment of storage. Battery storage was described as an increasingly logical way to smooth peaks and make better use of existing grid capacity, especially alongside renewables and onsite generation.

The broader message was that the industry’s next bottleneck may be institutional rather than technological. Capital is available. Demand is real. Developers are learning to pair data centers with power assets. But regulation, jurisdiction, equipment lead times and public acceptance will determine whether projects move at AI speed or utility speed.

Underwriting the New Risk Stack for AI Power

The afternoon session on underwriting power risk widened the lens from whether AI data center demand is real to how investors, operators and suppliers should manage the risks of building into that demand.

Moderated by Phil Lookadoo of Haynes Boone, the panel brought together perspectives from Wesco, Generate Capital, King Street and Nuveen Energy Infrastructure Credit. The result was a more granular look at the physical, financial and technology risks now embedded in AI data center development.

David Speidelsbach, VP Sales & Strategic Initiatives - Electrical & Electronic Solutions at Wesco, began with the electrical infrastructure implications of AI density. As rack and chip power requirements rise, he said the industry is moving toward higher-voltage DC applications, including 800 VDC distribution. That transition creates new complexity for switchgear, breakers, labor skills and onsite electrical design. Data center power is no longer only about securing megawatts at the site boundary. It is also about safely and reliably distributing far more power inside the building.

Sharin Valia, Managing Director of Capital Markets, King Street framed the same issue from inside the data hall. Power density, he said, has changed the nature of future-proofing. A few years ago, 10 kilowatts per cabinet represented a high-density deployment. Now the industry is discussing hundreds of kilowatts per cabinet, with future architectures pushing far beyond legacy assumptions. That creates a basic underwriting question: is a data center being built today already at risk of obsolescence?

The answer depends on design flexibility. Valia argued that operators must build thermal and electrical infrastructure capable of supporting mixed-density environments, from lower-density enterprise workloads to high-density liquid-cooled AI deployments. The challenge is not simply building for the highest possible density. It is building a facility that can adapt as tenant requirements change.

Logan Goldie-Scot, VP of Research & Impact for Generate Capital, focused on the gap between announced demand and executable demand. In ERCOT, he noted, large-load interconnection requests have ballooned far beyond the grid’s actual peak load, with data centers accounting for the overwhelming share of the queue. The result is a familiar problem: the load side is beginning to resemble the generation interconnection queue, where speculative requests can overwhelm system operators and make it harder to identify viable projects.

That distinction between headlines and reality ran throughout the session. Capital is trying to determine which announced campuses are real, which power requests are speculative, and which projects can actually secure interconnection, equipment, financing and offtake. Higher deposits and stronger financial commitments may help clean up the queue, but they could also knock out viable projects that lack the right capital structure at the right moment.

On nuclear, the panel was cautious. Goldie-Scot said new nuclear technologies are unlikely to play a meaningful role before 2030, and even first commercial deployments will need operating history before investors commit at scale. Valia added that communities already push back on water use, enclosures and noise around conventional data centers, making nuclear siting even more difficult. Tom Pollock, President of Nuveen Energy Infrastructure Credit, noted that nuclear has never gone away in the U.S. generation stack, but said the near-term comeback is more visible in countries restarting or extending existing nuclear assets than in new U.S. nuclear capacity for data centers.

The panel’s near-term power outlook was more practical: gas, storage, grid upgrades, demand management and better use of existing infrastructure. Pollock said Nuveen sees meaningful incremental load growth over the next five years, with data centers representing roughly half of that increase and industrial reshoring accounting for much of the rest. Speidelsbach pointed to natural gas pipeline operators and industrial suppliers now positioning themselves around data center demand, as maps of gas infrastructure become as strategically important as maps of fiber.

Software also entered the power conversation. Valia said data center operators are likely to use more software tools to manage power variability, both at the grid interface and inside the facility. That includes systems that help respond to utility requests, optimize power draw, and manage cooling and electrical loads more dynamically.

Still, the physical supply chain remains a central constraint. Speidelsbach said electrical equipment lead times, including switchgear, cables and transformers, are forcing suppliers and customers to think harder about standardization. Standard designs can allow material to be redirected between projects when delays occur, helping reduce the risk that one stalled site strands equipment needed elsewhere.

The panel also addressed what happens when projects slip. Valia pointed to satellite-based analysis suggesting that many data center projects are already delayed, with labor, chips, power and supply chain constraints all contributing. Goldie-Scot said some projects may be commissioned and then fail operationally, limiting the developer’s ability to secure follow-on sites. That risk is especially acute when equipment providers commit to delivery volumes beyond anything they have historically executed.

For credit investors, optionality is now a key underwriting pillar. Pollock said Nuveen examines whether turbines or other power assets could be redeployed to LNG terminals, gas processing facilities, feeder plants or other use cases if a data center project fails to proceed. Goldie-Scot said Generate often underwrites without assuming residual value for equipment after the contracted period, which creates protection if the counterparty falls away.

The session’s core message was that AI power risk is no longer one risk. It is a stack of interdependent risks: grid access, onsite distribution, density, cooling, equipment availability, fuel supply, interconnection, tenant durability, technology change, labor, community acceptance and capital structure.

For investors, the opportunity remains substantial. But the underwriting bar is rising. In the AI infrastructure market, power is not just what enables the asset. Increasingly, power is the asset being underwritten.

Batteries Move From Backup to AI Load Management

The battery and storage session brought the conference’s power discussion down to the operational layer of the data center itself.

The panel made clear that batteries are no longer being discussed only as short-duration backup systems inside UPS architectures. In AI data centers, storage is increasingly part of a broader resilience strategy: smoothing load swings, supporting behind-the-meter generation, enabling renewable integration, and helping facilities manage the electrical behavior of GPU-dense workloads.

Rich Hallahan, Head of Solutions Engineering for CleanArc Data Centers, traced the evolution from traditional UPS battery strings to today’s more complex storage conversation. In conventional colocation, batteries historically supported Tier III-style concurrent maintainability and bridged the gap until generators came online. But AI has changed the discussion. Some customers are now asking whether facilities really need the same level of redundancy across the entire building, while others are pushing batteries deeper into the architecture, from centralized UPS systems toward rack-level, site-level and behind-the-meter applications.

The most important shift is the AI load profile. GPU clusters can create rapid power fluctuations that conventional electrical systems may not handle smoothly. Batteries can help absorb those micro-surges and prevent utilities from seeing unstable load behavior at the end of the line.

Cynthia Thompson, Co-Founder, Executive Chair and Chief Corporate Development Officer of CloudBurst Data Centers, said that dynamic is especially important for fully islanded, behind-the-meter AI data centers. In those environments, batteries are not simply backup. They become a buffer between gas generation and highly variable AI loads. For large AI deployments, she said, operators may still need diesel generators for ultimate resiliency, but batteries are increasingly necessary to manage the swings.

That creates an interesting tension. Technically, batteries are becoming more important. Financially, they remain difficult to underwrite. The credit investor on the panel noted that battery revenue can come from several sources: availability payments, arbitrage, fixed-price contracts or data center backup/resilience use cases. But each model carries different levels of cash flow predictability, which matters for debt investors seeking stable, pension-like returns.

The session also underscored the conservative nature of data center adoption. Hyperscalers may want innovation, but they do not want their AI infrastructure to become the test bed for unproven battery chemistry, controls software or resiliency architecture. Thompson said hyperscalers are deeply involved in reviewing behind-the-meter power designs, but they want proven technology before deploying it at scale.

That is where Nvidia’s role becomes notable. Hallahan and Thompson both pointed to Nvidia as a force pushing the data center industry forward by identifying the facility-level impacts of GPU systems and working to reduce power instability in future generations. In effect, the chip road map is now forcing innovation in the power room.

The key takeaway: batteries are not replacing generators wholesale, at least not yet. But in AI data centers, they are becoming something more strategic than backup. They are part of the electrical control layer required to run high-density compute reliably.

 

At Data Center Frontier, we talk the industry talk and walk the industry walk. In that spirit, DCF Staff members may occasionally use AI tools to assist with content. Elements of this article were created with help from OpenAI's GPT5.

Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, as well as on BlueSky, and signing up for our weekly newsletters using the form below.

About the Author

Matt Vincent

Matt Vincent is Editor in Chief of Data Center Frontier, where he leads editorial strategy and coverage focused on the infrastructure powering cloud computing, artificial intelligence, and the digital economy. A veteran B2B technology journalist with more than two decades of experience, Vincent specializes in the intersection of data centers, power, cooling, and emerging AI-era infrastructure. Since assuming the EIC role in 2023, he has helped guide Data Center Frontier’s coverage of the industry’s transition into the gigawatt-scale AI era, with a focus on hyperscale development, behind-the-meter power strategies, liquid cooling architectures, and the evolving energy demands of high-density compute, while working closely with the Digital Infrastructure Group at Endeavor Business Media to expand the brand’s analytical and multimedia footprint. Vincent also hosts The Data Center Frontier Show podcast, where he interviews industry leaders across hyperscale, colocation, utilities, and the data center supply chain to examine the technologies and business models reshaping digital infrastructure. Since its inception he serves as Head of Content for the Data Center Frontier Trends Summit. Before becoming Editor in Chief, he served in multiple senior editorial roles across Endeavor Business Media’s digital infrastructure portfolio, with coverage spanning data centers and hyperscale infrastructure, structured cabling and networking, telecom and datacom, IP physical security, and wireless and Pro AV markets. He began his career in 2005 within PennWell’s Advanced Technology Division and later held senior editorial positions supporting brands such as Cabling Installation & Maintenance, Lightwave Online, Broadband Technology Report, and Smart Buildings Technology. Vincent is a frequent moderator, interviewer, and keynote speaker at industry events including the HPC Forum, where he delivers forward-looking analysis on how AI and high-performance computing are reshaping digital infrastructure. He graduated with honors from Indiana University Bloomington with a B.A. in English Literature and Creative Writing and lives in southern New Hampshire with his family, remaining an active musician in his spare time.

You can connect with Matt via LinkedIn or email.

You can connect with Matt via LinkedIn or email.

Sign up for our eNewsletters
Get the latest news and updates
AdobeStock, courtesy of Schneider Electric
Source: AdobeStock, courtesy of Schneider Electric
Sponsored
Schneider Electric's Carsten Baumann explains why the shift to AI factories demands a fundamental rethinking of power architecture, digital design, and energy intelligence.
BluePrint Supply Chain
Source: BluePrint Supply Chain
Sponsored
Jarrett Atkinson of BluePrint Supply Chain explains why on-time delivery isn’t just about when equipment arrives—it’s about whether the site is ready to do something with it when...