
Stefan Pastine is CEO of Thintronics, a semiconductor materials company.
From my vantage point leading a semiconductor materials company, I see a growing tension between artificial intelligence and the power grid. The technology sector is expanding data centers at record speed, but the electricity to run them must come from somewhere. Unless utilities and policymakers plan carefully, the costs of that expansion will fall on ordinary customers.
The problem is visible today in several states. In Virginia, the legislature’s watchdog agency, the Joint Legislative Audit & Review Commission, or JLARC, has documented nearly $1 billion a year in sales tax exemptions for data centers, along with mounting pressure on grid infrastructure. In Georgia, utility filings show that roughly 80% of projected load growth over the next decade will come from data centers. Regulators are already debating how much of the bill should be carried by households.
The scale matters. A single 100 MW data center can consume as much electricity as 75,000 homes, and cooling systems can draw millions of gallons of water per day. These facilities are essential to the digital economy, but their local impacts on rates, water, and infrastructure cannot be ignored.
Industry projections suggest that AI demand could push data centers from about 4.4% of U.S. electricity use today to as much as 12% by 2028. But, as Amory Lovins has argued in this magazine, forecasts are not destiny. Demand may grow more slowly than the most aggressive scenarios suggest. That uncertainty makes it even more important to avoid locking customers into paying for overbuilt capacity. Grid investment should be based on holistic planning, not on subsidies and tax incentives that prioritize speed over fairness.
So what can be done?
First, diversify supply strategically. Utilities need portfolios that balance renewables and natural gas with long-duration storage. Existing nuclear plants can continue to contribute, but the focus should be on building systems that can deliver firm, affordable power through all seasons. The metric is reliability per dollar, not adherence to one favored technology.
Second, align costs with beneficiaries. If new substations, transmission lines, or water systems are required to support a data center cluster, the companies driving that demand should share proportionally in the cost. Virginia’s JLARC has recommended tying tax benefits to efficiency standards. Extending that approach nationally would help prevent large transfers of wealth from households to trillion-dollar corporations.
Third, make efficiency and reuse part of the bargain. My own industry is focused on materials innovations, better chip insulation and cooling technologies, that can reduce energy use per calculation by 30-40%. At scale, those savings are equivalent to adding several power plants to the grid. Data center operators can also make better use of waste heat, as Microsoft and Google have piloted in Scandinavia. Utilities and regulators should link incentives to those performance outcomes.
The stakes extend beyond local rate cases. Cloud infrastructure underpins banking, logistics, and advanced manufacturing. If electricity supply becomes a constraint, the U.S. risks losing competitiveness to regions that can offer reliable, affordable power. That is why national coordination matters. Leaving states and municipalities to negotiate piecemeal agreements with tech giants is not a sustainable strategy.
AI is not going away, and neither are data centers. The question is whether utilities, regulators, and technology firms can strike a bargain that supports growth without overburdening customers or compromising grid reliability. That means planning for diverse supply, sharing costs fairly, and demanding efficiency and reuse.
The alternative is unmanaged expansion, higher bills for households, and a patchwork of subsidies that fail to deliver long-term value. Utilities and regulators have an opportunity — and a responsibility — to avoid that outcome.