Stay Ahead, Stay ONMINE

From code to current: How to keep AI data centers in check for a sustainable grid

Manav Mittal is a senior project manager at Consumers Energy. As artificial intelligence continues to transform industries, from healthcare and finance to autonomous vehicles and smart cities, the demand for data processing is skyrocketing. AI-driven data centers, which power the algorithms behind these innovations, are the backbone of this revolution. However, with the expansion of […]

Manav Mittal is a senior project manager at Consumers Energy.

As artificial intelligence continues to transform industries, from healthcare and finance to autonomous vehicles and smart cities, the demand for data processing is skyrocketing. AI-driven data centers, which power the algorithms behind these innovations, are the backbone of this revolution. However, with the expansion of AI capabilities comes a growing concern: how will these energy-hungry facilities affect our already strained power grids?

Take Meta’s $10 billion AI-optimized data center in Louisiana, for example. This enormous facility, designed to handle the massive computational load required by AI, will demand a staggering amount of electricity. As AI becomes more integrated into our everyday lives, the strain on the power grid is only set to increase. But here’s the thing — AI doesn’t have to be a burden on the grid. With thoughtful strategies and a proactive approach, we can minimize the environmental and infrastructural costs of these data centers. The question isn’t whether AI will disrupt the grid, but how we can make it work for us without sacrificing sustainability.

Energy efficiency: The first line of defense

It’s easy to think of data centers as mere consumers of energy, but the truth is, they’re not all created equal. There’s plenty of room for improvement when it comes to energy efficiency. The first step in minimizing AI data center impacts on the grid is simply making these centers run more efficiently.

Cooling systems alone account for a huge chunk of energy consumption in data centers. Traditionally, large HVAC systems keep servers at optimal temperatures, but these systems are often inefficient. Thankfully, innovative cooling methods — like liquid cooling and even immersion cooling — are beginning to replace outdated systems. These newer technologies can significantly reduce energy usage, which is crucial when every watt counts.

And it’s not just cooling that needs to be rethought. Advances in hardware, such as more energy-efficient processors and GPUs, are improving the performance-to-energy ratio of data centers. These small innovations might not make the headlines, but their cumulative impact on energy consumption could be profound. Data centers should be incentivized to adopt these energy-saving technologies, not only to reduce their operating costs but to lessen their impact on the grid.

Renewable energy: A cleaner, greener future

Let’s be clear — data centers don’t have to rely on fossil fuels to power their operations. In fact, many major tech companies, including Meta, have made ambitious commitments to run their data centers on 100% renewable energy. This shift to clean energy is one of the most impactful ways to reduce the strain on the grid. If AI data centers can be powered by wind, solar and other renewable sources, we’re looking at a win-win situation: energy demand is met without contributing to greenhouse gas emissions.

However, making this transition requires more than just goodwill — it requires collaboration with renewable energy developers and utilities. Power purchase agreements are a vital tool here. These long-term contracts allow data centers to secure renewable energy directly from producers, ensuring that their electricity needs are met without disrupting the grid. The beauty of this approach is that it supports the broader goal of transitioning to a clean energy economy, all while minimizing the impact on local power infrastructure.

But let’s not stop there. Data centers should also consider on-site renewable energy generation. Installing solar panels or wind turbines at their facilities can reduce their reliance on the grid during peak demand periods. In fact, on-site energy production, combined with energy storage, could allow data centers to be largely self-sufficient, alleviating much of the pressure on local grids.

Modernizing the grid: Building for the future

While improving the energy efficiency of data centers and shifting to renewable energy are essential steps, we can’t ignore the infrastructure itself. The grid, as it exists today, was not built to handle the enormous, and sometimes unpredictable, energy demands of AI data centers. As data centers become larger and more prevalent, the grid needs to evolve to accommodate them.

Here’s where smart grids come into play. These modernized grids use sensors and real-time data to better manage energy distribution. With a smart grid, utilities can dynamically adjust power flow based on demand, ensuring that energy is directed where it’s needed most. By integrating AI into grid management, utilities can anticipate and respond to shifts in energy demand caused by data centers, ensuring a more stable grid overall.

In addition to smart grids, we need to consider energy storage. Renewable energy is intermittent by nature — solar panels don’t generate electricity at night, and wind turbines are silent on calm days. By incorporating energy storage systems, such as large-scale batteries, data centers can store excess energy generated during off-peak hours and use it when demand is high. This will help to smooth out the fluctuations in energy supply and ensure that data centers are less reliant on the grid during peak times.

Demand response: A shared responsibility

But why stop with data centers? AI-driven facilities have a responsibility to participate in demand response programs. These programs incentivize businesses and consumers to reduce their energy usage during periods of peak demand, which helps prevent grid overloads. Data centers are prime candidates for demand response because they can adjust their operations — such as shifting workloads to off-peak hours — without negatively impacting performance. By participating in these programs, AI data centers can significantly ease pressure on the grid, especially during high-demand periods, like hot summer afternoons when air conditioning use is at its peak.

The key here is that grid stability is a shared responsibility. While AI data centers are heavy consumers of electricity, they also have the tools to manage their consumption intelligently. Rather than adding to the grid’s burden, these facilities can be part of the solution. Through demand response, they can reduce their energy use when it’s most needed, helping to balance supply and demand and prevent power outages.

Collaboration: A holistic approach to grid sustainability

It’s clear that minimizing the impact of AI data centers on the power grid isn’t a task for data center operators alone. This challenge requires collaboration among technology companies, utilities, policymakers and local communities. Governments must provide the right incentives to encourage the adoption of clean energy and energy-efficient technologies. At the same time, utility companies must modernize the grid to accommodate the growing demands of AI data centers and other large energy consumers.

We also need to prioritize transparency and dialogue with communities. Local governments and residents should be included in conversations about how AI data centers impact energy infrastructure. Through collaboration, we can ensure that these facilities contribute positively to both the local economy and the environment.

Conclusion: A vision for a sustainable future

The rise of AI presents enormous opportunities for innovation, but it also poses significant challenges, particularly when it comes to energy consumption. AI data centers are indispensable to the future of technology, but they must be built in a way that minimizes their impact on the power grid and the environment.

By focusing on energy efficiency, incorporating renewable energy, modernizing grid infrastructure and participating in demand response programs, we can reduce the strain AI data centers place on the grid. Ultimately, it’s about balancing progress with sustainability. As we move toward a cleaner, smarter and more connected future, we must ensure that the rise of AI doesn’t come at the expense of our planet — or our power systems.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

AI-Powered Policing: The Future of Traffic Safety in Kazakhstan

Traffic management is a growing challenge for cities worldwide, requiring a balance between enforcement, efficiency, and public trust. In Kazakhstan, the Qorgau system is redefining road safety through an innovative fusion of artificial intelligence (AI), computer vision, and mobile technology. Designed to assist traffic police in real-time violation detection and

Read More »

Quantum networking advances on Earth and in space

“Currently, the U.S. government is not investing in such testbeds or demonstrations, ensuring it will be a follower and not a leader in the development of technical advances in the field,” said a report released last year by the Quantum Economic Development Consortium, a global association of more than 250

Read More »

Who wins in networking in 2025 and beyond?

In this third team, we also find vendors most interested and active in the IoT opportunity, two of which are usually not considered enterprise vendors at all: Ericsson and Nokia. Both these companies are actively pursuing new IoT strategies, and while they are surely hoping that broad public-sensor IoT will

Read More »

Dorian LPG Q4 Profit Slashed 79 Percent as Shipping Rates Down

Dorian LPG Ltd. has reported $21.36 million in net income for the fourth quarter of 2024 (third quarter of fiscal year 2025), down 78.63 percent compared to the same three-month period 2023 as freight rates fell. The Stamford, Connecticut-based owner and operator of very large gas carriers (VLGCs) logged $18.5 million in adjusted net profit, or $0.43 per diluted share, according to results it published online. That was down from $106 million for the fourth quarter of 2023. The adjusted net earnings per share figure missed the $0.56 Zacks Consensus Estimate, an average of projections by brokerage analysts. “The $87.5 million decrease in adjusted net income for the three months ended December 31, 2024, compared to the three months ended December 31, 2023, is primarily attributable to (i) a decrease of $82.4 million in revenues; (ii) increases of $2.2 million in charter hire expenses, $2.2 million in vessel operating expenses, $0.2 million in voyage expenses, and $0.1 million in depreciation and amortization expenses; and (iii) decreases of $1.1 million in realized gain on derivatives and $1.6 million in other gain/(loss), net, partially offset by (i) decreases of $1.2 million in interest and finance costs and $0.2 million in general and administrative expenses and (ii) an increase of $0.9 million in interest income”, Dorian LPG said. Revenues for the October-December 2024 period totaled $80.67 million, compared to $163.06 million for the comparable period in the prior year. Dorian LPG’s time charter equivalent rate per available day in the fourth quarter of 2024 dropped 49.86 percent year-on-year to $36,071. Available days decreased from 2,256 to 2,210. “Weaker import demand from China, driven in part by lower steam cracking demand, resulted in a decline in LPG imports from high levels of 3.5 MMT [million metric tons] in July 2024 to 2.3 MMT in

Read More »

Trump Signs Memorandum ‘Restoring Maximum Pressure’ on Iran

A fact sheet posted on the White House website on Tuesday stated that U.S. President Donald J. Trump signed a National Security Presidential Memorandum (NSPM) “restoring maximum pressure on the government of the Islamic Republic of Iran”. “The NSPM directs the Secretary of the Treasury to impose maximum economic pressure on the Government of Iran, including by sanctioning or imposing enforcement mechanisms on those acting in violation of existing sanctions,” the fact sheet noted. “The Secretary of State will also modify or rescind existing sanctions waivers and cooperate with the Secretary of Treasury to implement a campaign aimed at driving Iran’s oil exports to zero,” it went on to state. Rigzone has contacted Iran’s ministry of foreign affairs for comment on the fact sheet. At the time of writing, the ministry has not yet responded to Rigzone’s request. In a report sent to Rigzone by the Skandinaviska Enskilda Banken AB (SEB) team on Wednesday morning, Bjarne Schieldrop, the chief commodities analyst at the company, said Brent “turned higher yesterday as Trump ramps up pressure on Iran” but added that it was “slightly lower this morning”. “Brent traded as low as $74.15 per barrel (-2.4 percent) yesterday but managed to close with a gain of 0.3 percent at $76.2 per barrel,” Schieldrop highlighted in the report. “The almost linear downward trend since the recent peak in mid-January seems to have faded a bit with price action now a little more sideways it seems,” he added. In the report, Schieldrop pointed out that, on Tuesday, “Trump signed actions for harder pressure on Iran with the potential to drive its exports significantly lower”. “That Trump would try to drive Iranian oil exports lower has been our expectation all along,” he said in the report. “The oil market is now caught between increasing fears

Read More »

Orsted, PGE Make FID on 1.5-GW Wind Farm Near Poland

Danish wind developer Ørsted A/S and PGE Polska Grupa Energetyczna S.A. (PGE) have made a final investment decision (FID) on the Baltica 2 Offshore Wind Farm, planned to have a capacity of 1.5 gigawatts (GW). The project will be built, owned and operated in a 50/50 partnership between the two firms. Baltica 2, which will be located approximately 25 miles (40 kilometers) off the Polish coast near Ustka, is expected to be fully commissioned in 2027, Orsted said in a news release. Baltica 2 has a 25-year inflation-protected contract for difference (CfD) in place with the Polish state. The wind farm has obtained all permits and has signed a grid connection contract with the Polish transmission system operator PSE, according to the release. The wind farm will use the Port of Gdansk for the storage, pre-assembly, and offshore installation of wind turbine components. All major component and vessel contracts for Baltica 2 have been signed, locking in the majority of the project’s capital expenditures, “which significantly de-risks the project,” Orsted noted. According to a separate report from PGE, the total budget for the project, including capital expenditures in development and construction, is estimated at approximately $7.41 billion (PLN 30 billion). Newly appointed Orsted CEO Rasmus Errboe said, “With today’s announcement, we’re ready to build Baltica 2, a flagship project for offshore wind in Poland. We’re satisfied with the value creation of the project, which has an attractive risk-reward profile. I wish to thank the Polish government for its support, and I want to thank our partner PGE for working with us to reach this moment. Together, we’re writing a new chapter in the Polish energy sector, and we’re setting up an industry that will bring jobs and industrial development to Poland for decades to come”. PGE CEO Dariusz Marzec said,

Read More »

Moomba CCS in Australia on Track to Achieve Declared Work Rate, Says Santos

The Moomba carbon capture and storage (CCS) project onshore South Australia stored 340,000 metric tons of carbon dioxide equivalent (CO2e) at yearend after starting service October 2024, operator Santos Ltd. has said, touting the facility as a showcase of Australia’s potential for the technology. “The technology and reservoir is [sic] performing as expected, putting Moomba CCS on track to safely and permanently sequester up to 1.7 million tonnes per annum of CO2e, depending on CO2 availability”, the local gas and oil company said in an online statement. A recent analysis by the Institute for Energy Economics and Financial Analysis (IEEFA) of another Australian CCS project found underperformance and cast doubt about the technology’s viability for abating emissions. The Chevron Corp.-led Gorgon CCS injection system captured, in the last Australian fiscal year (July 2023-June 2024), just 30 percent of the CO2 emitted from natural gas extraction by the Gorgon LNG and domestic gas project, IEEFA reported November 28, 2024. Gorgon CCS had been approved on the condition it captures, on a five-year rolling average from 2016, at least 80 percent of CO2 emissions from wells drilled for the gas facility, according to information published online by Chevron Australia Pty. Ltd. Santos assured its project “is delivering immediate and real large-scale emissions reduction for the company and for Australia at a very competitive lifecycle cost”. “The project is providing a real confidence boost for the potential of CCS technology to help Australia reach net zero and decarbonize faster, at scale and affordably”, the Adelaide-based exploration and production company added. At a full injection rate, Moomba CCS avoids more CO2 in four days than what 10,000 electric vehicles save in one year, according to Santos. “And in just one year, Moomba CCS will achieve around 28 percent of the total emissions reduction achieved by

Read More »

SSE on track despite stormy weather

SSE (LON: SSE) has upped its energy production across its portfolio of wind, gas and coal power through the most recent quarter affected by Storm Eowyn. The Perth-headquartered firm “good operational performance against variable weather conditions” in an update on its third quarter. It added operating profit expectations across its business units “remain unchanged” albeit it cautioned full year performance remains subject to a number of factors, including more weather. Generation output from its SSE Renewables division increased 26% in first nine months to the end of December compared to same period in prior year, SSE said. It added its “renewables fleet continue to experience periods of variable weather conditions” in January when Storm Eowyn hit, which the Met Office has described as “the UK’s most powerful windstorm for over a decade“. With its growing portfolio of investments in onshore and offshore wind in the UK, it said it’s massive Dogger Bank wind farm was still expected to complete in the second half of 2025.  SSE has a 40% stake in the project alongside Equinor (OSL: EQNR) 40% and Eni (IT: ENI) 20%. It added a second vessel has been reserved for the project from 2026 to support turbine installation across the second and third phases of the project. © Supplied by ProservProserv’s holistic cable monitoring system has been deployed at Dogger Bank wind farm. It also reported it has achieved first power at its 101MW Yellow River onshore wind farm that it has made a financial investment decision (FID) in its 208MW Strathy South onshore wind farm. It said it’s SSEN Transmission business, in which it holds a 75% stake along with Ontario Teachers’ Pension Plan Board which owns 25%, published its “bold blueprint to deliver at least £22 billion of critical grid infrastructure in the five years to 2031”.

Read More »

TriMas Completes Arrow Engine Sale

TriMas Corp., a Michigan-based manufacturer of engineered products, has completed the sale of its Arrow Engine business to IES Infrastructure Solutions LLC, a division of IES Holdings Inc. TriMas confirmed in a media release that the sale marks the end of its direct presence in the oil and gas market. “Founded in 1955, Arrow Engine was a legacy TriMas business and part of the TriMas portfolio for several decades”, Thomas Amato, TriMas President and Chief Executive Officer, said. “We are pleased to place this business with IES Infrastructure Solutions, the right partner to take Arrow Engine to the next level. This move is another important step in optimizing TriMas’ business portfolio”. Arrow Engine supplies natural gas engines commonly used in remote applications, including oil field pump jacks and compressors. These engines and replacement parts are designed for use in oil and natural gas production, as well as other industrial sectors. Arrow Engine distributes its products globally, with a primary focus on the United States and Canada, according to TriMas. Arrow Engine manufactures its engine line and offers a wide array of spare parts for various industrial engines, even those not produced by Arrow Engine, TriMas said. With estimated 2024 revenues of approximately $20 million, Arrow Engine will operate within IES’ Infrastructure Solutions segment and retain its existing brand name, TriMas said. “We remain committed to ensuring a smooth transition to IES Infrastructure Solutions while continuing to deliver the highest level of service to Arrow Engine’s customers”, said Amato. TriMas selected Holland & Knight as its outside legal counsel, while Woodward Park Partners acted as exclusive financial advisor and led the sale process. The company added that as a result of the sale, the Specialty Products business will include only Norris Cylinder’s financial performance in 2025, with just one month of Arrow

Read More »

Linux containers in 2025 and beyond

The upcoming years will also bring about an increase in the use of standard container practices, such as the Open Container Initiative (OCI) standard, container registries, signing, testing, and GitOps workflows used for application development to build Linux systems. We’re also likely see a significant rise in the use of bootable containers, which are self-contained images that can boot directly into an operating system or application environment. Cloud platforms are often the primary platform for AI experimentation and container development because of their scalability and flexibility along the integration of both AI and ML services. They’re giving birth to many significant changes in the way we process data. With data centers worldwide, cloud platforms also ensure low-latency access and regional compliance for AI applications. As we move ahead, development teams will be able to collaborate more easily through shared development environments and efficient data storage.

Read More »

Let’s Go Build Some Data Centers: PowerHouse Drives Hyperscale and AI Infrastructure Across North America

PowerHouse Data Centers, a leading developer and builder of next-generation hyperscale data centers and a division of American Real Estate Partners (AREP), is making significant strides in expanding its footprint across North America, initiating several key projects and partnerships as 2025 begins.  The new developments underscore the company’s commitment to advancing digital infrastructure to meet the growing demands of hyperscale and AI-driven applications. Let’s take a closer look at some of PowerHouse Data Centers’ most recent announcements. Quantum Connect: Bridging the AI Infrastructure Gap in Ashburn On January 17, PowerHouse Data Centers announced a collaboration with Quantum Connect to develop Ashburn’s first fiber hub specifically designed for AI and high-density workloads. This facility is set to provide 20 MW of critical power, with initial availability slated for late 2026.  Strategically located in Northern Virginia’s Data Center Alley, Quantum Connect aims to offer scalable, high-density colocation solutions, featuring rack densities of up to 30kW to support modern workloads such as AI inference, edge caching, and regional compute integration. Quantum Connect said it currently has 1-3 MW private suites available for businesses seeking high-performance infrastructure that bridges the gap between retail colocation and hyperscale facilities. “Quantum Connect redefines what Ashburn’s data center market can deliver for businesses caught in the middle—those too large for retail colocation yet underserved by hyperscale environments,” said Matt Monaco, Senior Vice President at PowerHouse Data Centers. “We’re providing high-performance solutions for tenants with demanding needs but without hyperscale budgets.” Anchored by 130 miles of private conduit and 2,500 fiber pathways, Quantum Connect’s infrastructure offers tenants direct, short-hop connections to adjacent facilities and carrier networks.  With 14 campus entrances and secure, concrete-encased duct banks, the partners said the new facility minimizes downtime risks and reduces operational costs by eliminating the need for new optics or extended fiber runs.

Read More »

Blue Owl Swoops In As Major Backer of New, High-Profile, Sustainable U.S. Data Center Construction

With the global demand for data centers continuing to surge ahead, fueled by the proliferation of artificial intelligence (AI), cloud computing, and digital services, it is unsurprising that we are seeing aggressive investment strategies, beyond those of the existing hyperscalers. One of the dynamic players in this market is Blue Owl Capital, a leading asset management firm that has made significant strides in the data center sector. Back in October 2024 we reported on its acquisition of IPI Partners, a digital infrastructure fund manager, for approximately $1 billion. This acquisition added over $11 billion to the assets Blue Owl manages and focused specifically on digital infrastructure initiatives. This acquisition was completed as of January 5, 2025 and IPI’s Managing Partner, Matt A’Hearn has been appointed Head of Blue Owl’s digital infrastructure strategy. A Key Player In Digital Infrastructure and Data Centers With multi-billion-dollar joint ventures and financing initiatives, Blue Owl is positioning itself as a key player in the digital infrastructure space. The company investments in data centers, the implications of its strategic moves, and the broader impact on the AI and digital economy highlights the importance of investment in the data center to the economy overall. With the rapid growth of the data center industry, it is unsurprising that aggressive investment fund management is seeing it as an opportunity. Analysts continue to emphasize that the global data center market is expected to grow at a compound annual growth rate (CAGR) of 10.2% from 2023 to 2030, reaching $517.17 billion by the end of the decade. In this rapidly evolving landscape, Blue Owl Capital has emerged as a significant contributor. The firm’s investments in data centers are not just about capitalizing on current trends but also about shaping the future of digital infrastructure. Spreading the Wealth In August 2024, Blue Owl

Read More »

Global Data Center Operator Telehouse Launches Liquid Cooling Lab in the UK to Meet Ongoing AI and HPC Demand

@import url(‘/fonts/fira_sans.css’); a { color: #0074c7; } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: “Fira Sans”, Arial, sans-serif; } body { letter-spacing: 0.025em; font-family: “Fira Sans”, Arial, sans-serif; } button, .ebm-button-wrapper { font-family: “Fira Sans”, Arial, sans-serif; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style { font-size: 0.75rem; opacity: .6; } #onetrust-pc-sdk [id*=btn-handler], #onetrust-pc-sdk [class*=btn-handler] { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-policy a, #onetrust-pc-sdk a, #ot-pc-content a { color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-pc-sdk .ot-active-menu { border-color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-accept-btn-handler, #onetrust-banner-sdk #onetrust-reject-all-handler, #onetrust-consent-sdk #onetrust-pc-btn-handler.cookie-setting-link { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-consent-sdk .onetrust-pc-btn-handler { color: #005ea0 !important; border-color: #005ea0 !important; background-color: undefined !important; } Starting in early 2025, Telehouse International Corporation of Europe will offer an advanced liquid cooling lab at their newest data center, Telehouse South at the London Docklands campus in Blackwall Yard. Telehouse has partnered with four leading liquid-cooling technology vendors — Accelsius, JetCool, Legrand, and EkkoSense — to allow customers to explore different cooling technologies and management tools while evaluating suitability for their use in the customer applications. Dr. Stu Redshaw, Chief Technology and Innovation Officer at EkkoSense, said about the project: Given that it’s not possible to run completely liquid-cooled data centers, the reality for most data center operators is that liquid cooling and air cooling will have an important role to play in the cooling mix – most likely as part of an evolving hybrid cooling approach. However, key engineering questions need answering before simply deploying liquid cooling – including establishing the exact blend of air and liquid cooling technologies you’ll need. And also recognizing the complexity of managing the operation of a hybrid air cooling and liquid cooling approach within the same room. This increases the

Read More »

Flexential Partners with Lonestar to Support First Lunar Data Center

Flexential, a leading provider of secure and flexible data center solutions, this month announced that it has joined forces with Lonestar Data Holdings Inc. to support the upcoming launch of Freedom, Lonestar’s second lunar data center. Scheduled to launch aboard a SpaceX Falcon 9 rocket via Intuitive Machines, this mission is a critical step toward establishing a permanent data center on the Moon. Ground-Based Support for Lunar Data Storage Flexential’s Tampa data center will serve as the mission control platform for Lonestar’s lunar operations, providing colocation, interconnection, and professional services. The facility was chosen for its proximity to Florida’s Space Coast launch operations and its ability to deliver low-latency connectivity for critical functions. Flexential operates two data centers in Tampa and four in Florida as part of its FlexAnywhere® Platform, comprising more than 40 facilities across the U.S. “Flexential’s partnership with Lonestar represents our commitment to advancing data center capabilities beyond conventional boundaries,” said Jason Carolan, Chief Innovation Officer at Flexential. “By supporting Lonestar’s space-based data center initiative, we are helping to create new possibilities for data storage and disaster recovery. This project demonstrates how innovative data center expertise can help organizations prepare for a resilient future with off-world storage solutions.” A New Era of Space-Based Resiliency The growing demand for data center capacity, with U.S. power consumption expected to double from 17 GW in 2022 to 35 GW by 2030 (according to McKinsey & Company), is driving interest in space-based solutions. Storing data off-planet reduces reliance on terrestrial resources while enhancing security against natural disasters, warfare, and cyber threats. The Freedom data center will provide resiliency, disaster recovery, and edge processing services for government and enterprise customers requiring the highest levels of data protection. The solar-powered data center leverages Solid-State Drives (SSDs) and a Field Programmable Gate Array (FPGA) edge

Read More »

Why DeepSeek Is Great for AI and HPC and Maybe No Big Deal for Data Centers

In the rapid and ever-evolving landscape of artificial intelligence (AI) and high-performance computing (HPC), the emergence of DeepSeek’s R1 model has sent ripples across industries. DeepSeek has been the data center industry’s topic of the week, for sure. The Chinese AI app surged to the top of US app store leaderboards last weekend, sparking a global selloff in technology shares Monday morning.  But while some analysts predict a transformative impact within the industry, a closer examination suggests that, for data centers at large, the furor over DeepSeek might ultimately be much ado about nothing. DeepSeek’s Breakthrough in AI and HPC DeepSeek, a Chinese AI startup, this month unveiled its R1 model, claiming performance on par with, or even surpassing, leading models like OpenAI’s ChatGPT-4 and Anthropic’s Claude-3.5-Sonnet. Remarkably, DeepSeek developed this model at a fraction of the cost typically associated with such advancements, utilizing a cluster of 256 server nodes equipped with 2,048 GPUs. This efficiency has been attributed to innovative techniques and optimized resource utilization. AI researchers have been abuzz about the performance of the DeepSeek chatbot that produces results similar to ChatGPT, but is based on open-source models and reportedly trained on older GPU chips. Some researchers are skeptical of claims about DeepSeek’s development costs and means, but its performance appears to challenge common assumptions about the computing cost of developing AI applications. This efficiency has been attributed to innovative techniques and optimized resource utilization.  Market Reactions and Data Center Implications The announcement of DeepSeek’s R1 model led to significant market reactions, with notable declines in tech stocks, including a substantial drop in Nvidia’s valuation. This downturn was driven by concerns that more efficient AI models could reduce the demand for high-end hardware and, by extension, the expansive data centers that house them. For now, investors are re-assessing the

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »