Stay Ahead, Stay ONMINE

Why everyone in AI is freaking out about DeepSeek

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More As of a few days ago, only the nerdiest of nerds (I say this as one) had ever heard of DeepSeek, a Chinese A.I. subsidiary of the equally evocatively named High-Flyer Capital Management, a quantitative analysis […]

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


As of a few days ago, only the nerdiest of nerds (I say this as one) had ever heard of DeepSeek, a Chinese A.I. subsidiary of the equally evocatively named High-Flyer Capital Management, a quantitative analysis (or quant) firm that initially launched in 2015.

Yet within the last few days, it’s been arguably the most discussed company in Silicon Valley. That’s largely thanks to the release of DeepSeek R1, a new large language model that performs “reasoning” similar to OpenAI’s current best-available model o1 — taking multiple seconds or minutes to answer hard questions and solve complex problems as it reflects on its own analysis in a step-by-step, or “chain of thought” fashion.

Not only that, but DeepSeek R1 scored as high or higher than OpenAI’s o1 on a variety of third-party benchmarks (tests to measure AI performance at answering questions on various subject matter), and was reportedly trained at a fraction of the cost (reportedly around $5 million) , with far fewer graphics processing units (GPU) under a strict embargo imposed by the U.S., OpenAI’s home turf.

But unlike o1, which is available only to paying ChatGPT subscribers of the Plus tier ($20 per month) and more expensive tiers (such as Pro at $200 per month), DeepSeek R1 was released as a fully open source model, which also explains why it has quickly rocketed up the charts of AI code sharing community Hugging Face’s most downloaded and active models.

Also, thanks to the fact that it is fully open source, people have already fine-tuned and trained many multiple variations of the model for different task-specific purposes such as making it small enough to run on a mobile device, or combining it with other open source models. Even if you want to use it for development purposes, DeepSeek’s API costs are more than 90% cheaper than the equivalent o1 model from OpenAI.

Most impressively of all, you don’t even need to be a software engineer to use it: DeepSeek has a free website and mobile app even for U.S. users with an R1-powered chatbot interface very similar to OpenAI’s ChatGPT. Except, once again, DeepSeek undercut or “mogged” OpenAI by connecting this powerful reasoning model to web search — something OpenAI hasn’t yet done (web search is only available on the less powerful GPT family of models at present).

An open and shut irony

There’s a pretty delicious, or maybe disconcerting irony to this given OpenAI’s founding goals to democratize AI to the masses. As NVIDIA Senior Research Manager Jim Fan put it on X: “We are living in a timeline where a non-US company is keeping the original mission of OpenAI alive – truly open, frontier research that empowers all. It makes no sense. The most entertaining outcome is the most likely.”

Or as X user @SuspendedRobot put it, referencing reports that DeepSeek appears to have been trained on question-answer outputs and other data generated by ChatGPT: “OpenAI stole from the whole internet to make itself richer, DeepSeek stole from them and give it back to the masses for free I think there is a certain british folktale about this”

But Fan isn’t the only one to sit up and take note of DeepSeek’s success. The open source availability of DeepSeek R1, its high performance, and the fact that it seemingly “came out of nowhere” to challenge the former leader of generative AI, has sent shockwaves throughout Silicon Valley and far beyond, based on my conversations and readings of various engineers, thinkers, and leaders. If not “everyone” is freaking out about it as my hyperbolic headline suggests, it’s certainly the talk of the town in tech and business circles.

A message posted to Blind, the app for sharing anonymous gossip in Silicon Valley, has been making the rounds suggesting Meta is in crisis over the success of DeepSeek because of how quickly it surpassed Meta’s own efforts to be the king of open source AI with its Llama models.

‘This changes the whole game’

X user @tphuang wrote compellingly: “DeepSeek has commoditized AI outside of very top-end. Lightbulb moment for me in 1st photo. R1 is so much cheaper than US labor cost that many jobs will get automated away over next 5 yrs,” later noting why DeepSeek’s R1 is more enticing to users than even OpenAI’s o1:

“3 huge issues w/ o1:
1) too slow
2) too expensive
3) lack of control for end user/reliance on OpenAI
R1 solves all of them. A company can buy their own Nvidia GPUs, run these models. Don’t have to worry about additional costs or slow/unresponsive OpenAI servers”

@tphaung also posed a compelling analogy as a question: “Will DeepSeek be to LLM what Android became to OS world?”

Web entrepreneur Arnaud Bertrand didn’t mince words about the startling implications of DeepSeek’s success, either, writing on X: “There’s no overstating how profoundly this changes the whole game. And not only with regards to AI, it’s also a massive indictment of the US’s misguided attempt to stop China’s technological development, without which Deepseek may not have been possible (as the saying goes, necessity is the mother of inventions).”

The censorship issue

However, others have sounded cautionary notes on DeepSeek’s rapid rise, arguing that as a startup operated out of China, it is necessarily subject to that country’s laws and content censorship requirements.

Indeed, my own usage of DeepSeek on the iOS app here in the U.S. found it would not answer questions about Tiananmen Square, the site of the 1989 pro-democracy student protests and uprising, and subsequent violent crackdown by the Chinese military, resulting in at least 200, possibly thousands of deaths, earning it the nickname “Tiananmen Square Massacre” in Western media outlets.

Ben Hylak, a former Apple human interface designer and co-founder of AI product analytics platform Dawn, posted on X how asking about this subject caused DeepSeek R1 to enter a circuitous loop.

As a member of the press itself, I of course take freedom of speech and expression extremely seriously and it is arguably one of the most fundamental, inarguable causes I champion.

Yet I would be remiss not to note that OpenAI’s models and products including ChatGPT also refuse to answer a whole range of questions about even innocuous content — especially pertaining to human sexuality and erotic/adult, NSFW subject matter.

It’s not an apples-to-apples comparison, of course. And there will be some for whom the resistance to relying on foreign technology makes them skeptical of DeepSeek’s ultimate value and utility. But there’s no denying its performance and low cost.

And in a time when 16.5% of all U.S. goods are imported by China, it’s hard for me to caution against using DeepSeek R1 on the basis of censorship concerns or security risks — especially when the model code is freely available to download, take offline, use on-device in secure environments, and to fine-tune at will.

I definitely detect some existential crisis about the “fall of the West” and “rise of China,” motivating some of the animated discussion around DeepSeek, however, and others have already connected it to how U.S. users joined the app Xiaohongshu (aka “Little Red Book”) when TikTok was briefly banned in this country, only to be amazed at the quality of life in China depicted in the videos shared there. DeepSeek R1’s arrival occurs in this narrative context — one in which China appears (and by many metrics is clearly) ascendant while the U.S. appears (and by many metrics, also is) in decline.

The first but hardly the last Chinese AI model to shake the world

It also won’t be the last Chinese AI model to threaten the dominance of Silicon Valley giants — even as they, like OpenAI, raise more money than ever for their ambitions to develop artificial general intelligence (AGI), programs that outperform humans at most economically valuable work.

Just yesterday, another Chinese model from TikTok parent company Bytedance — called Doubao-1.5-pro — was released with performance matching OpenAI’s non-reasoning GPT-4o model on third-party benchmarks, but again, at 1/50th the cost.

Chinese models have gotten so good, so fast, even those outside the tech industry are taking note: The Economist magazine just ran a piece on DeepSeek’s success and that of other Chinese AI efforts, and political commentator Matt Bruenig posted on X that: “I have been extensively using Gemini, ChatGPT, and Claude for NLRB document summary for nearly a year. Deepseek is better than all of them at it. The chatbot version of it is free. Price to use it’s API is 99.5% below the price of OpenAI’s API. [shrug emoji]”

How does OpenAI respond?

Little wonder OpenAI co-founder and CEO Sam Altman today said that the company was bringing its yet-to-be released second reasoning model family, o3, to ChatGPT even for free users. OpenAI still appears to be carving its own path with more proprietary and advanced models — setting the industry standard.

But the question becomes: with DeepSeek, ByteDance, and other Chinese AI companies nipping at its heels, how long can OpenAI remain in the lead at making and releasing new cutting-edge AI models? And if it and when it falls, how hard and how fast will its decline be?

OpenAI does have another historical precedent going for it, though. If DeepSeek and Chinese AI models do indeed become to LLMs as Google’s open source Android did to mobile — taking the lion’s share of the market for a while — you only have to see how the Apple iPhone with its locked down, proprietary, all-in house approach managed to carve off the high-end of the market and steadily expand downward from there, especially in the U.S., to the point that it now owns nearly 60% of the domestic smartphone market.

Still, for all those spending big bucks to use AI models from leading labs, DeepSeek shows the same capabilities may be available for much cheaper and with much greater control. And in an enterprise setting, that may be enough to win the ballgame.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Want to transform networking? Empower the missing users

Nokia seems to have the same goal, but it is taking a different route to reach it. Rather than trying to assemble the ingredients of the kind of IoT needed for empowerment, they start with a recipe—the digital twin. Digital twins are computer models of real-world systems, designed to assemble

Read More »

Chevron VP Confirms Job Cuts

In a statement sent to Rigzone by the Chevron team, Chevron Corporation Vice Chairman Mark Nelson confirmed that the company expects to cut up to 20 percent of its workforce. “Chevron is taking action to simplify our organizational structure, execute faster and more effectively, and position the company for stronger

Read More »

Palo Alto Networks firewall bug being exploited by threat actors: Report

The issue doesn’t affect the company’s Cloud NGFW or Prisma Access software. Greynoise said exploitation began around Tuesday of this week. Assetnote published research about the hole on Wednesday. Palo Alto Networks published its advisory the same day. ‘Weird path-processing behavior’ The vulnerability, Assetnote said, is a “weird path-processing behavior”

Read More »

National Gas appoints Murphy to deliver major St Fergus project

National Gas has appointed engineering and construction company Murphy to deliver a critical infrastructure project at the St Fergus Gas Terminal on the Northeast coast of Scotland. The Medium Combustion Plant Directive (MCPD) Compressor Project will help keep the facility compliant with emissions regulations. Under the multi-million pound contract, Murphy will design, construct and commission three low-emission compressor units. The project includes all associated civil, electrical, control and instrumentation, process, mechanical, and piping work, along with OEM rotating equipment. Murphy will work alongside its engineering partner Worley as part of the project, with completion expected in 2029. Murphy operations director Andy Harding said: “As one of National Gas’s priority investment sites to achieve MCPD emissions targets by 2030, we are proud to lead the delivery of this essential project at St Fergus. “With a proven track record in delivering compressor projects safely and successfully, Murphy, supported by our trusted engineering partner Worley, will provide resource certainty, safety excellence and on-time delivery. Our in-house team will bring unparalleled compressor expertise, enabling us to deliver this critical infrastructure project and meet all stakeholder requirements while maintaining a strong focus on cost efficiency.” The St Fergus Gas Terminal provides access to gas from the UK Continental Shelf (UKCS) and Norway, which it feeds into the National Transmission System (NTS). It regularly supplies between 25% and 50% of the country’s natural gas. The main terminal also receives treated gas from three sub-terminals, currently owned by Shell, Ancala and North Sea Midstream Partners (NSMP). Murphy has previously delivered a diverse range of complex engineering and construction projects for National Gas at St Fergus. These include asset health interventions, actuator replacements, aftercooler upgrades and cathodic protection enhancements. National Gas project director Darren Christie said the work will contribute “to our business’s wider 2030 emissions-reduction commitments.” “Our

Read More »

Texas GulfLink Oil Export Project Gets Federal Approval

Sentinel Midstream has received approval from the Department of Transportation for a project to build a crude oil export facility with loading rates of up to one million barrels a day off the coast of Brazoria County, Texas. The DOT’s Maritime Administration (MARAD) said the approval for Texas GulfLink supports President Donald Trump’s “Unleashing American Energy” Executive Order. “With this approval, we are increasing our energy revenue and unlocking our vast oil resources—not just for domestic security, but to dominate the global market”, Transport Secretary Sean P. Duffy said in an online statement. “This plan opens the floodgates for American oil exports, putting our producers in the driver’s seat and ensuring that the world looks to the United States—not foreign adversaries—for energy supply”, Duffy added. “By expanding production and giving American companies the ability to compete on the world stage, we are advancing American energy security, bringing more money into our country, and driving down costs for consumers. This plan is a declaration that American energy will fuel not just our own economy, but the global market—on our terms”. Dallas, Texas-based Sentinel Midstream said in a statement Monday, “Texas GulfLink will immediately turn its attention to satisfying the license conditions and is eager to work with the U.S. Maritime Administration, the U.S. Coast Guard, and the Trump administration, to bring this transformative project to fruition”. The deepwater port terminal will rise about 26.6 nautical miles off Brazoria’s side of the Texan Gulf Coast in a water depth of about 104 feet. The project involves a deepwater port, one fixed offshore platform, about 45 statute miles of pipeline infrastructure and a booster station, according to information on MARAD’s website. The design would allow up to two very large crude carriers (VLCC) to moor at single-point mooring (SPM) buoys and connect with

Read More »

Imperial Names New CEO

Imperial Oil Ltd. will have a new chair, president and chief executive from April 1, 2025, as the board approved the appointment of John Whelan to the position. The company said in a media release that Whelan will replace Brad Carson, the current chairman, president, and chief executive, who has declared plans to retire from Imperial after 42 years of service.   “On behalf of the Imperial board of directors, I would like to thank Brad Corson for his incredible leadership and dedication over the past five years”, David Cornhill, lead director, said. “Brad steered the company through the challenges of the global pandemic, with the organization emerging to deliver the strongest financial years in company history. “During his tenure, Imperial achieved repeated records for upstream production and exceptional downstream performance, delivering significant shareholder value through unprecedented share price appreciation and cash returns through record share repurchases and dividend growth. “Importantly, Brad has also positioned the company and its employees for future competitive success with strategic projects, including growth projects at Kearl and Cold Lake, the Strathcona renewable diesel facility and Low Carbon Solutions business, and as a founding member of the Pathways Alliance initiative to reduce emissions from oil sands operations”. “John brings extensive experience at both Imperial and on global ExxonMobil portfolios to successfully deliver exceptional operational performance and enhanced competitiveness, which will build on this strong momentum and continue to grow shareholder value going forward, and we welcome him back to Imperial”, Cornhill added. Whelan, originally from St. John’s, Newfoundland and Labrador, began his career with ExxonMobil in 1988 in Alberta. He has held various roles across Canada, the U.S., and Norway, Imperial said. After returning to Canada in 2013, he served as Imperial’s vice president of upstream production and became senior vice president of upstream in

Read More »

Aker Solutions Logs 42 Percent YoY Growth in Q4 Revenue

Aker Solutions ASA has reported NOK 15.71 billion ($1.41 billion) in revenue for the fourth quarter (Q4) of 2024, up 41.97 percent from the same three-month period last year, attributing the increase to high activity across segments and locations. Revenue excluding special items totaled NOK 15.708 billion for Q4 2024, up 42.53 percent year-on-year (YoY). The adjustments were mainly acquisition costs, restructuring costs, impairments, costs linked to the impact of currency derivatives not qualifying for hedge accounting and changes in the value of SLB shares, Aker Solutions said in its quarterly report. Aker Solutions, SLB and Subsea7 are co-venturers in subsea supplier One Subsea, which contributed NOK 166 million to Aker Solutions’ Q4 net profit. Net profit excluding special items rose 117.4 percent year-over-year to NOK 837 million, or NOK 1.7 per share. Earnings before interest, taxes, depreciation and amortization (EBITDA) climbed 99.78 percent year-on-year to NOK 1.19 billion. EBITDA excluding special items increased 5.6 percent year-on-year to NOK 1.2 billion. “This was driven by continued strong performance in our Life Cycle segment, while negatively affected by additional losses in the legacy renewables projects, which will be delivered in 2025”, the energy engineering company said in its quarterly report. EBIT increased 209.85 percent year-on-year to NOK 849 million. Excluding special items, EBIT increased 125.95 percent to NOK 888 million. Net cash flow from operating activities was NOK 1.18 billion, up 99.86 percent year-on-year. Oslo-listed Aker Solutions distributed NOK 10 billion in extraordinary dividends during Q4. The board is proposing NOK 3.3 dividend per share for 2024. “The dividend payment represents approximately 50 percent of net income excl. special items, in line with the ordinary dividend policy”, the company said. Aker Solutions ended 2024 with NOK 17.92 billion in current assets including NOK 2.86 billion in cash and cash equivalents. Its

Read More »

Petro-Victory to Acquire Oil Assets in Brazil’s Potiguar Basin

Petro-Victory Energy Corp. is pushing through with the acquisition of 13 oil fields in Rio Grande do Norte, located in the Potiguar Basin of Brazil, from Brava Energia S.A. Petro-Victory signed a sale purchase agreement, with a 50/50 partnership with Azevedo & Travassos Petroleo (ATP), for the acquisition of Polo Porto Carão and Polo Barrinha from Brava Energia subsidiaries 3R RNCE S.A. and 3R Potiguar S.A. The Calgary, Alberta-based company signed an exclusivity agreement with Brava Energia S.A. for the assets in December 2024. The total acquisition value for the assets is $15 million, net $7.5 million for Petro-Victory, it said in a news release. The payment will be made in four tranches plus a gross overriding royalty, of which Petro-Victory’s contribution will be pro-rated at 50 percent working interest. The transaction is expected to close in the second half of 2025, subject to customary closing conditions and regulatory approvals from the National Agency of Petroleum in Brazil, the company said. The 13 oil fields, with fully operational production facilities comprising 38,301 acres, are “strategically located adjacent to Petro-Victory’s existing assets,” it said. The assets have current production of 250 barrels of oil per day, with a high-impact work program underway to increase oil production. Brazil’s National Agency of Petroleum has reported a volume of oil in place of 125 million barrels, with a recovery factor of 13.3 percent, according to the release. “The proximity of the new oil fields to our existing assets enables us to streamline logistics and share services such as maintenance, transportation, administrative support, and resource allocation for personnel, equipment, and technologies. This shared infrastructure significantly reduces overall operational costs,” Petro-Victory said. The Porto Carão Cluster is located in the Potiguar onshore basin, in the state of Rio Grande do Norte, near the municipality of Guamaré. It

Read More »

North America Drops Rigs Again WoW

North America dropped two rigs week on week, according to Baker Hughes’ latest North America rotary rig count, which was released on February 14. Although the U.S. added two rigs week on week, Canada dropped four rigs during the same period, taking the total North America rig count down to 833, comprising 588 rigs from the U.S. and 245 rigs from Canada, the count outlined. Of the total U.S. rig count of 588, 572 rigs are categorized as land rigs, 14 are categorized as offshore rigs, and two are categorized as inland water rigs. The total U.S. rig count is made up of 481 oil rigs, 101 gas rigs, and six miscellaneous rigs, according to the count, which revealed that the U.S. total comprises 524 horizontal rigs, 51 directional rigs, and 13 vertical rigs. Week on week, the U.S. land rig count increased by two, and the country’s offshore and inland water rig counts remained unchanged, the count highlighted. The U.S. oil and gas rig counts each increased by one week on week, and the country’s miscellaneous rig count remained unchanged during the same timeframe, the count showed. Baker Hughes’ count revealed that the U.S. horizontal and directional rig counts each increased by one week on week, while the country’s vertical rig count remained unchanged during the period. A major state variances subcategory included in the rig count showed that, week on week, Texas added two rigs, and Oklahoma and Utah each added one rig. Louisiana and North Dakota each dropped one rig week on week, according to the count. A major basin variances subcategory included in Baker Hughes’ rig count showed that the Granite Wash and Permian basins each added one rig and the Williston basin dropped one rig, week on week. Canada’s total rig count of 245 is

Read More »

Data center spending to top $1 trillion by 2029 as AI transforms infrastructure

His projections account for recent advances in AI and data center efficiency, he says. For example, the open-source AI model from Chinese company DeepSeek seems to have shown that an LLM can produce very high-quality results at a very low cost with some clever architectural changes to how the models work. These improvements are likely to be quickly replicated by other AI companies. “A lot of these companies are trying to push out more efficient models,” says Fung. “There’s a lot of effort to reduce costs and to make it more efficient.” In addition, hyperscalers are designing and building their own chips, optimized for their AI workloads. Just the accelerator market alone is projected to reach $392 billion by 2029, Dell’Oro predicts. By that time, custom accelerators will outpace commercially available accelerators such as GPUs. The deployment of dedicated AI servers also has an impact on networking, power and cooling. As a result, spending on data center physical infrastructure (DCPI) will also increase, though at a more moderate pace, growing by 14% annually to $61 billion in 2029.  “DCPI deployments are a prerequisite to support AI workloads,” says Tam Dell’Oro, founder of Dell’Oro Group, in the report. The research firm raised its outlook in this area due to the fact that actual 2024 results exceeded its expectations, and demand is spreading from tier one to tier two cloud service providers. In addition, governments and tier one telecom operators are getting involved in data center expansion, making it a long-term trend.

Read More »

The Future of Property Values and Power in Virginia’s Loudoun County and ‘Data Center Alley’

Loudoun County’s FY 2026 Proposed Budget Is Released This week, Virginia’s Loudoun County released its FY 2026 Proposed Budget. The document notes how data centers are a major driver of revenue growth in Loudoun County, contributing significantly to both personal and real property tax revenues. As noted above, data centers generate almost 50% of Loudoun County property tax revenues. Importantly, Loudoun County has now implemented measures such as a Revenue Stabilization Fund (RSF) to manage the risks associated with this revenue dependency. The FY 2026 budget reflects the strong growth in data center-related revenue, allowing for tax rate reductions while still funding critical services and infrastructure projects. But the county is mindful of the potential volatility in data center revenue and is planning for long-term fiscal sustainability. The FY 2026 Proposed Budget notes how Loudoun County’s revenue from personal property taxes, particularly from data centers, has grown significantly. From FY 2013 to FY 2026, revenue from this source has increased from $60 million to over $800 million. Additionally, the county said its FY 2026 Proposed Budget benefits from $150 million in new revenue from the personal property tax portfolio, with $133 million generated specifically from computer equipment (primarily data centers). The county said data centers have also significantly impacted the real property tax portfolio. In Tax Year (TY) 2025, 73% of the county’s commercial portfolio is composed of data centers. The county said its overall commercial portfolio experienced a 50% increase in value between TY 2024 and TY 2025, largely driven by the appreciation of data center properties. RSF Meets Positive Economic Outlook The Loudoun County Board of Supervisors created the aformentioned Revenue Stabilization Fund (RSF) to manage the risks associated with the county’s reliance on data center-related revenue. The RSF targets 10% of data center-related real and personal property tax

Read More »

Deep Diving on DeepSeek: AI Disruption and the Future of Liquid Cooling

We know that the data center industry is currently undergoing a period of rapid transformation, driven by the increasing demands of artificial intelligence (AI) workloads and evolving cooling technologies. And it appears that the recent emergence of DeepSeek, a Chinese AI startup, alongside supply chain issues for NVIDIA’s next-generation GB200 AI chips, may be prompting data center operators to reconsider their cooling strategies. Angela Taylor, Chief of Staff at LiquidStack, provided insights to Data Center Frontier on these developments, outlining potential shifts in the industry and the future of liquid cooling adoption. DeepSeek’s Market Entry and Supply Chain Disruptions Taylor told DCF, “DeepSeek’s entry into the market, combined with NVIDIA’s GB200 supply chain delays, is giving data center operators a lot to think about.” At issue here is how DeepSeek’s R1 chatbot came out of the box positioned an energy-efficient AI model that reportedly requires significantly less power than many of its competitors. This development raises questions about whether current data center cooling infrastructures are adequate, particularly as AI workloads become more specialized and diverse. At the same time, NVIDIA’s highly anticipated GB200 NVL72 AI servers, designed to handle next-generation AI workloads, are reportedly facing supply chain bottlenecks. Advanced design requirements, particularly for high-bandwidth memory (HBM) and power-efficient cooling systems, have delayed shipments, with peak availability now expected between Q2 and Q3 of 2025.  This combination of a new AI player and delayed hardware supply has created uncertainty, compelling data center operators to reconsider their near-term cooling infrastructure investments. A Temporary Slowdown in AI Data Center Retrofits? Taylor also observed, “We may see a short-term slowdown in AI data center retrofits as operators assess whether air cooling can now meet their needs.” The efficiency of DeepSeek’s AI models suggests that some AI workloads may require less power and generate less heat, making air

Read More »

Georgia Follows Ohio’s Lead in Moving Energy Costs to Data Centers

The rule also mandates that any new contracts between Georgia Power and large-load customers exceeding 100 MW be submitted to the PSC for review. This provision ensures regulatory oversight and transparency in agreements that could significantly impact the state’s power grid and ratepayers. Commissioner Lauren “Bubba” McDonald points out that this is one of a number of actions that the PSC is planning to protect ratepayers, and that the PSC’s 2025 Integrated Resource Plan will further address data center power usage. Keeping Ahead of Anticipated Energy Demand This regulatory change reflects Georgia’s proactive approach to managing the increasing energy demands associated with the state’s growing data center industry, aiming to balance economic development with the interests of all electricity consumers. Georgia Power has been trying very hard to develop generation capacity to meet it’s expected usage pattern, but the demand is increasing at an incredible rate. In their projection for increased energy demand, the 2022 number was 400 MW by 2030. A year later, in their 2023 Integrated Resource Plan, the anticipated increase had grown to 6600 MW by 2030. Georgia Power recently brought online two new nuclear reactors at the Vogtle Electric Generating Plant, significantly increasing its nuclear generation capacity giving the four unit power generation station a capacity of over 4.5 GW. This development has contributed to a shift in Georgia’s energy mix, with clean energy sources surpassing fossil fuels for the first time. But despite the commitment to nuclear power, the company is also in the process of developing three new power plants at the Yates Steam Generating Plant. According to the AJC newspaper, regulators had approved the construction of fossil fuel power, approving natural gas and oil-fired power plants. Designed as “peaker” plants to come online at times of increased the demand, the power plants will

Read More »

Chevron, GE Vernova, Engine No.1 Join Race to Co-Locate Natural Gas Plants for U.S. Data Centers

Other Recent Natural Gas Developments for Data Centers As of February 2025, the data center industry has seen a host of significant developments in natural gas plant technologies and strategic partnerships aimed at meeting the escalating energy demands driven by AI and cloud computing. In addition to the partnership between Chevron, Engine No. 1, and GE Vernova, other consequential initiatives include the following: ExxonMobil’s Entry into the Electricity Market ExxonMobil has announced plans to build natural gas-fired power plants to supply electricity to AI data centers. The company intends to leverage carbon capture and storage technology to minimize emissions, positioning its natural gas solutions as competitive alternatives to nuclear power. This announcement in particular seemed to herald a notable shift in industry as fossil fuel companies venture into the electricity market to meet the rising demand for low-carbon power. Powerconnex Inc.’s Natural Gas Plant in Ohio An Ohio data center in New Albany, developed by Powerconnex Inc., plans to construct a natural gas-fired power plant on-site to meet its electricity needs amidst the AI industry’s increasing energy demands. The New Albany Energy Center is expected to generate up to 120 megawatts (MW) of electricity, with construction beginning in Q4 2025 and operations commencing by Q1 2026. Crusoe and Kalina Distributed Power Partnership in Alberta, Canada AI data center developer Crusoe has entered into a multi-year framework agreement with Kalina Distributed Power to develop multiple co-located AI data centers powered by natural gas power plants in Alberta, Canada. Crusoe will own and operate the data centers, purchasing power from three Kalina-owned 170 MW gas-fired power plants through 15-year Power Purchase Agreements (PPAs). Entergy’s Natural Gas Power Plants for Data Centers Entergy plans to deploy three new natural gas power plants, providing over 2,200 MW of energy over 15 years, pending approval

Read More »

Podcast: Phill Lawson-Shanks, Chief Innovation Officer, Aligned Data Centers

In the latest episode of the Data Center Frontier Show podcast, DCF Editor-in-Chief Matt Vincent sits down with Phill Lawson-Shanks, Chief Innovation Officer at Aligned Data Centers, for a wide-ranging discussion that touches on some of the most pressing trends and challenges shaping the future of the data center industry. From the role of nuclear energy and natural gas in addressing the sector’s growing power demands, to the rapid expansion of Aligned’s operations in Latin America (LATAM), in the course of the podcast Lawson-Shanks provides deep insight into where the industry is headed. Scaling Sustainability: Tracking Embodied Carbon and Scope 3 Emissions A key focus of the conversation is sustainability, where Aligned continues to push boundaries in carbon tracking and energy efficiency. Lawson-Shanks highlights the company’s commitment to monitoring embodied carbon—an effort that began four years ago and has since positioned Aligned as an industry leader. “We co-authored and helped found the Climate Accord with iMasons—taking sustainability to a whole new level,” he notes, emphasizing how Aligned is now extending its carbon traceability standards to ODATA’s facilities in LATAM. By implementing lifecycle assessments (LCAs) and tracking Scope 3 emissions, Aligned aims to provide clients with a detailed breakdown of their environmental impact. “The North American market is still behind in lifecycle assessments and environmental product declarations. Where gaps exist, we look for adjacencies and highlight them—helping move the industry forward,” Lawson-Shanks explains. The Nuclear Moment: A Game-Changer for Data Center Power One of the most compelling segments of the discussion revolves around the growing interest in nuclear energy—particularly small modular reactors (SMRs) and microreactors—as a viable long-term power solution for data centers. Lawson-Shanks describes the recent industry buzz surrounding Oklo’s announcement of a 12-gigawatt deployment with Switch as a significant milestone, calling the move “inevitable.” “There are dozens of nuclear

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »