Stay Ahead, Stay ONMINE

Deep Research by OpenAI: A Practical Test of AI-Powered Literature Review

“Conduct a comprehensive literature review on the state-of-the-art in Machine Learning and energy consumption. […]” With this prompt, I tested the new Deep Research function, which has been integrated into the OpenAI o3 reasoning model since the end of February — and conducted a state-of-the-art literature review within 6 minutes. This function goes beyond a normal web search (for example, with ChatGPT 4o): The research query is broken down & structured, the Internet is searched for information, which is then evaluated, and finally, a structured, comprehensive report is created. Let’s take a closer look at this. Table of Content1. What is Deep Research from OpenAI and what can you do with it?2. How does deep research work?3. How can you use deep research? — Practical example4. Challenges and risks of the Deep Research featureFinal ThoughtsWhere can you continue learning? 1. What is Deep Research from OpenAI and what can you do with it? If you have an OpenAI Plus account (the $20 per month plan), you have access to Deep Research. This gives you access to 10 queries per month. With the Pro subscription ($200 per month) you have extended access to Deep Research and access to the research preview of GPT-4.5 with 120 queries per month. OpenAI promises that we can perform multi-step research using data from the public web. Duration: 5 to 30 minutes, depending on complexity.  Previously, such research usually took hours. It is intended for complex tasks that require a deep search and thoroughness. What do concrete use cases look like? Conduct a literature review: Conduct a literature review on state-of-the-art machine learning and energy consumption. Market analysis: Create a comparative report on the best marketing automation platforms for companies in 2025 based on current market trends and evaluations. Technology & software development: Investigate programming languages and frameworks for AI application development with performance and use case analysis Investment & financial analysis: Conduct research on the impact of AI-powered trading on the financial market based on recent reports and academic studies. Legal research: Conduct an overview of data protection laws in Europe compared to the US, including relevant rulings and recent changes. 2. How does Deep Research work? Deep Research uses various Deep Learning methods to carry out a systematic and detailed analysis of information. The entire process can be divided into four main phases: 1. Decomposition and structuring of the research question In the first step the tool processes the research question using natural language processing (NLP) methods. It identifies the most important key terms, concepts, and sub-questions.  This step ensures that the AI understands the question not only literally, but also in terms of content. 2. Obtaining relevant information Once the tool has structured the research question, it searches specifically for information. Deep Research uses a mixture of internal databases, scientific publications, APIs, and web scraping. These can be open-access databases such as arXiv, PubMed, or Semantic Scholar, for example, but also public websites or news sites such as The Guardian, New York Times, or BBC. In the end, any content that can be accessed online and is publicly available. 3. Analysis & interpretation of the data The next step is for the AI model to summarize large amounts of text into compact and understandable answers. Transformers & Attention mechanisms ensure that the most important information is prioritized. This means that it does not simply create a summary of all the content found. Also, the quality and credibility of the sources is assessed. And cross-validation methods are normally used to identify incorrect or contradictory information. Here, the AI tool compares several sources with each other. However, it is not publicly known exactly how this is done in Deep Research or what criteria there are for this. 4. Generation of the final report Finally, the final report is generated and displayed to us. This is done using Natural Language Generation (NLG) so that we see easily readable texts. The AI system generates diagrams or tables if requested in the prompt and adapts the response to the user’s style. The primary sources used are also listed at the end of the report. 3. How you can use Deep Research: A practical example In the first step, it is best to use one of the standard models to ask how you should optimize the prompt in order to conduct deep research. I have done this with the following prompt with ChatGPT 4o: “Optimize this prompt to conduct a deep research:Carrying out a literature search: Carry out a literature search on the state of the art on machine learning and energy consumption.” The 4o model suggested the following prompt for the Deep Research function: Screenshot taken by the author The tool then asked me if I could clarify the scope and focus of the literature review. I have, therefore, provided some additional specifications: Screenshot taken by the author ChatGPT then returned the clarification and started the research. In the meantime, I could see the progress and how more sources were gradually added. After 6 minutes, the state-of-the-art literature review was complete, and the report, including all sources, was available to me. Deep Research Example.mp4 4. Challenges and risks of the Deep Research feature Let’s take a look at two definitions of research: “A detailed study of a subject, especially in order to discover new information or reach a new understanding.” Reference: Cambridge Dictionary “Research is creative and systematic work undertaken to increase the stock of knowledge. It involves the collection, organization, and analysis of evidence to increase understanding of a topic, characterized by a particular attentiveness to controlling sources of bias and error.” Reference: Wikipedia Research The two definitions show that research is a detailed, systematic investigation of a topic — with the aim of discovering new information or achieving a deeper understanding. Basically, the deep research function fulfills these definitions to a certain extent: it collects existing information, analyzes it, and presents it in a structured way. However, I think we also need to be aware of some challenges and risks: Danger of superficiality: Deep Research is primarily designed to efficiently search, summarize, and provide existing information in a structured form (at least at the current stage). Absolutely great for overview research. But what about digging deeper? Real scientific research goes beyond mere reproduction and takes a critical look at the sources. Science also thrives on generating new knowledge. Reinforcement of existing biases in research & publication: Papers are already more likely to be published if they have significant results. “Non-significant” or contradictory results, on the other hand, are less likely to be published. This is known to us as publication bias. If the AI tool now primarily evaluates frequently cited papers, it reinforces this trend. Rare or less widespread but possibly important findings are lost. A possible solution here would be to implement a mechanism for weighted source evaluation that also takes into account less cited but relevant papers. If the AI methods primarily cite sources that are quoted frequently, less widespread but important findings may be lost. Presumably, this effect also applies to us humans. Quality of research papers: While it is obvious that a bachelor’s, master’s, or doctoral thesis cannot be based solely on AI-generated research, the question I have is how universities or scientific institutions deal with this development. Students can get a solid research report with just a single prompt. Presumably, the solution here must be to adapt assessment criteria to give greater weight to in-depth reflection and methodology. Final thoughts In addition to OpenAI, other companies and platforms have also integrated similar functions (even before OpenAI): For example, Perplexity AI has introduced a deep research function that independently conducts and analyzes searches. Also Gemini by Google has integrated such a deep research function. The function gives you an incredibly quick overview of an initial research question. It remains to be seen how reliable the results are. Currently (beginning March 2025), OpenAI itself writes as limitations that the feature is still at an early stage, can sometimes hallucinate facts into answers or draw false conclusions, and has trouble distinguishing authoritative information from rumors. In addition, it is currently unable to accurately convey uncertainties. But it can be assumed that this function will be expanded further and become a powerful tool for research. If you have simpler questions, it is better to use the standard GPT-4o model (with or without search), where you get an immediate answer. Where can you continue learning? Want more tips & tricks about tech, Python, data science, data engineering, machine learning and AI? Then regularly receive a summary of my most-read articles on my Substack — curated and for free. Click here to subscribe to my Substack!

“Conduct a comprehensive literature review on the state-of-the-art in Machine Learning and energy consumption. […]”

With this prompt, I tested the new Deep Research function, which has been integrated into the OpenAI o3 reasoning model since the end of February — and conducted a state-of-the-art literature review within 6 minutes.

This function goes beyond a normal web search (for example, with ChatGPT 4o): The research query is broken down & structured, the Internet is searched for information, which is then evaluated, and finally, a structured, comprehensive report is created.

Let’s take a closer look at this.

Table of Content
1. What is Deep Research from OpenAI and what can you do with it?
2. How does deep research work?
3. How can you use deep research? — Practical example
4. Challenges and risks of the Deep Research feature
Final Thoughts
Where can you continue learning?

1. What is Deep Research from OpenAI and what can you do with it?

If you have an OpenAI Plus account (the $20 per month plan), you have access to Deep Research. This gives you access to 10 queries per month. With the Pro subscription ($200 per month) you have extended access to Deep Research and access to the research preview of GPT-4.5 with 120 queries per month.

OpenAI promises that we can perform multi-step research using data from the public web.

Duration: 5 to 30 minutes, depending on complexity. 

Previously, such research usually took hours.

It is intended for complex tasks that require a deep search and thoroughness.

What do concrete use cases look like?

  • Conduct a literature review: Conduct a literature review on state-of-the-art machine learning and energy consumption.
  • Market analysis: Create a comparative report on the best marketing automation platforms for companies in 2025 based on current market trends and evaluations.
  • Technology & software development: Investigate programming languages and frameworks for AI application development with performance and use case analysis
  • Investment & financial analysis: Conduct research on the impact of AI-powered trading on the financial market based on recent reports and academic studies.
  • Legal research: Conduct an overview of data protection laws in Europe compared to the US, including relevant rulings and recent changes.

2. How does Deep Research work?

Deep Research uses various Deep Learning methods to carry out a systematic and detailed analysis of information. The entire process can be divided into four main phases:

1. Decomposition and structuring of the research question

In the first step the tool processes the research question using natural language processing (NLP) methods. It identifies the most important key terms, concepts, and sub-questions. 

This step ensures that the AI understands the question not only literally, but also in terms of content.

2. Obtaining relevant information

Once the tool has structured the research question, it searches specifically for information. Deep Research uses a mixture of internal databases, scientific publications, APIs, and web scraping. These can be open-access databases such as arXiv, PubMed, or Semantic Scholar, for example, but also public websites or news sites such as The Guardian, New York Times, or BBC. In the end, any content that can be accessed online and is publicly available.

3. Analysis & interpretation of the data

The next step is for the AI model to summarize large amounts of text into compact and understandable answers. Transformers & Attention mechanisms ensure that the most important information is prioritized. This means that it does not simply create a summary of all the content found. Also, the quality and credibility of the sources is assessed. And cross-validation methods are normally used to identify incorrect or contradictory information. Here, the AI tool compares several sources with each other. However, it is not publicly known exactly how this is done in Deep Research or what criteria there are for this.

4. Generation of the final report

Finally, the final report is generated and displayed to us. This is done using Natural Language Generation (NLG) so that we see easily readable texts.

The AI system generates diagrams or tables if requested in the prompt and adapts the response to the user’s style. The primary sources used are also listed at the end of the report.

3. How you can use Deep Research: A practical example

In the first step, it is best to use one of the standard models to ask how you should optimize the prompt in order to conduct deep research. I have done this with the following prompt with ChatGPT 4o:

“Optimize this prompt to conduct a deep research:
Carrying out a literature search: Carry out a literature search on the state of the art on machine learning and energy consumption.”

The 4o model suggested the following prompt for the Deep Research function:

Deep Research screenshot (German and English)
Screenshot taken by the author

The tool then asked me if I could clarify the scope and focus of the literature review. I have, therefore, provided some additional specifications:

Deep research screenshot
Screenshot taken by the author

ChatGPT then returned the clarification and started the research.

In the meantime, I could see the progress and how more sources were gradually added.

After 6 minutes, the state-of-the-art literature review was complete, and the report, including all sources, was available to me.

Deep Research Example.mp4

4. Challenges and risks of the Deep Research feature

Let’s take a look at two definitions of research:

“A detailed study of a subject, especially in order to discover new information or reach a new understanding.”

Reference: Cambridge Dictionary

“Research is creative and systematic work undertaken to increase the stock of knowledge. It involves the collection, organization, and analysis of evidence to increase understanding of a topic, characterized by a particular attentiveness to controlling sources of bias and error.”

Reference: Wikipedia Research

The two definitions show that research is a detailed, systematic investigation of a topic — with the aim of discovering new information or achieving a deeper understanding.

Basically, the deep research function fulfills these definitions to a certain extent: it collects existing information, analyzes it, and presents it in a structured way.

However, I think we also need to be aware of some challenges and risks:

  • Danger of superficiality: Deep Research is primarily designed to efficiently search, summarize, and provide existing information in a structured form (at least at the current stage). Absolutely great for overview research. But what about digging deeper? Real scientific research goes beyond mere reproduction and takes a critical look at the sources. Science also thrives on generating new knowledge.
  • Reinforcement of existing biases in research & publication: Papers are already more likely to be published if they have significant results. “Non-significant” or contradictory results, on the other hand, are less likely to be published. This is known to us as publication bias. If the AI tool now primarily evaluates frequently cited papers, it reinforces this trend. Rare or less widespread but possibly important findings are lost. A possible solution here would be to implement a mechanism for weighted source evaluation that also takes into account less cited but relevant papers. If the AI methods primarily cite sources that are quoted frequently, less widespread but important findings may be lost. Presumably, this effect also applies to us humans.
  • Quality of research papers: While it is obvious that a bachelor’s, master’s, or doctoral thesis cannot be based solely on AI-generated research, the question I have is how universities or scientific institutions deal with this development. Students can get a solid research report with just a single prompt. Presumably, the solution here must be to adapt assessment criteria to give greater weight to in-depth reflection and methodology.

Final thoughts

In addition to OpenAI, other companies and platforms have also integrated similar functions (even before OpenAI): For example, Perplexity AI has introduced a deep research function that independently conducts and analyzes searches. Also Gemini by Google has integrated such a deep research function.

The function gives you an incredibly quick overview of an initial research question. It remains to be seen how reliable the results are. Currently (beginning March 2025), OpenAI itself writes as limitations that the feature is still at an early stage, can sometimes hallucinate facts into answers or draw false conclusions, and has trouble distinguishing authoritative information from rumors. In addition, it is currently unable to accurately convey uncertainties.

But it can be assumed that this function will be expanded further and become a powerful tool for research. If you have simpler questions, it is better to use the standard GPT-4o model (with or without search), where you get an immediate answer.

Where can you continue learning?

Want more tips & tricks about tech, Python, data science, data engineering, machine learning and AI? Then regularly receive a summary of my most-read articles on my Substack — curated and for free.

Click here to subscribe to my Substack!

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

SolarWinds buys Squadcast to speed incident response

Squadcast customers shared their experiences with the technology. “Since implementing Squadcast, we’ve reduced incoming alerts from tens of thousands to hundreds, thanks to flexible deduplication. It has a direct impact on reducing alert fatigue and increasing awareness,” said Avner Yaacov, Senior Manager at Redis, in a statement. According to SolarWinds,

Read More »

Tariffs won’t impact IT organizations, for now anyway

The idea behind tariffs is to increase domestic manufacturing, but Almassy notes the United Stated doesn’t have the manufacturing capacity or capability that Taiwan does. TSMC, Samsung, and GlobalFoundries have some fabs here but they are not building the most leading edge technologies. “Those are all in Taiwan at the

Read More »

Oil Drops as Tariffs and OPEC+ Supply Shake Markets

Brent oil extended declines to the lowest closing price since November as US President Donald Trump’s tariffs kick off a series of trade wars and OPEC+ moves to revive production that has been halted for years. Trump delivered on his threat to hit Canada and Mexico with sweeping import levies and doubled an existing charge on China, sparking swift reprisals that threaten to reduce global economic growth and curtail energy demand. The moves come a day after OPEC+ said it would increase output after repeated delays, a decision that surprised a market already weighed down by expectations for a surplus later this year. Global benchmark Brent crude slipped to around $71 a barrel and earlier briefly dipped below $70 for the first time since October. Meanwhile, West Texas Intermediate notched only a marginal decline to around $68 a barrel as the potential loss of Canadian or Mexican supplies tightens the US market. The turmoil also is accentuating a bearish posture in the market, with oil traders paying the biggest premiums for put options in five months. “The OPEC+ announcement from yesterday to increase output coupled with the US tariffs on Mexican, Canadian and Chinese imports raise the prospects of an oversupplied market where economic growth and oil demand also suffer,” brokerage PVM wrote in a report. Technicals are also signaling a shift in longstanding market fundamentals, including a tighter market for US crude relative to Brent. WTI’s front-month spread — the difference in futures prices for immediate delivery and the next month — is trading at the biggest premium to the same gauge of Brent since mid-January, when the US ratcheted up sanctions against Russia. Meanwhile, algorithmic-driven investors known as commodity trading advisers are holding the largest net-short position in WTI since early October, according to Stephen Roseme, managing member

Read More »

BP’s Reset Plan Shows Early Vulnerability as Oil Nears $70

BP Plc’s strategy reset isn’t even a week old, but it’s already showing some vulnerabilities. The London-based energy giant was the second-worst performer on the FTSE-100 on Tuesday following a plunge in crude prices. Its shares fell 4.2% to the lowest in a month, compared with a drop of 3% for Shell Plc.  The decision by the Organization of Petroleum Exporting Countries and its allies to finally go ahead with a long delayed plan to revive production, combined with President Donald Trump’s imposition of tariffs on Canada, Mexico and China, pushed Brent crude toward $70 a barrel in London — a key level for BP.  Two of the company’s most important pledges to investors — to increase cash flow by more than 20% a year to 2027, and raise returns on average capital employed that year above 16% — require Brent above $70. The international benchmark is likely to fall below that crucial threshold, according to Bloomberg Intelligence. BP Chief Executive Officer Murray Auchincloss acknowledged at the time that crude prices would depend on external factors that are “hard to predict.” “It’s entwined in US politics,” he said in an interview on Thursday. “How do relationships unfold with Russia? How do relationships unfold with Iran? How do relationships unfold with OPEC?” BP’s slide and the drop in crude prices complicates a bet by Elliott Investment Management, which has bought up around 5% of BP. The activist investor has been ramping up pressure on BP after its new strategy fell short of its expectations, people with knowledge of the matter said. If the OPEC+ production increase and Trump’s trade war usher in a materially weaker outlook for the global oil market, that would make Elliott’s efforts to turn around BP’s performance more difficult. Meanwhile, Auchincloss is in the middle of a 10-day roadshow meeting with

Read More »

‘You would not believe how difficult’ Green Volt wind farm was

Lord Nicol Stephen set out the challenges facing the UK’s ability to deliver floating offshore wind capacity, warning the process facing developers was “at times infuriatingly hard and we need to speed up”. The former Deputy First Minister of Scotland, who now leads offshore wind developer Flotation Energy, highlighted the progress the firm is making on its projects including the 560MW Green Volt offshore wind project planned on a site 50 miles (80 km) off the east coast of Scotland. Green Volt was a clean energy highlight of 2024 when it became the first floating offshore wind scheme to win support in the UK government’s contracts for difference (CfD) funding round, called allocation round six (AR6). Flotation, along with its partner Vårgrønn, is also planning Cenos, a 1.4GW floating wind scheme much further out to sea, 118m (190km) from Peterhead. The company made waves when it delivered the Kincardine floating offshore wind farm, whose five turbines are visible from Aberdeen. Speaking at the opening session of the Scottish Energy Futures conference in Aberdeen, Stephen, who also spent a stint as an energy minister for Scotland, said: “We certainly need wind. We certainly need offshore wind. We certainly need offshore floating wind. We need solar, we need storage. I’d love to see tidal and wave doing much more.” He said wind has to largely be developed in deeper waters in order to produce the energy required as well as minimise impact. “We have got to get into deep water and that means floating wind has got to come forward. “Continuing to develop in shallower waters will not be environmentally acceptable to the sea birds and the many other species we are trying to save from climate change.” He said it is still “very difficult” to get new floating offshore wind projects

Read More »

Massachusetts to deploy 100 bidirectional EV chargers in first-of-its-kind ‘V2X’ pilot

Dive Brief: The Massachusetts Clean Energy Center will deploy 100 bidirectional electric vehicle chargers at no cost to residential, commercial, municipal and school customers as part of a two-year “vehicle-to-everything” demonstration, program partners Resource Innovations and The Mobility House said Feb. 24. The demonstration will add an estimated 1.5 MW of distributed energy storage capacity across Massachusetts by September 2026, when MassCEC expects all installations to be completed, Resource Innovations and TMH said. Calling the demonstration “a crucial step in Massachusetts’s efforts to enable wide-scale adoption of electric vehicles,” MassCEC Senior Director of Clean Transportation Rachel Ackerman said in a statement that the multi-sector deployment would help identify and resolve barriers to “accelerated application of V2X technology.” Dive Insight: The deployment is a “first-of-its-kind statewide [program] to enhance grid resilience, reduce energy costs and increase renewable energy integration across Massachusetts,” Resource Innovations and TMH said last month. MassCEC expects to install 50 to 60 chargers at single-family residential locations, with a focus on homes located in low-income and disadvantaged communities; 30 to 40 chargers at three to five commercial sites; and 10 to 20 chargers at three school bus storage sites. Participants must own or plan to purchase bidirectional-capable EVs, it says. The program allows participants to use their vehicles for site backup power, self-consumption and participation in revenue-producing ConnectedSolutions grid events, MassCEC says. Resource Innovations will provide program design, management, implementation and reporting, while TMH will contribute vehicle-grid integration software, they said. Program partners Voltrek and B2Q will provide local engineering, site design and construction for residential, commercial and school installations, while Converge Strategies and the Vehicle-Grid Integration Council lead stakeholder engagement and coordination with charging vendors, automakers, community members, local governments and utilities, they said. The demonstration program is available to Massachuestts customers in Eversource, National Grid, Unitil

Read More »

Striking true or overreaching? How a new executive order clouds FERC’s decisional authority

Lyle Larson is a partner at law firm Balch & Bingham. As the White House sets new guidance for regulatory agencies, a new executive order brings fresh attention to states’ rights, executive power and the Federal Energy Regulatory Commission’s jurisdictional boundaries. The Federal Power Act establishes a clear division: states oversee retail electricity services, while FERC oversees interstate transmission and wholesale electricity markets. The statute explicitly provides that federal jurisdiction “shall not extend to facilities used in local distribution,” nor may FERC regulate “any other sale of electric energy” beyond wholesale transactions. Recent developments suggest that FERC is testing the limits of the authority Congress gave it. Multiple proceedings before the commission — including Order 1920, on long-term planning for regional transmission expansion and co-located load disputes — implicate longstanding jurisdictional limits that Congress has never disturbed. Executive Order 14215, which President Trump signed on Feb. 18, directs executive agencies to conform their legal interpretations to the views of the executive branch. While FERC is not directly bound by presidential directives in the same manner as executive agencies, the order’s broad language could be interpreted as pressuring independent commissions to align with the administration’s legal position. Honoré de Balzac said “Power is not revealed by striking hard or often, but by striking true.” In some proceedings, the executive order could strike true and help keep FERC within its statutory authority. In other issues before the commission, the concern is that the order could lead to federal overreach.   Executive authority and FERC’s regulatory role Executive Order 14215 mandates that “the President and the Attorney General, subject to the President’s supervision and control, shall provide authoritative interpretations of law for the executive branch.” It prohibits agencies such as FERC from advancing legal interpretations that conflict with the president’s position: “no employee of the

Read More »

EPA $20B funding freeze leaves ‘green bank’ nonprofits unable to pay bills

Dive Brief: The Environmental Protection Agency has frozen access to $20 billion in Greenhouse Gas Reduction Fund grants, leaving the grantees in a precarious financial position, said a spokesperson for the Climate United Fund. Last April, the Climate United Fund was tapped to manage $6.97 billion from the National Clean Investment Fund, a program under the GGRF program. So far, it has made investments that include a $10.8 million pre-development loan for utility-scale solar projects on tribal lands in eastern Oregon and Idaho and $250 million toward electric truck manufacturing. In a Monday letter to the EPA Office of Inspector General’s acting inspector general, Nicole Murley, EPA Administrator Lee Zeldin said the EPA has “launched certain oversight and accountability measures” to investigate the GGRF fund disbursement for “financial mismanagement, conflicts of interest, and oversight failures.” Dive Insight: Inflation Reduction Act funding was used to create the GGRF “to mobilize financing and private capital to address the climate crisis, ensure our country’s economic competitiveness, and promote energy independence while delivering lower energy costs and economic revitalization to communities that have historically been left behind,” according to the fund’s website. “We have placed staff on administrative leave, begun a full assessment of internal controls, and are cooperating with the Department of Justice and Federal Bureau of Investigation in their ongoing investigation,” Zeldin said of the measures EPA has taken. The Climate United Fund’s $6.97 billion grant held by Citibank has been frozen since Feb. 18. Per contracts with the EPA, grantees are able to withdraw funds from Citibank to cover up to 14 days of expenses, said the spokesperson. The freeze reaches the two-week mark today. The Climate United Fund will soon become unable to make payroll or disburse funding to its contracted borrowers, who are using the funds for projects such as

Read More »

Netskope expands SASE footprint, bolsters AI and automation

Netskope is expanding its global presence by adding multiple regions to its NewEdge carrier-grade infrastructure, which now includes more than 75 locations to ensure processing remains close to end users. The secure access service edge (SASE) provider also enhanced its digital experience monitoring (DEM) capabilities with AI-powered root-cause analysis and automated network diagnostics. “We are announcing continued expansion of our infrastructure and our continued focus on resilience. I’m a believer that nothing gets adopted if end users don’t have a great experience,” says Netskope CEO Sanjay Beri. “We monitor traffic, we have multiple carriers in every one of our more than 75 regions, and when traffic goes from us to that destination, the path is direct.” Netskope added regions including data centers in Calgary, Helsinki, Lisbon, and Prague as well as expanded existing NewEdge regions including data centers in Bogota, Jeddah, Osaka, and New York City. Each data center offers customers a range of SASE capabilities including cloud firewalls, secure web gateway (SWG), inline cloud access security broker (CASB), zero trust network access (ZTNA), SD-WAN, secure service edge (SSE), and threat protection. The additional locations enable Netskope to provide coverage for more than 220 countries and territories with 200 NewEdge Localization Zones, which deliver a local direct-to-net digital experience for users, the company says.

Read More »

Inside the Nuclear Race for Data Center Energy with Aalo Atomics CEO Matt Loszak

The latest episode of the DCF Show podcast delves into one of the most pressing challenges facing the data center industry today: the search for sustainable, high-density power solutions. And how, as hyperscale operators like Google and Meta contend with growing energy demands—and, in some cases, resistance from utilities unwilling or unable to support their expanding footprints—the conversation around nuclear energy has intensified.  Both legacy nuclear providers and innovative startups are racing to secure the future business of data center giants, each bringing unique approaches to the table. Our guest for this podcast episode is Matt Loszak, co-founder and CEO of Aalo Atomics, an Austin-based company that’s taking a fresh approach to nuclear energy. Aalo, which secured a $29.5 million Series A funding round in 2024, stands out in the nuclear sector with its 10-megawatt sodium-cooled reactor design—eliminating the need for water, a critical advantage for siting flexibility. Inspired by the Department of Energy’s MARVEL microreactor, Aalo’s technology benefits from direct expertise, as the company’s CTO was the chief architect behind MARVEL. Beyond reactor design, Aalo’s vision extends to full-scale modular plant production. Instead of just building reactors, the company aims to manufacture entire nuclear plants using prefabricated, LEGO-style components. The fully modular plants, shipped in standard containers, are designed to match the footprint of a data center while requiring no onsite water—features that could make them particularly attractive to hyperscale operators seeking localized, high-density power.  Aalo has already made significant strides, with the Department of Energy identifying land at Idaho National Laboratory (INL) as a potential site for its first nuclear facility. The company is on an accelerated timeline, expecting to complete a non-nuclear prototype within three months and break ground on its first nuclear reactor in about a year—remarkably fast progress for the nuclear industry. In our discussion,

Read More »

Does It Matter If Microsoft Is Cancelling AI Data Center Leases?

Strategic Reallocation: Microsoft is a major owner and operator of data centers and might be reallocating resources to in-house infrastructure rather than leased spaces. Supply Chain Delays: TD Cowen noted that Microsoft used power and facility delays as justifications for voiding agreements, a tactic previously employed by Meta. Oversupply Issues: Analysts at TD Cowen speculate that Microsoft may have overestimated AI demand, leading to an excess in capacity. As it is all speculation, it could simply be that the latest information has driven Microsoft to reevaluate demand and move to more closely align projected supply with projected demand. Microsoft has reiterated their commitment to spend $80 billion on AI in the coming year. Reallocating this spending internally or wit a different set of partners remains on the table. And when you put the TD Cowen report that Microsoft has cancelled leases for “a couple hundred megawatts” into context with Microsoft’s overall leased power, which is estimated at around 20 GW, you see that more than 98% of their energy commitment remains unchanged. Investment Markets Might See the Biggest Hits Microsoft’s retreat has had ripple effects on the stock market, particularly among energy and infrastructure companies. European firms like Schneider Electric and Siemens Energy experienced a decline in stock value, indicating fears that major AI companies might scale back energy-intensive data center investments. However, at press time we have not seen any other indicators that this is an issue as despite these concerns about potential AI overcapacity, major tech firms continue to invest heavily in AI infrastructure:         Amazon: Pledged $100 billion towards AI data centers.         Alphabet (Google): Committed $75 billion.         Meta (Facebook): Planning to spend up to $65 billion.         Alibaba: Announced a $53 billion investment over the next three years. If we see a rush of announcements

Read More »

Dual Feed: Vantage Data Centers, VoltaGrid, Equinix, Bloom Energy, Constellation, Calpine

Nuclear Giant Constellation Acquires Natural Gas Stalwart Calpine, Creating the Largest U.S. Clean Energy Provider On January 10, 2025, Constellation (Nasdaq: CEG) announced a definitive agreement to acquire Calpine Corp. in a $16.4 billion cash-and-stock transaction, including the assumption of $12.7 billion in net debt.  A landmark transaction, the acquisition positions Constellation as the largest clean energy provider in the United States, significantly enhancing its generation portfolio with natural gas and geothermal assets. With an expanded coast-to-coast footprint, the combined company will provide 60 GW of power, reinforcing grid reliability and offering businesses and consumers a broader array of sustainability solutions. The move strengthens Constellation’s competitive retail electricity presence, serving 2.5 million customers across key U.S. markets, including Texas, California, and the Northeast. “This acquisition will help us better serve our customers across America, from families to businesses and utilities,” said Joe Dominguez, president and CEO of Constellation. “By combining Constellation’s unmatched expertise in zero-emission nuclear energy with Calpine’s industry-leading, low-carbon natural gas and geothermal generation, we can deliver the most comprehensive clean energy portfolio in the industry.” A Strategic Move for the Data Center Industry With skyrocketing demand for AI and cloud services, data centers are under increasing pressure to secure reliable, low-carbon energy sources. The Constellation-Calpine combination is particularly relevant for large-scale hyperscale operators and colocation providers seeking flexible energy solutions.  For the data center industry, this consolidation offers several advantages: Diverse Energy Mix: The integration of nuclear, geothermal, and low-emission natural gas provides data centers with flexible and reliable energy options. Grid Stability: Calpine’s extensive natural gas fleet enhances grid reliability, crucial for data centers operating in high-demand regions. Sustainability Initiatives: The combined entity is well-positioned to invest in clean energy infrastructure, including battery storage and carbon sequestration, aligning with the sustainability goals of hyperscale operators. The

Read More »

AI, Data Centers, and the Next Big Correction: Will Growth Outpace Market Reality?

AI is being readily embraced by organizations, government, and individual enthusiasts for data aggregation, pattern recognition, data visualization, and co-creation of content. Given the headlines lately, AI is set to take over the world. And as an emerging, revolutionary technology with large potential impact and newfound user-friendliness, both large tech companies and small startups alike have raced to capitalize on potential growth. Hands down, this transformative technology has caused a wave of adoption, investment, and innovation around the world and across industries. Naturally, when a technology or application accelerates quickly, the more risk-averse will be cautious and when it accelerates this quickly, a bubble might be forming. Even more bullish investors have ridden through too much tumult in the past few decades for their bank accounts to withstand another cataclysmic loss. More investment is pouring in (including at a federal level), stock valuations are all over the charts and not necessarily true to a ticker’s earnings, and the recent market fluctuations leave the entire ecosystem a little hesitant about buying into the hype too much. The Nature of Bubbles and Some Potential Signals to Watch For Economic bubbles occur when asset prices significantly exceed their intrinsic value, often fueled by speculative demand and irrational investment, leading to unsustainable market conditions. A bigger concern than just to digital infrastructure, bubbles can have far-reaching impacts on the entire market, as the initial distorted financial metrics encourage excessive lending and create systemic risk. The collapse of a bubble can trigger a chain reaction of financial distress, causing widespread economic instability and potentially leading to recessions, as seen in historical examples like the dot-com and housing bubbles. Reasonable bubble indicators that have the market concerned include: Overvaluation and Lack of Profit Generation: Tech giants are heavily invested in AI despite limited returns from the

Read More »

Cedar Rapids, Iowa Gets Approval for Two Major Data Center Investments by Google and QTS

Cedar Rapids: A Growing Data Center Hub in the Heartland Leading up to the mega-investments by QTS and Google, Cedar Rapids has quietly emerged as a compelling location for data center investment, leveraging its strategic Midwest positioning, resilient infrastructure, and business-friendly environment. While much of the national attention has focused on hyperscale players, other operators and enterprises are making moves in the region, capitalizing on Iowa’s affordable power, strong fiber connectivity, and low risk of natural disasters. Enterprise and Colocation Growth Several regional enterprises have established or expanded their data center footprints in Cedar Rapids, drawn by the city’s reliable power grid and access to renewable energy. Local financial institutions, healthcare providers, and insurance firms have continued to bolster their IT resilience with both on-premises expansions and partnerships with colocation providers. Additionally, ark data centers (formerly Involta), a well-established name in the colocation and hybrid IT space, has maintained a strong presence in Cedar Rapids. The company operates a Tier III data center in the region, catering to mid-market enterprises and organizations seeking managed services, cloud connectivity, and disaster recovery solutions. ark’s Cedar Rapids facility benefits from a high-speed fiber network, offering secure and low-latency access to cloud platforms and business-critical applications. Infrastructure and Connectivity Very importantly, Cedar Rapids benefits from a robust fiber-optic network, making it an attractive option for companies looking to deploy edge computing capabilities closer to end users in the Midwest. Providers such as Lumen (formerly CenturyLink) and Windstream have strengthened regional connectivity, ensuring high-performance network routes between key Midwest metros like Chicago, Minneapolis, and Omaha. Iowa’s competitive energy market continues to be a draw, with utilities such as the aforementioned Alliant Energy supporting sustainability initiatives and offering attractive rates. The state’s commitment to renewable energy—including a growing mix of wind and solar—aligns with enterprise sustainability goals,

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »