Stay Ahead, Stay ONMINE

Inching towards AGI: How reasoning and deep research are expanding AI from statistical prediction to structured problem-solving

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More AI has evolved at an astonishing pace. What seemed like science fiction just a few years ago is now an undeniable reality. Back in 2017, my firm launched an AI Center of Excellence. AI was certainly […]

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


AI has evolved at an astonishing pace. What seemed like science fiction just a few years ago is now an undeniable reality. Back in 2017, my firm launched an AI Center of Excellence. AI was certainly getting better at predictive analytics and many machine learning (ML) algorithms were being used for voice recognition, spam detection, spell checking (and other applications) — but it was early. We believed then that we were only in the first inning of the AI game.

The arrival of GPT-3 and especially GPT 3.5 — which was tuned for conversational use and served as the basis for the first ChatGPT in November 2022 — was a dramatic turning point, now forever remembered as the “ChatGPT moment.” 

Since then, there has been an explosion of AI capabilities from hundreds of companies. In March 2023 OpenAI released GPT-4, which promised “sparks of AGI” (artificial general intelligence). By that time, it was clear that we were well beyond the first inning. Now, it feels like we are in the final stretch of an entirely different sport.

The flame of AGI

Two years on, the flame of AGI is beginning to appear.

On a recent episode of the Hard Fork podcast, Dario Amodei — who has been in the AI industry for a decade, formerly as VP of research at OpenAI and now as CEO of Anthropic — said there is a 70 to 80% chance that we will have a “very large number of AI systems that are much smarter than humans at almost everything before the end of the decade, and my guess is 2026 or 2027.”

Anthropic CEO Dario Amodei appearing on the Hard Fork podcast. Source: https://www.youtube.com/watch?v=YhGUSIvsn_Y 

The evidence for this prediction is becoming clearer. Late last summer, OpenAI launched o1 — the first “reasoning model.” They’ve since released o3, and other companies have rolled out their own reasoning models, including Google and, famously, DeepSeek. Reasoners use chain-of-thought (COT), breaking down complex tasks at run time into multiple logical steps, just as a human might approach a complicated task. Sophisticated AI agents including OpenAI’s deep research and Google’s AI co-scientist have recently appeared, portending huge changes to how research will be performed. 

Unlike earlier large language models (LLMs) that primarily pattern-matched from training data, reasoning models represent a fundamental shift from statistical prediction to structured problem-solving. This allows AI to tackle novel problems beyond its training, enabling genuine reasoning rather than advanced pattern recognition.

I recently used Deep Research for a project and was reminded of the quote from Arthur C. Clarke: “Any sufficiently advanced technology is indistinguishable from magic.” In five minutes, this AI produced what would have taken me 3 to 4 days. Was it perfect? No. Was it close? Yes, very. These agents are quickly becoming truly magical and transformative and are among the first of many similarly powerful agents that will soon come onto the market.

The most common definition of AGI is a system capable of doing almost any cognitive task a human can do. These early agents of change suggest that Amodei and others who believe we are close to that level of AI sophistication could be correct, and that AGI will be here soon. This reality will lead to a great deal of change, requiring people and processes to adapt in short order. 

But is it really AGI?

There are various scenarios that could emerge from the near-term arrival of powerful AI. It is challenging and frightening that we do not really know how this will go. New York Times columnist Ezra Klein addressed this in a recent podcast: “We are rushing toward AGI without really understanding what that is or what that means.” For example, he claims there is little critical thinking or contingency planning going on around the implications and, for example, what this would truly mean for employment.

Of course, there is another perspective on this uncertain future and lack of planning, as exemplified by Gary Marcus, who believes deep learning generally (and LLMs specifically) will not lead to AGI. Marcus issued what amounts to a take down of Klein’s position, citing notable shortcomings in current AI technology and suggesting it is just as likely that we are a long way from AGI. 

Marcus may be correct, but this might also be simply an academic dispute about semantics. As an alternative to the AGI term, Amodei simply refers to “powerful AI” in his Machines of Loving Grace blog, as it conveys a similar idea without the imprecise definition, “sci-fi baggage and hype.” Call it what you will, but AI is only going to grow more powerful.

Playing with fire: The possible AI futures

In a 60 Minutes interview, Alphabet CEO Sundar Pichai said he thought of AI as “the most profound technology humanity is working on. More profound than fire, electricity or anything that we have done in the past.” That certainly fits with the growing intensity of AI discussions. Fire, like AI, was a world-changing discovery that fueled progress but demanded control to prevent catastrophe. The same delicate balance applies to AI today.

A discovery of immense power, fire transformed civilization by enabling warmth, cooking, metallurgy and industry. But it also brought destruction when uncontrolled. Whether AI becomes our greatest ally or our undoing will depend on how well we manage its flames. To take this metaphor further, there are various scenarios that could soon emerge from even more powerful AI:

  1. The controlled flame (utopia): In this scenario, AI is harnessed as a force for human prosperity. Productivity skyrockets, new materials are discovered, personalized medicine becomes available for all, goods and services become abundant and inexpensive and individuals are freed from drudgery to pursue more meaningful work and activities. This is the scenario championed by many accelerationists, in which AI brings progress without engulfing us in too much chaos.
  2. The unstable fire (challenging): Here, AI brings undeniable benefits — revolutionizing research, automation, new capabilities, products and problem-solving. Yet these benefits are unevenly distributed — while some thrive, others face displacement, widening economic divides and stressing social systems. Misinformation spreads and security risks mount. In this scenario, society struggles to balance promise and peril. It could be argued that this description is close to present-day reality.
  3. The wildfire (dystopia): The third path is one of disaster, the possibility most strongly associated with so-called “doomers” and “probability of doom” assessments. Whether through unintended consequences, reckless deployment or AI systems running beyond human control, AI actions become unchecked, and accidents happen. Trust in truth erodes. In the worst-case scenario, AI spirals out of control, threatening lives, industries and entire institutions.

While each of these scenarios appears plausible, it is discomforting that we really do not know which are the most likely, especially since the timeline could be short. We can see early signs of each: AI-driven automation increasing productivity, misinformation that spreads at scale, eroding trust and concerns over disingenuous models that resist their guardrails. Each scenario would cause its own adaptations for individuals, businesses, governments and society.

Our lack of clarity on the trajectory for AI impact suggests that some mix of all three futures is inevitable. The rise of AI will lead to a paradox, fueling prosperity while bringing unintended consequences. Amazing breakthroughs will occur, as will accidents. Some new fields will appear with tantalizing possibilities and job prospects, while other stalwarts of the economy will fade into bankruptcy. 

We may not have all the answers, but the future of powerful AI and its impact on humanity is being written now. What we saw at the recent Paris AI Action Summit was a mindset of hoping for the best, which is not a smart strategy. Governments, businesses and individuals must shape AI’s trajectory before it shapes us. The future of AI won’t be determined by technology alone, but by the collective choices we make about how to deploy it.

Gary Grossman is EVP of technology practice at Edelman.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Bad data in, bad data out: Protecting your investment in ADMS

Congratulations! Your utility has successfully implemented a cutting-edge ADMS application. Your GIS team has spent months working closely with the implementation team to clean and correct the data within the GIS application. The teams have validated voltage and phasing, eliminated loops, resolved open points, populated missing attribution with default values,

Read More »

EPA to end environmental justice programs, monitoring tools

Dive Brief: The Trump administration announced Wednesday it will shut down all environmental justice offices and officially end other EJ-related initiatives, a move that will impact how waste and recycling industries measure and track their environmental impact on neighboring communities. The closures include the EPA’s Office of Environmental Justice and

Read More »

Intel under Tan: What enterprise IT buyers need to know

Intel’s discrete GPU ambitions — especially in enterprise AI — have often appeared reactive rather than part of a clear strategic vision. The company entered the market late, facing Nvidia’s dominant CUDA ecosystem and AMD’s aggressive push into AI GPUs. “Tan’s background suggests he is unlikely to double down on

Read More »

SUSE expands AI tools to control workloads, LLM usage

“And every few weeks we’ll continue to add to the library,” Puri says. SUSE also announced a partnership with Infosys today. The system integrator has the Topaz AI platform, which includes a set of services and solutions to help enterprises build and deploy AI applications. SUSE is also integrating the

Read More »

Unlocking efficiency across the utility asset lifecycle

With global investment in energy infrastructure climbing—set to exceed $3 trillion in 2024, per the International Energy Agency (IEA)—the utilities industry faces intense pressure to optimize operations amid rising costs, regulatory scrutiny and reliability demands. For VPs and Directors of Operations and Engineering, as well as Field Technicians, the mission is clear: maximize asset performance from inception to retirement. The answer lies in utility asset lifecycle management, powered by asset information management to drive digital transformation in energy. And an AI content assistant is the linchpin. Centralized asset information for project efficiency Keeping capital projects on track is a top concern. Delays in grid upgrades or substation builds can disrupt service and budgets. An intelligent AI content assistant changes the game by consolidating asset content data—engineering drawings, SOPs, risk assessments, regulatory compliance reports, etc. – into one accessible platform. This drives project excellence, ensuring teams collaborate seamlessly and leaders maintain oversight, delivering projects on schedule and within budget despite surging investment demands. Reduced downtime with a proactive AI content assistant Reducing asset downtime is a non-negotiable for operational excellence. Unexpected failures in transformers or power lines threaten reliability and increases costs. Field Technicians and Engineers, equipped with digital tools and an AI content assistant, gain instant access to critical asset details—work orders, SOPs, engineering drawings and much more. This empowers proactive fixes, keeping utility assets humming and customers satisfied while containing operational overhead. Reimaged knowledge across the utility asset lifecycle Knowledge management is vital to sustaining utility success. Elevating knowledge across the utility asset lifecycle prevents expertise from vanishing with retiring workers or siloed departments. Imagine a Field Technician diagnosing a fault with a digital assistant that retrieves decades of asset history in moments. Or a Director of Engineering leveraging that data for smarter capital planning. Asset information management, enhanced by

Read More »

Whitecap, Veren to Merge in $10.4MM Transaction

Whitecap Resources Inc. and Veren Inc. are merging to “create a leading light oil and condensate producer” with concentrated assets in Alberta’s Montney and Duvernay formations. The companies have entered into a definitive business combination agreement to combine in an all-share transaction valued at approximately ($10.43 billion) CAD 15 billion, inclusive of net debt, they said in a news release. The transaction is expected to close before May 30, subject to approval of the arrangement by the Court of King’s Bench of Alberta as well as other customary closing conditions. Under the terms of the agreement, Veren shareholders will receive 1.05 shares of Whitecap for each Veren share held. The combined company will be led by Whitecap’s existing management team under the Whitecap name with four Veren directors joining Whitecap’s board, including the current President and CEO of Veren, Craig Bryksa. The combined company will have 370,000 barrels of oil equivalent per day (boepd) of corporate production with significant overlap across both unconventional and conventional assets and will become the largest Canadian light oil focused producer and the seventh largest producer in the Western Canadian Sedimentary Basin, with “significant natural gas growth potential,” according to the release. Whitecap and Veren said the combined company becomes the largest producer in the high margin Kaybob Duvernay and Alberta Montney with approximately 220,000 boepd of unconventional production, as well as the largest landholder in the Alberta Montney and the second largest landholder across unconventional Montney and Duvernay fairways with 1.5 million acres in Alberta. The combined company will also have over 4,800 total development locations in the Montney and Duvernay to drive decades of future production growth, they said. Following the transaction’s closing, Whitecap shareholders will own approximately 48 percent and Veren shareholders will own approximately 52 percent of the total common shares

Read More »

Aberdeen must ‘maximise’ green jobs, says GB Energy boss

Aberdeen will “lead that charge” to clean energy due to its expertise in North Sea oil and gas, the boss of GB Energy has said. Chairman of GB Energy Juergen Maier added the city must “maximise the number of clean energy jobs” as the £8.3 billion government body held its first board meeting in Aberdeen to discuss the next steps to scale up the firm’s presence in the city. Energy Minister Michael Shanks, who will also attend Monday’s meeting in Aberdeen, said it marks “another step forward for the company as it gears up to make its first investments”. Writing a column submitted to the Press and Journal, Maier said GB Energy will “directly invest in and own new energy projects”. It will also help to “scale up technologies with the potential to transform our power system”. Without this, he warned: “If we don’t take decisive action now, we risk losing investment, jobs, and technological leadership to competitors overseas.” Maier said: “The skills, knowledge, and infrastructure built over decades of North Sea oil and gas production are perfect to be repurposed for a clean energy future. “That means not just investing in renewables but ensuring that those jobs stay in the city and that workers are given every opportunity to be part of this transformation. “Great British Energy will play a key role in driving this investment—backing new projects, funding innovation, and working with industry to make sure Aberdeen remains the UK’s energy capital for decades to come. “Floating offshore wind, green hydrogen, and carbon capture should be as synonymous with Aberdeen’s future as oil and gas have been with its past. “We are in a race against the clock to secure Britain’s place in the global clean energy economy. Other countries are moving fast. If we don’t take decisive action

Read More »

Scottish ports awarded £13m ahead of Swinney wind summit

The Scottish Government confirmed cash injections worth up to £13.2 million in Port of Nigg in the Highlands and at Montrose ahead of a meeting in Edinburgh to help raise the country’s profile as a “destination” for investment in offshore wind energy. Enterprise body Highlands and Islands Enterprise (HIE) approved up to £10m to support development of heavy-duty quayside and the introduction of roll-on roll-off capability at Port of Nigg. Another £3.2m, allocated by Scotland’s other enterprise body, Scottish Enterprise, will support a £7.2m plan to acquire and develop land just under two miles from the Port of Montrose. The site, Montrose Port Industrial Park, will enable new companies to grow and adapt alongside Scotland’s thriving renewables sector. The announcements comes as Scottish Ministers host a “major” Global Offshore Wind Investment Forum (GOWIF) Monday in Edinburgh. Hosted by First Minister John Swinney, the event aims to unlock private investment in the country’s offshore wind industry and its expected to attract “more than 100 stakeholders”. The Scottish Government is investing up to £500m over five years in the Scottish offshore wind supply chain to leverage an expected £1.5 billion of private investment. In the Highlands, the Port of Nigg has already attracted a £350m high voltage cable manufacturing facility being built by Japanese firm Sumitomo Electric Power Cables which will use Nigg its primary export facility. The recent government investment at Nigg is subject to approval by port owner, Global Energy Nigg owned by Roy MacGregor’s Global Energy Group. The expansion is expected to create five jobs. MacGregor, said: “We welcome this significant investment from HIE and the Scottish Government, which reinforces their commitment to strengthening Scotland’s offshore wind supply chain. “Since acquiring Nigg in 2011, we have invested more than £120m in transforming the facility into a world-class offshore wind superhub,

Read More »

CNOOC Starts Up Two Projects in South China Sea, Bohai Sea

CNOOC Ltd. announced Monday the start of production at the Wenchang 19-1 Oilfield Phase 2 Project in the South China Sea and the Caofeidian 6-4 Oilfield Comprehensive Adjustment Project in the Bohai Sea. The oil and gas exploration and production company, which claims to be the biggest producer of oil and gas offshore China, has now announced six upstream start-ups in Chinese waters this year. CNOOC Ltd., majority-owned by China National Offshore Oil Corp. (CNOOC), expects the second phase of Wenchang 19-1 to reach 12,000 barrels of oil equivalent a day (boed) in peak production 2027. The project produces medium crude. The Caofeidian 6-4 adjustment project is expected to achieve 11,000 boed in peak production 2026. The oil is light crude. “The design concept of intelligent oil and gas fields were [sic] used in the two new projects”, CNOOC Ltd. said in a press release. “Some of the procedures, such as crude oil production, equipment maintenance, and safety management, were upgraded with intelligent technologies, to enhance the efficiency of hydrocarbon development”. Wenchang 19-1 phase 2 uses a megawatt-level high-temperature flue gas ORC power generation unit able to produce up to 24 million kilowatt hours of electricity. The unit can cut the project’s carbon dioxide (CO2) emissions by about 23,000 metric tons a year, according to CNOOC Ltd. The Caofeidian 6-4 adjustment project re-injects associated gas into the reservoir using compressors, curbing the project’s CO2 emissions by around 13,000 metric tons a year, it said. Wenchang 19-1 phase 2 sits in an average water depth of approximately 125 meters (410.1 feet) in the west of the Pearl River Mouth Basin. It has a new drilling and production platform but also uses adjacent existing facilities. CNOOC Ltd. eyes 13 production wells. The Caofeidian 6-4 adjustment project, in the west of the Bohai Sea,

Read More »

Cheaper zonal pricing risks ‘derailing’ tens of billions of Scottish offshore wind investment

Developers behind offshore wind projects promising tens of billions of pounds of investment in Scotland are at risk of being derailed by measures aimed at keeping power inexpensive for consumers. A coalition of offshore wind developers, representing around 33GW of planned offshore wind power capacity, has urged the UK Government to immediately rule out zonal pricing and instead adopt “workable” reforms to benefit both consumers and investors. In a letter to UK prime minister Keir Starmer and Scotland’s First Minister John Swinney MSP, 17 offshore wind developers in Scotland have warned investment could be halted and projects left unviable by the “unmanageable risk” of pricing proposals based on regional energy demand. The UK Government is currently considering a proposal known as zonal pricing, which would divide the UK into different pricing zones and see consumers face varying electricity costs. Zonal pricing has emerged as a significant battle line between developers and consumer champions as the UK government mulls its review of electricity market arrangements (REMA) which is due to conclude soon. Supporters of zonal pricing argue that it would give Scotland some of the cheapest energy prices in Europe. The also believe it could also solve the problem of so called constraint payments, where energy producers are paid to shut down power generation to the grid when it is not needed. UK taxpayers paid £390 million in constraint payments to Scottish wind farm operators in 2024, recent data has showed. © Supplied by Offshore Wind ScotlaMap of offshore wind in Scotland and Ireland. Supplied by Offshore Wind Scotland/ OEUK However, industry has warned zonal pricing could in fact lead to higher energy bills for both households and businesses. They argue that even a modest increase in the costs of building renewable projects would negate any potential savings from zonal pricing.

Read More »

IBM laying foundation for mainframe as ultimate AI server

“It will truly change what customers are able to do with AI,” Stowell said. IBM’s mainframe processors The next generation of processors is expected to continue a long history of generation-to-generation improvements, IBM stated in a new white paper on AI and the mainframe. “They are projected to clock in at 5.5 GHz. and include ten 36 MB level 2 caches. They’ll feature built-in low-latency data processing for accelerated I/O as well as a completely redesigned cache and chip-interconnection infrastructure for more on-chip cache and compute capacity,” IBM wrote.  Today’s mainframes also have extensions and accelerators that integrate with the core systems. These specialized add-ons are designed to enable the adoption of technologies such as Java, cloud and AI by accelerating computing paradigms that are essential for high-volume, low-latency transaction processing, IBM wrote.  “The next crop of AI accelerators are expected to be significantly enhanced—with each accelerator designed to deliver 4 times more compute power, reaching 24 trillion operations per second (TOPS),” IBM wrote. “The I/O and cache improvements will enable even faster processing and analysis of large amounts of data and consolidation of workloads running across multiple servers, for savings in data center space and power costs. And the new accelerators will provide increased capacity to enable additional transaction clock time to perform enhanced in-transaction AI inferencing.” In addition, the next generation of the accelerator architecture is expected to be more efficient for AI tasks. “Unlike standard CPUs, the chip architecture will have a simpler layout, designed to send data directly from one compute engine, and use a range of lower- precision numeric formats. These enhancements are expected to make running AI models more energy efficient and far less memory intensive. As a result, mainframe users can leverage much more complex AI models and perform AI inferencing at a greater scale

Read More »

VergeIO enhances VergeFabric network virtualization offering

VergeIO is not, however, using an off-the-shelf version of KVM. Rather, it is using what Crump referred to as a heavily modified KVM hypervisor base, with significant proprietary enhancements while still maintaining connections to the open-source community. VergeIO’s deployment profile is currently 70% on premises and about 30% via bare-metal service providers, with a particularly strong following among cloud service providers that host applications for their customers. The software requires direct hardware access due to its low-level integration with physical resources. “Since November of 2023, the normal number one customer we’re attracting right now is guys that have had a heart attack when they got their VMware renewal license,” Crump said. “The more of the stack you own, the better our story becomes.” A 2024 report from Data Center Intelligence Group (DCIG) identified VergeOS as one of the top 5 alternatives to VMware. “VergeIO starts by installing VergeOS on bare metal servers,” the report stated. “It then brings the servers’ hardware resources under its management, catalogs these resources, and makes them available to VMs. By directly accessing and managing the server’s hardware resources, it optimizes them in ways other hypervisors often cannot.” Advanced networking features in VergeFabric VergeFabric is the networking component within the VergeOS ecosystem, providing software-defined networking capabilities as an integrated service rather than as a separate virtual machine or application.

Read More »

Podcast: On the Frontier of Modular Edge AI Data Centers with Flexnode’s Andrew Lindsey

The modular data center industry is undergoing a seismic shift in the age of AI, and few are as deeply embedded in this transformation as Andrew Lindsey, Co-Founder and CEO of Flexnode. In a recent episode of the Data Center Frontier Show podcast, Lindsey joined Editor-in-Chief Matt Vincent and Senior Editor David Chernicoff to discuss the evolution of modular data centers, the growing demand for high-density liquid-cooled solutions, and the industry factors driving this momentum. A Background Rooted in Innovation Lindsey’s career has been defined by the intersection of technology and the built environment. Prior to launching Flexnode, he worked at Alpha Corporation, a top 100 engineering and construction management firm founded by his father in 1979. His early career involved spearheading technology adoption within the firm, with a focus on high-security infrastructure for both government and private clients. Recognizing a massive opportunity in the data center space, Lindsey saw a need for an innovative approach to infrastructure deployment. “The construction industry is relatively uninnovative,” he explained, citing a McKinsey study that ranked construction as the second least-digitized industry—just above fishing and wildlife, which remains deliberately undigitized. Given the billions of square feet of data center infrastructure required in a relatively short timeframe, Lindsey set out to streamline and modernize the process. Founded four years ago, Flexnode delivers modular data centers with a fully integrated approach, handling everything from site selection to design, engineering, manufacturing, deployment, operations, and even end-of-life decommissioning. Their core mission is to provide an “easy button” for high-density computing solutions, including cloud and dedicated GPU infrastructure, allowing faster and more efficient deployment of modular data centers. The Rising Momentum for Modular Data Centers As Vincent noted, Data Center Frontier has closely tracked the increasing traction of modular infrastructure. Lindsey has been at the forefront of this

Read More »

Last Energy to Deploy 30 Microreactors in Texas for Data Centers

As the demand for data center power surges in Texas, nuclear startup Last Energy has now announced plans to build 30 microreactors in the state’s Haskell County near the Dallas-Fort Worth Metroplex. The reactors will serve a growing customer base of data center operators in the region looking for reliable, carbon-free energy. The plan marks Last Energy’s largest project to date and a significant step in advancing modular nuclear power as a viable solution for high-density computing infrastructure. Meeting the Looming Power Demands of Texas Data Centers Texas is already home to over 340 data centers, with significant expansion underway. Google is increasing its data center footprint in Dallas, while OpenAI’s Stargate has announced plans for a new facility in Abilene, just an hour south of Last Energy’s planned site. The company notes the Dallas-Fort Worth metro area alone is projected to require an additional 43 gigawatts of power in the coming years, far surpassing current grid capacity. To help remediate, Last Energy has secured a 200+ acre site in Haskell County, approximately three and a half hours west of Dallas. The company has also filed for a grid connection with ERCOT, with plans to deliver power via a mix of private wire and grid transmission. Additionally, Last Energy has begun pre-application engagement with the U.S. Nuclear Regulatory Commission (NRC) for an Early Site Permit, a key step in securing regulatory approval. According to Last Energy CEO Bret Kugelmass, the company’s modular approach is designed to bring nuclear energy online faster than traditional projects. “Nuclear power is the most effective way to meet Texas’ growing energy demand, but it needs to be deployed faster and at scale,” Kugelmass said. “Our microreactors are designed to be plug-and-play, enabling data center operators to bypass the constraints of an overloaded grid.” Scaling Nuclear for

Read More »

Data Center Jobs: Engineering and Technician Jobs Available in Major Markets

Each month Data Center Frontier, in partnership with Pkaza, posts some of the hottest data center career opportunities in the market. Here’s a look at some of the latest data center jobs posted on the Data Center Frontier jobs board, powered by Pkaza Critical Facilities Recruiting.  Data Center Facility Engineer (Night Shift Available) Ashburn, VAThis position is also available in: Tacoma, WA (Nights), Days/Nights: Needham, MA and New York City, NY. This opportunity is working directly with a leading mission-critical data center developer / wholesaler / colo provider. This firm provides data center solutions custom-fit to the requirements of their client’s mission-critical operational facilities. They provide reliability of mission-critical facilities for many of the world’s largest organizations facilities supporting enterprise clients and hyperscale companies. This opportunity provides a career-growth minded role with exciting projects with leading-edge technology and innovation as well as competitive salaries and benefits. Electrical Commissioning Engineer New Albany, OHThis traveling position is also available in: Somerset, NJ; Boydton, VA; Richmond, VA; Ashburn, VA; Charlotte, NC; Atlanta, GA; Hampton, GA; Fayetteville, GA; Des Moines, IA; San Jose, CA; Portland, OR; St Louis, MO; Phoenix, AZ;  Dallas, TX;  Chicago, IL; or Toronto, ON. *** ALSO looking for a LEAD EE and ME CxA agents.*** Our client is an engineering design and commissioning company that has a national footprint and specializes in MEP critical facilities design. They provide design, commissioning, consulting and management expertise in the critical facilities space. They have a mindset to provide reliability, energy efficiency, sustainable design and LEED expertise when providing these consulting services for enterprise, colocation and hyperscale companies. This career-growth minded opportunity offers exciting projects with leading-edge technology and innovation as well as competitive salaries and benefits. Switchgear Field Service Technician – Critical Facilities Nationwide TravelThis position is also available in: Charlotte, NC; Atlanta, GA; Dallas,

Read More »

Amid Shifting Regional Data Center Policies, Iron Mountain and DC Blox Both Expand in Virginia’s Henrico County

The dynamic landscape of data center developments in Maryland and Virginia exemplify the intricate balance between fostering technological growth and addressing community and environmental concerns. Data center developers in this region find themselves both in the crosshairs of groups worried about the environment and other groups looking to drive economic growth. In some cases, the groups are different components of the same organizations, such as local governments. For data center development, meeting the needs of these competing interests often means walking a none-too-stable tightrope. Rapid Government Action Encourages Growth In May 2024, Maryland demonstrated its commitment to attracting data center investments by enacting the Critical Infrastructure Streamlining Act. This legislation provides a clear framework for the use of emergency backup power generation, addressing previous regulatory challenges that a few months earlier had hindered projects like Aligned Data Centers’ proposed 264-megawatt campus in Frederick County, causing Aligned to pull out of the project. However, just days after the Act was signed by the governor, Aligned reiterated its plans to move forward with development in Maryland.  With the Quantum Loop and the related data center development making Frederick County a focal point for a balanced approach, the industry is paying careful attention to the pace of development and the relations between developers, communities and the government. In September of 2024, Frederick County Executive Jessica Fitzwater revealed draft legislation that would potentially restrict where in the county data centers could be built. The legislation was based on information found in the Frederick County Data Centers Workgroup’s final report. Those bills would update existing regulations and create a floating zone for Critical Digital Infrastructure and place specific requirements on siting data centers. Statewide, a cautious approach to environmental and community impacts statewide has been deemed important. In January 2025, legislators introduced SB116,  a bill

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »