Stay Ahead, Stay ONMINE

This quantum computer built on server racks paves the way to bigger machines

A Canadian startup called Xanadu has built a new quantum computer it says can be easily scaled up to achieve the computational power needed to tackle scientific challenges ranging from drug discovery to more energy-efficient machine learning. Aurora is a “photonic” quantum computer, which means it crunches numbers using photonic qubits—information encoded in light. In practice, this means combining and recombining laser beams on multiple chips using lenses, fibers, and other optics according to an algorithm. Xanadu’s computer is designed in such a way that the answer to an algorithm it executes corresponds to the final number of photons in each laser beam. This approach differs from one used by Google and IBM, which involves encoding information in properties of superconducting circuits.  Aurora has a modular design that consists of four similar units, each installed in a standard server rack that is slightly taller and wider than the average human. To make a useful quantum computer, “you copy and paste a thousand of these things and network them together,” says Christian Weedbrook, the CEO and founder of the company.  Ultimately, Xanadu envisions a quantum computer as a specialized data center, consisting of rows upon rows of these servers. This contrasts with the industry’s earlier conception of a specialized chip within a supercomputer, much like a GPU. But this work, which the company published last week in Nature, is just a first step toward that vision. Aurora used 35 chips to construct a total of 12 quantum bits, or qubits. Any useful applications of quantum computing proposed to date will require at least thousands of qubits, or possibly a million. By comparison, Google’s quantum computer Willow, which debuted last year, has 105 qubits (all built on a single chip), and IBM’s Condor has 1,121. Devesh Tiwari, a quantum computing researcher at Northeastern University, describes Xanadu’s progress in an analogy with building a hotel. “They have built a room, and I’m sure they can build multiple rooms,” he says. “But I don’t know if they can build it floor by floor.” Still, he says, the work is “very promising.”  Xanadu’s 12 qubits may seem like a paltry number next to IBM’s 1,121, but Tiwari says this doesn’t mean that quantum computers based on photonics are running behind. In his opinion, the number of qubits reflects the amount of investment more than it does the technology’s promise.  Photonic quantum computers offer several design advantages. The qubits are less sensitive to environmental noise, says Tiwari, which makes it easier to get them to retain information for longer. It is also relatively straightforward to connect photonic quantum computers via conventional fiber optics, because they already use light to encode information. Networking quantum computers together is key to the industry’s vision of a “quantum internet” where different quantum devices talk to each other. Aurora’s servers also don’t need to be kept as cool as superconducting quantum computers, says Weedbrook, so they don’t require as much cryogenic technology. The server racks operate at room temperature, although photon-counting detectors still need to be cryogenically cooled in another room.  Xanadu is not the only company pursuing photonic quantum computers; others include PsiQuantum in the US and Quandela in France. Other groups are using materials like neutral atoms and ions to construct their quantum systems.  From a technical standpoint, Tiwari suspects, no single qubit type will ever be the “winner,” but it’s likely that certain qubits will be better for specific applications. Photonic quantum computers, for example, are particularly well suited to Gaussian boson sampling, an algorithm that could be useful for quickly solving graph problems. “I really want more people to be looking at photonic quantum computers,” he says. He has studied quantum computers with multiple qubit types, including photons and superconducting qubits, and is not affiliated with a company.  Isaac Kim, a physicist at the University of California, Davis, points out that Xanadu has not demonstrated the error correction ability many experts think a quantum computer will need in order to do any useful task, given that information stored in a quantum computer is notoriously fragile.  Weedbrook, however, says Xanadu’s next goal is to improve the quality of the photons in the computer, which will ease the error correction requirements. “When you send lasers through a medium, whether it’s free space, chips, or fiber optics, not all the information makes it from the start to the finish,” he says. “So you’re actually losing light and therefore losing information.” The company is working to reduce this loss, which means fewer errors in the first place.  Xanadu aims to build a quantum data center, with thousands of servers containing a million qubits, in 2029.

A Canadian startup called Xanadu has built a new quantum computer it says can be easily scaled up to achieve the computational power needed to tackle scientific challenges ranging from drug discovery to more energy-efficient machine learning.

Aurora is a “photonic” quantum computer, which means it crunches numbers using photonic qubits—information encoded in light. In practice, this means combining and recombining laser beams on multiple chips using lenses, fibers, and other optics according to an algorithm. Xanadu’s computer is designed in such a way that the answer to an algorithm it executes corresponds to the final number of photons in each laser beam. This approach differs from one used by Google and IBM, which involves encoding information in properties of superconducting circuits. 

Aurora has a modular design that consists of four similar units, each installed in a standard server rack that is slightly taller and wider than the average human. To make a useful quantum computer, “you copy and paste a thousand of these things and network them together,” says Christian Weedbrook, the CEO and founder of the company. 

Ultimately, Xanadu envisions a quantum computer as a specialized data center, consisting of rows upon rows of these servers. This contrasts with the industry’s earlier conception of a specialized chip within a supercomputer, much like a GPU.

But this work, which the company published last week in Nature, is just a first step toward that vision. Aurora used 35 chips to construct a total of 12 quantum bits, or qubits. Any useful applications of quantum computing proposed to date will require at least thousands of qubits, or possibly a million. By comparison, Google’s quantum computer Willow, which debuted last year, has 105 qubits (all built on a single chip), and IBM’s Condor has 1,121.

Devesh Tiwari, a quantum computing researcher at Northeastern University, describes Xanadu’s progress in an analogy with building a hotel. “They have built a room, and I’m sure they can build multiple rooms,” he says. “But I don’t know if they can build it floor by floor.”

Still, he says, the work is “very promising.” 

Xanadu’s 12 qubits may seem like a paltry number next to IBM’s 1,121, but Tiwari says this doesn’t mean that quantum computers based on photonics are running behind. In his opinion, the number of qubits reflects the amount of investment more than it does the technology’s promise. 

Photonic quantum computers offer several design advantages. The qubits are less sensitive to environmental noise, says Tiwari, which makes it easier to get them to retain information for longer. It is also relatively straightforward to connect photonic quantum computers via conventional fiber optics, because they already use light to encode information. Networking quantum computers together is key to the industry’s vision of a “quantum internet” where different quantum devices talk to each other. Aurora’s servers also don’t need to be kept as cool as superconducting quantum computers, says Weedbrook, so they don’t require as much cryogenic technology. The server racks operate at room temperature, although photon-counting detectors still need to be cryogenically cooled in another room. 

Xanadu is not the only company pursuing photonic quantum computers; others include PsiQuantum in the US and Quandela in France. Other groups are using materials like neutral atoms and ions to construct their quantum systems. 

From a technical standpoint, Tiwari suspects, no single qubit type will ever be the “winner,” but it’s likely that certain qubits will be better for specific applications. Photonic quantum computers, for example, are particularly well suited to Gaussian boson sampling, an algorithm that could be useful for quickly solving graph problems. “I really want more people to be looking at photonic quantum computers,” he says. He has studied quantum computers with multiple qubit types, including photons and superconducting qubits, and is not affiliated with a company. 

Isaac Kim, a physicist at the University of California, Davis, points out that Xanadu has not demonstrated the error correction ability many experts think a quantum computer will need in order to do any useful task, given that information stored in a quantum computer is notoriously fragile. 

Weedbrook, however, says Xanadu’s next goal is to improve the quality of the photons in the computer, which will ease the error correction requirements. “When you send lasers through a medium, whether it’s free space, chips, or fiber optics, not all the information makes it from the start to the finish,” he says. “So you’re actually losing light and therefore losing information.” The company is working to reduce this loss, which means fewer errors in the first place. 

Xanadu aims to build a quantum data center, with thousands of servers containing a million qubits, in 2029.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

AI-powered automation set for gains in 2025

AI-driven automation could help close staffing gaps Research firm Enterprise Management Associates (EMA), too, cites the infrastructure complexity that has resulted from hybrid and multi-cloud networks and the need for more advanced automation. “EMA research finds that hybrid clouds are particularly problematic for network operations teams today. They’re struggling with

Read More »

Tamboran Starts New Well Stimulation Campaign in Australia’s Beetaloo Basin

Tamboran Resources Corp. has launched what it says is the biggest well stimulation campaign in the Beetaloo Sub-Basin onshore Australia’s Northern Territory. The campaign consists of the Shenandoah South 2H sidetrack (SS-2H ST1) and Shenandoah S2-4H (SS4H), with up to 119 stimulated stages planned across the two wells. “The SS-2H ST1 well is planned to include 43 stages over a 5,427-foot (1,654-meter) horizontal section and the SS-3H well is planned to include 76 stages over a 9,766-foot (2,977-meter) horizontal section”, Tamboran said in a regulatory filing. The wells are planned to supply the proposed Shenandoah South Pilot Project, which is expected to produce 40 million cubic feet of natural gas a day with production expected to start mid-2026. An additional four-well campaign is planned for 2025 to complete the project’s drilling phase. For the new stimulation campaign, Tamboran has deployed Liberty Energy Inc.’s modern stimulation equipment, which it said is “the first 80,000-hydraulic-horsepower frac spread imported into the Beetaloo Basin from the US”. “The increased horsepower of this equipment is expected to deliver a step change in stimulation efficiency and proppant intensity compared to previous wells completed in the Basin”, commented Tamboran managing director and chief executive Joel Riddle. “This stimulation campaign is planned to be the largest in the Beetaloo Basin to date, with up to 119 stages over a full lateral length of 15,193 feet (~4,631 meters) across the two wells, an average of ~127 feet per stimulated stage. “Importantly, these wells are planned to be among the first to provide reliable energy to the Northern Territory”. “We have successfully undertaken stimulation programs during the wet season in the Basin over the last two years, with campaigns at Amungee 2H (2022/23) and Shenandoah South 1H (2023/24)”, Riddle added. “This experience gives us confidence in our ability to safely and

Read More »

EOG Wins Two New PSCs in Trinidad

Trinidad and Tobago signed Wednesday two new production sharing contracts (PSCs) with EOG Resources Ltd., which is already producing in the Caribbean country. The licenses are part of shallow water areas offered in a 2023-24 bidding round in the twin islands, which opened October 2023 and closed May 2024, according to an online statement by the Ministry of Energy and Energy Industries (MEEI). “EOG continues to be very important to the fabric of the energy sector here in Trinidad and Tobago”, said Energy and Energy Industries Minister Stuart R Young. “We at the Ministry of Energy and Energy Industries have to continue to ensure the relationship is as we have built it to be, with open communication and then we proceed to get more done”, Young added. EOG managing director George Viera said, “The last time we signed a PSC was in 2005 so it has been a while, approximately 20 years ago and we are excited to get this going”. The new PSCs are Lower Reverse L and NCMA 4(a), out of three for which the United States oil and gas exploration and production company bid. Block Lower Reverse L is on the southeast coast of Trinidad island. It has water depths of about 80-200 meters (262.47-656.17 feet). It is near existing EOG infrastructure and producing assets, the MEEI said. NCMA 4(a) sits on the North Coast Marine Area with water depths of around 100-200 meters. “Entry into NCMA marks a significant milestone for EOG, as it ventures into an area predominantly operated by other companies”, the ministry said. NCMA hosts seven gas fields to the north of Trinidad: Bougainvillea, Chaconia, Heliconia, Hibiscus, Ixora, Poinsettia and Poinsettia Southwest. The fields, discovered between the late 1970s and the early 1980s and put into production twenty years later, lie on trend with

Read More »

North Sea firms to press ahead despite court ruling Rosebank and Jackdaw approval ‘unlawful’

North Sea operators have insisted they will press ahead the development of two major oil and gas fields despite environmental campaigners claiming a “historic win” in a landmark court ruling. Shell (LON:SHEL) and Norwegian firm Equinor (OSL:EQNR) will continue work on the Rosebank and Jackdaw North Sea projects and resubmit their consent applications. This comes despite Scotland’s Court of Session deciding to overturn the approvals. This was in response to cases brought by two environmental groups who argued the UK government has acted unlawfully when granting consent for the field. Outlining the judgement on Thursday, Lord Ericht agreed the decision to grant consent was “unlawful” and ruled the operators will have to resubmit environmental impact assessments (EIAs) for the project. Campaign groups Greenpeace and Uplift had lodged the case after regulators and the previous Conservative government gave approval for the multi-billion pound Rosebank and Jackdaw projects. Supreme Court Finch case The decision comes in the wake of a Supreme Court ruling in the Finch case last year, which determined approvals for fossil fuel projects must account for all downstream emissions. © Supplied by Friends of the EarthCampaigner Sarah Finch celebrates outside of the Supreme Court in London in 2024. However the new environmental assessments must await guidance from the UK Labour Government which is due sometime in the Spring. The UK government is currently undertaking a consultation on how to assess scope 3 emissions in new oil and gas projects following the Supreme Court’s Finch ruling. Norwegian firm Equinor and partner Ithaca Energy received approval to develop the Rosebank oil field off the west coast of Shetland from the North Sea Transition Authority (NSTA) in 2023. Meanwhile, oil major Shell secured consent for the Jackdaw gas field off the coast of Aberdeen in 2022. They argued the environmental impact assessments

Read More »

SC Analysts Say US Tariffs on Canada Oil Would Likely be Counterproductive

In a report sent to Rigzone by Standard Chartered Bank Commodities Research Head Paul Horsnell this week, analysts at the bank, including Horsnell, said U.S. tariffs on Canadian oil would likely be counterproductive. “The U.S. imported an average of 6.6 million barrels per day of crude oil in the first 10 months of 2024, of which 4.0 million barrels per day was heavy oil for use in upgraded refineries with cracking units,” the analysts stated in the report. “Heavy oil cannot easily be substituted with the light oil that makes up most of U.S. shale oil production; such a switch would create a significant loss of optimization in the highly expensive cracking units that require feed from vacuum distillation of the heavy residual obtained by simple distillation,” they added. “Canada provided 75 percent of U.S. heavy crude oil imports in 2024, with its market share having steadily increased since 2000 … squeezing out flows from Mexico, Venezuela, and Colombia,” they continued. “If one were devising a tariff regime for oil imports with the aim of minimizing the pass-through of the tariff to retail prices, the tariffs would ideally target the most easily substitutable flows into the most competitive retail area,” they went on to state. The analysts noted in the report that, in the U.S., that would be light sweet crude oil imports into the Gulf Coast refining system. They said the other end of that scale is the 2.2 million barrels per day of Canadian heavy imports into Midwest refineries, which they pointed out in the report is more than half of Canada’s total crude oil exports to the United States. “Canada has supplied 99.89 percent of all heavy imports into Midwest refineries over the past 10 years,” the analysts stated in the report. “The low substitutability of this

Read More »

Shell Logs Lower Q4 Profit on Weaker Prices

Shell PLC on Thursday reported $3.66 billion in adjusted earnings, its definition of bottom line, for the fourth quarter of 2024, down by about 39.27 percent compared to the prior quarter and 49.89 percent against the corresponding period 2023. Adjusted earnings per share landed at $0.6. Shell opened lower on the Amsterdam and London stock exchanges on results day. The sequential fall “reflected higher exploration well write-offs, lower margins from crude and oil products trading and optimization, lower marketing margins and volumes, lower LNG trading and optimization margins, lower realized oil prices, and unfavorable tax movements”, according to results released online by the British energy giant. Shell also recognized negative adjustments to fourth-quarter income comprising $2.2 billion in impairment charges and reversals, as well as losses related to asset sales. The third quarter had a $1.2 billion negative adjustment. Oil and gas production available for sale totaled 2.82 million barrels per day (MMbpd) in the October-December period, compared to 2.8 MMbpd in the prior quarter and 2.83 MMbpd in the comparable period 2023. In its integrated gas reporting segment, Shell logged 116,000 bpd of liquids output for sale and 4.57 million standard cubic feet a day (MMscfd) of natural gas for sale. The segment includes the conversion of gas into liquid fuels and other products and the delivery of these products to the market, as well as the sale of liquefied natural gas. Shell’s realized prices upstream averaged $71 a barrel for liquids, down from the prior quarter, and $7 per thousand scf for gas, up from the prior quarter. In the integrated gas segment the average realized liquids price stood at 63 per barrel, while that of gas landed at $8.1 per thousand scf. Shell’s adjusted EBITDA for the fourth quarter came at $14.28 billion, down 10.77 percent sequentially.

Read More »

ITM Power losses narrow as revenue increases

British electrolyser manufacturer ITM Power (LON:ITM) has posted a 74% increase in revenues in its first half results as its losses narrowed. ITM recorded £15.5 million in revenues for the six months to 31 October, compared to £8.9m the year prior. Meanwhile, ITM posted an adjusted earnings before interest, taxes, depreciation and amortisation (EBITDA) loss of £16.8m, a slight drop compared to an £18.1m loss in the corresponding period last year. ITM said it now has a record contract backlog to date of £135.3m, up from £43.7m two years ago. ITM Power contracts It comes amid a series of deals for the firm in its first half period, including a contract for Shell’s 100 MW REFHYNE 2 project in Germany. The firm has also secured several deals following the end of October, including the sale of three of its 5 MW Neptune V electrolyser units to a company in Germany. Looking ahead, ITM said it expects to see revenue between £18m and £22m across its full-year results, with an adjusted EBITDA loss of between £32m and £36m. ITM Power chief executive officer Dennis Schulz said the “unrealistic hype” behind green hydrogen has given way to “real industrial scale-up of projects”. “Green hydrogen has begun to play its vital role in decarbonising the global energy system, whether as a feedstock in sectors such as chemicals and refining, as a fuel, or as a source of flexible power generation,” Schulz said. © Supplied by ITM PowerITM Power CEO Dennis Schulz. “Gone is the unrealistic hype that the hydrogen economy would develop overnight. “Instead, today, the hype has given way to real industrial scale-up of projects and production capacities.” Schulz said the green hydrogen industry has started gaining traction with an increasing number of project final investment decisions (FIDs) taken over the recent

Read More »

Verizon brings AI suite to enterprise infrastructure customers

Verizon Business has launched AI Connect, an integrated suite of products designed to let businesses deploy generative artificial intelligence (AI) workloads at scale. Verizon is building its AI ecosystem by repurposing its existing infrastructure assets in its intelligent and programmable network, which consists of fiber, edge networking, and data center assets, along with its metro and long-haul fiber, ILEC and Fios footprint, its metro network build-out, lit and dark fiber services, and 5G network. Verizon believes that the drive toward real-time decision-making using inferencing will be what drives demand for additional computing power.  The company cites a McKinsey report, which states that 60% to 70% of AI workloads are expected to shift to real-time inference by 2030. That will create an urgent need for low-latency connectivity, compute and security at the edge beyond current demand.

Read More »

Trump’s 100% tariff threat on Taiwan chips raises cost, supply chain fears

“I don’t think we will see a near-term impact, as it takes years to build fabs, but by the end of the decade, the US share could rise by a few percentage points,” Gupta said. “It’s hard to give an exact number, but if I were to estimate, I’d say 14-15%. That isn’t a lot, but for the US to gain share, someone else must lose it, and while the US is making efforts, we see similar developments across Asia.” Yet, if Washington imposes smaller tariffs on imports from countries such as India, Japan, or Malaysia, Taiwanese chipmakers may shift production there rather than to the US, according to Stephen Ezell, vice president at the Information Technology and Innovation Foundation (ITIF). “Additionally, if the tariffs applied to Chinese chip exports were lower than for Taiwanese exports, Trump would be helping Chinese semiconductor manufacturers, whose exports to the US market would then be less expensive,” Ezell said in a recent note. “So, for this policy to have any real effect, Trump effectively must raise tariffs on all semiconductors, and that would likely lead to global tit-for-tat.” Enterprise IT faces tough choices If semiconductor tariffs drive up costs, enterprises will be forced to reassess spending priorities, potentially delaying or cutting investments in critical IT infrastructure. Rising chip prices could squeeze budgets for AI, cloud computing, and data center expansions, forcing businesses to make difficult trade-offs. “On the corporate side, hyperscalers and enterprise players need to brace for impact over the next 2-3 years if high tariffs continue along with the erosion of operating margin,” Faruqui said. “In addition, the boards and CEOs have to boldly make heavy CAPEX investment on US Soil via US and Asian partners as soon as possible to realize HVM on US soil and alleviate operating margin erosion due to

Read More »

New tweak to Linux kernel could cut data center power usage by up to 30%

When network traffic is heavy, it is most efficient, and delivers the best performance, to disable interrupts and run in polling mode. But when network traffic is light, interrupt-driven processing works best, he noted. “An implementation using only polling would waste a lot of resources/energy during times of light traffic. An implementation using only interrupts becomes inefficient during times of heavy traffic. … So the biggest energy savings arise when comparing to a high-performance always-polling implementation during times of light traffic,” Karsten said. “Our mechanism automatically detects [the amount of network traffic] and switches between polling and interrupt-driven to get the best of both worlds.” In the patch cover letter, Damato described the implementation of the new parameter in more detail, noting: “this delivery mode is efficient, because it avoids softIRQ execution interfering with application processing during busy periods. It can be used with blocking epoll_wait to conserve CPU cycles during idle periods. The effect of alternating between busy and idle periods is that performance (throughput and latency) is very close to full busy polling, while CPU utilization is lower and very close to interrupt mitigation.” Added Karsten: “At the nuts and bolts level, enabling the feature requires a small tweak to applications and the setting of a system configuration variable.” And although he can’t yet quantify the energy benefits of the technique (the 30% saving cited is best case), he said, “the biggest energy savings arise when comparing to a high-performance always-polling implementation during times of light traffic.”

Read More »

Macquarie’s Big Play in AI and HPC: $17+ Billion Invested Across Two Data Center Titans

Macquarie Asset Management (MAM) is making bold moves to position itself as a dominant force in the rapidly growing sectors of AI and high-performance computing (HPC). In a single week, MAM has made two pivotal investments in Applied Digital and Aligned Data Centers, committing over $17 billion to fuel innovation, growth, and capacity expansion in critical infrastructure markets across the Americas. Both deals highlight the immense demand for AI-ready and HPC-optimized data centers, underscoring the ongoing digitization of the global economy and the insatiable need for computing power to drive artificial intelligence (AI), machine learning (ML), and other resource-intensive workloads. Applied Digital Partners with Macquarie Asset Management for $5 Billion HPC Investment On January 14, Applied Digital Corporation announced what it billed as a transformative partnership with Macquarie to drive growth in HPC infrastructure. This agreement positions Applied Digital as a leading designer, builder, and operator of advanced data centers in the United States, catering to the growing demands of AI and HPC workloads. To account for the $5 billion commitment, funds managed by MAM will invest up to $900 million in Applied Digital’s Ellendale HPC Campus in North Dakota, with an additional $4.1 billion available for future HPC projects. This could support over 2 gigawatts (GW) of HPC data center development. MAM is a global asset manager overseeing approximately $633.7 billion in assets. Part of Australia-based Macquarie Group, it specializes in diverse investment solutions across real assets, real estate, credit, and equities. With its new landmark agreement with Macquarie, Applied Digital feels it is poised to redefine the HPC data center landscape, ensuring its place as a leader in the AI and HPC revolution. In terms of ownership structure, MAM’s investment here includes perpetual preferred equity and a 15% common equity interest in Applied Digital’s HPC business segment, allowing

Read More »

Data Center Frontier Announces Editorial Advisory Board for 2025 DCF Trends Summit

Nashua, NH – Data Center Frontier is excited to announce its Editorial Advisory Board for the second annual Data Center Frontier Trends Summit (DCF Trends Summit), taking place August 26-28, 2025, at the Hyatt Regency Reston in Reston, Virginia.  The 2025 DCF Trends Summit Editorial Advisory Board includes distinguished leaders from hyperscale and colocation operators, power and cooling solutions companies, IT and interconnection providers, and design/build/construction specialists. This year’s board has grown to include 15 esteemed executives, reflecting DCF’s commitment to providing comprehensive and diverse insights for the data center sector.  This visionary group of leaders, representing the critical facets of the data center ecosystem, will guide the event’s content and programming to address the most pressing trends impacting the industry. The group’s unparalleled expertise ensures the Summit will deliver essential insights to help data center stakeholders make informed decisions in the industry’s rapidly evolving landscape.  The Editorial Advisory Board for the 2025 DCF Trends Summit includes:  Scott Bergs, CEO, Dark Fiber & Infrastructure (DF&I) Steven Carlini, VP, Innovation and Data Center Energy Management Business, Schneider Electric Dan Crosby, CEO, Legend Energy Advisors Rob Coyle, Director of Technical Programs, Open Compute Project (OCP) Foundation Chris Downie, CEO, Flexential Sean Farney, VP of Data Centers, Jones Lang LaSalle (JLL) Mark Freeman, VP of Marketing, Vantage Data Centers Steven Lim, SVP of Marketing & GTM Strategy, NTT Global Data Centers David McCall, VP of Innovation, QTS Data Centers Nancy Novak, Chief Innovation Officer, Compass Datacenters Karen Petersburg, VP of Construction & Development, PowerHouse Data Centers Tara Risser, Chief Business Officer, Cologix Stefan Raab, Sr. Director, Business Development – AMER, Equinix Phill Lawson-Shanks, Chief Innovation Officer, Aligned Data Centers Brenda Van der Steen, VP of Global Growth Marketing, Digital Realty “The Editorial Advisory Board for the second annual Data Center Frontier Trends Summit is

Read More »

Podcast: Data Center Trends Discussion with Industry Veteran Ron Vokoun of Everus Construction Group

For this episode of the Data Center Frontier Show Podcast, DCF Editor in Chief Matt Vincent and Senior Editor David Chernicoff sat down for a far-reaching discussion with data center industry luminary Ron Vokoun, a 35-year veteran of the construction industry with a primary focus on digital infrastructure.  “I got into telecom back in ’92, which led to data centers,” Vokoun said. “Probably worked on my first one around ’96 or ’97, and I’ve been involved ever since.” Currently the Director of National Market Development for Everus Construction Group, Vokoun has been involved in AFCOM, both regionally and nationally, for nearly two decades and is an emeritus content advisory board member for Data Center World. He has also written extensively for Data Center Dynamics. He added, “I’ve just always been curious—very much a learner. Being a construction guy, I often write about things I probably have no business writing about, which is always the challenge, but I’m just curious—a lifelong learner. Interestingly, [DCF founder] Rich Miller … gave me my first blogging opportunity.” Here’s a timeline of the podcast’s highlights: Introductions – Ron Vokoun shares his extensive background. He has been in the construction industry for 35 years. 1:46– On his role at Everus Construction Group and the company’s diverse services across the nation. 2:07– Vokoun reflects on his long-standing relationship with Rich Miller. He acknowledges Rich’s influence on his blogging career. 3:05 Nuclear Energy – A discussion about nuclear energy trends occurs. The importance of nuclear energy in data center construction is probed. 3:35– Natural gas is highlighted as a key trend. Its role as a gateway to hydrogen is emphasized. 3:51– The impact of recent nuclear developments is analyzed. The reopening of Three Mile Island is noted as significant. 4:55 Future Power Sources for Data Centers – Discussion turns to the

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »