Stay Ahead, Stay ONMINE

Talking about Games

Game theory is a field of research that is quite prominent in Economics but rather unpopular in other scientific disciplines. However, the concepts used in game theory can be of interest to a wider audience, including data scientists, statisticians, computer scientists or psychologists, to name just a few. This article is the opener to a […]

Game theory is a field of research that is quite prominent in Economics but rather unpopular in other scientific disciplines. However, the concepts used in game theory can be of interest to a wider audience, including data scientists, statisticians, computer scientists or psychologists, to name just a few. This article is the opener to a four-chapter tutorial series on the fundamentals of game theory, so stay tuned for the upcoming articles. 

In this article, I will explain the kinds of problems Game Theory deals with and introduce the main terms and concepts used to describe a game. We will see some examples of games that are typically analysed within game theory and lay the foundation for deeper insights into the capabilities of game theory in the later chapters. But before we go into the details, I want to introduce you to some applications of game theory, that show the multitude of areas game-theoretic concepts can be applied to. 

Applications of game theory

Even french fries can be an application of game theory. Photo by engin akyurt on Unsplash

Does it make sense to vote for a small party in an election if this party may not have a chance to win anyway? Is it worth starting a price war with your competitor who offers the same goods as you? Do you gain anything if you reduce your catch rate of overfished areas if your competitors simply carry on as before? Should you take out insurance if you believe that the government will pay for the reconstruction after the next hurricane anyway? And how should you behave in the next auction where you are about to bid on your favourite Picasso painting? 

All these questions (and many more) live within the area of applications that can be modelled with game theory. Whenever a situation includes strategic decisions in interaction with others, game-theoretic concepts can be applied to describe this situation formally and search for decisions that are not made intuitively but that are backed by a notion of rationality. Key to all the situations above is that your decisions depend on other people’s behaviour. If everybody agrees to conserve the overfished areas, you want to play along to preserve nature, but if you think that everybody else will continue fishing, why should you be the only one to stop? Likewise, your voting behaviour in an election might heavily depend on your assumptions about other people’s votes. If nobody votes for that candidate, your vote will be wasted, but if everybody thinks so, the candidate doesn’t have a chance at all. Maybe there are many people who say “I would vote for him if others vote for him too”.

Similar situations can happen in very different situations. Have you ever thought about having food delivered and everybody said “You don’t have to order anything because of me, but if you order anyway, I’d take some french fries”? All these examples can be applications of game theory, so let’s start understanding what game theory is all about. 

Understanding the game

Before playing, you need to understand the components of the game. Photo by Laine Cooper on Unsplash

When you hear the word game, you might think of video games such as Minecraft, board games such as Monopoly, or card games such as poker. There are some common principles to all these games: We always have some players who are allowed to do certain things determined by the game’s rules. For example, in poker, you can raise, check or give up. In Monopoly, you can buy a property you land on or don’t buy it. What we also have is some notion of how to win the game. In poker, you have to get the best hand to win and in Monopoly, you have to be the last person standing after everybody went bankrupt. That also means that some actions are better than others in some scenarios. If you have two aces on the hand, staying in the game is better than giving up. 

When we look at games from the perspective of game theory, we use the same concepts, just more formally.

A game in game theory consists of n players, where each player has a strategy set and a utility function.

A game consists of a set of players I = {1, .., n}, where each player has a set of strategies S and a utility function ui(s1, s2, … sn). The set of strategies is determined by the rules of the games. For example, it could be S = {check, raise, give-up} and the player would have to decide which of these actions they want to use. The utility function u (also called reward) describes how valuable a certain action of a player would be, given the actions of the other players. Every player wants to maximize their utility, but now comes the tricky part: The utility of an action of yours depends on the other players’ actions. But for them, the same applies: Their actions’ utilities depend on the actions of the other players (including yours). 

Let’s consider a well-known game to illustrate this point. In rock-paper-scissors, we have n=2 players and each player can choose between three actions, hence the strategy set is S={rock, paper, scissors} for each player. But the utility of an action depends on what the other player does. If our opponent chooses rock, the utility of paper is high (1), because paper beats rock. But if your opponent chooses scissors, the utility of paper is low (-1), because you would lose. Finally, if your opponent chooses paper as well, you reach a draw and the utility is 0. 

Utility values for player one choosing paper for three choices of the opponents strategy.

Instead of writing down the utility function for each case individually, it is common to display games in a matrix like this:

The first player decides for the row of the matrix by selecting his action and the second player decides for the column. For example, if player 1 chooses paper and player 2 chooses scissors, we end up in the cell in the third column and second row. The value in this cell is the utility for both players, where the first value corresponds to player 1 and the second value corresponds to player 2. (-1,1) means that player 1 has a utility of -1 and player 2 has a utility of 1. Scissors beat paper. 

Some more details

Now we have understood the main components of a game in game theory. Let me add a few more hints on what game theory is about and what assumptions it uses to describe its scenarios. 

  • We often assume that the players select their actions at the same time (like in rock-paper-scissors). We call such games static games. There are also dynamic games in which players take turns deciding on their actions (like in chess). We will consider these cases in a later chapter of this tutorial. 
  • In game theory, it is typically assumed that the players can not communicate with each other so they can’t come to an agreement before deciding on their actions. In rock-paper-scissors, you wouldn’t want to do that anyway, but there are other games where communication would make it easier to choose an action. However, we will always assume that communication is not possible. 
  • Game theory is considered a normative theory, not a descriptive one. That means we will analyse games concerning the question “What would be the rational solution?” This may not always be what people do in a likewise situation in reality. Such descriptions of real human behaviour are part of the research field of behavioural economics, which is located on the border between Psychology and economics. 

The prisoner’s dilemma

The prisoner’s dilemma is all about not ending up here. Photo by De an Sun on Unsplash

Let us become more familiar with the main concepts of game theory by looking at some typical games that are often analyzed. Often, such games are derived from are story or scenario that may happen in the real world and require people to decide between some actions. One such story could be as follows: 

Say we have two criminals who are suspected of having committed a crime. The police have some circumstantial evidence, but no actual proof for their guilt. Hence they question the two criminals, who now have to decide if they want to confess or deny the crime. If you are in the situation of one of the criminals, you might think that denying is always better than confessing, but now comes the tricky part: The police propose a deal to you. If you confess while your partner denies, you are considered a crown witness and will not be punished. In this case, you are free to go but your partner will go to jail for six years. Sounds like a good deal, but be aware, that the outcome also depends on your partner’s action. If you both confess, there is no crown witness anymore and you both go to jail for three years. If you both deny, the police can only use circumstantial evidence against you, which will lead to one year in prison for both you and your partner. But be aware, that your partner is offered the same deal. If you deny and he confesses, he is the crown witness and you go to jail for six years. How do you decide?

The prisoner’s dilemma.

The game derived from this story is called the prisoner’s dilemma and is a typical example of a game in game theory. We can visualize it as a matrix just like we did with rock-paper-scissors before and in this matrix, we easily see the dilemma the players are in. If both deny, they receive a rather low punishment. But if you assume that your partner denies, you might be tempted to confess, which would prevent you from going to jail. But your partner might think the same, and if you both confess, you both go to jail for longer. Such a game can easily make you go round in circles. We will talk about solutions to this problem in the next chapter of this tutorial. First, let’s consider some more examples. 

Bach vs. Stravinsky

Who do you prefer, Bach or Stravinsky? Photo by Sigmund on Unsplash

You and your friend want to go to a concert together. You are a fan of Bach’s music but your friend favors the Russian 20th. century composer Igor Stravinsky. However, you both want to avoid being alone in any concert. Although you prefer Bach over Stravinsky, you would rather go to the Stravinsky concert with your friend than go to the Bach concert alone. We can create a matrix for this game: 

Bach vs. Stravinsky

You decide for the row by going to the Bach or Stravinsky concert and your friend decides for the column by going to one of the concerts as well. For you, it would be best if you both chose Bach. Your reward would be 2 and your friend would get a reward of 1, which is still better for him than being in the Stravinsky concert all by himself. However, he would be even happier, if you were in the Stravinsky concert together. 

Do you remember, that we said players are not allowed to communicate before making their decision? This example illustrates why. If you could just call each other and decide where to go, this would not be a game to investigate with game theory anymore. But you can’t call each other so you just have to go to any of the concerts and hope you will meet your friend there. What do you do? 

Arm or disarm?

Make love, not war. Photo by Artem Beliaikin on Unsplash

A third example brings us to the realm of international politics. The world would be a much happier place with fewer firearms, wouldn’t it? However, if nations think about disarmament, they also have to consider the choices other nations make. If the USA disarms, the Soviet Union might want to rearm, to be able to attack the USA — that was the thinking during the Cold War, at least. Such a scenario could be described with the following matrix: 

The matrix for the disarm vs. upgrade game.

As you see, when both nations disarm, they get the highest reward (3 each), because there are fewer firearms in the world and the risk of war is minimized. However, if you disarm, while the opponent upgrades, your opponent is in the better position and gets a reward of 2, while you only get 0. Then again, it might have been better to upgrade yourself, which gives a reward of 1 for both players. That is better than being the only one who disarms, but not as good as it would get if both nations disarmed. 

The solution?

All these examples have one thing in common: There is no single option that is always the best. Instead, the utility of an action for one player always depends on the other player’s action, which, in turn, depends on the first player’s action and so on. Game theory is now interested in finding the optimal solution and deciding what would be the rational action; that is, the action that maximizes the expected reward. Different ideas on how exactly such a solution looks like will be part of the next chapter in this series. 

Summary

Learning about game theory is as much fun as playing a game, don’t you think? Photo by Christopher Paul High on Unsplash

Before continuing with finding solutions in the next chapter, let us recap what we have learned so far. 

  • A game consists of players, that decide for actions, which have a utility or reward
  • The utility/reward of an action depends on the other players’ actions. 
  • In static games, players decide for their actions simultaneously. In dynamic games, they take turns. 
  • The prisoner’s dilemma is a very popular example of a game in game theory.
  • Games become increasingly interesting if there is no single action that is better than any other. 

Now that you are familiar with how games are described in game theory, you can check out the next chapter to learn how to find solutions for games in game theory. 

References

The topics introduced here are typically covered in standard textbooks on game theory. I mainly used this one, which is written in German though: 

  • Bartholomae, F., & Wiens, M. (2016). Spieltheorie. Ein anwendungsorientiertes Lehrbuch. Wiesbaden: Springer Fachmedien Wiesbaden.

An alternative in English language could be this one: 

  • Espinola-Arredondo, A., & Muñoz-Garcia, F. (2023). Game Theory: An Introduction with Step-by-step Examples. Springer Nature.

Game theory is a rather young field of research, with the first main textbook being this one: 

  • Von Neumann, J., & Morgenstern, O. (1944). Theory of games and economic behavior.

Like this article? Follow me to be notified of my future posts.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

CompTIA training targets workplace AI use

CompTIA AI Essentials (V2) delivers training to help employees, students, and other professionals strengthen the skills they need for effective business use of AI tools such as ChatGPT, Copilot, and Gemini. In its first iteration, CompTIA’s AI Essentials focused on AI fundamentals to help professionals learn how to apply AI technology

Read More »

OPEC Receives Updated Compensation Plans

A statement posted on OPEC’s website this week announced that the OPEC Secretariat has received updated compensation plans from Iraq, the United Arab Emirates (UAE), Kazakhstan, and Oman. A table accompanying this statement showed that these compensation plans amount to a total of 221,000 barrels per day in November, 272,000

Read More »

LogicMonitor closes Catchpoint buy, targets AI observability

The acquisition combines LogicMonitor’s observability platform with Catchpoint’s internet-level intelligence, which monitors performance from thousands of global vantage points. Once integrated, Catchpoint’s synthetic monitoring, network data, and real-user monitoring will feed directly into Edwin AI, LogicMonitor’s intelligence engine. The goal is to let enterprise customers shift from reactive alerting to

Read More »

Akamai acquires Fermyon for edge computing as WebAssembly comes of age

Spin handles compilation from source to WebAssembly bytecode and manages execution on target platforms. The runtime abstracts the underlying technology while preserving WebAssembly’s performance and security characteristics. This bet on WebAssembly standards has paid off as the technology matured.  WebAssembly has evolved significantly beyond its initial browser-focused design to support

Read More »

U.S. Department of Energy Unveils Updated Hydrogen Program Plan

The U.S. Department of Energy (DOE) today announced its updated Hydrogen Program Plan, a foundational resource for advancing research, development, demonstration, and deployment (RDD&D) of hydrogen technologies. The Hydrogen Program Plan specifically identifies and articulates strategic, high-impact areas of focus across DOE’s Hydrogen Program, a cohesive and coordinated effort involving multiple hydrogen-related offices. Advancing a coordinated strategy for RDD&D is particularly important for hydrogen because it has the potential to be used across virtually all sectors of the economy and it can be produced, stored, and delivered in many ways. A successful strategy will need to integrate efforts in renewable, nuclear, and fossil energy—and coordinate across end uses in multiple sectors of the economy. This plan provides a strategic framework that incorporates RDD&D efforts of the Office of Energy Efficiency and Renewable Energy, Office of Fossil Energy and Carbon Management, Office of Nuclear Energy, Office of Electricity, Office of Science, Loan Programs Office, Office of Manufacturing and Energy Supply Chains, Office of Clean Energy Demonstrations, and the Advanced Research Projects Agency – Energy to advance the production, transport, storage, and use of hydrogen. In 2023, several federal agencies developed the U.S. National Hydrogen Strategy and Roadmap, a comprehensive, nationwide framework for accelerating the production, processing, delivery, storage, and use of hydrogen. This 2024 update to the Hydrogen Program Plan explains how DOE offices collaboratively work to efficiently implement the strategies outlined in the Strategy and Roadmap. This new version also includes updated supporting data and analysis, a description of the regional hydrogen hubs, information about ambitious DOE-wide goals established through the Hydrogen Shot™, and examples of DOE-wide efforts to establish a strong workforce, maximize technology efficiency, and accelerate innovation in the transition to a hydrogen economy. Learn more about the Hydrogen Program Plan.

Read More »

Crude Settles Higher Despite Saudi Price Cut

Oil climbed in choppy trading as the market weighed geopolitical tensions against a price cut on Saudi Arabia’s main grade of crude to Asia. West Texas Intermediate rose 1.2% to settle below $60. Ukrainian negotiators will have a new round of talks in Florida as Russian President Vladimir Putin said some points in a US-backed peace plan to end Moscow’s war were unacceptable to him. That means a lifting of sanctions on Russian oil is still elusive, offering support to prices. Lending further support, Putin emphasized that his country’s energy cooperation with India “remains unaffected.” He added that a Russian oil company has been “continuously expanding operations” of an Indian refinery, and Moscow’s oil flows to India are running smoothly. Meanwhile, US President Donald Trump reiterated on Wednesday that the US will start striking alleged drug cartels on land in Venezuela very soon. Military intervention could force the South American nation’s oil production and exports to drop. Still, signs of a surplus are proliferating in global oil markets. State producer Saudi Aramco will reduce the price of its flagship Arab Light crude grade to a 60-cent premium to the regional benchmark for January, according to a price list seen by Bloomberg. The cut brings prices to the lowest level since 2021. Crude prices are down about 17% this year as booming supply from the Americas, coupled with hikes from the OPEC+ group itself, has exceeded subdued demand growth. Canadian crude prices have fallen to their weakest point relative to the US benchmark since March. And the International Energy Agency has predicted a record glut in 2026, while Wall Street banks expect futures to head lower. While lower prices are a pain point for producers, they offer a boon for consumers. US gasoline prices fell below $3 a gallon for the

Read More »

UK Energy Tech Firm Seeks $500MM in Funding Round

Kraken Technologies Ltd., a software platform that helps utilities manage the transition to cleaner energy, has kicked off a fresh funding round to raise about $500 million, according to people familiar with the matter. The platform owned by Octopus Energy Group Ltd. has been in talks with prospective investors, said the people, who asked not to be identified as the information is private. Kraken is seeking a valuation of about $15 billion in the funding round, the people said. Deliberations are ongoing and details of the fundraising could still change, the people said. A representative for Octopus Energy declined to comment. In September, Octopus Energy said it plans to spin off Kraken, which has been key to its growth into the UK’s largest electricity supplier, leapfrogging industry incumbents to serve more than 7 million customers in the country. The software allows it to balance out power flows to households as energy-transition technologies like electric vehicles, home batteries, solar panels and heat pumps become more widespread. Kraken’s platform is already being licensed to other energy providers such as Electricite de France SA and Tokyo Gas Co., supporting more than 70 million utility customers globally, its website shows. It’s poised to sustain a 25% annual account growth to reach 100 million by 2027 through multi-sector global licensing, Bloomberg Intelligence analysts Joao Martins and Patricio Alvarez wrote in an Oct. 31 note.  Octopus Energy said in January that it had become the UK’s largest energy supplier just nine years after the company started, overtaking legacy providers such as Centrica Plc’s British Gas. The rapid decarbonization of the country’s electricity has transformed the business landscape for utilities, occasionally sending power prices below zero when wind power generation surges. WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments

Read More »

Saudis Cut Main Oil Price to 5-Year Low

Saudi Arabia cut the price of its flagship crude grade to the lowest level in five years, amid persistent signs of a surplus in global oil markets. State producer Saudi Aramco will reduce the price of its Arab Light grade for Asian customers to a 60-cent premium to the regional benchmark for January, according to a price list seen by Bloomberg. It’s the lowest since January 2021 and a drop that was largely in-line with a survey of refiners and traders. The Organization of the Petroleum Exporting Countries and its allies affirmed over the weekend a previous decision to pause production increases in the first quarter of next year, citing a period of weaker seasonal demand during winter months across much of Asia, Europe and North America. Crude prices are down about 16% this year as booming supply from the Americas, in tandem with hikes from the OPEC+ grouping itself, exceeded subdued demand growth. The International Energy Agency has predicted a record glut in 2026, while Wall Street banks including Goldman Sachs Group Inc. see futures heading lower.  Aramco cut all of its prices to Asian buyers, with its Arab Medium crude flipping to a discount for the first time since late-2020. Global benchmark Brent futures erased an earlier gain to trade little changed after the prices were released. WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Read More »

$5 Psychological Threshold Still a Relevant Driver for NatGas

In an EBW Analytics Group report sent to Rigzone by the EBW team on Thursday, Eli Rubin, an energy analyst at the company, outlined that the “$5.00 [per million British thermal units (MMBtu)] psychological threshold” is “still a relevant market driver” for natural gas. “The January natural gas contract traded as high as $5.039 [per MMBtu] yesterday before closing at $4.995,” Rubin said in the report. “Technicals appear supportive of further upside, physical prices strong, and very cold forecasts intact,” he added. “DTN’s forecast for Weeks 2 and 3 added another 4 gHDDs in the past 24 hours, and the Week 4 forecast highlights enduring cold risks into Christmas. If frigid late-December forecasts roll forward, it could present further tailwinds for NYMEX upside,” he continued. Rubin highlighted in the report that consensus projections for this morning’s U.S. Energy Information Administration (EIA) weekly natural gas storage report are for a 14 to 18 billion cubic foot draw. “The subsequent four EIA weeks, however, may total 218 Bcf (7.8 Bcfpd) tighter than the five-year average,” he said. “LNG remains up 5.1 Bcfpd and November production upside seems to be stalling. Our most likely storage projection for the end of March dipped below 1,600 Bcf,” he added. “While winter 2025/26 storage appears adequate, concerns over rebuilding storage next year have led the 2026 injection season contracts to add 16.3 cents since Friday – even outpacing January contract gains (+14.5 cents) week to date,” he continued. The EBW report outlined that the January natural gas contract’s Wednesday close of $4.995 per MMBtu marked a 15.5 cent, or 3.2 percent, increase from Tuesday’s close. In the report, EBW predicted a “test higher and relent” trend for the NYMEX front-month natural gas contract price over the next 7-10 days and a “volatile path higher” trend over

Read More »

Analysts See ‘5-10 Years of Decent Growth’ for Oil Demand

In a report sent to Rigzone by the Morningstar team this week, Morningstar analysts said they believe oil demand “still has five to 10 years of decent growth before plateauing in the early 2030s”. “Demand doesn’t start declining until the late 2030s,” the analysts said in the report, which is titled The Future of Oil to 2050. “We project oil demand to grow from 104 million barrels per day in 2024 to a peak of 108 million barrels per day in 2032, then decline to 96 million barrels per day in 2050,” they added. The analysts highlighted in the report that the 2050 projection represents a cumulative eight percent drop versus 2024, “or 0.3 percent annually”. The Morningstar analysts stated in the report that, as a result of their demand outlook, they’ve recently upgraded their midcycle oil price to $65 per barrel from $60. “Over 2025 to 2034, they expect Brent oil prices to average $65 per barrel in inflation-adjusted terms,” the analysts said in the report. “This is close to the current Brent price at $63 as of November 2025, and a bit below the 2015-24 average real Brent price of $76,” they added. “We use this 10-year average through 2034 as our midcycle price. By the 2040s, we project oil prices to surge over $100 per barrel in real terms,” they continued. Morningstar analysts highlighted that The Future of Oil to 2050 report is an update to the company’s 2021 report, The Future of Oil Demand. “Our bullish thesis on oil demand remains essentially unchanged from that earlier report,” the analysts added. “New in this year’s report is a price forecast through 2050, derived by combining our in-house demand view with Rystad’s supply-side projections,” they added. The analysts pointed out in the report that Morningstar’s methodology includes a

Read More »

With AI Factories, AWS aims to help enterprises scale AI while respecting data sovereignty

“The AWS AI Factory seeks to resolve the tension between cloud-native innovation velocity and sovereign control. Historically, these objectives lived in opposition. CIOs faced an unsustainable dilemma: choose between on-premises security or public cloud cost and speed benefits,” he said. “This is arguably AWS’s most significant move in the sovereign AI landscape.” On premises GPUs are already a thing AI Factories isn’t the first attempt to put cloud-managed AI accelerators in customers’ data centers. Oracle introduced Nvidia processors to its Cloud@Customer managed on-premises offering in March, while Microsoft announced last month that it will add Nvidia processors to its Azure Local service. Google Distributed Cloud also includes a GPU offering, and even AWS offers lower-powered Nvidia processors in its AWS Outposts. AWS’ AI Factories is also likely to square off against from a range of similar products, such as Nvidia’s AI Factory, Dell’s AI Factory stack, and HPE’s Private Cloud for AI — each tightly coupled with Nvidia GPUs, networking, or software, and all vying to become the default on-premises AI platform. But, said Sopko, AWS will have an advantage over rivals due to its hardware-software integration and operational maturity: “The secret sauce is the software, not the infrastructure,” he said. Omdia principal analyst Alexander Harrowell expects AWS’s AI Factories to combine the on-premises control of Outposts with the flexibility and ability to run a wider variety of services offered by AWS Local Zones, which puts small data centers close to large population centers to reduce service latency. Sopko cautioned that enterprises are likely to face high commitment costs, drawing a parallel with Oracle’s OCI Dedicated Region, one of its Cloud@Customer offerings.

Read More »

HPE loads up AI networking portfolio, strengthens Nvidia, AMD partnerships

On the hardware front, HPE is targeting the AI data center edge with a new MX router and the scale-out networking delivery with a new QFX switch. Juniper’s MX series is its flagship routing family aimed at carriers, large-scale enterprise data center and WAN customers, while the QFX line services data center customers anchoring spine/leaf networks as well as top-of-rack systems. The new 1U, 1.6Tbps MX301 multiservice edge router, available now, is aimed at bringing AI inferencing closer to the source of data generation and can be positioned in metro, mobile backhaul, and enterprise routing applications, Rahim said. It includes high-density support for 16 x 1/1025/50GbE, 10 x 100Gb and 4 x 400Gb interfaces. “The MX301 is essentially the on-ramp to provide high speed, secure connections from distributed inference cluster users, devices and agents from the edge all the way to the AI data center,” Rami said. “The requirements here are typically around high performance, but also very high logical skills and integrated security.” In the QFX arena, the new QFX5250 switch, available in 1Q 2026, is a fully liquid-cooled box aimed at tying together Nvidia Rubin and/or AMD MI400 GPUs for AI consumption across the data center. It is built on Broadcom Tomahawk 6 silicon and supports up to 102.4Tbps Ethernet bandwidth, Rahim said.  “The QFX5250 combines HPE liquid cooling technology with Juniper networking software (Junos) and integrated AIops intelligence to deliver a high-performance, power-efficient and simplified operations for next-generation AI inference,” Rami said. Partnership expansions Also key to HPE/Juniper’s AI networking plans are its partnerships with Nvidia and AMD. The company announced its relationship with Nvidia now includes HPE Juniper edge onramp and long-haul data center interconnect (DCI) support in its Nvidia AI Computing by HPE portfolio. This extension uses the MX and Junipers PTX hyperscaler routers to support high-scale, secure

Read More »

What is co-packaged optics? A solution for surging capacity in AI data center networks

When it announced its CPO-capable switches, Nvidia said they would improve resiliency by 10 times at scale compared to previous switch generations. Several factors contribute to this claim, including the fact that the optical switches require four times fewer lasers, Shainer says. Whereas the laser source was previously part of the transceiver, the optical engine is now incorporated onto the ASIC, allowing multiple optical channels to share a single laser. Additionally, in Nvidia’s implementation, the laser source is located outside of the switch. “We want to keep the ability to replace a laser source in case it has failed and needs to be replaced,” he says. “They are completely hot-swappable, so you don’t need to shut down the switch.” Nonetheless, you may often hear that when something fails in a CPO box, you need to replace the entire box. That may be true if it’s the photonics engine embedded in silicon inside the box. “But they shouldn’t fail that often. There are not a lot of moving parts in there,” Wilkinson says. While he understands the argument around failures, he doesn’t expect it to pan out as CPO gets deployed. “It’s a fallacy,” he says. There’s also a simple workaround to the resiliency issue, which hyperscalers are already talking about, Karavalas says: overbuild. “Have 10% more ports than you need or 5%,” he says. “If you lose a port because the optic goes bad, you just move it and plug it in somewhere else.” Which vendors are backing co-packaged optics? In terms of vendors that have or plan to have CPO offerings, the list is not long, unless you include various component players like TSMC. But in terms of major switch vendors, here’s a rundown: Broadcom has been making steady progress on CPO since 2021. It is now shipping “to

Read More »

Nvidia’s $2B Synopsys stake tests independence of open AI interconnect standard

But the concern for enterprise IT leaders is whether Nvidia’s financial stakes in UALink consortium members could influence the development of an open standard specifically designed to compete with Nvidia’s proprietary technology and to give enterprises more choices in the datacenter. Organizations planning major AI infrastructure investments view such open standards as critical to avoiding vendor lock-in and maintaining competitive pricing. “This does put more pressure on UALink since Intel is also a member and also took investment from Nvidia,” Sag said. UALink and Synopsys’s critical role UALink represents the industry’s most significant effort to prevent vendor lock-in for AI infrastructure. The consortium ratified its UALink 200G 1.0 Specification in April, defining an open standard for connecting up to 1,024 AI accelerators within computing pods at 200 Gbps per lane — directly competing with Nvidia’s NVLink for scale-up applications. Synopsys plays a critical role. The company joined UALink’s board in January and in December announced the industry’s first UALink design components, enabling chip designers to build UALink-compatible accelerators. Analysts flag governance concerns Gaurav Gupta, VP analyst at Gartner, acknowledged the tension. “The Nvidia-Synopsys deal does raise questions around the future of UALink as Synopsys is a key partner of the consortium and holds critical IP for UALink, which competes with Nvidia’s proprietary NVLink,” he said. Sanchit Vir Gogia, chief analyst at Greyhound Research, sees deeper structural concerns. “Synopsys is not a peripheral player in this standard; it is the primary supplier of UALink IP and a board member within the UALink Consortium,” he said. “Nvidia’s entry into Synopsys’ shareholder structure risks contaminating that neutrality.”

Read More »

Cooling crisis at CME: A wakeup call for modern infrastructure governance

Organizations should reassess redundancy However, he pointed out, “the deeper concern is that CME had a secondary data center ready to take the load, yet the failover threshold was set too high, and the activation sequence remained manually gated. The decision to wait for the cooling issue to self-correct rather than trigger the backup site immediately revealed a governance model that had not evolved to keep pace with the operational tempo of modern markets.” Thermal failures, he said, “do not unfold on the timelines assumed in traditional disaster recovery playbooks. They escalate within minutes and demand automated responses that do not depend on human certainty about whether a facility will recover in time.” Matt Kimball, VP and principal analyst at Moor Insights & Strategy, said that to some degree what happened in Aurora highlights an issue that may arise on occasion: “the communications gap that can exist between IT executives and data center operators. Think of ‘rack in versus rack out’ mindsets.” Often, he said, the operational elements of that data center environment, such as cooling, power, fire hazards, physical security, and so forth, fall outside the realm of an IT executive focused on delivering IT services to the business. “And even if they don’t fall outside the realm, these elements are certainly not a primary focus,” he noted. “This was certainly true when I was living in the IT world.” Additionally, said Kimball, “this highlights the need for organizations to reassess redundancy and resilience in a new light. Again, in IT, we tend to focus on resilience and redundancy at the app, server, and workload layers. Maybe even cluster level. But as we continue to place more and more of a premium on data, and the terms ‘business critical’ or ‘mission critical’ have real relevance, we have to zoom out

Read More »

Microsoft loses two senior AI infrastructure leaders as data center pressures mount

Microsoft did not immediately respond to a request for comment. Microsoft’s constraints Analysts say the twin departures mark a significant setback for Microsoft at a critical moment in the AI data center race, with pressure mounting from both OpenAI’s model demands and Google’s infrastructure scale. “Losing some of the best professionals working on this challenge could set Microsoft back,” said Neil Shah, partner and co-founder at Counterpoint Research. “Solving the energy wall is not trivial, and there may have been friction or strategic differences that contributed to their decision to move on, especially if they saw an opportunity to make a broader impact and do so more lucratively at a company like Nvidia.” Even so, Microsoft has the depth and ecosystem strength to continue doubling down on AI data centers, said Prabhu Ram, VP for industry research at Cybermedia Research. According to Sanchit Gogia, chief analyst at Greyhound Research, the departures come at a sensitive moment because Microsoft is trying to expand its AI infrastructure faster than physical constraints allow. “The executives who have left were central to GPU cluster design, data center engineering, energy procurement, and the experimental power and cooling approaches Microsoft has been pursuing to support dense AI workloads,” Gogia said. “Their exit coincides with pressures the company has already acknowledged publicly. GPUs are arriving faster than the company can energize the facilities that will house them, and power availability has overtaken chip availability as the real bottleneck.”

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »