Stay Ahead, Stay ONMINE

Talking about Games

Game theory is a field of research that is quite prominent in Economics but rather unpopular in other scientific disciplines. However, the concepts used in game theory can be of interest to a wider audience, including data scientists, statisticians, computer scientists or psychologists, to name just a few. This article is the opener to a […]

Game theory is a field of research that is quite prominent in Economics but rather unpopular in other scientific disciplines. However, the concepts used in game theory can be of interest to a wider audience, including data scientists, statisticians, computer scientists or psychologists, to name just a few. This article is the opener to a four-chapter tutorial series on the fundamentals of game theory, so stay tuned for the upcoming articles. 

In this article, I will explain the kinds of problems Game Theory deals with and introduce the main terms and concepts used to describe a game. We will see some examples of games that are typically analysed within game theory and lay the foundation for deeper insights into the capabilities of game theory in the later chapters. But before we go into the details, I want to introduce you to some applications of game theory, that show the multitude of areas game-theoretic concepts can be applied to. 

Applications of game theory

Even french fries can be an application of game theory. Photo by engin akyurt on Unsplash

Does it make sense to vote for a small party in an election if this party may not have a chance to win anyway? Is it worth starting a price war with your competitor who offers the same goods as you? Do you gain anything if you reduce your catch rate of overfished areas if your competitors simply carry on as before? Should you take out insurance if you believe that the government will pay for the reconstruction after the next hurricane anyway? And how should you behave in the next auction where you are about to bid on your favourite Picasso painting? 

All these questions (and many more) live within the area of applications that can be modelled with game theory. Whenever a situation includes strategic decisions in interaction with others, game-theoretic concepts can be applied to describe this situation formally and search for decisions that are not made intuitively but that are backed by a notion of rationality. Key to all the situations above is that your decisions depend on other people’s behaviour. If everybody agrees to conserve the overfished areas, you want to play along to preserve nature, but if you think that everybody else will continue fishing, why should you be the only one to stop? Likewise, your voting behaviour in an election might heavily depend on your assumptions about other people’s votes. If nobody votes for that candidate, your vote will be wasted, but if everybody thinks so, the candidate doesn’t have a chance at all. Maybe there are many people who say “I would vote for him if others vote for him too”.

Similar situations can happen in very different situations. Have you ever thought about having food delivered and everybody said “You don’t have to order anything because of me, but if you order anyway, I’d take some french fries”? All these examples can be applications of game theory, so let’s start understanding what game theory is all about. 

Understanding the game

Before playing, you need to understand the components of the game. Photo by Laine Cooper on Unsplash

When you hear the word game, you might think of video games such as Minecraft, board games such as Monopoly, or card games such as poker. There are some common principles to all these games: We always have some players who are allowed to do certain things determined by the game’s rules. For example, in poker, you can raise, check or give up. In Monopoly, you can buy a property you land on or don’t buy it. What we also have is some notion of how to win the game. In poker, you have to get the best hand to win and in Monopoly, you have to be the last person standing after everybody went bankrupt. That also means that some actions are better than others in some scenarios. If you have two aces on the hand, staying in the game is better than giving up. 

When we look at games from the perspective of game theory, we use the same concepts, just more formally.

A game in game theory consists of n players, where each player has a strategy set and a utility function.

A game consists of a set of players I = {1, .., n}, where each player has a set of strategies S and a utility function ui(s1, s2, … sn). The set of strategies is determined by the rules of the games. For example, it could be S = {check, raise, give-up} and the player would have to decide which of these actions they want to use. The utility function u (also called reward) describes how valuable a certain action of a player would be, given the actions of the other players. Every player wants to maximize their utility, but now comes the tricky part: The utility of an action of yours depends on the other players’ actions. But for them, the same applies: Their actions’ utilities depend on the actions of the other players (including yours). 

Let’s consider a well-known game to illustrate this point. In rock-paper-scissors, we have n=2 players and each player can choose between three actions, hence the strategy set is S={rock, paper, scissors} for each player. But the utility of an action depends on what the other player does. If our opponent chooses rock, the utility of paper is high (1), because paper beats rock. But if your opponent chooses scissors, the utility of paper is low (-1), because you would lose. Finally, if your opponent chooses paper as well, you reach a draw and the utility is 0. 

Utility values for player one choosing paper for three choices of the opponents strategy.

Instead of writing down the utility function for each case individually, it is common to display games in a matrix like this:

The first player decides for the row of the matrix by selecting his action and the second player decides for the column. For example, if player 1 chooses paper and player 2 chooses scissors, we end up in the cell in the third column and second row. The value in this cell is the utility for both players, where the first value corresponds to player 1 and the second value corresponds to player 2. (-1,1) means that player 1 has a utility of -1 and player 2 has a utility of 1. Scissors beat paper. 

Some more details

Now we have understood the main components of a game in game theory. Let me add a few more hints on what game theory is about and what assumptions it uses to describe its scenarios. 

  • We often assume that the players select their actions at the same time (like in rock-paper-scissors). We call such games static games. There are also dynamic games in which players take turns deciding on their actions (like in chess). We will consider these cases in a later chapter of this tutorial. 
  • In game theory, it is typically assumed that the players can not communicate with each other so they can’t come to an agreement before deciding on their actions. In rock-paper-scissors, you wouldn’t want to do that anyway, but there are other games where communication would make it easier to choose an action. However, we will always assume that communication is not possible. 
  • Game theory is considered a normative theory, not a descriptive one. That means we will analyse games concerning the question “What would be the rational solution?” This may not always be what people do in a likewise situation in reality. Such descriptions of real human behaviour are part of the research field of behavioural economics, which is located on the border between Psychology and economics. 

The prisoner’s dilemma

The prisoner’s dilemma is all about not ending up here. Photo by De an Sun on Unsplash

Let us become more familiar with the main concepts of game theory by looking at some typical games that are often analyzed. Often, such games are derived from are story or scenario that may happen in the real world and require people to decide between some actions. One such story could be as follows: 

Say we have two criminals who are suspected of having committed a crime. The police have some circumstantial evidence, but no actual proof for their guilt. Hence they question the two criminals, who now have to decide if they want to confess or deny the crime. If you are in the situation of one of the criminals, you might think that denying is always better than confessing, but now comes the tricky part: The police propose a deal to you. If you confess while your partner denies, you are considered a crown witness and will not be punished. In this case, you are free to go but your partner will go to jail for six years. Sounds like a good deal, but be aware, that the outcome also depends on your partner’s action. If you both confess, there is no crown witness anymore and you both go to jail for three years. If you both deny, the police can only use circumstantial evidence against you, which will lead to one year in prison for both you and your partner. But be aware, that your partner is offered the same deal. If you deny and he confesses, he is the crown witness and you go to jail for six years. How do you decide?

The prisoner’s dilemma.

The game derived from this story is called the prisoner’s dilemma and is a typical example of a game in game theory. We can visualize it as a matrix just like we did with rock-paper-scissors before and in this matrix, we easily see the dilemma the players are in. If both deny, they receive a rather low punishment. But if you assume that your partner denies, you might be tempted to confess, which would prevent you from going to jail. But your partner might think the same, and if you both confess, you both go to jail for longer. Such a game can easily make you go round in circles. We will talk about solutions to this problem in the next chapter of this tutorial. First, let’s consider some more examples. 

Bach vs. Stravinsky

Who do you prefer, Bach or Stravinsky? Photo by Sigmund on Unsplash

You and your friend want to go to a concert together. You are a fan of Bach’s music but your friend favors the Russian 20th. century composer Igor Stravinsky. However, you both want to avoid being alone in any concert. Although you prefer Bach over Stravinsky, you would rather go to the Stravinsky concert with your friend than go to the Bach concert alone. We can create a matrix for this game: 

Bach vs. Stravinsky

You decide for the row by going to the Bach or Stravinsky concert and your friend decides for the column by going to one of the concerts as well. For you, it would be best if you both chose Bach. Your reward would be 2 and your friend would get a reward of 1, which is still better for him than being in the Stravinsky concert all by himself. However, he would be even happier, if you were in the Stravinsky concert together. 

Do you remember, that we said players are not allowed to communicate before making their decision? This example illustrates why. If you could just call each other and decide where to go, this would not be a game to investigate with game theory anymore. But you can’t call each other so you just have to go to any of the concerts and hope you will meet your friend there. What do you do? 

Arm or disarm?

Make love, not war. Photo by Artem Beliaikin on Unsplash

A third example brings us to the realm of international politics. The world would be a much happier place with fewer firearms, wouldn’t it? However, if nations think about disarmament, they also have to consider the choices other nations make. If the USA disarms, the Soviet Union might want to rearm, to be able to attack the USA — that was the thinking during the Cold War, at least. Such a scenario could be described with the following matrix: 

The matrix for the disarm vs. upgrade game.

As you see, when both nations disarm, they get the highest reward (3 each), because there are fewer firearms in the world and the risk of war is minimized. However, if you disarm, while the opponent upgrades, your opponent is in the better position and gets a reward of 2, while you only get 0. Then again, it might have been better to upgrade yourself, which gives a reward of 1 for both players. That is better than being the only one who disarms, but not as good as it would get if both nations disarmed. 

The solution?

All these examples have one thing in common: There is no single option that is always the best. Instead, the utility of an action for one player always depends on the other player’s action, which, in turn, depends on the first player’s action and so on. Game theory is now interested in finding the optimal solution and deciding what would be the rational action; that is, the action that maximizes the expected reward. Different ideas on how exactly such a solution looks like will be part of the next chapter in this series. 

Summary

Learning about game theory is as much fun as playing a game, don’t you think? Photo by Christopher Paul High on Unsplash

Before continuing with finding solutions in the next chapter, let us recap what we have learned so far. 

  • A game consists of players, that decide for actions, which have a utility or reward
  • The utility/reward of an action depends on the other players’ actions. 
  • In static games, players decide for their actions simultaneously. In dynamic games, they take turns. 
  • The prisoner’s dilemma is a very popular example of a game in game theory.
  • Games become increasingly interesting if there is no single action that is better than any other. 

Now that you are familiar with how games are described in game theory, you can check out the next chapter to learn how to find solutions for games in game theory. 

References

The topics introduced here are typically covered in standard textbooks on game theory. I mainly used this one, which is written in German though: 

  • Bartholomae, F., & Wiens, M. (2016). Spieltheorie. Ein anwendungsorientiertes Lehrbuch. Wiesbaden: Springer Fachmedien Wiesbaden.

An alternative in English language could be this one: 

  • Espinola-Arredondo, A., & Muñoz-Garcia, F. (2023). Game Theory: An Introduction with Step-by-step Examples. Springer Nature.

Game theory is a rather young field of research, with the first main textbook being this one: 

  • Von Neumann, J., & Morgenstern, O. (1944). Theory of games and economic behavior.

Like this article? Follow me to be notified of my future posts.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Extreme plots enterprise marketplace for AI agents, tools, apps

Extreme Networks this week previewed an AI marketplace where it plans to offer a curated catalog of AI tools, agents and applications. Called Extreme Exchange, it’s designed to give enterprise customers a way to discover, deploy, and create AI agents, microapps, and workflows in minutes rather than developing such components

Read More »

Top quantum breakthroughs of 2025

The Helios quantum computing platform is available to customers through Quantinuum’s cloud service and on-premises offering. HSBC is using IBM’s Heron quantum computer to improve their bond trading predictions by 34% compared to classical computing. Caltech physicists create 6,100-qubit array. Kon H. Leung is seen working on the apparatus used

Read More »

How enterprises are rethinking online AI tools

A second path enterprises like had only about 35% buy-in, but generated the most enthusiasm. It is to use an online AI tool that offers more than a simple answer to a question, something more like an “interactive AI agent” than a chatbot. Two that got all the attention are

Read More »

Jade Secures Preliminary Funding Deal for Mongolian CBM-to-LNG Project

Zhengzhou Langrun Intelligent Equipment Co Ltd has signed a non-binding letter of intent to provide up to $46 million (AUD 70 million) in financing for a coal bed methane (CBM)-to-liquefied natural gas (LNG) project by Jade Gas Holdings Ltd in Mongolia. The agreement is for the Red Lake gas field, part of the Australian company’s flagship project with the Mongolian government’s Erdenes Methane LLC to develop the Tavantolgoi XXXIII unconventional oil basin (TTCBM Project). Red Lake has 246 billion cubic feet of 2C gross unrisked contingent resources, according to Jade. The Chinese CBM-focused gas equipment manufacturer would fund drilling and production for the next 18 wells in the field, Jade said in a stock filing. Jade has already drilled seven Red Lake wells according to the company. The “non-dilutive financing” would also cover surface facilities for gathering, processing and liquefying gas produced from the field into LNG. The deal also includes “a low upfront capital outlay option, to be funded by future Jade revenue”, Jade said. The parties agreed to consider expanding the terms to accommodate all 175 gas production wells in Red Lake’s first-phase development. Phase 1 involves 20 production wells, including two that came online June, according to Jade. “Langrun’s expertise in the gas industry in China and in particular in CBM offers a great fit for Jade as the company seeks options to fast-track development of the Red Lake gas field and to optimize gas production for faster access to customer markets and ultimately early revenue”, Jade said. “Subject to agreement of definitive documentation, and government and regulator cooperation and other approvals, the Red Lake gas field could potentially be developed to cover purification, pipeline and other transport, compression (for potential production of CNG), liquefaction (for production of LNG), refueling station construction, enabling gas sales for vehicle,

Read More »

Var Energi Confirms Oil Discovery Near Goliat

Var Energi ASA on Thursday confirmed oil in the Zagato North appraisal well, located 10 kilometers (6.21 miles) north of its operated Goliat field on Norway’s side of the Barents Sea. Zagato North, or well 7122/8-4 S, yielded estimated gross recoverable resources of up to three million barrels of oil equivalent (MMboe) in the Klappmyss and Realgrunnen formations, according to a press release by the Stavanger, Norway-based oil and gas explorer and producer. The discovery is part of Production License 229, operated by Var Energi with a 65 percent stake with Equinor as partner holding 35 percent. It is the13th well drilled in the production license, awarded under the Barents Sea Project in 1997, the Norwegian Offshore Directorate (NOD) said separately. The partners are considering tying the discovery to existing Goliat infrastructure. The discovery had been proven February. The well aimed to delineate the 7122/8-3 S (Zagato) discovery in Lower Jurassic-Upper Triassic and Middle Triassic reservoir rocks in the Realgrunnen Subgroup and the Kobbe Formation respectively. “Well 7122/8-4 S encountered an 11-meter [36.09 feet] oil column in the Tubaen Formation in the Realgrunnen Subgroup in reservoir rocks totaling 8.5 meters with good reservoir quality”, the NOD said. “The oil/water contact was encountered 1,523 meters below sea level. “Additional reservoir rocks were encountered in the Kobbe Formation totaling 48 meters with moderate reservoir quality, but the reservoirs were aquiferous. “An 80-meter oil column was also proven in the Klappmyss Fomation in sandstone layers totaling one meter with poor reservoir quality. The oil/water contact was not encountered. “The well was not formation-tested, but extensive data acquisition and sampling were carried out. “Appraisal well 7122/8-4 S was drilled to respective measured/vertical depths of 2986/2793 meters below sea level, and was terminated in the Klappmyss Formation in the Lower Triassic”. Zagato North, which has a

Read More »

A new kind of self-service: empowering utilities to shape their own tech

When utilities are empowered to meet their own tech needs, the whole system benefits. Across the energy industry, the idea of self-service has long been aimed at end-customers, helping them check their usage, change a tariff, or resolve an issue without calling a support team. But in an era of rapid digital transformation, a new kind of self-service is emerging: one aimed at utilities themselves.Today, it’s becoming increasingly critical that utilities must be able to serve their own tech needs, configuring their tech systems to build new products, refine processes, and connect systems without complicated, lengthy coding and other outside help. Independently configurable tech is about giving teams direct control of their tools so they can respond to challenges and innovate at speed. As grids decentralize and customer expectations rise, utilities can’t afford to get caught up in coding request tickets. Many utilities still struggle with cumbersome, disconnected and inflexible systems.  Now, fortunately, a new generation of integrated, configurable tech is here, laying the groundwork for utilities to champion their changing industry. Rigid tech is holding utilities back Clunky tech remains one of the biggest barriers to utility innovation. Many utilities still rely on a patchwork of disconnected legacy systems – for billing, metering, customer care, and field operations, to name a few. These systems are often siloed, with data held in a range of different forms and formats. Even relatively small updates – building a new rate, or tweaking a debt collection process – often demands weeks of specialist work across several siloed platforms, followed by more effort to stitch those updates together without breaking everything. This isn’t just slow and costly, it wears people down. Talented teams lose faith when they can’t fix what’s broken or move ideas forward. Empowering utilities to serve their own tech needs Bringing

Read More »

Crude Settles Higher

Oil eked out gains, rebounding slightly from the previous trading day’s sharp slump as traders weighed the outlook for a record surplus against supply risks from US sanctions. West Texas Intermediate rose 0.3% to settle under $59 a barrel after losing almost 4.2% on Wednesday, its biggest drop since June. Expectations for a long-awaited surplus were bolstered when the International Energy Agency flagged a deteriorating outlook for a sixth consecutive month, saying in a report on Thursday that supply will exceed demand by just over four million barrels a day next year.  Hours later, a US government report showed crude inventories rose 6.4 million barrels last week, the biggest increase since July and markedly higher than expected.  Both announcements came a day after producer group OPEC — which has been restoring idled capacity this year — said that global supply had topped demand in the third quarter, flipping its earlier estimate for the period from a shortfall. The bearish outlook for next year has weighed on oil prices afresh in recent days, with a key indicator — WTI’s prompt spread — sinking into contango. That pricing pattern, with the nearest contracts trading at discounts to further-out ones, signals ample short-term supplies, though it also clawed back into bearish territory on Thursday. At the same time, the Trump administration has moved to raise the pressure on Russia to end the war in Ukraine, including sanctions on Rosneft PJSC and Lukoil PJSC. With days to go until sanctions fully kick in, The Carlyle Group Inc. is exploring its options to buy Lukoil’s foreign assets, Reuters reported.  And bearish momentum on the news of rising US crude inventories was in part undercut by indications that product inventories fell across the board while exports picked up, a sign of resilient consumption at home and

Read More »

South Sudan Asks Crude Producers for $2.5B Pre-Payment

South Sudan is asking crude producers for a total pre-payment of $2.5 billion, which it intends to refund through future production that would have been entitled to Petroliam Nasional Bhd. In separate letters sent in October to ONGC Nile Ganga BV and a unit of China National Petroleum Corp. and seen by Bloomberg, the petroleum ministry asked for $1 billion and $1.5 billion respectively to be repaid in 54 months. The two companies were in consortia with Petronas before the Malaysian multinational abandoned its assets in South Sudan, which were subsequently confiscated by the state-owned Nile Petroleum Corp. “We hope the agreement will be worked within one month from the date of receipt of this letter,” according to both leaked documents, which the ministry confirmed as authentic.  Both companies were not readily available to comment. The ministry said the leak “constitutes a gross violation of government confidentiality, ethics, and national security protocols.” State coffers in South Sudan — which relies on crude exports for more than 90% of government revenue — have run dry as a war in neighboring Sudan froze nearly the nation’s entire export. WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Read More »

2.4-GW New Jersey offshore wind project canceled by developer

An attorney representing Invenergy told the New Jersey Board of Public Utilities in a Friday filing that the company’s 2.4-GW offshore wind project, Leading Light Wind, is canceled. Invenergy “has determined it cannot move forward with the project under the terms and conditions set out” when the BPU awarded it offshore wind renewable energy certificates in January of last year, the filing said. Invenergy “regrets this decision … and looks to the future for possible solicitations.” “The [BPU] is well aware that the offshore wind industry has experienced economic and regulatory conditions that have made the development of new offshore wind energy projects extremely difficult,” the filing said. Invenergy says it’s North America’s largest privately-held developer, owner and operator of clean energy solutions, with 36 GW of projects under its belt. Leading Light Wind was being developed off the coast of New Jersey and was set to become operational in 2030.  The filing cites financial, supply chain and regulatory obstacles as reasons the project is no longer viable. Invenergy and the project’s co-sponsor, energyRe, sought several delays from the BPU as they failed to meet filing deadlines due to issues like an inability to find a turbine supplier, it said. Invenergy was granted a stay last September, then extended that stay three more times before filing about its intention to abandon the project. In its May filing, Invenergy said that “given ongoing market and policy uncertainty, Leading Light Wind will continue to focus on meeting its lease obligations.” The Friday filing said the company since concluded that it doesn’t see a path forward for doing so. “The Company has invested considerable time and financial resources in the development of [Leading Light Wind] and remains firmly of the view that [Leading Light Wind], and offshore wind energy development, can provide significant

Read More »

AMD outlines ambitious plan for AI-driven data centers

“There are very beefy workloads that you must have that performance for to run the enterprise,” he said. “The Fortune 500 mainstream enterprise customers are now … adopting Epyc faster than anyone. We’ve seen a 3x adoption this year. And what that does is drives back to the on-prem enterprise adoption, so that the hybrid multi-cloud is end-to-end on Epyc.” One of the key focus areas for AMD’s Epyc strategy has been our ecosystem build out. It has almost 180 platforms, from racks to blades to towers to edge devices, and 3,000 solutions in the market on top of those platforms. One of the areas where AMD pushes into the enterprise is what it calls industry or vertical workloads. “These are the workloads that drive the end business. So in semiconductors, that’s telco, it’s the network, and the goal there is to accelerate those workloads and either driving more throughput or drive faster time to market or faster time to results. And we almost double our competition in terms of faster time to results,” said McNamara. And it’s paying off. McNamara noted that over 60% of the Fortune 100 are using AMD, and that’s growing quarterly. “We track that very, very closely,” he said. The other question is are they getting new customer acquisitions, customers with Epyc for the first time? “We’ve doubled that year on year.” AMD didn’t just brag, it laid out a road map for the next two years, and 2026 is going to be a very busy year. That will be the year that new CPUs, both client and server, built on the Zen 6 architecture begin to appear. On the server side, that means the Venice generation of Epyc server processors. Zen 6 processors will be built on 2 nanometer design generated by (you guessed

Read More »

Building the Regional Edge: DartPoints CEO Scott Willis on High-Density AI Workloads in Non-Tier-One Markets

When DartPoints CEO Scott Willis took the stage on “the Distributed Edge” panel at the 2025 Data Center Frontier Trends Summit, his message resonated across a room full of developers, operators, and hyperscale strategists: the future of AI infrastructure will be built far beyond the nation’s tier-one metros. On the latest episode of the Data Center Frontier Show, Willis expands on that thesis, mapping out how DartPoints has positioned itself for a moment when digital infrastructure inevitably becomes more distributed, and why that moment has now arrived. DartPoints’ strategy centers on what Willis calls the “regional edge”—markets in the Midwest, Southeast, and South Central regions that sit outside traditional cloud hubs but are increasingly essential to the evolving AI economy. These are not tower-edge micro-nodes, nor hyperscale mega-campuses. Instead, they are regional data centers designed to serve enterprises with colocation, cloud, hybrid cloud, multi-tenant cloud, DRaaS, and backup workloads, while increasingly accommodating the AI-driven use cases shaping the next phase of digital infrastructure. As inference expands and latency-sensitive applications proliferate, Willis sees the industry’s momentum bending toward the very markets DartPoints has spent years cultivating. Interconnection as Foundation for Regional AI Growth A key part of the company’s differentiation is its interconnection strategy. Every DartPoints facility is built to operate as a deeply interconnected environment, drawing in all available carriers within a market and stitching sites together through a regional fiber fabric. Willis describes fiber as the “nervous system” of the modern data center, and for DartPoints that means creating an interconnection model robust enough to support a mix of enterprise cloud, multi-site disaster recovery, and emerging AI inference workloads. The company is already hosting latency-sensitive deployments in select facilities—particularly inference AI and specialized healthcare applications—and Willis expects such deployments to expand significantly as regional AI architectures become more widely

Read More »

Key takeaways from Cisco Partner Summit

Brian Ortbals, senior vice president from World Wide Technology, which is one of Cisco’s biggest and most important partners stated: “Cisco engaged partners early in the process and took our feedback along the way. We believe now is the right time for these changes as it will enable us to capitalize on the changes in the market.” The reality is, the more successful its more-than-half-a-million partners are, the more successful Cisco will be. Platform approach is coming together When Jeetu Patel took the reigns as chief product officer, one of his goals was to make the Cisco portfolio a “force multiple.” Patel has stated repeatedly that, historically, Cisco acted more as a technology holding company with good products in networking, security, collaboration, data center and other areas. In this case, product breadth was not an advantage, as everything must be sold as “best of breed,” which is a tough ask of the salesforce and partner community. Since then, there have been many examples of the coming together of the portfolio to create products that leverage the breadth of the platform. The latest is the Unified Edge appliance, an all-in-one solution that brings together compute, networking, storage and security. Cisco has been aggressive with AI products in the data center, and Cisco Unified Edge compliments that work with a device designed to bring AI to edge locations. This is ideally suited for retail, manufacturing, healthcare, factories and other industries where it’s more cost effecting and performative to run AI where the data lives.

Read More »

AI networking demand fueled Cisco’s upbeat Q1 financials

Customers are very focused on modernizing their network infrastructure in the enterprise in preparation for inferencing and AI workloads, Robbins said. “These things are always multi-year efforts,” and this is only the beginning, Robbins said. The AI opportunity “As we look at the AI opportunity, we see customer use cases growing across training, inferencing, and connectivity, with secure networking increasingly critical as workloads move from the data center to end users, devices, and agents at the edge,” Robbins said. “Agents are transforming network traffic from predictable bursts to persistent high-intensity loads, with agentic AI queries generating up to 25 times more network traffic than chatbots.” “Instead of pulling data to and from the data center, AI workloads require models and infrastructure to be closer to where data is created and decisions are made, particularly in industries such as retail, healthcare, and manufacturing.” Robbins pointed to last week’s introduction of Cisco Unified Edge, a converged platform that integrates networking, compute and storage to help enterprise customers more efficiently handle data from AI and other workloads at the edge. “Unified Edge enables real-time inferencing for agentic and physical AI workloads, so enterprises can confidently deploy and manage AI at scale,” Robbins said. On the hyperscaler front, “we see a lot of solid pipeline throughout the rest of the year. The use cases, we see it expanding,” Robbins said. “Obviously, we’ve been selling networking infrastructure under the training models. We’ve been selling scale-out. We launched the P200-based router that will begin to address some of the scale-across opportunities.” Cisco has also seen great success with its pluggable optics, Robbins said. “All of the hyperscalers now are officially customers of our pluggable optics, so we feel like that’s a great opportunity. They not only plug into our products, but they can be used with other companies’

Read More »

When the Cloud Leaves Earth: Google and NVIDIA Test Space Data Centers for the Orbital AI Era

On November 4, 2025, Google unveiled Project Suncatcher, a moonshot research initiative exploring the feasibility of AI data centers in space. The concept envisions constellations of solar-powered satellites in Low Earth Orbit (LEO), each equipped with Tensor Processing Units (TPUs) and interconnected via free-space optical laser links. Google’s stated objective is to launch prototype satellites by early 2027 to test the idea and evaluate scaling paths if the technology proves viable. Rather than a commitment to move production AI workloads off-planet, Suncatcher represents a time-bound research program designed to validate whether solar-powered, laser-linked LEO constellations can augment terrestrial AI factories, particularly for power-intensive, latency-tolerant tasks. The 2025–2027 window effectively serves as a go/no-go phase to assess key technical hurdles including thermal management, radiation resilience, launch economics, and optical-link reliability. If these milestones are met, Suncatcher could signal the emergence of a new cloud tier: one that scales AI with solar energy rather than substations. Inside Google’s Suncatcher Vision Google has released a detailed technical paper titled “Towards a Future Space-Based, Highly Scalable AI Infrastructure Design.” The accompanying Google Research blog describes Project Suncatcher as “a moonshot exploring a new frontier” – an early-stage effort to test whether AI compute clusters in orbit can become a viable complement to terrestrial data centers. The paper outlines several foundational design concepts: Orbit and Power Project Suncatcher targets Low Earth Orbit (LEO), where solar irradiance is significantly higher and can remain continuous in specific orbital paths. Google emphasizes that space-based solar generation will serve as the primary power source for the TPU-equipped satellites. Compute and Interconnect Each satellite would host Tensor Processing Unit (TPU) accelerators, forming a constellation connected through free-space optical inter-satellite links (ISLs). Together, these would function as a disaggregated orbital AI cluster, capable of executing large-scale batch and training workloads. Downlink

Read More »

Cloud-based GPU savings are real – for the nimble

The pattern points to an evolving GPU ecosystem: while top-tier chips like Nvidia’s new GB200 Blackwell processors remain in extremely short supply, older models such as the A100 and H100 are becoming cheaper and more available. Yet, customer behavior may not match practical needs. “Many are buying the newest GPUs because of FOMO—the fear of missing out,” he added. “ChatGPT itself was built on older architecture, and no one complained about its performance.” Gil emphasized that managing cloud GPU resources now requires agility, both operationally and geographically. Spot capacity fluctuates hourly or even by the minute, and availability varies across data center regions. Enterprises willing to move workloads dynamically between regions—often with the help of AI-driven automation—can achieve cost reductions of up to 80%. “If you can move your workloads where the GPUs are cheap and available, you pay five times less than a company that can’t move,” he said. “Human operators can’t respond that fast automation is essential.” Conveniently, Cast sells an AI automation solution. But it is not the only one and the argument is valid. If spot pricing can be found cheaper at another location, you want to take it to keep the cloud bill down/ Gil concluded by urging engineers and CTOs to embrace flexibility and automation rather than lock themselves into fixed regions or infrastructure providers. “If you want to win this game, you have to let your systems self-adjust and find capacity where it exists. That’s how you make AI infrastructure sustainable.”

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »