Stay Ahead, Stay ONMINE

Decoding OpenAI’s Super Bowl ad and Sam Altman’s grandiose blog post

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More If you were in one of the nearly 40 million U.S. households that tuned into the NFL Super Bowl LIX this year, in addition to watching the Philadelphia Eagles trounce the Kansas City Chiefs, you may […]

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


If you were in one of the nearly 40 million U.S. households that tuned into the NFL Super Bowl LIX this year, in addition to watching the Philadelphia Eagles trounce the Kansas City Chiefs, you may have caught an advertisement for OpenAI.

This is the company’s first Super Bowl ad, and it cost a reported $14 million — in keeping with the astronomical sums commanded by ads during the big game, which some come to see instead of the football. As you’ll see in a copy embedded below, the OpenAI ad depicts various advancements throughout human history, leading up to ChatGPT today, what OpenAI calls the “Intelligence Age.

While reaction to the ad was mixed — I’ve seen more praise and defense for it than criticism in my feeds — it clearly indicates that OpenAI has arrived as a major force in American culture, and quite obviously seeks to connect to a long lineage of invention, discovery and technological progress that’s taken place here.

On it’s own, the OpenAI Super Bowl ad seems to me to be a totally inoffensive and simple message designed to appeal to the widest possible audience — perfect for the Super Bowl and its large audience across demographics. In a way, it’s even so smooth and uncontroversial that it is forgettable.

But coupled with a blog post OpenAI CEO Sam Altman published on his personal website earlier on Sunday, entitled “Three Observations,” and suddenly OpenAI’s assessment of the current moment and the future becomes much more dramatic and stark.

Altman begins the blog post with a pronouncement about artificial general intelligence (AGI), the raison d’etre of OpenAI’s founding and its ongoing efforts to release more and more powerful AI models such as the latest o3 series. This pronouncement, like OpenAI’s Super Bowl ad, also seeks to connect OpenAI’s work building these models and approaching this goal of AGI with the history of human innovation more broadly.

Systems that start to point to AGI* are coming into view, and so we think it’s important to understand the moment we are in. AGI is a weakly defined term, but generally speaking we mean it to be a system that can tackle increasingly complex problems, at human level, in many fields.

People are tool-builders with an inherent drive to understand and create, which leads to the world getting better for all of us. Each new generation builds upon the discoveries of the generations before to create even more capable tools—electricity, the transistor, the computer, the internet, and soon AGI.

A few paragraphs later, he even seems to concede that AI — as many developers and users of the tech agree — is simply another new tool. Yet he immediately flips to suggest this may be a much different tool than anyone in the world has ever experienced to date. As he writes:

In some sense, AGI is just another tool in this ever-taller scaffolding of human progress we are building together. In another sense, it is the beginning of something for which it’s hard not to say “this time it’s different”; the economic growth in front of us looks astonishing, and we can now imagine a world where we cure all diseases, have much more time to enjoy with our families, and can fully realize our creative potential.

The idea of “curing all diseases,” while certainly appealing — mirrors something rival tech boss Mark Zuckerberg of Meta also sought out to do with his Chan-Zuckerberg Initiative medical research nonprofit co-founded with his wife, Prisicilla Chan. As of two years ago, the timeline proposed for the Chan-Zuckerberg’s initiative to reach this goal was by 2100. Yet now thanks to the progress of AI, Altman seems to believe it’s attainable even sooner, writing: “In a decade, perhaps everyone on earth will be capable of accomplishing more than the most impactful person can today.”

Altman and Zuck are hardly the one two high-profile tech billionaires interested in medicine and longevity science in particular. Google’s co-founders, especially Sergey Brin, have put money towards analogous efforts, and in fact, there were (or are) at one point so many leaders in the tech industry interested in prolonging human life and ending disease that back in 2017, The New Yorker magazine ran a feature article entitled: “Silicon Valley’s Quest to Live Forever.”

This utopian notion of ending disease and ultimately death seems patently hubristic to me on the face of it — how many folklore stories and fairy tales are there about the perils of trying to cheat death? — but it aligns neatly with the larger techno-utopian beliefs of some in the industry, which have been helpfully grouped by AGI critics and researchers Timnit Gebru and Émile P. Torres under the umbrella term TESCREAL, an acronym for “transhumanism, Extropianism, singularitarianism, (modern) cosmism, Rationalism, Effective Altruism, and longtermism,” in their 2023 paper.

As these authors elucidate, the veneer of progress sometimes masks uglier beliefs such as in the inherent racial superiority or humanity of those with higher IQs, specific demographics, and ultimately evoking racial science and phrenology of more openly discriminatory and oppressive ages past.

There’s nothing to suggest in Altman’s note that he shares such beliefs, mind you…in fact, rather the opposite. He writes:

“Ensuring that the benefits of AGI are broadly distributed is critical. The historical impact of technological progress suggests that most of the metrics we care about (health outcomes, economic prosperity, etc.) get better on average and over the long-term, but increasing equality does not seem technologically determined and getting this right may require new ideas.”

In other words: he wants to ensure everyone’s life gets better with AGI, but is uncertain how to achieve that. It’s a laudable notion, and one that maybe AGI itself could help answer, but for one thing, OpenAI’s latest and greatest models remain closed and proprietary as opposed to competitors such as Llama’s Meta family and DeepSeek’s R1, though the latter has apparently caused Altman to re-assess OpenAI’s approach to the open source community as he mentioned on a recent separate Reddit AMA thread. Perhaps OpenAI could start by open sourcing more of its technology to ensure it spreads wider to more users, more equally?

Meanwhile, speaking of specific timelines, Altman seems to project that while the next few years may not be wholly remade by AI or AGI, he’s more confident of a visible impact by the end of the decade 2035. As he puts it:

The world will not change all at once; it never does. Life will go on mostly the same in the short run, and people in 2025 will mostly spend their time in the same way they did in 2024. We will still fall in love, create families, get in fights online, hike in nature, etc.

But the future will be coming at us in a way that is impossible to ignore, and the long-term changes to our society and economy will be huge. We will find new things to do, new ways to be useful to each other, and new ways to compete, but they may not look very much like the jobs of today.

Anyone in 2035 should be able to marshall [sic] the intellectual capacity equivalent to everyone in 2025; everyone should have access to unlimited genius to direct however they can imagine. There is a great deal of talent right now without the resources to fully express itself, and if we change that, the resulting creative output of the world will lead to tremendous benefits for us all.”

Where does this leave us? Critics of OpenAI would say it’s more empty hype designed to continue placating OpenAI’s big-pocketed investors such as Softbank and put off any pressure to have working AGI for a while longer.

But having used these tools myself, watched and reported on other users and sene what they’ve been able to accomplish — such as writing up complex software within mere minutes without much background in the field — I’m inclined to believe Altman is serious in his prognostications, and hopeful in his commitment to equal distribution.

But keeping all the best models closed up under a subscription bundle clearly is not the way to attain equal access to AGI — so my biggest question remains on what the company does under his leadership to ensure it moves in this direction he so clearly articulated and that the Super Bowl ad also celebrated.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

IBM won’t sell VMware to new cloud customers

“Things are pretty straightforward for the managed cloud option — there are two obvious choices. Companies can go with IBM and Red Hat, or choose VMware,” he said, noting that while the company has been a big VMware reseller, “IBM had to ask themselves, ‘How are we going to compete

Read More »

Wild Moose emerges from stealth mode with site reliability platform

The startup designed its architecture to meet enterprise-grade security requirements. The platform is SOC 2–compliant, processes all data in memory, and doesn’t store customer logs or telemetry. That approach, the company says, allows adoption even in highly regulated industries where data control is critical. The company says the transparency is

Read More »

Standard Chartered Offers OPEC+ Meeting Prediction

In a report sent to Rigzone by the Standard Chartered team on Wednesday, Emily Ashford, Head of Energy Research at Standard Chartered Bank, offered a prediction for OPEC+’s next meeting, which is scheduled to be held on November 2. “OPEC+ [is] likely to continue [a] gradual unwinding of cuts, adding a further 137,000 barrels per day month on month at [the] November 2 meeting,” Ashford said in the report. “We see no reason for a change in strategy this month. The week on week change in the shape of the Brent forward curve and Russia-based market supply concerns are supportive of OPEC+ continuing its small monthly unwind of the April 2023 voluntary output cuts,” Ashford added. In the report, Ashford highlighted that, “just one week ago”, the company “might have suggested that any decision at this meeting by OPEC+ could be considered with a bearish tint”. “Continuing to add barrels slowly back to the market via the 137,000 barrel per day increase mechanism could have been reported as adding barrels to a market under pressure from poor sentiment, and concerns over impending surpluses,” Ashford said. “A pause in the process could have suggested that OPEC+ did not view the market as healthy enough to add barrels, given its repeated commentary that it remains responsive to the global economic outlook and market fundamentals,” the head of energy research added. Ashford went on to state in the report that Standard Chartered Bank expects the group’s next meeting “to continue the trend of rapid decisions, with the communiqué focusing on the group’s ability to pause or reverse the additional adjustments, including the previous November 2023 tranche of 2.2 million barrels per day”. “Given the price change over the past week, increased supply pressure, and – most critically – the adjustment in the shape

Read More »

Oil Giants Plan Production Boost

(Update) October 29, 2025, 4:36 PM GMT: Updates with companies’ reporting schedule in seventh paragraph. The world’s biggest oil companies are expected to press ahead with plans to accelerate production growth when they report earnings this week, despite weak crude prices and higher supplies from OPEC and its allies.  Exxon Mobil Corp., Chevron Corp., Shell Plc, BP Plc and TotalEnergies SE will likely grow output 3.9% this year and 4.7% in 2026, according to analysts’ estimates compiled by Bloomberg. The increases — which include new projects as well as acquisitions — appear designed to capitalize on an expected oil-price upturn in the latter half of next year.  But they could add to the supply glut in the short term. “They’re taking the long view that oil demand is going to be a lot more resilient post-2030,” Noah Barrett, a research analyst at Janus Henderson, which manages about $457 billion. “If they’re not making the investments today, then their portfolios will be really disadvantaged when prices move higher.” After years of outsized profits as oil demand roared back following the pandemic, the world’s largest energy companies are feeling the pinch of crude prices that have dropped about 14% this year near to a four-year low. In response, they’re cutting jobs, reducing low-carbon investments and trimming share buybacks to channel funds toward the most valuable part of their business: oil and gas production. “All the supply coming to the market is shrinking OPEC’s spare capacity — so there’s a light at end of the tunnel,” said Betty Jiang, an analyst at Barclays Plc. “Whether that’s second half of 2026 or 2027, the balance is going to tighten. It’s just a matter of when.” Shell and TotalEnergies will kick off Big Oil earnings season on Thursday, followed by Exxon and Chevron the next

Read More »

AI Frenzy Spreading Over to Caterpillar, Oil Frackers

Caterpillar Inc., the company synonymous with yellow bulldozers and mining trucks, is getting a boost from another type of machinery. Power generators and turbines, which keep data centers running, have become a dominant driver for the company as demand for artificial-intelligence infrastructure takes off globally. Caterpillar told investors on Wednesday that sales of those products jumped 31% in its latest quarter, trouncing sales growth for its more traditional equipment.  The stock surged as much as 14%, to a record high. It’s an example of how the AI fervor that’s fueled Nvidia Corp.’s rise to become the world’s first $5 trillion company is also lifting the fortunes of more traditional industrial players.  For instance, shares of ProPetro Holding Corp., a West Texas oil contractor, soared an eye-popping 44% Wednesday after saying it was expanding its business to supply electricity to data centers. It is the biggest single-day jump on record for the company, which has traditionally focused on fracking oil and gas wells.  Last week, another fracking company, Halliburton Co., announced a deal to provide power to data centers, sending its up 12% for the biggest gain in six months. And a third, Liberty Energy Inc., has gained 60% since saying earlier this month it planned to expand its power generation business.   For Caterpillar, the boom in data center construction has transformed a once-sleepy segment of its business. Its so-called Energy & Transportation unit, which sells generators and related machinery, used to lag the firm’s other two divisions. But now that unit, which also sells equipment to oil producers and rail services, has become the company’s largest and fast-growing, accounting for about 40% of the firm’s total revenue last year. In quarterly earnings on Wednesday, Texas-based Caterpillar said that sales within the unit increased 17% year-over-year. And some analysts are expecting revenue from the unit to

Read More »

Oil Rebounds as Stockpiles Drop

Oil increased, snapping a three-day losing streak, as investors assessed a large drop in US inventories and the impact of Western sanctions against leading Russian crude producers. West Texas Intermediate rose by 0.6% to settle below $61 a barrel, while Brent closed near $65. US crude stockpiles declined 6.9 million barrels last week, the most since early September, along with draws in gasoline and distillates, according to an Energy Information Administration report Wednesday. The report pushed up oil prices to intraday highs as traders reconciled a tightening supply outlook in the West with mounting threats to Russian flows. Gasoline futures surged 2.5% on signs of unseasonably high demand. The data compounded bullish sentiment after US President Donald Trump said he would follow through and enforce harsh new sanctions against Moscow to pressure Vladimir Putin into negotiations to end the war in Ukraine, according to Matthew Whitaker, the US ambassador to NATO. Indian state-owned refiners are considering whether they can continue to take some discounted Russian oil after the measures were imposed, though some processors will pause purchases for now. Indian Oil Corp. said on Tuesday it is “absolutely not going to discontinue” purchases of Russian crude as long as it complies with international sanctions. “The market is now trying to assess the longer-term impact of the additional sanctions, which will be determined by the quantity of actual barrels removed from supply,” Standard Chartered analysts including Emily Ashford said in a note. Oil is on track to notch a third monthly decline, with prices dragged lower by expectations of a global surplus as OPEC+ raises production. Key alliance nations are set to hold discussions this weekend and may sign off on another supply increase. Adding a ceiling to prices, Federal Reserve Chair Jerome Powell said that a rate cut in December

Read More »

US falls 1 place in ACEEE’s global energy efficiency rankings

Out of the top 25 highest energy-consuming countries, the United States now ranks 11th in energy efficiency policy and performance, according to the American Council for an Energy-Efficient Economy’s 2025 assessment. ACEEE previously ranked the U.S. 10th in its 2018 and 2022 International Energy Efficiency Scorecards. The slip in rankings is “despite a slight increase (3 points) in overall score,” ACEEE said in its report, “due to larger point gains among the top performers.” The four categories the group scores are national efforts, buildings, industry and transportation. Countries can earn up to 25 points in each section for a total of 100 points in subcategories such as fuel economy and emission standards or residential and commercial building codes. The U.S. in 2022 scored 16.5 points in national efforts, 17 in buildings, 12 in industry and 8.5 in transportation. In 2025 it scored 21 in national efforts, 17 in buildings, 13 in industry and 6 in transportation. Since 2022, “countries have made incremental progress in improving energy efficiency in buildings and industry,” ACEEE said. “However, even the top-performing countries have substantial room for improvement in advancing transportation efficiency.” “Nearly every country could work to significantly reduce travel from personal vehicles,” the report said. China improved its ranking from 9th in 2022 to sharing 5th place with Spain in this year’s report, thanks in part to its global leadership in public transit use and its national targets for reducing energy consumption and energy intensity. In 2025, the top four countries in order are France, Germany, the United Kingdom and Italy. France also ranked first in 2022.  ACEEE cites France’s 2024 commitment of €800 million, or $932.9 billion, “to building retrofits … with additional investments in low-carbon transportation,” and its 2023 release of a “national water saving plan to cut water use by

Read More »

Energy Department Announces Loan for Indiana Coal-Powered Fertilizer Facility

WASHINGTON—U.S. Secretary of Energy Chris Wright today announced the Department of Energy’s (DOE) Loan Programs Office (LPO) closed a loan to support independent, American-made, and coal-powered fertilizer production. The $1.5 billion loan to Wabash Valley Resources, LLC, will help finance a coal and ammonia fertilizer facility in West Terre Haute, Indiana. The project will restart and repurpose a coal gasification plant idled since 2016 to produce 500,000 metric tons of anhydrous ammonia per year by using coal from a nearby Southern Indiana mine and petcoke as feedstock. “For too long, America has been dependent on foreign sources of fertilizer,” said U.S. Energy Secretary Chris Wright. “Under President Trump’s leadership, we are changing that by putting America first, relying on American coal, American workers, and American innovation to power our farms and feed our families.” By investing in a coal community, the Wabash project will bring the gasification plant back online to produce ammonia fertilizer – a vital resource for farmers across the Corn Belt, which currently relies on imports from Canada, the Caribbean, the Middle East, and Russia. The project will strengthen domestic supply chains, lower costs for farmers and consumers, and strengthen national food security by producing cost-competitive ammonia for the Eastern Corn Belt while creating hundreds of American jobs. The loan, which was carefully evaluated under the new LPO guidance directed by Secretary Wright, delivers on the Trump administration’s promise to responsibly steward taxpayer dollars and unleash American energy dominance. The Wabash financial close is the second closed loan under the Energy Dominance Financing (EDF) Program created by the Working Families Tax Cut, also known as the One Big Beautiful Bill Act. Today’s announcement highlights DOE’s commitment to achieving President Trump’s national security and energy dominance goals by securing domestic fertilizer supply for farmers in the Corn Belt

Read More »

Samsung’s memory ramp-up may ease AI and cloud upgrade concerns

The company confirmed that its latest-generation HBM3E chips are now being shipped to “all related customers,” a possible sign that supply to major AI chipmakers like Nvidia may be stabilizing. With mass production of HBM4 expected next year, Samsung could eventually help relieve pressure on the broader enterprise infrastructure ecosystem, from cloud providers building new AI clusters to data center operators seeking to expand switching and storage capacity. Samsung’s Foundry division also plans to begin operating its new 2nm fab in Taylor, Texas, in 2026 and supply HBM4 base-dies, a move that could further stabilize component availability for US cloud and networking infrastructure providers. Easing the memory chokehold Easing DRAM and NAND lead times will unlock delayed infrastructure projects, particularly among hyperscalers, according to Manish Rawat, semiconductor analyst at TechInsights. “As component availability improves from months to weeks, deferred server and storage upgrades can transition to active scheduling,” Rawat said. “Hyperscalers are expected to lead these restarts, followed by large enterprises once pricing and delivery stabilize. Improved access to high-density memory will also drive faster refresh cycles and higher-performance rack designs, favoring denser server configurations. Procurement models may shift from long-term, buffer-heavy strategies to more agile, just-in-time or spot-buy approaches.” Samsung’s expanded role as a “meaningful volume supplier” of HBM3E 12-high DRAM will also be crucial for hyperscalers planning their 2026 AI infrastructure rollouts, according to Danish Faruqui, CEO of Fab Economics. “Without Samsung’s contribution, most hyperscaler ASIC programs, including Google’s TPU v7, AWS’s Trainium 3, and Microsoft’s in-house accelerators, were facing one- to two-quarter delays due to the limited HBM3E 12-high supply from SK Hynix,” Faruqui said. “These products form the backbone of next-generation AI data centers, and volume ramp-up depends directly on Samsung’s ability to deliver.”

Read More »

Oracle’s cloud strategy an increasingly risky bet

However, he pointed out, “theatre is not delivery. What Oracle served was less a coronation than a carefully staged performance: a heady cocktail of ambition, backlog, and speculation. At Greyhound Research, we argue that such moments call not for applause but for scrutiny. The right instinct is not to toast, but to check the bill.” Oracle ‘betting the farm’ on AI Rob Tiffany, research director in IDC’s worldwide infrastructure research organization, had a different view, saying, “in an effort to catch up with the other hyperscaler clouds, Oracle has been aggressively building out its Oracle Cloud Infrastructure (OCI) data center regions all over the world prior to their Stargate endeavor with Crusoe, OpenAI, and SoftBank, to capitalize on the AI opportunity.” Speculation about the burst of the AI bubble aside, he said, “the strength and success of the OCI buildout thus far rests with Oracle’s dominant database and Fusion Cloud ERP, and those enterprise customers should be confident  in Oracle’s future.” Scott Bickley, advisory fellow at Info-Tech Research Group, added, “[while it is] extraordinary to see them take on this kind of debt, [Oracle] are really betting the farm on the AI revolution panning out. There are a lot of risks involved if momentum in the AI space loses its current trajectory. There could be a lot of stranded infrastructure and capital.” The ultimate risk, he said “lies in the viability of OpenAI. These guys have said they’re going to spend $1.4 trillion on AI capacity build out, and they’re sitting on a revenue base of $13 billion a year right now. If they go up in smoke, then that could leave a lot of this investment stranded. That would be the worst case kind of Black Swan scenario.” At this point, he said, “CIOs would not want that bubble

Read More »

Google wants to restart closed nuclear power plant in Iowa

The enormous amount of energy required to power a modern data center has prompted major tech companies to sign major partnership agreements with power companies. Most recently, Google signed an agreement with Next Era Energy to restart the Duane Arnold Energy Center in Iowa. The nuclear power plant in question was shut down in 2020 and it is expected to take four years to make it operational again, CNBC reports.

Read More »

Arista fills out AI networking portfolio

The 7280R4-32PE features 25.6 Tbps switching capacity and supports 32x 800 GbE ports with Octal Small Form-Factor Pluggable (OSFP) or Quad Small Form-Factor Pluggable – Double Density (QSFP-DD) optical uplinks. It’s targeted at customers that need to support AI/ML workloads and routing-intensive edge use cases, Arista stated. It supports 25% lower power per Gbps compared to the prior generation, according to Arista.  A second version, the 7280R4-64QC-10PE, is aimed at dense, deep buffer-requiring workloads in data centers with 100G/800G requirements. The box supports 64x 100 GbE and 10x 800 GbE OSFP in addition to 4x 1/10/25 GbE for management or additional low-speed interfaces, Arista started. The box promises 20% lower power requirement per Gbps over the prior generation of the box, Arista stated.  At the high end, the new 7800R4 is the vendor’s latest flagship networking box capable of supporting 36 ports of 800GbE OSFP and QSFP-DD line cards in 4, 8, 12, and 16-slot chassis configurations. The box offers a high radix capacity – meaning it can be fully loaded with line card and support 576 physical 800 Gigabit Ethernet ports or 1,152 400GbE ports, Arista stated.  In addition, the 7800R supports a new 3.2 TbpsEthernet line card called HyperPort that supports 4 800G channels to tie together widely dispersed data centers via a technique Arista calls “scale across.” It’s designed to scale across buildings in the same metropolitan region or across sites in different cities or countries. This routed Data Center Interconnect technology that can extend AI clusters over Metro or long-haul WAN links, according to Arista. “Building on the flexible Extensible Operating System (EOS) software foundation [which runs across all Arista networking gear] and deep buffering, HyperPort delivers up to 44% faster job completion time (JCT) for high-bandwidth AI flows via a single high-speed port, compared to

Read More »

Cisco, Nvidia strengthen AI ties with new data center switch, reference architectures

The new box extends Cisco Nexus 9000 Series portfolio of high-density 800G aggregation switches for the data center fabric, Cisco stated. The Nexus 9000 data center switches are a core component of the vendor’s enterprise AI offerings. They support congestion-management and flow-control algorithms and deliver the right latency and telemetry to meet the design requirements of AI/ML fabrics, Cisco stated. With the Cisco N9100 Series, Cisco now supports Nvidia Cloud Partner (NCP)-compliant reference architecture. “This development is particularly significant for neocloud and sovereign cloud customers building data centers with capacities ranging from thousands to potentially hundreds of thousands of GPUs, as it allows them to diversify their supply chains effectively,” wrote Will Eatherton, senior vice president of Cisco networking engineering, in a blog post about the news. An add-on license lets customers extend the NCP reference architecture to define how customers can mix and mingle Nvidia Spectrum-X adaptive routing capability with Cisco Nexus 9300 Series switches and Nvidia Spectrum-X Ethernet SuperNICs. “The combination of low latency and congestion-aware, per-packet load balancing on Cisco 9300 switches, along with out-of-order packet handling and end-to-end congestion management on Nvidia SuperNICs, significantly enhances network performance. These improvements are essential for AI networks, optimizing critical metrics such as job completion time,” Eatherton wrote. In addition to neoclouds and sovereign buildouts, enterprise customers are a target, according to Futuriom’s Raynovich.

Read More »

IT shortcuts curb AI returns

Organizations must ensure the infrastructure is AI ready Infrastructure is another area where Cisco found a major difference. Pacesetters are designing their networks for future demands. Seventy-one percent say their networks can scale instantly for new AI projects. Roughly three-quarters of pacesetters are investing in new data center capacity over the next year. Currently, about two-thirds say their infrastructure can accommodate AI workloads. Most pacesetters (93%) also have data systems that are fully prepared for AI, compared with 34% of other companies. About 76% have fully centralized their in-house data, while only 19% of other companies have done the same. Eighty-four percent report strong governance readiness, while 95% have mature processes to measure the impact of AI. If ever there was a technological shift that requires the right infrastructure, it’s AI. AI generates a significant amount of data, needs large amounts of processes and low latency, high-capacity networks. Historically, businesses could operate with networks that operated on the premise of “best effort,” but that’s no longer the case. From the data center to campus to branch offices, in most companies, the network will require a refresh. Scaling AI requires the right processes When it comes to being disciplined, 62% of pacesetters have an established process for generating, piloting, and scaling AI use cases. Only 13% of other organizations (non-pacesetters) have reached this level of maturity. Most pacesetters say their AI models achieve at least 75% accuracy. Almost half also expect a 50% to 100% return on investment (ROI) within a year, far above the average. Cisco notes that over the past six months, pressure has been building for companies to show tangible ROI. Executives and IT leaders are pushing for results, and so are competitors. By contrast, most other companies are in early stages of readiness. Although 83% plan to

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »