Stay Ahead, Stay ONMINE

From silicon to sentience: The legacy guiding AI’s next frontier and human cognitive migration

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Humans have always migrated, not only across physical landscapes, but through ways of working and thinking. Every major technological revolution has demanded some kind of migration: From field to factory, from muscle to machine, from analog […]

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


Humans have always migrated, not only across physical landscapes, but through ways of working and thinking. Every major technological revolution has demanded some kind of migration: From field to factory, from muscle to machine, from analog habits to digital reflexes. These shifts did not simply change what we did for work; they reshaped how we defined ourselves and what we believed made us valuable.

One vivid example of technological displacement comes from the early 20th century. In 1890, more than 13,000 companies in the U.S. built horse-drawn carriages. By 1920, fewer than 100 remained. In the span of a single generation, an entire industry collapsed. As Microsoft’s blog The Day the Horse Lost Its Job recounts, this was not just about transportation, it was about the displacement of millions of workers, the demise of trades, the reorientation of city life and the mass enablement of continental mobility. Technological progress, when it comes, does not ask for permission.

Today, as AI grows more capable, we are entering a time of cognitive migration when humans must move again. This time, however, the displacement is less physical and more mental: Away from tasks machines are rapidly mastering, and toward domains where human creativity, ethical judgment and emotional insight remain essential.

From the Industrial Revolution to the digital office, history is full of migrations triggered by machinery. Each required new skills, new institutions and new narratives about what it means to contribute. Each created new winners and left others behind.

The framing shift: IBM’s “Cognitive Era”

In October 2015 at a Gartner industry conference, IBM CEO Ginni Rometty publicly declared the beginning of what the company called the Cognitive Era. It was more than a clever marketing campaign; it was a redefinition of strategic direction and, arguably, a signal flare to the rest of the tech industry that a new phase of computing had arrived.

Where previous decades had been shaped by programmable systems based on rules written by human software engineers, the Cognitive Era would be defined by systems that could learn, adapt and improve over time. These systems, powered by machine learning (ML) and natural language processing (NLP), would not be explicitly told what to do. They would infer, synthesize and interact.

At the center of this vision was IBM’s Watson, which had already made headlines in 2011 for defeating human champions on Jeopardy! But the real promise of Watson was not about winning quiz shows. Instead, it was helping doctors sort through thousands of clinical trials to suggest treatments, or to assist lawyers analyzing vast corpuses of case law. IBM pitched Watson not as a replacement for experts, but as an amplifier of human intelligence, the first cognitive co-pilot.

This framing change was significant. Unlike earlier tech eras that emphasized automation and efficiency, the Cognitive Era emphasized partnership. IBM spoke of “augmented intelligence” rather than “artificial intelligence,” positioning these new systems as collaborators, not competitors.

But implicit in this vision was something deeper: A recognition that cognitive labor, long the hallmark of the white-collar professional class, was no longer safe from automation. Just as the steam engine displaced physical labor, cognitive computing would begin to encroach on domains once thought exclusively human: language, diagnosis and judgment.

IBM’s declaration was both optimistic and sobering. It imagined a future where humans could do ever more with the help of machines. It also hinted at a future where value would need to migrate once again, this time into domains where machines still struggled — such as meaning-making, emotional resonance and ethical reasoning.

The declaration of a Cognitive Era was seen as significant at the time, yet few then realized its long-term implications. It was, in essence, the formal announcement of the next great migration; one not of bodies, but of minds. It signaled a shift in terrain, and a new journey that would test not just our skills, but our identity.

The first great migration: From field to factory

To understand the great cognitive migration now underway and how it is qualitatively unique in human history, we must first briefly consider the migrations that came before it. From the rise of factories in the Industrial Revolution to the digitization of the modern workplace, every major innovation has demanded a shift in skills, institutions and our assumptions about what it means to contribute.

The Industrial Revolution, beginning in the late 18th century, marked the first great migration of human labor on a mass scale into entirely new ways of working. Steam power, mechanization and the rise of factory systems pulled millions of people from rural agrarian life into crowded, industrializing cities. What had once been local, seasonal and physical labor became regimented, specialized and disciplined, with productivity as the driving force.

This transition did not just change where people worked; it changed who they were. The village blacksmith or cobbler moved to new roles and became cogs in a vast industrial machine. Time clocks, shift work and the logic of efficiency began to redefine human contribution. Entire generations had to learn new skills, embrace new routines and accept new hierarchies. It was not just labor that migrated, it was identity.

Just as importantly, institutions had to migrate too. Public education systems expanded to produce a literate industrial workforce. Governments adapted labor laws to new economic conditions. Unions emerged. Cities grew rapidly, often without infrastructure to match. It was messy, uneven and traumatic. It also marked the beginning of a modern world shaped by — and increasingly for — machines.

This migration created a repeated pattern: Modern technology displaces, and people and society need to adapt. This adaptation could happen gradually — or sometimes violently — until eventually, a new equilibrium emerged. But every wave has asked more of us. The Industrial Revolution required our bodies. The next would require our minds.

If the Industrial Revolution demanded our bodies, the Digital Revolution demanded new minds. Beginning in the mid-20th century and accelerating through the 1980s and ’90s, computing technologies transformed human work once again. This time, repetitive mechanical tasks were increasingly replaced with information processing and symbolic manipulation.

In what is sometimes called the Information Age, clerks became data analysts and designers became digital architects. Administrators, engineers and even artists began working with pixels and code instead of paper and pen. Work moved from the factory floor to the office tower, and eventually to the screen in our pocket. Knowledge work became not just dominant, but aspirational. The computer and the spreadsheet became the picks and shovels of a new economic order.

I saw this first-hand early in my career when working as a software engineer at Hewlett Packard. Several newly-minted MBA graduates arrived with HP-branded Vectra PCs and Lotus 1-2-3 spreadsheet software. It was seemingly at that moment when data analysts began proffering cost-benefit analyses, transforming enterprise operational efficiency.

This migration was less visibly traumatic than the one from farm to factory, but no less significant. It redefined productivity in cognitive terms: memory, organization, abstraction. It also brought new forms of inequality between those who could master digital systems and those who were left behind. And, once again, institutions scrambled to keep pace. Schools retooled for “21st-century skills.” Companies reorganized information flows using techniques like “business process reengineering.” Identity shifted again too, this time from laborer to knowledge worker.

Now, midway through the third decade of the 21st century, even knowledge work is becoming automated, and white-collar workers can feel the climate shifting. The next migration has already begun.

The most profound migration yet

We have migrated our labor across fields, factorie, and fiber optics. Each time, we have adapted. This has often been uneven and sometimes painful, but we have transitioned to a new normalcy, a new equilibrium. However, the cognitive migration now underway is unlike those before it. It does not just change how we work; it challenges what we have long believed makes us irreplaceable: Our rational mind.

As AI grows more capable, we must shift once more. Not toward harder skills, but toward deeper ones that remain human strengths, including creativity, ethics, empathy, meaning and even spirituality. This is the most profound migration yet because this time, it is not just about surviving the shift. It is about discovering who we are beyond what we produce and understanding the true nature of our value.

Accelerating change, compressed adaptation

The timeline for each technological migration has also accelerated dramatically. The Industrial Revolution unfolded over a century, allowing generational adaptation. The Digital Revolution compressed that timeline into a few decades. Some workers began their careers with paper files and retired managing cloud databases. Now, the next migration is occurring in mere years. For example, large language models (LLMs) went from academic projects to workplace tools in less than five years.

William Bridges noted in the 2003 revision of “Managing Transitions:” “It is the acceleration of the pace of change in the past several decades that we are having trouble assimilating and that throws us into transition.” The pace of change is far faster now than it was in 2003, which makes this even more urgent.

This acceleration is reflected not only in AI software but also in the underlying hardware. In the Digital Revolution, the predominant computing element was the CPU that executed instructions serially based on rules coded explicitly by a software engineer. Now, the dominant computing element is the GPU, which executes instructions in parallel and learns from data rather than rules. The parallel execution of tasks provides an implicit acceleration of computing. It is no coincidence that Nvidia, the leading developer of GPUs, refers to this as “accelerated computing.”

The existential migration

Transitions that once evolved across generations are now occurring within a single career, or even a single decade. This particular shift demands not just new skills, but a fundamental reassessment of what makes us human. Unlike previous technological shifts, we cannot simply learn new tools or adopt new routines. We must migrate to terrain where our uniquely human qualities of creativity, ethical judgment and meaning-making become our defining strengths. The challenge before us is not merely technological adaptation but existential redefinition.

As AI systems master what we once thought as uniquely human tasks, we find ourselves on an accelerated journey to discover what truly lies beyond automation: The essence of being human in an age where intelligence alone is no longer our exclusive domain.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

LandBridge Posts Higher Revenue

LandBridge Company LLC has reported $44 million in revenue for the first quarter of 2025, up from $36.5 million for the fourth quarter of 2024 and $19 million for the corresponding quarter a year prior. The company attributed the sequential increase to increases in surface use royalties of $6.8 million,

Read More »

Nutanix partnerships target storage, AI workloads as it aims to take on VMware

“Driven by customer requests, these partnerships highlight Nutanix management’s push toward unbundling AHV to capitalize on the ongoing VMware displacement opportunity. Running standalone AHV on existing three-tier infrastructure provides dissatisfied VMware customers with an easier migration route off VMware as it removes the need for hardware refreshes,” Ader wrote. “While

Read More »

Beyond firewalls: SonicWall pivots to embrace cloud, services, AI

These acquisitions included Solutions Granted in November 2023, which expanded the company’s managed security services portfolio. SonicWall acquired Banyan Security in January 2024, bringing with it cloud-native ZTNA capabilities. “Every firewall going out the door now has cloud native capability,” VanKirk noted. Managed Protection Service Suite brings co-managed services A

Read More »

WhiteHawk Energy to Acquire PHX Minerals for $187MM

WhiteHawk Energy and WhiteHawk Income Corporation have entered into a definitive agreement to acquire PHX Minerals Inc. in an all-cash transaction of $4.35 per PHX share, for a total of approximately $187 million, including its debt. Through the acquisition, WhiteHawk will add approximately 1.8 million gross unit acres of premier natural gas mineral and royalty assets, significantly expanding its footprint in the core of the Haynesville Shale in East Texas / North Louisiana and diversifying its portfolio into the SCOOP / STACK region in Oklahoma, the company said in a news release. After the close of the transaction, WhiteHawk will own royalty interests across approximately 3.1 million gross unit acres, with cash flow from approximately 10,163 producing wells, 368 wells-in-progress, 330 permitted wells and more than 7,250 undeveloped locations across its portfolio, on a pro forma basis, according to the release. The transaction is expected to close early in the third quarter and is subject to customary closing conditions, including the tender into the offer of a minimum amount of PHX’s common stock and other conditions as set forth in the merger agreement. Following the completion of the transaction, PHX will no longer trade on the New York Stock Exchange, WhiteHawk said. WhiteHawk said it plans to finance this transaction with a combination of new equity and additional debt under its existing senior secured notes. “We are excited to announce this transaction with WhiteHawk, which will provide compelling and certain value to all PHX stockholders,” PHX President and CEO Chad Stephens said. “PHX’s Board of Directors conducted a robust strategic alternatives process to maximize value for our stockholders, and we unanimously determined the transaction with WhiteHawk achieves this objective. This transaction is also a testament to the PHX team’s work to evolve our business and build a best-in-class natural gas

Read More »

Constructive USA-China Tone Lifts Market Sentiment

Crude oil prices rose on Monday, extending their rebound, supported by renewed optimism after the United States and China reported progress in trade negotiations over the weekend. That’s what Inki Cho, Financial Markets Strategist Consultant to Exness, said in a market analysis sent to Rigzone on Monday, adding that the constructive tone from both sides lifted market sentiment, as easing tensions between the world’s two largest oil consumers could support global demand expectations. “However, despite the rebound, crude prices remain vulnerable to near-term volatility due to limited clarity from the talks and persistent global macroeconomic uncertainties,” Cho warned. In a note sent to Rigzone by the Sparta Commodities team on Monday, Neil Crosby, Oil Analytics AVP at Sparta, said “a temporary lowering of U.S.-China tariffs will undoubtedly spur bullishness in wider equities for a spell and also push [the] crude flat price higher”. A “joint statement on [the] U.S.-China economic and trade meeting in Geneva” published on the White House website on Monday stated that “the parties [U.S. and China] commit to take the following actions by May 14, 2025”. “The United States will (i) modify the application of the additional ad valorem rate of duty on articles of China (including articles of the Hong Kong Special Administrative Region and the Macau Special Administrative Region) set forth in Executive Order 14257 of April 2, 2025, by suspending 24 percentage points of that rate for an initial period of 90 days, while retaining the  remaining ad valorem rate of 10 percent on those articles pursuant to the terms of said Order; and (ii) removing the modified additional ad valorem rates of duty on those articles imposed by Executive Order 14259 of April 8, 2025 and Executive Order 14266 of April 9, 2025,” it added. “China will (i) modify accordingly the application of

Read More »

Uniper picks ITM electrolyser for Humber gas plant

Hydrogen fuel cell specialist ITM Power has been chosen by Uniper to provide electrolysis technology for a low carbon hydrogen production facility at its Humber gas power plant. The German energy giant’s HumberH2ub (Green) project at Killingholme in North Lincolnshire was one of the 27 green hydrogen projects shortlisted under the UK government’s hydrogen allocation round 2 (HAR2) scheme. AIM-listed ITM Power announced on May 8 that it had been selected to provide electrolysis technology to Uniper’s project where it is tasked with supplying six 20-MW Poseidon core electrolysis process modules. Humber H2ub (Green) will have an initial capacity of 120 MW, with the potential to expand it by an additional 200 MW or more further down the line. Uniper signed a collaboration agreement in March 2024 to work towards supplying green hydrogen from the Humber H2ub project to Phillips 66’s Humber refinery, which is also located in Killingholme, to replace some refinery fuel gas in fired heaters at that facility. HAR2  projects have been invited to proceed to the next stage of the process – a due diligence phase for which Uniper will need to submit a request for information (RFI) form by May 16. The government is expected to decide which of the shortlisted projects to award contracts to in 2026, with successful projects then required to be commissioned by the end of 2029. Uniper is targeting a final investment decision (FID) on Humber H2ub (Green) in 2026, after which it would bring the project online by 2029, in line with HAR2 requirements. In its announcement, Sheffield-based ITM said that Poseidon offered “unmatched efficiency, rapid response times, and an optimised footprint for large-scale projects”. Elsewhere on its website, the company says Poseidon consists of skid-mounted units enabling scale-up, which are suitable for both indoor and outdoor installation. © Supplied

Read More »

JP Morgan Says Oil Demand in Early May ‘Indicates Tepid YoY Growth’

Global oil demand in early May indicates a tepid year over year growth, analysts at J.P. Morgan, including Natasha Kaneva, Head of Global Commodities Strategy at the company, said in a research note sent to Rigzone by the JPM Commodities Research team on Thursday. “The final figures for global liquids demand in 1Q25 aligned with our forecast, increasing by 1.6 million barrels per day year over year,” the analysts said in the note. “Preliminary data for April indicate consumption was flat with last year’s levels and 500,000 barrel per day below our expectations. The weakness appears to have extended into early May,” they added. In the research note, the J.P. Morgan analysts stated that, as of May 6, global oil demand averaged 103.5 million barrels per day. They pointed out in the note that this marked an increase of 280,000 barrels per day on year ago levels, which they said “is nearly half of the anticipated pace of 550,000 barrels per day for the month”. “We anticipate that oil demand will likely improve in the coming weeks as the summer driving season kicks off in the northern hemisphere,” the J.P. Morgan analysts went on to state. In the note, the J.P. Morgan analysts said that, in the first week of May, “visible OECD commercial oil stock (including the U.S., Europe, and Singapore) reported a four million barrel decline”. The analysts noted that a six million barrel drop in oil products stocks was partially offset by a two million barrel increase in crude oil stocks.   “Globally, total liquid stocks increased by eight million barrels in the first week of May, marking seven increases over the past eight weeks,” the J.P. Morgan analysts said in the research note. “Observable oil product stocks experienced a drawdown of three million barrels, while crude

Read More »

Tax reform readiness: Steps to prepare now

With the enactment of the Tax Cuts and Jobs Act (TCJA) in 2017, we learned that there are many complexities involved in navigating a corporate tax rate change—especially since the 2017 changes came after more than 30 years of stability. As the political appetite for new tax legislation grows, the process for navigating change is certainly more familiar, yet not without continued complexity. Today’s tax reform requires utilities to understand the implications of another rate change or other reform initiatives on the business and the extensive preparations required to manage those initiatives effectively.   One of my goals with this series [CP1] is to inform tax teams of those relevant legislative initiatives and impacts to utilities, based on what my industry colleagues and I are hearing from Capitol Hill. Here’s the latest that we’re tracking:   On Tuesday, May 6, more than 30 Republicans signed on to a letter to Speaker Mike Johnson (R-La.) and Majority Leader Steve Scalise (R-La.) reaffirming that their support for Trump’s “big, beautiful bill” being crafted in the House depends on “at minimum” the proposal’s “strict adherence” to House’s blueprint for the plan. “Under the House’s framework, the reconciliation bill must not add to the deficit. The House budget resolution assumes that enacting President Trump’s agenda, including extending the 2017 tax cuts, will generate $2.5 trillion in additional revenue through economic growth,” they argued. “This means that all additional tax cuts or increases in spending above this level must be offset.”  Treasury Secretary Scott Bessent and White House National Economic Council Director Kevin Hassett explained to House and Senate leaders that there could be huge new corporate tax cuts, citing a potential drop from 21% to 15% for U.S. manufacturers. It’s unclear if this would apply to industries outside of manufacturing: two weeks ago, White House Press Secretary Karoline Leavitt said President Trump has

Read More »

Electricity prices are rising. EV managed charging can help everyone.

Electricity prices are rising faster than inflation. Utilities are navigating higher market and capacity prices, strained supply chains, and growing demand. For many households, monthly energy bills are already a stretch and getting tighter. As costs across the power system increase, utilities are working to meet ambitious goals around reliability, decarbonization, and equity. Investments in tools such as distributed energy resources (DERs) and Virtual Power Plants (VPPs) are part of the push to meet the needs of the moment. But here’s the challenge: while many of these tools save money for the customers that use them, they don’t always lower rates for everyone else. EV managed charging is different. It can reduce costs for all customers, including those who do not drive an EV. Yet, in 2025, managed charging remains an underutilized tool. Managed Charging Lowers the Cost to Serve Everyone EVs can either increase grid stress and utility costs, or help relieve both. The difference depends on when and how EVs are charged. Managed charging doesn’t reduce electricity sales. It shifts when the sales occur. The average EV is plugged in for 10 hours but only needs about two hours to fully charge. This creates a wide window to schedule charging at lower-cost times, while still meeting drivers’ needs. Here’s why that matters: when utilities can sell more electricity without triggering grid upgrades, they lower their cost to serve everyone. That reduces the overall revenue requirement and puts downward pressure on rates, benefitting all customers—not just EV drivers. EV Load is Growing. So is the Need for Managed Charging. Nearly 1.5 million EVs were sold in the U.S. last year. There are now more EVs on the road than rooftop solar systems installed. And as more affordable models come to market, those numbers will keep rising. The effects of

Read More »

Tech CEOs warn Senate: Outdated US power grid threatens AI ambitions

The implications are clear: without dramatic improvements to the US energy infrastructure, the nation’s AI ambitions could be significantly constrained by simple physical limitations – the inability to power the massive computing clusters necessary for advanced AI development and deployment. Streamlining permitting processes The tech executives have offered specific recommendations to address these challenges, with several focusing on the need to dramatically accelerate permitting processes for both energy generation and the transmission infrastructure needed to deliver that power to AI facilities, the report added. Intrator specifically called for efforts “to streamline the permitting process to enable the addition of new sources of generation and the transmission infrastructure to deliver it,” noting that current regulatory frameworks were not designed with the urgent timelines of the AI race in mind. This acceleration would help technology companies build and power the massive data centers needed for AI training and inference, which require enormous amounts of electricity delivered reliably and consistently. Beyond the cloud: bringing AI to everyday devices While much of the testimony focused on large-scale infrastructure needs, AMD CEO Lisa Su emphasized that true AI leadership requires “rapidly building data centers at scale and powering them with reliable, affordable, and clean energy sources.” Su also highlighted the importance of democratizing access to AI technologies: “Moving faster also means moving AI beyond the cloud. To ensure every American benefits, AI must be built into the devices we use every day and made as accessible and dependable as electricity.”

Read More »

Networking errors pose threat to data center reliability

Still, IT and networking issues increased in 2024, according to Uptime Institute. The analysis attributed the rise in outages due to increased IT and network complexity, specifically, change management and misconfigurations. “Particularly with distributed services, cloud services, we find that cascading failures often occur when networking equipment is replicated across an entire network,” Lawrence explained. “Sometimes the failure of one forces traffic to move in one direction, overloading capacity at another data center.” The most common causes of major network-related outages were cited as: Configuration/change management failure: 50% Third-party network provider failure: 34% Hardware failure: 31% Firmware/software error: 26% Line breakages: 17% Malicious cyberattack: 17% Network overload/congestion failure: 13% Corrupted firewall/routing tables issues: 8% Weather-related incident: 7% Configuration/change management issues also attributed for 62% of the most common causes of major IT system-/software-related outages. Change-related disruptions consistently are responsible for software-related outages. Human error continues to be one of the “most persistent challenges in data center operations,” according to Uptime’s analysis. The report found that the biggest cause of these failures is data center staff failing to follow established procedures, which has increased by about 10 percentage points compared to 2023. “These are things that were 100% under our control. I mean, we can’t control when the UPS module fails because it was either poorly manufactured, it had a flaw, or something else. This is 100% under our control,” Brown said. The most common causes of major human error-related outages were reported as:

Read More »

Liquid cooling technologies: reducing data center environmental impact

“Highly optimized cold-plate or one-phase immersion cooling technologies can perform on par with two-phase immersion, making all three liquid-cooling technologies desirable options,” the researchers wrote. Factors to consider There are numerous factors to consider when adopting liquid cooling technologies, according to Microsoft’s researchers. First, they advise performing a full environmental, health, and safety analysis, and end-to-end life cycle impact analysis. “Analyzing the full data center ecosystem to include systems interactions across software, chip, server, rack, tank, and cooling fluids allows decision makers to understand where savings in environmental impacts can be made,” they wrote. It is also important to engage with fluid vendors and regulators early, to understand chemical composition, disposal methods, and compliance risks. And associated socioeconomic, community, and business impacts are equally critical to assess. More specific environmental considerations include ozone depletion and global warming potential; the researchers emphasized that operators should only use fluids with low to zero ozone depletion potential (ODP) values, and not hydrofluorocarbons or carbon dioxide. It is also critical to analyze a fluid’s viscosity (thickness or stickiness), flammability, and overall volatility. And operators should only use fluids with minimal bioaccumulation (the buildup of chemicals in lifeforms, typically in fish) and terrestrial and aquatic toxicity. Finally, once up and running, data center operators should monitor server lifespan and failure rates, tracking performance uptime and adjusting IT refresh rates accordingly.

Read More »

Cisco unveils prototype quantum networking chip

Clock synchronization allows for coordinated time-dependent communications between end points that might be cloud databases or in large global databases that could be sitting across the country or across the world, he said. “We saw recently when we were visiting Lawrence Berkeley Labs where they have all of these data sources such as radio telescopes, optical telescopes, satellites, the James Webb platform. All of these end points are taking snapshots of a piece of space, and they need to synchronize those snapshots to the picosecond level, because you want to detect things like meteorites, something that is moving faster than the rotational speed of planet Earth. So the only way you can detect that quickly is if you synchronize these snapshots at the picosecond level,” Pandey said. For security use cases, the chip can ensure that if an eavesdropper tries to intercept the quantum signals carrying the key, they will likely disturb the state of the qubits, and this disturbance can be detected by the legitimate communicating parties and the link will be dropped, protecting the sender’s data. This feature is typically implemented in a Quantum Key Distribution system. Location information can serve as a critical credential for systems to authenticate control access, Pandey said. The prototype quantum entanglement chip is just part of the research Cisco is doing to accelerate practical quantum computing and the development of future quantum data centers.  The quantum data center that Cisco envisions would have the capability to execute numerous quantum circuits, feature dynamic network interconnection, and utilize various entanglement generation protocols. The idea is to build a network connecting a large number of smaller processors in a controlled environment, the data center warehouse, and provide them as a service to a larger user base, according to Cisco.  The challenges for quantum data center network fabric

Read More »

Zyxel launches 100GbE switch for enterprise networks

Port specifications include: 48 SFP28 ports supporting dual-rate 10GbE/25GbE connectivity 8 QSFP28 ports supporting 100GbE connections Console port for direct management access Layer 3 routing capabilities include static routing with support for access control lists (ACLs) and VLAN segmentation. The switch implements IEEE 802.1Q VLAN tagging, port isolation, and port mirroring for traffic analysis. For link aggregation, the switch supports IEEE 802.3ad for increased throughput and redundancy between switches or servers. Target applications and use cases The CX4800-56F targets multiple deployment scenarios where high-capacity backbone connectivity and flexible port configurations are required. “This will be for service providers initially or large deployments where they need a high capacity backbone to deliver a primarily 10G access layer to the end point,” explains Nguyen. “Now with Wi-Fi 7, more 10G/25G capable POE switches are being powered up and need interconnectivity without the bottleneck. We see this for data centers, campus, MDU (Multi-Dwelling Unit) buildings or community deployments.” Management is handled through Zyxel’s NebulaFlex Pro technology, which supports both standalone configuration and cloud management via the Nebula Control Center (NCC). The switch includes a one-year professional pack license providing IGMP technology and network analytics features. The SFP28 ports maintain backward compatibility between 10G and 25G standards, enabling phased migration paths for organizations transitioning between these speeds.

Read More »

Engineers rush to master new skills for AI-driven data centers

According to the Uptime Institute survey, 57% of data centers are increasing salary spending. Data center job roles that saw the highest increases were in operations management – 49% of data center operators said they saw highest increases in this category – followed by junior and mid-level operations staff at 45%, and senior management and strategy at 35%. Other job categories that saw salary growth were electrical, at 32% and mechanical, at 23%. Organizations are also paying premiums on top of salaries for particular skills and certifications. Foote Partners tracks pay premiums for more than 1,300 certified and non-certified skills for IT jobs in general. The company doesn’t segment the data based on whether the jobs themselves are data center jobs, but it does track 60 skills and certifications related to data center management, including skills such as storage area networking, LAN, and AIOps, and 24 data center-related certificates from Cisco, Juniper, VMware and other organizations. “Five of the eight data center-related skills recording market value gains in cash pay premiums in the last twelve months are all AI-related skills,” says David Foote, chief analyst at Foote Partners. “In fact, they are all among the highest-paying skills for all 723 non-certified skills we report.” These skills bring in 16% to 22% of base salary, he says. AIOps, for example, saw an 11% increase in market value over the past year, now bringing in a premium of 20% over base salary, according to Foote data. MLOps now brings in a 22% premium. “Again, these AI skills have many uses of which the data center is only one,” Foote adds. The percentage increase in the specific subset of these skills in data centers jobs may vary. The Uptime Institute survey suggests that the higher pay is motivating workers to stay in the

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »