Stay Ahead, Stay ONMINE

Google Cloud partners with mLogica to offer mainframe modernization

Other than the partnership with mLogica, Google Cloud also offers a variety of other mainframe migration tools, including Radis and G4 that can be employed to modernize specific applications. Enterprises can also use a combination of migration tools to modernize their mainframe applications. Some of these tools include the Gemini-powered Mainframe Assessment Tool (MAT), Mainframe […]

Other than the partnership with mLogica, Google Cloud also offers a variety of other mainframe migration tools, including Radis and G4 that can be employed to modernize specific applications.

Enterprises can also use a combination of migration tools to modernize their mainframe applications.

Some of these tools include the Gemini-powered Mainframe Assessment Tool (MAT), Mainframe Code Rewrite, Mainframe Refactor for Batch, Mainframe Application Replatforming, and the Mainframe Connector.

IBM, too, offers mainframe modernization services and code transformation services via its generative AI service watsonx.

IBM spin-off Kyndryl also offers a similar service in partnership with Microsoft. As part of the partnership, the companies combine Kyndryl’s zCloud mainframe service with Microsoft’s Power Platform. Kyndryl has also partnered with AWS to offer mainframe modernization as a service.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Kyndryl, AWS unwrap AI-driven mainframe migration service

Kyndryl is tapping into agentic AI technology from AWS to offer enterprise customers another way to integrate with or move mainframe applications to the AWS Cloud platform. Specifically, Kyndryl will offer an implementation service based on the recently announced AWS Transform package, which analyzes mainframe code bases, decomposes them into

Read More »

IBM claims to have ‘only realistic path’ to quantum computing

Error correction is the biggest obstacle to practical quantum computing. Quantum computing companies typically address error correction with redundant qubits, but with previous approaches the number of these redundant qubits would grow much faster than the number of total usable qubits, keeping the computers from reading any useful size. Earlier

Read More »

El Capitan retains No. 1 supercomputer ranking

The El Capitan system at Lawrence Livermore National Laboratory in California maintained its title as the world’s fastest supercomputer, according to the latest TOP500 list released today. U.S.-based systems also won the second and third spots with long-time list leader Frontier at No. 2 and the Aurora system at No.

Read More »

Atlantic Shores Offshore Wind asks to terminate project’s renewable energy credits

Atlantic Shores Offshore Wind, the developer behind the 1.5-GW Atlantic Shores 1 wind energy project in progress offshore New Jersey, filed a petition Wednesday asking the state’s Board of Public Utilities to terminate the project’s offshore renewable energy credits and release it from all associated obligations. The petition cites President Donald Trump’s executive order pausing the development of offshore wind, along with other actions from his administration and the overall macroeconomic environment as reasons that “the project is no longer viable upon the terms and conditions set forth in the [Offshore Wind Renewable Energy Certificate] Order,” which was issued in 2021. “This is despite Petitioner’s diligent and good faith efforts to advance the Project toward completion as set forth in Section II of this Petition, including submission of a rebid of the Project in the Fourth Solicitation, which was concluded by the BPU without an award,” the petition said. However, Atlantic Shores Offshore Wind’s CEO Joris Veldhoven said in a Monday release that the project is seeking a “reset period” and that “this filing marks the closing of a chapter, but not the end for Atlantic Shores.” The project in March had its Clean Air Act permit remanded by the Environmental Protection Agency’s Environmental Appeals Board, an action which Atlantic Shores Offshore Wind argued lacked “good cause.” The board maintained that it has “broad discretion to grant a voluntary remand.” The uncertainty caused by Trump’s executive order, the project’s loss of its air permit, “and other actions taken by the current administration more generally” mean that the “Petitioner’s parent company has been forced to materially reduce its personnel, terminate contracts, and cancel planned project investments.” The project was launched as a joint venture between EDF Renewables and Shell. In January, Shell booked a $1 billion impairment associated with the project, which

Read More »

US utility-scale energy storage to double, reach 65 GW by 2027: EIA

Utility-scale battery storage in the United States is poised to more than double over the next two years and will close out 2026 at nearly 65 GW — a rapid rise from 17 GW in the first quarter of 2024. The U.S. Energy Information Administration published its Short Term Energy Outlook on Tuesday, showing a rapid increase in anticipated energy storage resources. Counting projects larger than 1 MW in the electric power sector, EIA said domestic storage capacity will rise from about 28 GW at the end of Q1’25 to 64.9 GW at the end of 2026.  Large-scale battery storage resources in the commercial and industrial sectors are expected to rise from about 100 MW to about 300 MW across the same time frame. EIA’s data shows annual U.S. electricity generation rising from 4,180 billion kWh in 2023 to 4,490 billion kWh in 2026. Monthly electricity demand is highest during the summer months in the United States, but despite a cooling degree forecast in line with last year’s weather EIA said it expects total U.S. electricity generation this summer will increase by 1%, or 14 billion kWh, compared with summer 2024.  “We expect strong demand growth in the commercial sector because of the expansion of data centers,” EIA said. EIA anticipates natural gas-fired power plants between June and September 2025 will generate 3% less than they did last summer “because of higher natural gas prices and the continuing increase in new solar generating capacity.” Domestic solar generation this summer will grow by 33%, EIA said, and improving water supply in the western states will contribute to a 6% increase in U.S. hydroelectric generation.

Read More »

MISO submits revised fast-track interconnection process for FERC approval

The Midcontinent Independent System Operator on Friday asked the Federal Energy Regulatory Commission to approve a fast-track generation interconnection process after the agency rejected its initial plan last month. MISO runs the grid and wholesale power markets in 15 states and part of Canada, from Louisiana to Manitoba. In response to FERC’s May 16 decision, MISO set more limits around its proposed Expedited Resource Addition Study process, which would allow projects that meet certain criteria to avoid MISO’s generation interconnection queue and potentially receive an interconnection agreement within 90 days. Under the revised proposal, the ERAS process would be capped at 68 projects, with MISO studying only 10 projects each quarter. The proposed cap includes eight projects for the restructured states of Illinois and Michigan as well as 10 projects for independent power producers with power supply agreements with non-utility entities. The revised process requires “relevant electric retail regulatory authorities,” such as state utility commissions, to verify in writing that a proposed project will address either a load addition not already planned for or an identified resource adequacy deficiency within its jurisdictional footprint, MISO said in its filing. The new proposal also requires projects to have a commercial operation date within three years of an application being filed. MISO plans to end the ERAS process by Aug. 31, 2027, or after its 68-project cap is met, whichever happens first. The proposal addresses potential capacity shortfalls driven by load growth and power plant retirements, according to MISO, which pointed to various reports to support its claims. “These surveys and forecasts demonstrate that MISO must address resource adequacy and reliability needs in the next five years and that an accelerated interconnection study process must be established to address time-critical challenges,” the grid operator told FERC. The ERAS process will provide a bridge to 2028,

Read More »

Utility demand response messages do not need additional customer consent: FCC

Electric utilities can make calls and send demand response text messages to customers who have provided their phone number, the Federal Communications Commission said in a Monday declaratory ruling. “When a consumer gives a utility their phone number they give their prior express consent to receive non-telemarketing demand response calls and texts,” the commission wrote. “We confirm that demand response communications are closely related to utility service and thus within the scope of consent customers give when they provide their phone number to their utility,” FCC said. “We thus make clear that a utility can make critical, time-sensitive demand response communications to its customers rather than having to solicit additional consent from each customer, a burdensome and often costly undertaking.” The Edison Electric Institute, which represents investor-owned utilities, requested in March that the FCC clarify that utilities have prior express consent under the Telephone Consumer Protection Act, and utilities can make demand response communications without risking penalties or fines. EEI “appreciates the FCC allowing electric companies to deliver timely and clear communications that help customers conserve electricity — and save money — at times of extremely high demand,” Senior Vice President of Customer Solutions Phil Dion said in a statement. “As summer begins and Americans use more electricity to cool their homes and businesses, demand response programs allow grid operators to maintain reliability while helping to keep customer bills as low as possible.” Utilities see demand response as a key resource to meet growing electricity demand. EEI told the FCC in its March request that the clarification would help “ensure that utilities can continue to provide safe, reliable, and efficient service to customers.” U.S. electricity demand could grow 25% from 2023 to 2030, ICF said in a May report. And utility demand-side management programs, which include demand response, could be able to meet

Read More »

A collaborative approach for meeting data center power needs and protecting ratepayers

Chris Crosby is CEO of Compass Datacenters, a Dallas-based data center developer. We at Compass Datacenters have closely followed the discussion around the increasing power demands of data centers and read with interest the Harvard Law study recently released asserting that demand is being serviced on the backs of consumers. The study draws a compelling picture, and we acknowledge the challenges presented by existing regulatory frameworks. However, Compass Datacenters believes that there’s one key shortcoming of the paper, in that it lumps all data center developers into one category when, in reality, not all developers are created equal. There’s a gold rush or dot-com mentality in our space right now. Speculative developers are coming out of the woodwork to flip a piece of powered land for profit. These aren’t real projects, complete with solid financing and tenants waiting in the wings. I’ve been in the business of data center development for decades and I’ve never seen anything quite like it. Because of the will-serve model that utilities currently work under, these spec developers can come to a utility and say they are building a hyperscale data center that’s going to need as much energy as a small power plant and the utility has to put the request in the queue which creates potentially false demand. Speculative requests for service put enormous pressure on the utilities’ resource planning and capital projections. Further, speculation increases risk to the utility investing in infrastructure to serve loads for projects that may never materialize. Put in place during a simpler time, the will-serve model is not suited for an era where the needs of certain customers are growing exponentially compared to others. Responding to requests from speculative investors is tying up power without an actual project and true demand behind it. Indeed, building out infrastructure

Read More »

Germany’s SEFE Strikes 10-Year Deal for Gas From Azerbaijan

Germany’s state-owned SEFE signed a 10-year deal to buy gas from Azerbaijan’s Socar, a move that will help diversify supplies in Europe’s tight market. The agreement, which begins this year, allows the German trading company to buy as much as 15 terawatt-hours of gas annually, according to a release that confirmed an earlier Bloomberg report. That’s equivalent to about 1.5 billion cubic meters per year, or about 0.5% of the European Union’s gas consumption in 2024. The gas is likely to reach Europe via the Trans Adriatic Pipeline, according to two people familiar with the matter, who asked not to be named because the information is private. TAP crosses northern Greece, Albania and the Adriatic Sea before coming ashore in southern Italy. The deal with state-owned Socar will support investments in production and infrastructure, boosting the volumes of pipeline gas coming to Europe, according to the statement. SEFE, which stands for Securing Energy for Europe, has been among the most active companies in negotiating new supply deals. The region increasingly depends on global flows after losing most of its Russian pipeline gas in 2022. With Norwegian production operating pretty much at full capacity, Europe still needs to secure supplies as it awaits the startup of additional liquefied natural gas export facilities in the US.  Socar was a key player in negotiations between European countries and Ukraine about potentially continuing to transit gas after a deal ended at the end of last year. An agreement wasn’t reached at the time and Azerbaijan was keen to secure commitments from Europe on building new infrastructure and longer-term contracts to buy gas.  SEFE, formerly the European trading and supply unit of Gazprom PJSC, was nationalized in 2022 as part of a €6.3 billion ($7.2 billion) rescue package. The company has lined up numerous supply deals with global producers, mainly for

Read More »

Cisco reinvigorates data center, campus, branch networking with AI demands in mind

“We have a number of … enterprise data center customers that have been using bi-directional optics for many generations, and this is the next generation of that feature,” said Bill Gartner, senior vice president and general manager of Cisco’s optical systems and optics business. “The 400G lets customer use their existing fiber infrastructure and reduces fiber count for them so they can use one fiber instead of two, for example,” Gartner said. “What’s really changed in the last year or so is that with AI buildouts, there’s much, much more optics that are part of 400G and 800G, too. For AI infrastructure, the 400G and 800G optics are really the dominant optics going forward,” Gartner said. New AI Pods Taking aim at next-generation interconnected compute infrastructures, Cisco expanded its AI Pod offering with the Nvidia RTX 6000 Pro and Cisco UCS C845A M8 server package. Cisco AI Pods are preconfigured, validated, and optimized infrastructure packages that customers can plug into their data center or edge environments as needed. The Pods include Nvidia AI Enterprise, which features pretrained models and development tools for production-ready AI, and are managed through Cisco Intersight. The Pods are based on Cisco Validated Design principals, which offer customers pre-tested and validated network designs that provide a blueprint for building reliable, scalable, and secure network infrastructures, according to Cisco. Building out the kind of full-scale AI infrastructure compute systems that hyperscalers and enterprises will utilize is a huge opportunity for Cisco, said Daniel Newman, CEO of The Futurum Group. “These are full-scale, full-stack systems that could land in a variety of enterprise and enterprise service application scenarios, which will be a big story for Cisco,” Newman said. Campus networking For the campus, Cisco has added two new programable SiliconOne-based Smart Switches: the C9350 Fixed Access Smart Switches and C9610

Read More »

Qualcomm’s $2.4B Alphawave deal signals bold data center ambitions

Qualcomm says its Oryon CPU and Hexagon NPU processors are “well positioned” to meet growing demand for high-performance, low-power compute as AI inferencing accelerates and more enterprises move to custom CPUs housed in data centers. “Qualcomm’s advanced custom processors are a natural fit for data center workloads,” Qualcomm president and CEO Cristiano Amon said in the press release. Alphawave’s connectivity and compute technologies can work well with the company’s CPU and NPU cores, he noted. The deal is expected to close in the first quarter of 2026. Complementing the ‘great CPU architecture’ Qualcomm has been amassing Client CPUs have been a “big play” for Qualcomm, Moor’s Kimball noted; the company acquired chip design company Nuvia in 2021 for $1.4 billion and has also announced that it will be designing data center CPUs with Saudi AI company Humain. “But there was a lot of data center IP that was equally valuable,” he said. This acquisition of Alphawave will help Qualcomm complement the “great CPU architecture” it acquired from Nuvia with the latest in connectivity tools that link a compute complex with other devices, as well as with chip-to-chip communications, and all of the “very low level architectural goodness” that allows compute cores to deliver “absolute best performance.” “When trying to move data from, say, high bandwidth memory to the CPU, Alphawave provides the IP that helps chip companies like Qualcomm,” Kimball explained. “So you can see why this is such a good complement.”

Read More »

LiquidStack launches cooling system for high density, high-powered data centers

The CDU is serviceable from the front of the unit, with no rear or end access required, allowing the system to be placed against the wall. The skid-mounted system can come with rail and overhead piping pre-installed or shipped as separate cabinets for on-site assembly. The single-phase system has high-efficiency dual pumps designed to protect critical components from leaks and a centralized design with separate pump and control modules reduce both the number of components and complexity. “AI will keep pushing thermal output to new extremes, and data centers need cooling systems that can be easily deployed, managed, and scaled to match heat rejection demands as they rise,” said Joe Capes, CEO of LiquidStack in a statement. “With up to 10MW of cooling capacity at N, N+1, or N+2, the GigaModular is a platform like no other—we designed it to be the only CDU our customers will ever need. It future-proofs design selections for direct-to-chip liquid cooling without traditional limits or boundaries.”

Read More »

Enterprises face data center power design challenges

” Now, with AI, GPUs need data to do a lot of compute and send that back to another GPU. That connection needs to be close together, and that is what’s pushing the density, the chips are more powerful and so on, but the necessity of everything being close together is what’s driving this big revolution,” he said. That revolution in new architecture is new data center designs. Cordovil said that instead of putting the power shelves within the rack, system administrators are putting a sidecar next to those racks and loading the sidecar with the power system, which serves two to four racks. This allows for more compute per rack and lower latency since the data doesn’t have to travel as far. The problem is that 1 mW racks are uncharted territory and no one knows how to manage the power, which is considerable now. ”There’s no user manual that says, hey, just follow this and everything’s going to be all right. You really need to push the boundaries of understanding how to work. You need to start designing something somehow, so that is a challenge to data center designers,” he said. And this brings up another issue: many corporate data centers have power plugs that are like the ones that you have at home, more or less, so they didn’t need to have an advanced electrician certification. “We’re not playing with that power anymore. You need to be very aware of how to connect something. Some of the technicians are going to need to be certified electricians, which is a skills gap in the market that we see in most markets out there,” said Cordovil. A CompTIA A+ certification will teach you the basics of power, but not the advanced skills needed for these increasingly dense racks. Cordovil

Read More »

HPE Nonstop servers target data center, high-throughput applications

HPE has bumped up the size and speed of its fault-tolerant Nonstop Compute servers. There are two new servers – the 8TB, Intel Xeon-based Nonstop Compute NS9 X5 and Nonstop Compute NS5 X5 – aimed at enterprise customers looking to upgrade their transaction processing network infrastructure or support larger application workloads. Like other HPE Nonstop systems, the two new boxes include compute, software, storage, networking and database resources as well as full-system clustering and HPE’s specialized Nonstop operating system. The flagship NS9 X5 features support for dual-fabric HDR200 InfiniBand interconnect, which effectively doubles the interconnect bandwidth between it and other servers compared to the current NS8 X4, according to an HPE blog detailing the new servers. It supports up to 270 networking ports per NS9 X system, can be clustered with up to 16 other NS9 X5s, and can support 25 GbE network connectivity for modern data center integration and high-throughput applications, according to HPE.

Read More »

AI boom exposes infrastructure gaps: APAC’s data center demand to outstrip supply by 42%

“Investor confidence in data centres is expected to strengthen over the remainder of the decade,” the report said. “Strong demand and solid underlying fundamentals fuelled by AI and cloud services growth will provide a robust foundation for investors to build scale.” Enterprise strategies must evolve With supply constrained and prices rising, CBRE recommended that enterprises rethink data center procurement models. Waiting for optimal sites or price points is no longer viable in many markets. Instead, enterprises should pursue early partnerships with operators that have robust development pipelines and focus on securing power-ready land. Build-to-suit models are becoming more relevant, especially for larger capacity requirements. Smaller enterprise facilities — those under 5MW — may face sustainability challenges in the long term. The report suggested that these could become “less relevant” as companies increasingly turn to specialized colocation and hyperscale providers. Still, traditional workloads will continue to represent up to 50% of total demand through 2030, preserving value in existing facilities for non-AI use cases, the report added. The region’s projected 15 to 25 GW gap is more than a temporary shortage — it signals a structural shift, CBRE said. Enterprises that act early to secure infrastructure, invest in emerging markets, and align with power availability will be best positioned to meet digital transformation goals. “Those that wait may find themselves locked out of the digital infrastructure they need to compete,” the report added.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »