Stay Ahead, Stay ONMINE

JERA, Ryohin Keikaku Form RE JV

Ryohin Keikaku Co. Ltd. and JERA Co. Inc. have launched a joint venture to develop renewable energy projects, initially solar. MUJI ENERGY LLC, based in Tokyo, is owned 80 percent by Japanese retailer Ryohin Keikaku and 20 percent by Japanese utility JERA, according to a joint statement. An existing brand of goods sold by Ryohin […]

Ryohin Keikaku Co. Ltd. and JERA Co. Inc. have launched a joint venture to develop renewable energy projects, initially solar.

MUJI ENERGY LLC, based in Tokyo, is owned 80 percent by Japanese retailer Ryohin Keikaku and 20 percent by Japanese utility JERA, according to a joint statement. An existing brand of goods sold by Ryohin Keikau is called MUJI.

The JV plans to develop about 13 megawatts of solar generation capacity within a year of its formation. “Equivalent to 20 percent of Ryohin Keikaku’s annual electricity consumption, this is projected to reduce carbon dioxide emissions by approximately 8,000 tons per year”, the statement said.

“Leveraging JERA’s expertise in solar power plant development and maintenance, we have formulated our own development criteria – guided by consideration for local communities, protection of aquatic environments and biodiversity – and decided to establish solar power plants only after performing on-site inspections and evaluations of all power plant candidate sites”, the statement said.

“All of the environmental value of electricity produced by MUJI ENERGY will be acquired by Ryohin Keikaku via JERA subsidiary JERA Cross Co. Inc. through a virtual power purchase agreement and utilized to reduce CO2 emissions from electricity consumption at locations such as MUJI tenant stores”.

In a separate power investment, JERA joined a consortium of 12 Japanese companies that participated in a capital raise by Commonwealth Fusion Systems LLC (CFS), a Devens, Massachusetts-based company eyeing to develop next-generation tokamak fusion reactors using high-temperature superconducting magnet technology.

CFS said August 28 it had raised $863 million in a Series B2 round, which it said was “the largest amount raised among deep tech and energy companies since CFS’ $1.8 billion Series B round in 2021”.

“In parallel, CFS is moving forward with plans to build the world’s first grid-scale fusion power plant, called ARC, in Chesterfield County, Virginia”, CFS said. “This effort to put power on the grid in the early 2030s is bolstered by strategic partnerships with Dominion Energy and Google – an investor in CFS that also agreed to buy half the power produced at the plant”.

JERA said, “The Japanese consortium will acquire technical and commercial expertise in policy, regulatory and the development, construction, operation and maintenance of ARC from CFS’s commercialization projects in the United States”.

“In addition, each consortium will bring together its know-how and expertise and aspire to expedite the commercialization and industrialization of fusion energy power generation in Japan”, JERA said.

“The collaboration between the Japanese consortium and CFS is in line with the Japan-U.S. joint partnership to accelerate the development and commercialization of fusion energy power generation in Japan and the U.S., and represents a strong commitment and growing momentum toward the early realization of commercialization in both countries”.

To contact the author, email [email protected]

What do you think? We’d love to hear from you, join the conversation on the

Rigzone Energy Network.

The Rigzone Energy Network is a new social experience created for you and all energy professionals to Speak Up about our industry, share knowledge, connect with peers and industry insiders and engage in a professional community that will empower your career in energy.


MORE FROM THIS AUTHOR

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Cisco launches AI-driven data fabric powered by Splunk

At this week’s Splunk .conf25 event in Boston, Cisco unveiled a new data architecture that’s powered by the Splunk platform and designed to help enterprises glean AI-driven insights from machine-generated telemetry, such as metrics, events, logs and traces. The new Cisco Data Fabric integrates business and machine data for AI

Read More »

Broadcom CEO urges cloud-to-on prem repatriation

“Since the acquisition two years ago, we roll up our sleeves, did the tough engineering work, and the result today is VMware Cloud Foundation 9.0, a real Software Defined Platform to run all your application workloads with complete compute, networking and storage, tightly integrated, and this is what you asked

Read More »

USA Oil Output At All Time High But Growth Slowing

U.S. oil production is at an all time high but growth is slowing. That’s what analysts at Standard Chartered Bank said in a report sent to Rigzone by the Standard Chartered team last week, highlighting that “total U.S. crude oil production recorded an all time high of 13.58 million barrels per day in June 2025, according to data from the Energy Information Administration (EIA)”. This production figure exceeded the previous high of October 2024 by 50,000 barrels per day, and the pre-COVID November 2019 high by 582,000 barrels per day, the analysts pointed out in the report. “The impending peak in U.S. crude oil production has been much discussed and long forecast; indeed, production in some key states appears to have already peaked,” the Standard Chartered Bank analysts stated in the report. “The year on year trend shows a clear growth slowdown following the immediate post-COVID recovery, with year on year growth in June of 328,000 barrels per day,” they added. “Still the dominant state in terms of volumes, production in Texas fell 33,000 barrels per day year on year and is now 109,000 barrels per day lower than its October 2024 peak of 5.832 million barrels per day,” they continued. “Extrapolating the decelerating year on year growth trend implies that U.S. production will peak nine months from June, at c.14.34 million barrels per day in March 2026, before declines set in,” the Standard Chartered Bank analysts predicted. The analysts warned in the report that, “if a peak is to be avoided, then additional new production (driven by drilling and completion activity, technological advancements, geological quality and operational efficiencies) must exceed the decline rates of existing wells”. “The recent earnings season highlighted widespread strategic adjustments among shale producers and vocal concerns about operational pullbacks in the low-price environment, but also

Read More »

Saudi Wealth Fund Adds to Bond Rush

Saudi Arabia’s sovereign wealth fund sold $2 billion of 10-year dollar bonds to help finance investment plans, adding to a recent wave of Saudi issuance from entities including local banks and the sovereign. The Public Investment Fund’s deal was priced 95 basis points over US Treasuries, compared with initial price talk of around 120 basis points, according to a person familiar with the matter who asked not to be identified. It sold $4 billion of debt in January and signed a $7 billion Islamic loan with 20 banks around the same time.  The PIF is the main entity tasked with executing Crown Prince Mohammed bin Salman’s Vision 2030 agenda to shift away from oil and into a more diverse range of industries. The International Monetary Fund said in August that the PIF is expected to continue spending at least $40 billion a year on domestic investment, which “is going to help keep growth positive and robust compared to what’s happening elsewhere.” The fund recently introduced a commercial paper program and has been raising cash by selling stakes in portfolio firms. The PIF is also looking to list some of its companies as it seeks to drive spending in the kingdom amid low oil prices that are complicating efforts to diversify the economy. Monday’s offering follows $20 billion in sales of dollar- and euro-denominated debt from the Saudi government this year — the most among emerging markets, according to data compiled by Bloomberg. The latest move also adds to a pick-up in syndicated loan activity and a fresh wave of local bank issuance. Countries and companies in the developing world have been rushing to sell bonds at the fastest clip in at least a decade, taking advantage of high appetite for emerging-market assets to issue debt amid what investors say could be sharper swings ahead in global debt markets. Saudi Arabia

Read More »

ADNOC Gains Entry into FTSE Emerging Index

ADNOC Gas PLC said Monday it has been selected for the FTSE Emerging Index, part of its efforts to boost its investment attractiveness. The gas processing and sales arm of Abu Dhabi National Oil Co will debut on the index, which tracks the performance of large and mid-cap companies in emerging markets within the FTSE Russell Global Equity Index Series, when markets open September 22. “ADNOC Gas believes this milestone will enhance its visibility among global institutional investors, broaden its shareholder base and support liquidity on the Abu Dhabi Securities Exchange”, ADNOC Gas said in a statement on its website. “Market analysts estimate that inclusion may result in up to $250 million in incremental investment”. In June ADNOC Gas joined the MSCI Emerging Markets Index, which it expects to boost its passive cash inflows by $300-500 million. In February ADNOC Gas completed a $2.84 billion placement of 3.1 billion shares. The so-called marketed offering, the first in the UAE according to ADNOC, was priced about 43 percent higher than ADNOC Gas’ initial public offering in March 2023. The marketed offering, which attracted Gulf and international investors, represented four percent of ADNOC Gas’ issued and outstanding share capital, the parent said then. ADNOC Gas has raised its planned 2025-29 capital expenditure to $20 billion to capitalize on growing domestic demand for natural gas. During the five years, ADNOC Gas aims to complete the Big Oil-backed Ruwais LNG. Targeted to start production 2028, the 9.6 million tons per annum (MMtpa) facility would more than double ADNOC’s LNG output. In June ADNOC Gas said it had approved phase one of the Rich Gas Development (RGD) project, which will increase its throughput capacity and operational efficiency. It awarded $5 billion worth of contracts for RGD. Also planned to be completed in 2025-29 are the

Read More »

North America Breaks Rig Loss Streak

North America added seven rigs week on week, according to Baker Hughes’ latest North America rotary rig count, which was released on September 5. The U.S. added one rig and Canada added six rigs week on week, taking the total North America rig count up to 718, comprising 537 rigs from the U.S. and 181 rigs from Canada, the count outlined. Of the total U.S. rig count of 537, 522 rigs are categorized as land rigs, 13 are categorized as offshore rigs, and two are categorized as inland water rigs. The total U.S. rig count is made up of 414 oil rigs, 118 gas rigs, and five miscellaneous rigs, according to Baker Hughes’ count, which revealed that the U.S. total comprises 470 horizontal rigs, 54 directional rigs, and 13 vertical rigs. Week on week, the U.S. offshore and inland water rig counts remained unchanged and the country’s land rig count increased by one, Baker Hughes highlighted. The U.S. gas rig count dropped by one, the country’s oil rig count increased by two, and its miscellaneous rig count remained unchanged week on week, the count showed. The U.S. directional rig count dropped by one, week on week, while its horizontal rig count increased by two and its vertical rig count remained unchanged during the same period, the count revealed. A major state variances subcategory included in the rig count showed that, week on week, Texas added two rigs and New Mexico dropped two rigs. A major basin variances subcategory included in Baker Hughes’ rig count showed that, week on week, the Permian basin dropped one rig. Canada’s total rig count of 181 is made up of 123 oil rigs and 58 gas rigs, Baker Hughes pointed out. Week on week, the country’s oil rig count increased by three, its gas rig

Read More »

China Plan to Scale Back Refining Seen Taking 5 Years

China’s plan to scale back the oil-refining industry as part of a broader push against overcapacity and excessive competition could take half a decade to make an impact, according to a major independent processor. The initiative could require three to five years to phase out about 100 million tons of refining capacity, Li Xinhua, global head of trading at Rongsheng Petrochemical Co., said at APPEC by S&P Global Commodity Insights. “It will take quite a long time,” Li told the industry conference in Singapore. Policymakers in Asia’s largest economy have been pressing on with a so-called anti-involution campaign to ease the strain on companies from aggressive competition, tackle deflation, and support economic growth. In oil refining and petrochemicals, there’s a drive to phase out smaller facilities and upgrade outdated plants, while redirecting investment toward advanced materials. Capacity reductions involve negotiations between central and local governments, and these take time as they impact employment, which is a pertinent factor given China’s still-shaky property market, Li said on Tuesday. The impact of trade tariffs, meanwhile, is that oil prices no longer reflect market dynamics, harming the financial benefit for Chinese refiners, according to Li. “That will encourage us to lower the output of oil products, while they will encourage more petrochemical production,” he said. What do you think? We’d love to hear from you, join the conversation on the Rigzone Energy Network. The Rigzone Energy Network is a new social experience created for you and all energy professionals to Speak Up about our industry, share knowledge, connect with peers and industry insiders and engage in a professional community that will empower your career in energy.

Read More »

Thailand to Proceed with First CCS Project

PTT Exploration and Production Public Co Ltd announced Monday a positive FID (final investment decision) for Thailand’s first carbon capture and storage (CCS) development, targeting first injection in 2028. Earmarked THB 10 billion ($320 million) over five years, the Arthit field CCS project is planned to have a carbon dioxide storage capacity of one million metric tons a year, to be gradually achieved. The project will leverage existing infrastructure at the producing field and build new facilities. It will not impact Arthit’s natural gas production, according to the state-owned oil and gas company. “PTTEP has conducted extensive studies and assessments in preparation for the project, covering the entire process – from storage site selection with reservoirs at depths of 1,000-2,000 meters, to engineering design, and comprehensive measurement, monitoring and verification programs”, PTTEP said in a statement on its website. “The Arthit CCS Project has been endorsed as one of Flagship Projects for greenhouse gas emissions reduction under the NDC Action Plan, with an approved direction for the government to consider measures and granting of appropriate investment support such as tax incentives”, PTTEP said. The Southeast Asian country aims to cut its greenhouse gas emissions by 30 percent relative to 2025 by 2030, according to its Nationally Determined Contribution (NDC) submitted to the United Nations, updated 2022. “The Arthit CCS Project will serve as a pilot for cultivating expertise and driving CCS adoption in Thailand, including the Eastern CCS Hub in the Northern Gulf of Thailand, which has potential to make material contribution to the country’s Net Zero target and enhance the nation’s long-term economic competitiveness”, said chief executive Montri Rawanchaikul. In December 2023 PTTEP signed an agreement to conduct a study with the Japan Organization for Metals and Energy Security (JOGMEC) and INPEX Corp on the carbon storage potential of the

Read More »

Google adds Gemini to its on-prem cloud for increased data protection

Google has announced the general availability of its Gemini artificial intelligence models on Google Distributed Cloud (GDC), making its generative AI product available on enterprise and government data centers. GDC is an on-premises implementation of Google Cloud, aimed at heavily regulated industries like medical and financial services to bring Google Cloud services within company firewalls rather than the public cloud. The launch of Gemini on GDC allows organizations with strict data residency and compliance requirements to deploy generative AI without compromising control over sensitive information. GDC uses Nvidia Hopper and Blackwell 0era GPU accelerators with automated load balancing and zero-touch updates for high availability. Security features include audit logging and access control capabilities that provide full transparency for customers. The platform also features Confidential Computing support for both CPUs (with Intel TDX) and GPUs (with Nvidia’s confidential computing) to secure sensitive data and prevent tampering or exfiltration.

Read More »

Nvidia networking roadmap: Ethernet, InfiniBand, co-packaged optics will shape data center of the future

Nvidia is baking into its Spectrum-X Ethernet platform a suite of algorithms that can implement networking protocols to allow Spectrum-X switches, ConnectX-8 SuperNICs, and systems with Blackwell GPUs to connect over wider distances without requiring hardware changes. These Spectrum-XGS algorithms use real-time telemetry—tracking traffic patterns, latency, congestion levels, and inter-site distances—to adjust controls dynamically. Ethernet and InfiniBand Developing and building Ethernet technology is a key part of Nvidia’s roadmap. Since it first introduced Spectrum-X in 2023, the vendor has rapidly made Ethernet a core development effort. This is in addition to InfiniBand development, which is still Nvidia’s bread-and-butter connectivity offering. “InfiniBand was designed from the ground up for synchronous, high-performance computing — with features like RDMA to bypass CPU jitter, adaptive routing, and congestion control,” Shainer said. “It’s the gold standard for AI training at scale, connecting more than 270 of the world’s top supercomputers. Ethernet is catching up, but traditional Ethernet designs — built for telco, enterprise, or hyperscale cloud — aren’t optimized for AI’s unique demands,” Shainer said. Most industry analysts predict Ethernet deployment for AI networking in enterprise and hyperscale deployments will increase in the next year; that makes Ethernet advancements a core direction for Nvidia and any vendor looking to offer AI connectivity options to customers. “When we first initiated our coverage of AI back-end Networks in late 2023, the market was dominated by InfiniBand, holding over 80% share,” wrote Sameh Boujelbene, vice president of Dell ’Oro Group, in a recent report. “Despite its dominance, we have consistently predicted that Ethernet would ultimately prevail at scale. What is notable, however, is the rapid pace at which Ethernet gained ground in AI back-end networks. As the industry moves to 800 Gbps and beyond, we believe Ethernet is now firmly positioned to overtake InfiniBand in these high-performance deployments.”

Read More »

Inside the AI-optimized data center: Why next-gen infrastructure is non-negotiable

How are AI data centers different from traditional data centers? AI data centers and traditional data centers can be physically similar, as they contain hardware, servers, networking equipment, and storage systems. The difference lies in their capabilities: Traditional data centers were built to support general computing tasks, while AI data centers are specifically designed for more sophisticated, time and resource-intensive workloads. Conventional data centers are simply not optimized for AI’s advanced tasks and necessary high-speed data transfer. Here’s a closer look at their differences: AI-optimized vs. traditional data centers Traditional data centers: Handle everyday computing needs such as web browsing, cloud services, email and enterprise app hosting, data storage and retrieval, and a variety of other relatively low-resource tasks. They can also support simpler AI applications, such as chatbots, that do not require intensive processing power or speed. AI data centers: Built to compute significant volumes of data and run complex algorithms, ML and AI tasks, including agentic AI workflows. They feature high-speed networking and low-latency interconnects for rapid scaling and data transfer to support AI apps and edge and internet of things (IoT) use cases. Physical infrastructure Traditional data centers: Typically composed of standard networking architectures such as CPUs suitable for handling networking, apps, and storage. AI data centers: Feature more advanced graphics processing units (GPU) (popularized by chip manufacturer Nvidia), tensor processing units (TPUs) (developed by Google), and other specialized accelerators and equipment. Storage and data management Traditional data centers: Generally, store data in more static cloud storage systems, databases, data lakes, and data lakehouses. AI data centers: Handle huge amounts of unstructured data including text, images, video, audio, and other files. They also incorporate high-performance tools including parallel file systems, multiple network servers, and NVMe solid state drives (SSDs). Power consumption Traditional data centers: Require robust cooling

Read More »

From Cloud to Concrete: How Explosive Data Center Demand is Redefining Commercial Real Estate

The world will generate 181 ZB of data in 2025, an increase of 23.13% year over year and 2.5 quintillion bytes (a quintillion byte is also called an exabyte, EB) created daily, according to a report from Demandsage. To put that in perspective: One exabyte is equal to 1 quintillion bytes, which is 1,000,000,000,000,000,000 bytes. That’s 29 TB every second, or 2.5 million TB per day. It’s no wonder data centers have become so crucial for creating, consuming, and storing data — and no wonder investor interest has skyrocketed.  The surging demand for secure, scalable, high-performance retail and wholesale colocation and hyperscale data centers is spurred by the relentless, global expansion of cloud computing and demand for AI as data generation from businesses, governments, and consumers continues to surge. Power access, sustainable infrastructure, and land acquisition have become critical factors shaping where and how data center facilities are built.  As a result, investors increasingly view these facilities not just as technology assets, but as a unique convergence of real estate, utility infrastructure, and mission-critical systems. Capitalizing on this momentum, private equity and real estate investment firms are rapidly expanding into the sector through acquisitions, joint ventures, and new funds—targeting opportunities to build and operate facilities with a focus on energy efficiency and scalability.

Read More »

Ai4 2025 Navigates Rapid Change in AI Policy, Education

The pace of innovation in artificial intelligence is fundamentally reshaping the landscape of education, and the changes are happening rapidly. At the forefront of this movement stand developers, policy makers, educational practitioners, and associated experts at the recent Ai4 2025 conference (Aug. 11-13) in Las Vegas, where leading voices such as Geoffrey Hinton “The Godfather of AI,” top executives from Google and U.S. Bank, and representatives from multiple government agencies gathered to chart the future of AI development. Importantly, educators and academic institutions played a central role, ensuring that the approach to AI in schools is informed by those closest to the classroom. Key discussions at Ai4 and recent educator symposia underscored both the promise and peril of swift technological change. Generative AI, with its lightning-fast adoption since the advent of tools like ChatGPT, is opening new possibilities for personalized learning, skills development, and operational efficiency. But participants were quick to note that acceleration brings good and bad consequences. On one hand, there’s excitement about practical classroom implementations and the potential for students to engage with cutting-edge technology. On the other, concerns about governance, ethics, safety, and the depth of genuine learning remain at the forefront. This urgency to “do this right” is echoed by teachers, unions, and developers who are united by the challenges and opportunities on the ground. Their voices highlight the need for agreement on education policy and associated regulations to keep pace with technological progress, create frameworks for ethical and responsible use, and ensure that human agency remains central in shaping the future of childhood and learning. In this rapidly evolving environment, bringing all stakeholders to the table is no longer optional; it is essential for steering AI in education toward outcomes that benefit both students and society. Global Context: America, China, and the AI Race

Read More »

Two Lenses on One Market: JLL and CBRE Show Data Centers in a Pinch

The two dominant real estate research houses, JLL and CBRE, have released midyear snapshots of the North American data center market, and both paint the same picture in broad strokes: demand remains insatiable, vacancy has plunged to record lows, and the growth of AI and hyperscale deployments is reshaping every aspect of the business. But their lenses capture different angles of the same story: one emphasizing preleasing and capital flows, the other highlighting hyperscale requirements and regional shifts. Vacancy Falls Through the Floor JLL sets the stage with a stark headline: colocation vacancy is nearing 0%. The JLL Midyear 2025 North America Data Center report warns that this scarcity “is constraining economic growth and undermining national security,” underscoring the role of data centers as critical infrastructure. CBRE’s North American Data Center Trends H1 2025 numbers back this up, recording an all-time low North America vacancy rate of 1.6%, the tightest in more than a decade. Both agree that market loosening is years away — JLL projecting vacancy hovering around 2% through 2027, CBRE noting 74.3% of new capacity already spoken for. The takeaway seems clear: without preleasing, operators and tenants alike are effectively shut out of core markets. Absorption and Preleasing Drive Growth JLL drills down into the mechanics. With virtually all absorption tied to preleasing, the firm points to Northern Virginia (647 MW) and Dallas (575 MW) as the twin engines of growth in H1, joined by Chicago, Austin/San Antonio, and Atlanta. CBRE’s absorption math is slightly different, but the conclusion aligns: Northern Virginia again leads the nation, with 538.6 MW net absorption and a remarkable 80% surge in under-construction capacity. CBRE sharpens the view by noting that the fiercest competition is at the top end: single-tenant requirements of 10 MW or more are setting pricing records as hyperscalers

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »