Stay Ahead, Stay ONMINE

Google adds more AI tools to its Workspace productivity apps

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Google continues to bring its flagship AI models to its productivity apps, expanding its Gemini features.  The company today announced several updates to its Workspace products, including the addition of Audio Overviews and new streamlined methods […]

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


Google continues to bring its flagship AI models to its productivity apps, expanding its Gemini features. 

The company today announced several updates to its Workspace products, including the addition of Audio Overviews and new streamlined methods for tracking meetings. 

Audio Overviews, which was first introduced in Google’s popular NotebookLM, allows people to create podcasts on their chosen research topic. 

Now, through Gemini, users can create audio files based on uploaded documents and slides. They can also generate audio overviews within deep research reports. These podcast-style audio files are downloadable. Audio Overview generates voices and grounds its discussions solely on the provided documents. 

Google previously told VentureBeat that its tests showed some people prefer learning through listening, where information is presented in a conversational format. 

The company also launched a new feature called Canvas in Gemini, which lets people create drafts and refine text or code using the Gemini model. Google said Canvas helps “generate, optimize and preview code.” Canvas documents can be shared with Google Docs. 

Updated calendars

Google also streamlined how users can add events and meetings to their calendars. Gemini will detect if an email contains details of events and can prompt people to add it to their calendar. The model will surface emails with potential appointments if the user misses them. 

Some plug-ins for Google, such as Boomerang, offer similar features that display appointments above the subject line. The Gemini-powered calendar feature will open a Gemini chat window alerting the user of the event. 

Pointing AI models to surface data or events from emails has become a cornerstone of enterprise AI assistants and agents. Microsoft’s new agents parse through emails for input. Startup Martin AI has an AI assistant that manages calendars, emails and to-do lists.  

Melding generative AI with productivity

Google added Gemini chat to Workspace last year to integrate the standalone chat platform with Gmail, Google Docs and Calendars. This brought Google closer to Microsoft’s Copilot, which added AI models to its productivity platforms, including Outlook.  

Enterprises continue to add AI features to the workplace, and it’s possible that if their employees regularly access things like Gemini on their Gmails and use AI models for research, AI adoption rates may be even higher. 

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Linkerd 2.18 advances cloud-native service mesh

The project’s focus has evolved significantly over the years. While early adoption centered on mutual TLS between pods, today’s enterprises are tackling much larger challenges. “For a long time, the most common pattern was simply, ‘I want to get mutual TLS between all my pods, which gives me encryption, and

Read More »

18 essential commands for new Linux users

[jdoe@fedora ~]$ ls -ld /home/jdoedrwx——. 1 jdoe jdoe 106 Apr 3 14:39 /home/jdoe As you may have suspected, “r” stands for read, “w” means write and “x” is for execute. Note that no permissions are available for other group members and anyone else on the system. Each user will be

Read More »

What are GPUs? Inside the processing power behind AI

AI and generative AI Today’s increasingly sophisticated AI technologies — notably large language models (LLMs) and generative AI — require lots of speed, lots of data and lots of compute. Because they can perform simultaneous calculations and handle vast amounts of data, GPUs have become the powerhouse behind AI (e.g.,

Read More »

Cerulean Winds moving towards submitting key Aspen document

Cerulean Winds has moved toward submitting the environmental impact assessment (EIA) for its 1GW Aspen floating offshore wind farm. This follows the completion of aerial surveys of the planned site in the Central North Sea. The developer completed 72 digital aerial surveys (DAS) for both Aspen and its sister projects, Beech and Cedar. This saw ecological data collected from across around 1,077 square miles (2,784 sq km) of ocean. The work was performed by HiDef Surveying using fixed-wing aircraft, taking place between April 2023 and March 2025. HiDef Surveying CEO Nicholas Tompkins said: “Completing 72 digital aerial surveys across these sites marks a significant milestone in offshore wind development. “This large-scale data collection effort supports comprehensive environmental assessments, aiding the consenting process and ensuring sustainable project planning. “By using ultra-high-resolution digital surveying technology, we have achieved precise species identification while minimising environmental impact – reinforcing our commitment to innovation in offshore monitoring.” Environmental impact assessment Having completed the survey stage, Cerulean Winds is now set to submit its EIA for the Aspen project. In addition, the data gathered during the surveys will be used in the assessments for Beech and Cedar’s consenting phase. The findings will also help optimise the project’s designs, such as determining turbine placement and infrastructure planning. Cerulean Winds founding director Dan Jackson: “The completion of this best-in-class work, delivered at scale, is another milestone in the development of the UK’s floating offshore wind industry. The HiDef team have provided the accurate data we need to make our projects as sustainable as possible. “We’re now looking ahead, alongside our consortium of partners, to developing the project and capturing the benefits of a UK-based FLOW industry.” © Supplied by Cerulean WindsDan Jackson, Co-Founder and Director, Cerulean Winds. Aspen, Cedar and Beech, which have over 3GW of capacity between

Read More »

GB Energy receives £300m for offshore wind supply chains

GB Energy will receive £300 million of funding to invest in domestic supply chains for the UK’s offshore wind sector. The new investment aims to boost jobs, secure manufacturing facilities for critical clean energy supply chains like floating offshore platforms, and according to the UK government, could “directly and indirectly mobilise billions in additional private investment”. The latest funding forms part of the previously announced £8.3 billion for GB Energy over the current parliament, with individual companies able to apply for grants if they can show that they will produce long-term investments in UK supply chains. The fast-tracked funding is being brought forward by prime minister Keir Starmer ahead of the Comprehensive Spending Review. The public investment complements the £43 billion of private investment pledged for clean energy projects since July. Starmer said: “Delivering the Plan for Change means winning the race for the clean energy jobs of the future, which will drive growth and help us reach clean power by 2030. “That is why I am bringing forward much-needed investment in our domestic offshore wind supply chains, strengthening our security and creating good jobs for our welders, electricians, and engineers. “Let my message to the world go out: come and build the clean energy future in Britain.” GB Energy The funding forms part of the government’s modern Industrial Strategy and aims to help build the supply chains for components which are essential to delivering clean power by 2030. His comments come as Starmer, along with energy secretary Ed Miliband and GB Energy chairman Juergen Maier will write to global clean energy developers and investors calling on them to invest in the UK’s economy. Energy secretary Ed Miliband added: “It is only by taking back control of our energy that we can protect families and businesses from the rollercoaster of

Read More »

IEA Chief Says Oil Prices May Fall Further on China Slowdown

Oil prices may decline further this year as new production swells and demand remains capped by China’s faltering growth, the head of the International Energy Agency said. While crude futures have recovered over the past two weeks to trade near $68 a barrel on London, they remain roughly 9% below levels traded before President Donald Trump announced a blizzard of tariffs on China and other nations on April 2.  The IEA sees “slow demand growth in the markets, mainly driven by what is happening in China,” Executive Director Fatih Birol said in an interview with Bloomberg television. “If there are no other surprises, we may expect oil prices to see a downward pressure further.” There’s still lots of uncertainty and a “change in the trade war’s context in a positive direction may increase the global economic outlook, and we may see oil demand slightly higher than what we have now,” Birol said. It’s also hard to predict a direction for Iran’s oil exports amid the country’s talks with the Trump administration, he said.  Birol is visiting London as the IEA and the UK government convene a two-day summit of more than 60 countries and numerous companies to discuss the future of energy security. The Paris-based agency was founded by the US and other oil—importing nations in the 1970s to monitor energy issues. Last year, the IEA — which has predicted that global oil demand will stop growing this decade — drew criticism from some US Republican lawmakers, who accused the agency of focusing on the transition to clean energy at the expense of its core mandate on supply security. Birol said the agency works with the Trump administration and other member nations “in a respectful way,” and that it continues to forecast that oil demand growth will “slow down considerably

Read More »

Crude Falls Despite Trade Talk Optimism

Oil fell after a Reuters report that some OPEC+ members will seek an accelerated output increase overshadowed an optimistic turn in trade talks between the US and China. West Texas Intermediate futures sank 2.2% to settle just above $62 a barrel. The report compounded pressure from an earlier item from the news service that Kazakhstan’s newly appointed energy minister said the country is unable to reduce production at its three largest projects, which was later walked back. The commodity pared some losses on a Wall Street Journal report that US tariffs on China could be cut by more than half in some cases. Treasury Secretary Scott Bessent later told reporters that President Donald Trump hasn’t offered to take down US tariffs on China on a unilateral basis and that a full trade deal with the country may take two to three years. Trump’s earlier comments that he doesn’t plan to fire Federal Reserve Chair Jerome Powell spurred a relief rally in stocks and provided a floor to crude prices. Oil has declined sharply this month, touching a four-year low at one point, driven by investors’ fears that the onslaught of tariffs and counter-levies between the US and its biggest trading partners will sap crude demand. The drop has been compounded by the OPEC+ alliance’s decision to bring back production at a faster-than-expected pace, reviving concerns about a supply glut. That move was designed to keep perennial overproducers like Kazakhstan in line with their targets, and Saudi Arabia’s energy minister said at the time the hike would be just an “aperitif” if those countries didn’t improve their performance. “It’s kind of hard to put the toothpaste back in the tube on this one,” said Rebecca Babin, a senior energy trader at CIBC Private Wealth Group, in regards to Kazakhstan’s recanting of

Read More »

Enphase to absorb bulk of China tariff hit this year: CEO

Dive Brief: Enphase Energy expects to absorb most of the impact of the Trump administration’s China tariffs this year as it works to line up non-China battery cell supplies by early 2026, CEO Badri Kothandaraman said Tuesday on the company’s first quarter earnings call.  Though Enphase could raise battery prices by 6% to 8% later this year, it plans to bear the brunt of triple-digit duties on cells and other battery materials imported from China, which Kothandaraman said accounts for 90% to 95% of global battery cell supply. Enphase reported a 13% decline in U.S. revenue from Q4 2024 due to seasonality and softening demand, it said, amid broader uncertainty around U.S. trade policy and the fate of U.S. tax credits that benefit domestic battery manufacturers and installers. Dive Insight: Enphase’s geographically diversified manufacturing base provides some tariff protection for non-battery products, such as microinverters and electric vehicle charging equipment, Kothandaraman said on the call. Its battery business does face significant cost increases due to China’s dominance of the battery supply chain, however. Though the company makes about 25% of its batteries in the United States and plans to further increase that share, it remains reliant on China-made cells for now, Kothandaraman said. U.S. battery distributors and energy storage developers were already bracing for higher import duties on Chinese inputs thanks to an expected increase in tariffs imposed during the Biden administration — but the 145% duty on a range of Chinese imports far exceeds the double-digit tariffs Trump threatened during the 2024 campaign. Administration officials suggested this week that China tariffs could decline to 50% to 65% in the near term without offering details on the timing or scope of the potential change. Looking ahead, Enphase must weigh the impacts of import duties against the higher cost of U.S.

Read More »

Aberdeen MP calls for support for North Sea oil and gas workers

Calls have been made for the UK government to provide transitional support for North Sea oil and gas workers by a north-east Scotland MP. Leading a debate in Westminster, MP for Aberdeen north Kirsty Blackman urged the UK government to deliver a clear and credible plan to protect jobs and communities during the energy transition. “As of 2021, direct employment in oil and gas in Aberdeen has declined by nearly one-third since 2015,” Blackman said, citing evidence from the UK’s seventh carbon budget. “Household disposable income has fallen and poverty has increased… some estimates indicate that around 14,000 people in the region will need to have moved to other roles or sectors between 2022 and 2030.” The fate of North Sea oil and gas workers, along with those in services reliant upon extraction, is a key challenge of the UK’s energy transition. The Labour Party previously made banning future North Sea oil and gas licences a key part of its election campaign as it looks to move the country away from fossil fuels towards renewables. Aberdeen North MP, Kirsty Blackman.Photo: PARBUL/PA Wire Addressing the chamber, the SNP MP warned that political uncertainty and a lack of investment are threatening to derail the UK’s energy ambitions, with skilled workers increasingly looking overseas for opportunities. “We are at a tipping point,” Blackman added. “The risk is that these highly mobile, highly paid oil and gas workers will go abroad. They can up sticks and move to another country, because drilling is the same there—even if the carbon cost is higher and conditions are worse.” Support plan Blackman previously gave her backing to Unite the Union’s ‘No Ban Without a Plan’ campaign to preserve oil worker jobs throughout the transition The trade union launched the campaign to create 35,000 commensurate new energy transition

Read More »

Cloudbrink pushes SASE boundaries with 300 Gbps data center throughput

Those core components are functionally table stakes and don’t really serve to differentiate Cloudbrink against its myriad competitors in the SASE market. Where Cloudbrink looks to differentiate is at a technical level through a series of innovations including: Distributed edge architecture: The company has decoupled software from hardware, allowing their platform to run across 800 data centers by leveraging public clouds, telco networks and edge computing infrastructure. This approach reduces network latency from 300 milliseconds to between 7 and 20 milliseconds, the company says. This density dramatically improves TCP performance and responsiveness. Protocol optimization: Cloudbrink developed its own algorithms for SD-WAN optimization that bring enterprise-grade reliability to last mile links. These algorithms significantly improve efficiency on consumer broadband connections, enabling enterprise-grade performance over standard internet links. Integrated security stack: “We’ve been able to produce secure speeds at line rate on our platform by bringing security to the networking stack itself,” Mana noted. Rather than treating security as a separate overlay that degrades performance, Cloudbrink integrates security functions directly into the networking stack. The solution consists of three core components: client software for user devices, a cloud management plane, and optional data center connectors for accessing internal applications. The client intelligently connects to multiple edge nodes simultaneously, providing redundancy and application-specific routing optimization. Cloudbrink expands global reach Beyond its efforts to increase throughput, Cloudbrink is also growing its global footprint. Cloudbrink today announced a global expansion through new channel agreements and the opening of a Brazil office to serve emerging markets in Latin America, Korea and Africa. The expansion includes exclusive partnerships with WITHX in Korea, BAMM Technologies for Latin America distribution and OneTic for African markets. The company’s software-defined FAST (Flexible, Autonomous, Smart and Temporary) Edges technology enables rapid deployment of points of presence by leveraging existing infrastructure from multiple

Read More »

CIOs could improve sustainability with data center purchasing decisions — but don’t

CIOs can drive change Even though it’s difficult to calculate an organization’s carbon footprint, CIOs and IT purchasing leaders trying to reduce their environmental impact can influence data center operators, experts say. “Customers have a very large voice,” Seagate’s Feist says. “Don’t underestimate how powerful that CIO feedback loop is. The large cloud accounts are customer-obsessed organizations, so they listen, and they react.” While DataBank began using renewable energy years ago, customer demand can push more data center operators to follow suit, Gerson says. “For sure, if there is a requirement to purchase renewable power, we are going to purchase renewable power,” she adds.

Read More »

Copper-to-optics technology eyed for next-gen AI networking gear

Broadcom’s demonstration and a follow-up session explored the benefits of further developing CPC, such as reduced signal integrity penalties and extended reach, through channel modeling and simulations, Broadcom wrote in a blog about the DesignCon event. “Experimental results showed successful implementation of CPC, demonstrating its potential to address bandwidth and signal integrity challenges in data centers, which is crucial for AI applications,” Broadcom stated. In addition to the demo, Broadcom and Samtec also authored a white paper on CPC that stated: “Co-packaged connectivity (CPC) provides the opportunity to omit loss and reflection penalties from the [printed circuit board (PCB)] and the package. When high speed I/O is cabled from the top of the package advanced PCB materials are not necessary. Losses from package vertical paths and PCB routing can be transferred to the longer reach of cables,” the authors stated. “As highly complex systems are challenged to scale the number of I/O and their reach, co- packaged connectivity presents opportunity. As we approach 224G-PAM4 [which uses optical techniques to support 224 Gigabits per second data rates per optical lane] and above, system loss and dominating noise sources necessitate the need to re-consider that which has been restricted in the back of the system architect’s mind for years: What if we attached to the package?” At OFC, Samtec demonstrated its Si-FlyHD co-packaged cable assemblies and Samtec FlyoverOctal Small Form-factor Pluggable (OSFP) over the Samtec Eye Speed Hyper Low Skew twinax copper cable. Flyover is Samtec’s proprietary way of addressing signal integrity and reach limitations of routing high-speed signals through traditional printed circuit boards (PCBs). “This evaluation platform incorporates Broadcom’s industry-leading 200G SerDes technology and Samtec’s co-packaged Flyover technology. Si-Fly HD CPC offers the industry’s highest footprint density and robust interconnect which enables 102.4T (512 lanes at 200G) in a 95 x

Read More »

The Rise of AI Factories: Transforming Intelligence at Scale

AI Factories Redefine Infrastructure The architecture of AI factories reflects a paradigm shift that mirrors the evolution of the industrial age itself—from manual processes to automation, and now to autonomous intelligence. Nvidia’s framing of these systems as “factories” isn’t just branding; it’s a conceptual leap that positions AI infrastructure as the new production line. GPUs are the engines, data is the raw material, and the output isn’t a physical product, but predictive power at unprecedented scale. In this vision, compute capacity becomes a strategic asset, and the ability to iterate faster on AI models becomes a competitive differentiator, not just a technical milestone. This evolution also introduces a new calculus for data center investment. The cost-per-token of inference—how efficiently a system can produce usable AI output—emerges as a critical KPI, replacing traditional metrics like PUE or rack density as primary indicators of performance. That changes the game for developers, operators, and regulators alike. Just as cloud computing shifted the industry’s center of gravity over the past decade, the rise of AI factories is likely to redraw the map again—favoring locations with not only robust power and cooling, but with access to clean energy, proximity to data-rich ecosystems, and incentives that align with national digital strategies. The Economics of AI: Scaling Laws and Compute Demand At the heart of the AI factory model is a requirement for a deep understanding of the scaling laws that govern AI economics. Initially, the emphasis in AI revolved around pretraining large models, requiring massive amounts of compute, expert labor, and curated data. Over five years, pretraining compute needs have increased by a factor of 50 million. However, once a foundational model is trained, the downstream potential multiplies exponentially, while the compute required to utilize a fully trained model for standard inference is significantly less than

Read More »

Google’s AI-Powered Grid Revolution: How Data Centers Are Reshaping the U.S. Power Landscape

Google Unveils Groundbreaking AI Partnership with PJM and Tapestry to Reinvent the U.S. Power Grid In a move that underscores the growing intersection between digital infrastructure and energy resilience, Google has announced a major new initiative to modernize the U.S. electric grid using artificial intelligence. The company is partnering with PJM Interconnection—the largest grid operator in North America—and Tapestry, an Alphabet moonshot backed by Google Cloud and DeepMind, to develop AI tools aimed at transforming how new power sources are brought online. The initiative, detailed in a blog post by Alphabet and Google President Ruth Porat, represents one of Google’s most ambitious energy collaborations to date. It seeks to address mounting challenges facing grid operators, particularly the explosive backlog of energy generation projects that await interconnection in a power system unprepared for 21st-century demands. “This is our biggest step yet to use AI for building a stronger, more resilient electricity system,” Porat wrote. Tapping AI to Tackle an Interconnection Crisis The timing is critical. The U.S. energy grid is facing a historic inflection point. According to the Lawrence Berkeley National Laboratory, more than 2,600 gigawatts (GW) of generation and storage projects were waiting in interconnection queues at the end of 2023—more than double the total installed capacity of the entire U.S. grid. Meanwhile, the Federal Energy Regulatory Commission (FERC) has revised its five-year demand forecast, now projecting U.S. peak load to rise by 128 GW before 2030—more than triple the previous estimate. Grid operators like PJM are straining to process a surge in interconnection requests, which have skyrocketed from a few dozen to thousands annually. This wave of applications has exposed the limits of legacy systems and planning tools. Enter AI. Tapestry’s role is to develop and deploy AI models that can intelligently manage and streamline the complex process of

Read More »

Podcast: Vaire Computing Bets on Reversible Logic for ‘Near Zero Energy’ AI Data Centers

The AI revolution is charging ahead—but powering it shouldn’t cost us the planet. That tension lies at the heart of Vaire Computing’s bold proposition: rethinking the very logic that underpins silicon to make chips radically more energy efficient. Speaking on the Data Center Frontier Show podcast, Vaire CEO Rodolfo Rossini laid out a compelling case for why the next era of compute won’t just be about scaling transistors—but reinventing the way they work. “Moore’s Law is coming to an end, at least for classical CMOS,” Rossini said. “There are a number of potential architectures out there—quantum and photonics are the most well known. Our bet is that the future will look a lot like existing CMOS, but the logic will look very, very, very different.” That bet is reversible computing—a largely untapped architecture that promises major gains in energy efficiency by recovering energy lost during computation. A Forgotten Frontier Unlike conventional chips that discard energy with each logic operation, reversible chips can theoretically recycle that energy. The concept, Rossini explained, isn’t new—but it’s long been overlooked. “The tech is really old. I mean really old,” Rossini said. “The seeds of this technology were actually at the very beginning of the industrial revolution.” Drawing on the work of 19th-century mechanical engineers like Sadi Carnot and later insights from John von Neumann, the theoretical underpinnings of reversible computing stretch back decades. A pivotal 1961 paper formally connected reversibility to energy efficiency in computing. But progress stalled—until now. “Nothing really happened until a team of MIT students built the first chip in the 1990s,” Rossini noted. “But they were trying to build a CPU, which is a world of pain. There’s a reason why I don’t think there’s been a startup trying to build CPUs for a very, very long time.” AI, the

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »