Stay Ahead, Stay ONMINE

Prompt AI’s Seemour launches visual intelligence platform for the home

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Prompt AI, a smart home and visual intelligence research and technology company, has launched Seemour, a visual intelligence platform for the home. Founded by leading scientists and pioneers of computer vision, Seemour is designed to understand, […]

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


Prompt AI, a smart home and visual intelligence research and technology company, has launched Seemour, a visual intelligence platform for the home.

Founded by leading scientists and pioneers of computer vision, Seemour is designed to understand, describe, and act on what it sees in real time when connected to a home camera.

Seemour can summarize moments or hours of video footage, learn the names of family, friends, and pets, and even inform you which delivery service is at your door. Seemour is like if your home could talk, helping you feel safer and more connected to what’s happening around you, said Tete Xiao, CEO of Seemour, in an interview with GamesBeat.

Xiao started the company with fellow doctorate candidates and a professor from the University of California at Berkeley. (Go Bears). They started thinking about the combo of home cameras, which proliferated during the pandemic, and the explosion of AI and started the company in late 2023.

“We started the company to focus on visual intelligence, with streaming cameras as our first product,” Xiao said. “Seemour works with videos from streaming cameras so users don’t actually have to upload [and edit] these videos. They just need to have cameras already installed in their houses, which a lot of them do. And then whenever these cameras have some things going on, Seemour will actually leverage the video streams and understand this information for them.”

How to use Seemour

Seemour can find patterns in your videos.
Seemour can find patterns in your videos.

You can download Seemour today for free from the App Store or visit Seemour.ai for more information.
Seemour’s visual intelligence can reduce unnecessary camera notifications by as much as 70%. It harnesses the power of large vision-language models and machine learning to summarize multiple on-camera events into a single, easy-to-understand update.

This intelligence also allows Seemour to review hours of footage and understand what’s happening.
In an upcoming update, users will soon be able to ask Seemour questions, saving time and gaining insight.

“Imagine a future where you can ask your home what happened today or inform your roommate that you’ve stepped out to go to the grocery store when they open the fridge,” said Xiao. “That future is closer than you think, and we’re excited to bring it to you.”

Seemour is an app that works with the feeds from the security cameras, and the company is actively working on bringing more types of cameras into its system to leverage what is already in homes. The first adopters will be those who already use digital home security cameras.

The consumer reaction to tests has been good.

“When they first try Seymour, they’re very excited to actually see how it fundamentally changes their relationships with the cameras. You can be proactive, because Seemour delivers a contact notification within tens of seconds so you can get pretty much what’s going on in real time. You don’t even need to watch videos. You can just get every information.”

The types of cameras can range from outdoor-facing cameras, like front-door cameras, as well as petcams or baby cams inside the house.

“I can actually put the camera next to my cat’s food bowl. I don’t have the time to watch the videos, but now Seemour can watch the videos for me. We can actually to tell them what the cat is doing and also piece together what the cat’s day is like.”

If you’re concerned about housemates spying on each other, that’s probably a discussion to have before putting in cameras into a place in the first place. And regarding privacy of the content, Seemour said the tech is built with privacy and security at its core.

“Any data we store is protected using industry-leading security protocols, ensuring that your information remains private and under your control. We also use industry-leading authentication services and Google Cloud Platform, which is recognized for its industry-leading security and reliability,” the company said. “And we don’t stop there—unlike other platforms that quietly absorb user data, Seemour keeps you in control. Your data stays yours, and we never use it to train models without your explicit permission. Our goal isn’t just to secure your information—but to ensure you own it.”

Key features include:

Seemour device discovery tab.

● Intelligent Video Summaries. Get easy-to-read summaries of significant events captured by your cameras, saving you time and hassle, and never miss a moment.

● Personalized Notifications. Seemour learns and remembers familiar people, allowing you to label them for more personalized notifications.

● Specific Pet Notifications. Teach Seemour the names of all your dogs and/or cats, and Seemour can tell you which pet got into the trash.

● 70% Fewer Alerts. Stay informed without overload. Receive notifications only for the moments that matter, from familiar people and pets to potential intrusions.

● Suspicious Behavior Alerts. Seemour can instantly alert you if it sees unusual or suspicious activity, making your home safer and more secure.

● Delivery Service Announcements. Get instant notifications for who’s at your door, whether it’s FedEx, Amazon, or the mailman.

● Wildlife Detection. Capture and enjoy the beauty of nature with alerts for wildlife like birds, deer, or bears spotted on your property—helping you stay aware and connected to the world outside.

● Seamless Integration. See your camera feeds in one platform, with more integrations coming soon.

● Ask Seemour (coming soon). No need to review hours of footage just to know when something happened. Save time and gain insights by asking Seemour questions like, ‘Where did I leave my keys?’ or, ‘When did my daughter leave home today?’ Seemour reviews your camera footage, offering answers and highlights using naturally spoken language.

● Invisibility (coming soon). Seemour can remove individuals from video footage to protect their privacy.
● Custom Visual Alerts (coming soon). Get alerts when Seemour sees specific things like Fido digging in your backyard, even if it’s never seen that before.

Origins

Seemour makes use of the cameras that are already in our outside your home.

Prompt AI is a pioneering visual intelligence research and technology company based in San Francisco that aims to create human-centered, innovative, and useful products for everyone.

It is cofounded by Xiao and Seth Park, both of whom hold Ph.Ds in Computer Science from UC Berkeley, and Trevor Darrell, a founder and co-leader of Berkeley Artificial Intelligence Research (BAIR) lab, renowned for his significant contributions to the advancement of computer vision and AI research.

Darrow has worked with computer vision for a long time. Xiao has worked in the field for nearly a decade.

“One of the things that I think really fascinating about this technology is the bigger picture vision that it has for all AI. So now what we’re doing is to sort of understand every single video. But the technology can be be much more than just that. We’re we’re working on the feature to allow people to ask questions like, ‘How many times have my candy today?’” Xiao said.

The company is working on how to create a summary of day’s worth of videos from a petcam or something similar. That’s not available yet. Typical videos will be around 30 seconds or longer. One of the goals is to reduce the number of notifications you get in a day from the cameras.

With the market for cameras getting commoditized, better quality cameras with better sensors are available now. Many cameras record in 1080p and some with 2K quality, while some have night vision. The AI system can note a given person or pet and you can identify that person for the videos. Gen AI becomes useful in classifying events and describing them.

The idea is to find information on your behalf and summarize it or organize it for you. If your cat is on a counter, maybe Seemour could play a pre-recorded message from you to tell the cat to get off the corner. If a package is delivered by mistake, you could refuse it at the door.

There are big competitors in the space like Amazon and Apple, which have their own smart hubs for the home. But they haven’t entered this space just yet. Xiao said his team understands computer vision well and it is thinking about agentic features for its visual intelligence products in the future.

“Something we strive for is our reasoning capacity,” Xiao said. “So if three people go into a house and two come out, common sense says that there is still one person there. It’s basic reasoning. I think the technology is ready for that.”

The company has raised $6 million in seed led by prominent investors AIX Ventures and Abstract Ventures, with participation from several renowned angel investors. The team has 11 people.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

New Relic boosts observability platform with AI intelligence

New Relic announced updates to its Intelligent Observability Platform this week, which the company says will enable a unified view of system relationships and dependencies and intelligently connect technical systems with business context. New Relic’s cloud-based observability platform monitors applications and services in real time to provide insights into software,

Read More »

Charging Forward: UK battery storage projects reach startup, grid delays and more

In this week’s Charging Forward, Gore Street, Eku and BW ESS reach energisation at UK battery energy storage system (BESS) projects, amid warnings over an oversubscribed grid connection queue. This week’s headlines: Root-Power secures planning consent for 40 MW Rotherham BESS Sungrow and BW ESS Bramley Project begins operations Warnings over UK grid connection queue Invinity and Frontier Power partner on UK long duration energy storage projects Fire at Statera BESS site in Essex brought under control Gore Street energises UK Enderby BESS project Eku energises two UK BESS projects International news: China and Saudi Arabia collaborate on 12.5 GWh of energy storage projects and Canadian firm Hydrostor secures $200 million for compressed air energy storage Root-Power consent for 40 MW Rotherham BESS UK energy storage developer Root-Power has secured planning consent for a 40 MW/80 MWh BESS project in Brinsworth, Rotherham. Root-Power said the site will power 80,000 homes for a two-hour period once fully operational, and delivering a biodiversity net gain of 32.76%. The Brinsworth BESS is the fourth planning approval for Root-Power in 2025, following consents at sites in Yorkshire, County Durham and the Scottish Highlands. © Supplied by Root-PowerThe site of Root-Power’s 40 MW/80 MWh Brinsworth BESS project in Rotherham. Root-Power managing director Neil Brooks said the company “carefully selected a near perfect location” for the Brinsworth project. “Managing competing constraints is always difficult when planning a project, so finding a suitable location only 1 mile from the point of connection in an urban area, without causing unacceptable noise or visual impact on sensitive receptors is a real achievement,” he said. “We are happy to see that the planning committee unanimously supported our application, which is a real vote of confidence in our process and team.” Sungrow and BW ESS Bramley BESS starts up Swiss energy storage developer

Read More »

Costain secures multi-million pound Sizewell C contract

UK construction and engineering firm Costain (LON:COST) has secured a multi-million pound contract to support the construction of the Sizewell C nuclear power plant. Costain said under the ten-year framework agreement, the company will provide support in areas such as delivery integration, health and safety and quality control. French state-owned energy firm EDF is developing the 3.2 GW nuclear power station, which could provide up to 7% of UK energy needs over its 60-year lifetime. The UK government holds a 76.1% stake in Sizewell C, with EDF holding the remaining 23.9%. Costain defence and nuclear energy sector director Bob Anstey said the Sizewell C project is a “vital part of creating a sustainable future”. “We have a long and successful track record in delivering for our civil nuclear customers, with a highly qualified and experienced workforce that consistently works to the highest safety and quality standards,” Anstey said. “A key part of our role will be to help ensure the project leaves a positive legacy, and we look forward to working closely with Sizewell C on a range of social value and employment initiatives that improve lives and provide long-term benefits to local communities.” Sizewell C Ltd managing director Nigel Cann said the project will “strengthen energy security and provide clean, reliable electricity for millions”. “We welcome Costain to the Sizewell C supplier family,” Cann said. “We are committed to providing thousands of great jobs and career development opportunities and we’re looking forward to working with our suppliers to boost skills, promote a diverse workforce and spread opportunities as widely as possible.” Sizewell C criticism The Sizewell C project has attracted significant criticism amid concerns over its ballooning costs. Earlier this year, campaign group Together Against Sizewell C (TASC) wrote to the National Audit Office calling for a review of

Read More »

Trump’s National Energy Dominance Council should appoint a ‘Hydropower Czar’

Charles Yang is the founder and executive director of the Center for Industrial Strategy. President Trump forming a National Energy Dominance Council in the White House is a key recognition of the importance of energy to industrial policy. Affordable, abundant electricity is the foundation of America’s economic future, powering everything from advanced manufacturing to data centers. The council offers a rare opportunity to develop a unified federal strategy for energy dominance, linking industrial policy and load growth. However, the White House might quickly find how difficult it is for the federal government to unilaterally bring online new generation to meet load growth. The electricity sector’s web of regional operators, state regulators and investor-owned utilities have far more direct control over generation investments than the federal government. While financing vehicles like the Loan Programs Office or smart grid grants can help incentivize investments in the power sector, there are few levers the U.S. government has available to directly meet load growth. One unique exception to this is hydropower: the majority of the hydropower fleet in the U.S. is federally owned and operated, providing about 80 GW of generating capacity, with pumped storage hydropower being responsible for over 90% of the nation’s energy storage capacity.  The federal government’s ownership of hydropower facilities offers a rare chance to fast-track solutions to load growth. Hydropower is also one of the lowest cost, baseload sources of energy generation, which makes it attractive to manufacturers with large loads, like polysilicon producers and data center developers. For instance, Iron Mountain Data Centers and Rye Development signed a 150-MW power purchase agreement, with Rye Development using the PPA financing to power non-powered dams across the mid-Atlantic. The National Energy Dominance Council can leverage the direct federal control over hydropower and dam facilities to help meet load growth through

Read More »

NERC interregional transfer capability study lacks detail to drive transmission upgrades: EIPC

A North American Electric Reliability Corp. study on interregional transfer capability is inadequate for determining how much transmission capacity should be added between regions, according to grid planners across the Eastern Interconnection. NERC’s study — ordered by Congress in 2023 — provides “helpful information” but lacks enough detail to drive actions, the Eastern Interconnection Planning Collaborative said in comments filed with the Federal Energy Regulatory Commission on Monday. In its study filed with FERC in November, NERC found that the U.S. could use an additional 35 GW of transfer capacity between regions to bolster reliability during extreme weather. However, the study fails to adequately consider the costs and benefits of building transmission lines to increase transfer capacity between regions, according to EIPC, which includes ISO New England, the Midcontinent Independent System Operator, the New York ISO, the PJM Interconnection, the Southwest Power Pool and utilities such as Southern Co. “Large nationwide studies, like the [Interregional Transfer Capability Study,] have no way of achieving sufficiently detailed results to effectively weigh the cost/benefit associated with adding transfer capability within or between different regions, or to appropriately assign costs to the true beneficiaries,” the group said.  Transmission planning entities should assess interregional transfer capability needs, according to EIPC. “Those entities have complex models of the system, and they are in the best position to evaluate resource adequacy and transmission security as well as an understanding of enhanced needs due to extreme weather conditions,” the group said. Determining how much transfer capability is needed should be informed by how it could improve system reliability, but also the cost of the upgrades, the ability to assign the costs to beneficiaries and the overall cost/benefit ratio compared with other options, such as generation resource additions, demand side management or operational measures, EIPC said. Further, adding transmission capacity

Read More »

Fortress backs behind-the-meter gas turbines to support AI hyperscalers

Florida-based New APR Energy plans to deploy four mobile gas turbines, together capable of providing more than 100 MW of behind-the-meter power, to an unnamed data center hyperscaler, the company announced Tuesday. It expects to make more such deployments as the electricity demands of artificial intelligence begin to accelerate, officials said. Fortress Investment Group said in January it had acquired APR Energy’s assets, comprising 30 mobile gas-powered turbines with a combined capacity of 850 MW. “We are excited to deploy New APR Energy’s first 100 MW to a U.S.-based data center. This deployment is a good proof point for our investment thesis for behind-the-meter power demand,” said Chuck Ferry, New APR Energy chairman and CEO. He also serves as CEO of Duostech. Members of the former APR Energy management team lead New APR Energy through an asset management agreement with Duostech. “We are currently in discussions with many other data center operators and hyperscalers seeking similar support and expect to announce more deployments in the coming weeks,” Ferry said. Data centers and growth in AI have driven electricity demand forecasts rapidly higher over the past year. The Electric Power Research Institute in May predicted data centers could double their consumption, to 9% of the United States’ electricity generation, by 2030. However, improved algorithm and chip efficiency could slow that demand growth, experts say. Microsoft has canceled leases for two data centers, TD Cowen analysts said last week, though the company maintains it will spend more than $80 billion on infrastructure this fiscal year. “Securing power solutions from local utilities has become a challenge for data center expansion,” New APR said in its announcement. The company’s mobile gas turbine fleet “offers a fast and flexible alternative that can accelerate a data center developer’s project timeline and scalability.”

Read More »

EDF Renewables books $980M impairment from Atlantic Shores Offshore Wind

Political uncertainty surrounding offshore wind in the U.S. and the withdrawal of project partner Shell led EDF Renewables to book a $980 million impairment associated with Atlantic Shores Offshore Wind, the company said in a Friday report. Atlantic Shores Offshore Wind was a joint venture between Shell and EDF until Shell exited last month and booked a $1 billion impairment associated with the investment. The venture was developing the 2.8-GW Atlantic Shores 1 and 2 projects offshore New Jersey. On Atlantic Shores’ website, the project responded to Shell’s exit with a Jan. 30 statement that said, “While we can’t comment on the views of shareholders, Atlantic Shores intends to continue progressing New Jersey’s first offshore wind project and our portfolio in compliance with our obligations to local, state and federal partners under existing leases and relevant permits.” However, the New Jersey Board of Public Utilities responded to Shell’s decision by canceling the state’s fourth offshore wind solicitation earlier this month. “A number of reasons led to this decision, notably Shell backing out as an equity partner in the Atlantic Shores project and backing away from the American clean energy market, as well as uncertainty driven by federal actions and permitting,” NJBPU President Christine Guhl-Sadovy said in a Feb. 3 release. “The Board concluded that an award in New Jersey’s fourth offshore wind solicitation, despite the manifold benefits the industry offers to the state, would not be a responsible decision at this time.” After EDF’s writedown of its own stake, the project’s fate is left unclear. The 1.5-GW Atlantic Shores 1 had received a construction and operations plan approval from the Bureau of Ocean Energy Management in October, and completion was slated for 2028. EDF said “unfavourable changes in the political situation in the United States, Shell’s announcement of its withdrawal from

Read More »

3 strategies for carbon-free data centers

Because of the strain that data centers (as well as other electrification sources, such as electric vehicles) are putting on the grid, “the data center industry needs to develop new power supply strategies to support growth plans,” Dietrich said. Here are the underling factors that play into the three strategies outlined by Uptime. Scale creates new opportunities: It’s not just that more data centers are being built, but the data centers under construction are fundamentally different in terms of sheer magnitude. For example, a typical enterprise data center might require between 10 and 25 megawatts of power. Today, the hyperscalers are building data centers in the 250-megawatt range and a large data center campus could require 1,000 megawatts of power. Data centers not only require a reliable source of power, they also require backup power in the form of generators. Dietrich pointed out that if a data center operator builds out enough backup capacity to support 250 megawatts of demand, they’re essentially building a new, on-site power plant. On the one hand, that new power plant requires permitting, it’s costly, and it requires highly training staffers to operate. On the other hand, it provides an opportunity. Instead of letting this asset sit around unused except in an emergency, organizations can leverage these power plants to generate energy that can be sold back to the grid. Dietrich described this arrangement as a win-win that enables the data center to generate revenue, and it helps the utility to gain a new source of power. Realistic expectations: Alternative energy sources like wind and solar, which are dependent on environmental factors, can’t technically or economically supply 100% of data center power, but they can provide a significant percentage of it. Organizations need to temper their expectations, Dietrich said.

Read More »

Questions arise about reasons why Microsoft has cancelled data center lease plans

This, the company said, “allows us to invest and allocate resources to growth areas for our future. Our plans to spend over $80 billion on infrastructure this fiscal year remains on track as we continue to grow at a record pace to meet customer demand.” When asked for his reaction to the findings, John Annand, infrastructure and operations research practice lead at Info-Tech Research Group, pointed to a blog released last month by Microsoft president Brad Smith, and said he thinks the company “is hedging its bets. It reaffirms the $80 billion AI investment guidance in 2025, $40 billion in the US. Why lease when you can build/buy your own?” Over the past four years, he said, Microsoft “has been leasing more data centers than owning. Perhaps they are using the fact that the lessors are behind schedule on providing facilities or the power upgrades required to bring that ratio back into balance. The limiting factor for data centers has always been the availability of power, and this has only become more true with power-hungry AI workloads.” The company, said Annand, “has made very public statements about owning nuclear power plants to help address this demand. If third-party data center operators are finding it tough to provide Microsoft with the power they need, it would make sense that Microsoft vertically integrate its supply chain; so, cancel leases or statements of qualification in favor of investing in the building of their own capacity.” However, Gartner analyst Tony Harvey said of the report, “so much of this is still speculation.” Microsoft, he added, “has not stated as yet that they are reducing their capex spend, and there are reports that Microsoft have strongly refuted that they are making changes to their data center strategy.” The company, he said, “like any other hyperscaler,

Read More »

Quantum Computing Advancements Leap Forward In Evolving Data Center and AI Landscape

Overcoming the Barriers to Quantum Adoption Despite the promise of quantum computing, widespread deployment faces multiple hurdles: High Capital Costs: Quantum computing infrastructure requires substantial investment, with uncertain return-on-investment models. The partnership will explore cost-sharing strategies to mitigate risk. Undefined Revenue Models: Business frameworks for quantum services, including pricing structures and access models, remain in development. Hardware Limitations: Current quantum processors still struggle with error rates and scalability, requiring advancements in error correction and hybrid computing approaches. Software Maturity: Effective algorithms for leveraging quantum computing’s advantages remain an active area of research, particularly in real-world AI and optimization problems. SoftBank’s strategy includes leveraging its extensive telecom infrastructure and AI expertise to create real-world testing environments for quantum applications. By integrating quantum into existing data center operations, SoftBank aims to position itself at the forefront of the quantum-AI revolution. A Broader Play in Advanced Computing SoftBank’s quantum initiative follows a series of high-profile moves into the next generation of computing infrastructure. The company has been investing heavily in AI data centers, aligning with its “Beyond Carrier” strategy that expands its focus beyond telecommunications. Recent efforts include the development of large-scale AI models tailored to Japan and the enhancement of radio access networks (AI-RAN) through AI-driven optimizations. Internationally, SoftBank has explored data center expansion opportunities beyond Japan, as part of its efforts to support AI, cloud computing, and now quantum applications. The company’s long-term vision suggests that quantum data centers could eventually play a role in supporting AI-driven workloads at scale, offering performance benefits that classical supercomputers cannot achieve. The Road Ahead SoftBank and Quantinuum’s collaboration signals growing momentum for quantum computing in enterprise settings. While quantum remains a long-term bet, integrating QPUs into data center infrastructure represents a forward-looking approach that could redefine high-performance computing in the years to come. With

Read More »

STACK Infrastructure Pushes Aggressive Data Center Expansion and Sustainability Strategy Into 2025

Global data center developer and operator STACK Infrastructure is providing a growing range of digital infrastructure solutions for hyperscalers, cloud service providers, and enterprise clients. Like almost all of the cutting-edge developers in the industry, Stack is maintaining the focus on scalability, reliability, and sustainability while delivering a full range of solutions, including build-to-suit, colocation, and powered shell facilities, with continued development in key global markets. Headquartered in the United States, the company has expanded its presence across North America, Europe, and Asia-Pacific, catering to the increasing demand for high-performance computing, artificial intelligence (AI), and cloud-based workloads. The company is known for its commitment to sustainable growth, leveraging green financing initiatives, energy-efficient designs, and renewable power sources to minimize its environmental impact. Through rapid expansion in technology hubs like Silicon Valley, Northern Virginia, Malaysia, and Loudoun County, the company continues to develop industry benchmarks for innovation and infrastructure resilience. With a customer-centric approach and a robust development pipeline, STACK Infrastructure is shaping the future of digital connectivity and data management in an era of accelerating digital transformation. Significant Developments Across 23 Major Data Center Markets Early in 2024, Stack broke ground on the expansion of their existing 100 MW campus in San Jose, servicing the power constrained Silicon Valley. Stack worked with the city of San Jose to add a 60 MW expansion to their SVY01 data center. While possibly the highest profile of Stack’s developments, due to its location, at that point in time the company had announced significant developments across 23 major data center markets, including:       Stack’s 48 MW Santa Clara data center, featuring immediately available shell space powered by an onsite substation with rare, contracted capacity. Stack’s 56 MW Toronto campus, spanning 19 acres, includes an existing 8 MW data center and 48 MW expansion capacity,

Read More »

Meta Update: Opens Mesa, Arizona Data Center; Unveils Major Subsea Cable Initiative; Forges Oklahoma Wind Farm PPA; More

Meta’s Project Waterworth: Building the Global Backbone for AI-Powered Digital Infrastructure Also very recently, Meta unveiled its most ambitious subsea cable initiative yet: Project Waterworth. Aimed at revolutionizing global digital connectivity, the project will span over 50,000 kilometers—surpassing the Earth’s circumference—and connect five major continents. When completed, it will be the world’s longest subsea cable system, featuring the highest-capacity technology available today. A Strategic Expansion to Key Global Markets As announced on Feb. 14, Project Waterworth is designed to enhance connectivity across critical regions, including the United States, India, Brazil, and South Africa. These regions are increasingly pivotal to global digital growth, and the new subsea infrastructure will fuel economic cooperation, promote digital inclusion, and unlock opportunities for technological advancement. In India, for instance, where rapid digital infrastructure growth is already underway, the project will accelerate progress and support the country’s ambitions for an expanded digital economy. This enhanced connectivity will foster regional integration and bolster the foundation for next-generation applications, including AI-driven services. Strengthening Global Digital Highways Subsea cables are the unsung heroes of global digital infrastructure, facilitating over 95% of intercontinental data traffic. With a multi-billion-dollar investment, Meta aims to open three new oceanic corridors that will deliver the high-speed, high-capacity bandwidth needed to fuel innovations like artificial intelligence. Meta’s experience in subsea infrastructure is extensive. Over the past decade, the company has collaborated with various partners to develop more than 20 subsea cables, including systems boasting up to 24 fiber pairs—far exceeding the typical 8 to 16 fiber pairs found in most new deployments. This technological edge ensures scalability and reliability, essential for handling the world’s ever-increasing data demands. Engineering Innovations for Resilience and Capacity Project Waterworth isn’t just about scale—it’s about resilience and cutting-edge engineering. The system will be the longest 24-fiber-pair subsea cable ever built, enhancing

Read More »

Do data centers threaten the water supply?

In a new report, the Royal Academy of Engineering called upon the government to ensure tech companies accurately report how much energy and water their data centers are using and reducing the use of drinking water for cooling. Without such action, warns one of the report’s authors, Professor Tom Rodden, “we face a real risk that our development, deployment and use of AI could do irreparable damage to the environment.” The situation is a little different for the US as the country has large bodies of water offering a  water supply that the UK just does not have. It’s not an accident that there are many data centers around the Chicago area: they’ve also got the Great Lakes to draw upon. Likewise, the Columbia and Klamath Rivers have become magnets for data centers for both water supply and hydroelectric power. Other than the Thames River, the UK doesn’t have these massive bodies of water. Still, the problem is not unique to the UK, says Alan Howard, senior analyst with Omdia. He notes that Microsoft took heat last year because it was draining the water supply of a small Arizona town of Goodyear with a new AI-oriented data center.  The city of Chandler, Arizona passed an ordinance in 2015 that restricted new water-intensive businesses from setting up shop which slowed data center development.   “I believe some data center operators just bowed out,” said Howard.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »