Stay Ahead, Stay ONMINE

Crypto billionaire Brian Armstrong is ready to invest in CRISPR baby tech

Brian Armstrong, the billionaire CEO of the cryptocurrency exchange Coinbase, says he’s ready to fund a US startup focused on gene-editing human embryos. If he goes forward, it would be the first major commercial investment in one of medicine’s most fraught ideas. In a post on X June 2, Armstrong announced he was looking for gene-editing scientists and bioinformatics specialists to form a founding team for an “embryo editing” effort targeting an unmet medical need, such as a genetic disease. “I think the time is right for the defining company in the US to be built in this area,” Armstrong posted.  The announcement from a deep-pocketed backer is a striking shift for a field considered taboo following the 2018 birth of the world’s first genetically edited children in China—a secretive experiment that led to international outrage and prison time for the lead scientist. According to Dieter Egli, a gene-editing scientist at Columbia University whose team has briefed Armstrong, his plans may be motivated in part by recent improvements in editing technology that have opened up a safer, more precise way to change the DNA of embryos. That technique, called base editing, can deftly change a single DNA letter. Earlier methods, on the other hand, actually cut the double helix, damaging it and causing whole genes to disappear. “We know much better now what to do,” says Egli. “It doesn’t mean the work is all done, but it’s a very different game now—entirely different.”   Shoestring budget Embryo editing, which ultimately aims to produce humans with genes tailored by design, is an idea that has been heavily stigmatized and starved of funding. While it’s legal to study embryos in the lab, actually producing a gene-edited baby is flatly illegal in most countries. In the US, the CRISPR baby ban operates via a law that forbids the Food and Drug Administration from considering, or even acknowledging, any application it gets to attempt a gene-edited baby. But that rule could be changed, especially if scientists can demonstrate a compelling use of the technique—or perhaps if a billionaire lobbies for it. In his post, Armstrong included an image of a seven-year-old Pew Research Center poll showing Americans were strongly favorable to altering a baby’s genes if it could treat disease, although the same poll found most opposed experimentation on embryos.   Up until this point, no US company has openly pursued embryo editing, and the federal government doesn’t fund studies on embryos at all. Instead, research on gene editing in embryos has been carried forward in the US by just two academic centers, Egli’s and one at the Oregon Health & Science University. Those efforts have operated on a shoestring, held together by private grants and university funds. Researchers at those centers said they support the idea of a well-financed company that could advance the technology. “We would honestly welcome that,” says Paula Amato, a fertility doctor at Oregon Health & Science University and the past president of the American Society for Reproductive Medicine.  “More research is needed, and that takes people and money,” she says, adding that she doesn’t mind if it comes from “tech bros.” Editing embryos can, in theory, be used to correct genetic errors likely to cause serious childhood conditions. But since in most cases genetic testing of embryos can also be used to avoid those errors, many argue it will be hard to find a true unmet need where the DNA-altering technique is actually necessary. Instead, it’s easy to conclude that the bigger market for the technology would be to intervene in embryos in ways that could make humans resistant to common conditions, such as heart disease or Alzheimer’s. But that is more controversial because it’s a type of enhancement, and the changes would also be passed through the generations. Only last week, several biotech trade and academic groups demanded a 10-year moratorium on heritable human genome editing, saying the technology has few real medical uses and “introduces long-term risks with unknown consequences.” They said the ability to “program” desired traits or eliminate bad ones risked a new form of “eugenics,” one that would have the effect of “potentially altering the course of evolution.” No limits Armstrong did not reply to an email from MIT Technology Review seeking comment about his plans. Nor did his company Coinbase, a cryptocurrency trading platform that went public in 2021 and is the source of his fortune, estimated at $10 billion by Forbes. The billionaire is already part of a wave of tech entrepreneurs who’ve made a splash in science and biology by laying down outsize investments, sometimes in far-out ideas. Armstrong previously cofounded New Limit, which Bloomberg calls a “life extension venture” and which this year raised a further $130 million to explore methods to reprogram old cells into an embryonic-like state. He started that company with Blake Byers, an investor who has said a significant portion of global GDP should be spent on “immortality” research, including biotech approaches and ways of uploading human minds to computers. Then, starting late last year, Armstrong began publicly telegraphing his interest in exploring a new venture, this time connected to assisted reproduction. In December, he announced on X that he and Byers were ready to meet with entrepreneurs working on “artificial wombs,” “embryo editing,” and “next-gen IVF.” The post invited people to apply to attend an off-the-record dinner—a kind of forbidden-technologies soiree. Applicants had to fill in a Google form answering a few questions, including “What is something awesome you’ve built?” Among those who attended the dinner was a postdoctoral fellow from Egli’s lab, Stepan Jerabek, who had led the base-editing research project. Another attendee, Lucas Harrington, is a gene-editing scientist who trained at the University of California, Berkeley under Jennifer Doudna, a winner of the Nobel Prize in chemistry for development of CRISPR gene editing. Harrington says a venture group he helps run, called SciFounders, is also considering starting an embryo-editing company. “We share an interest in there being a company to empirically evaluate whether embryo editing can be done safely, and are actively exploring incubating a company to undertake this,” Harrington said in an email. “We believe there need to be legitimate scientists and clinicians working to safely evaluate this technology.” Because of how rapidly gene editing is advancing, Harrington has also criticized bans and moratoria on the technology. These can’t stop it from being applied but, he says, can drive it into “the shadows,” where it might be used less safely. According to Harrington, “several biohacker groups have quietly raised small amounts of capital” to pursue the technology. By contrast, Armstrong’s public declaration on X represents a more transparent approach. “It seems pretty serious now. They want to put something together,” says Egli, who hopes the Coinbase CEO might fund some research at his lab. “I think it’s very good he posted publicly, because you can feel the temperature, see what reaction you get, and you stimulate the public conversation.” Editing error The first reports that researchers were testing CRISPR on human embryos in the lab emerged from China in 2015, causing shock waves as it became clear how easy, in theory, it was to change human heredity. Two years later, in 2017, a report from Oregon claimed successful correction of a dangerous DNA mutation present in lab embryos made from patients’ egg and sperm cells. But that breakthrough was not what it seemed. More careful testing by Egli and Oregon showed that CRISPR technology actually caused havoc in a cell, often deleting large chunks of chromosomes. That’s in addition to mosaicism, in which edits occur differently in different cells. What looked at first like precise DNA editing was in fact a dangerous process causing unseen damage. While the public debate turned on the ethics of CRISPR babies—especially after three edited children were born in China—researchers were discussing basic scientific problems and how to solve them. Since then, both US labs, as well as some in China, have switched to base editing. That method, in theory, could also endow an embryo with a number of advantageous gene variants, not just one change. Company job Some researchers also feel certain that editing an embryo is simpler than trying to treat sick adults. The only approved gene-editing treatment, for sickle-cell disease, costs more than $2 million. By contrast, editing an embryo could be incredibly cheap, and if it’s done early, when an embryo is forming, all the body cells could carry the change. “You fix the text before you print the book,” says Egli. “It seems like a no-brainer.” Still, gene editing isn’t quite ready for prime time in making babies. Getting there requires more work, including careful design of the editing system (which includes a protein and short guide molecule) and systematic ways to check embryos for unwanted DNA changes. That is the type of industrial effort Armstrong’s company, if he funds one, would be suited to carry out. “You would have to optimize something to a point where it is perfect, to where it’s a breeze,” says Egli. “This is the kind of work that companies do.”

Brian Armstrong, the billionaire CEO of the cryptocurrency exchange Coinbase, says he’s ready to fund a US startup focused on gene-editing human embryos. If he goes forward, it would be the first major commercial investment in one of medicine’s most fraught ideas.

In a post on X June 2, Armstrong announced he was looking for gene-editing scientists and bioinformatics specialists to form a founding team for an “embryo editing” effort targeting an unmet medical need, such as a genetic disease.

“I think the time is right for the defining company in the US to be built in this area,” Armstrong posted. 

The announcement from a deep-pocketed backer is a striking shift for a field considered taboo following the 2018 birth of the world’s first genetically edited children in China—a secretive experiment that led to international outrage and prison time for the lead scientist.

According to Dieter Egli, a gene-editing scientist at Columbia University whose team has briefed Armstrong, his plans may be motivated in part by recent improvements in editing technology that have opened up a safer, more precise way to change the DNA of embryos.

That technique, called base editing, can deftly change a single DNA letter. Earlier methods, on the other hand, actually cut the double helix, damaging it and causing whole genes to disappear. “We know much better now what to do,” says Egli. “It doesn’t mean the work is all done, but it’s a very different game now—entirely different.”  

Shoestring budget

Embryo editing, which ultimately aims to produce humans with genes tailored by design, is an idea that has been heavily stigmatized and starved of funding. While it’s legal to study embryos in the lab, actually producing a gene-edited baby is flatly illegal in most countries.

In the US, the CRISPR baby ban operates via a law that forbids the Food and Drug Administration from considering, or even acknowledging, any application it gets to attempt a gene-edited baby. But that rule could be changed, especially if scientists can demonstrate a compelling use of the technique—or perhaps if a billionaire lobbies for it.

In his post, Armstrong included an image of a seven-year-old Pew Research Center poll showing Americans were strongly favorable to altering a baby’s genes if it could treat disease, although the same poll found most opposed experimentation on embryos.  

Up until this point, no US company has openly pursued embryo editing, and the federal government doesn’t fund studies on embryos at all. Instead, research on gene editing in embryos has been carried forward in the US by just two academic centers, Egli’s and one at the Oregon Health & Science University.

Those efforts have operated on a shoestring, held together by private grants and university funds. Researchers at those centers said they support the idea of a well-financed company that could advance the technology. “We would honestly welcome that,” says Paula Amato, a fertility doctor at Oregon Health & Science University and the past president of the American Society for Reproductive Medicine. 

“More research is needed, and that takes people and money,” she says, adding that she doesn’t mind if it comes from “tech bros.”

Editing embryos can, in theory, be used to correct genetic errors likely to cause serious childhood conditions. But since in most cases genetic testing of embryos can also be used to avoid those errors, many argue it will be hard to find a true unmet need where the DNA-altering technique is actually necessary.

Instead, it’s easy to conclude that the bigger market for the technology would be to intervene in embryos in ways that could make humans resistant to common conditions, such as heart disease or Alzheimer’s. But that is more controversial because it’s a type of enhancement, and the changes would also be passed through the generations.

Only last week, several biotech trade and academic groups demanded a 10-year moratorium on heritable human genome editing, saying the technology has few real medical uses and “introduces long-term risks with unknown consequences.”

They said the ability to “program” desired traits or eliminate bad ones risked a new form of “eugenics,” one that would have the effect of “potentially altering the course of evolution.”

No limits

Armstrong did not reply to an email from MIT Technology Review seeking comment about his plans. Nor did his company Coinbase, a cryptocurrency trading platform that went public in 2021 and is the source of his fortune, estimated at $10 billion by Forbes.

The billionaire is already part of a wave of tech entrepreneurs who’ve made a splash in science and biology by laying down outsize investments, sometimes in far-out ideas. Armstrong previously cofounded New Limit, which Bloomberg calls a “life extension venture” and which this year raised a further $130 million to explore methods to reprogram old cells into an embryonic-like state.

He started that company with Blake Byers, an investor who has said a significant portion of global GDP should be spent on “immortality” research, including biotech approaches and ways of uploading human minds to computers.

Then, starting late last year, Armstrong began publicly telegraphing his interest in exploring a new venture, this time connected to assisted reproduction. In December, he announced on X that he and Byers were ready to meet with entrepreneurs working on “artificial wombs,” “embryo editing,” and “next-gen IVF.”

The post invited people to apply to attend an off-the-record dinner—a kind of forbidden-technologies soiree. Applicants had to fill in a Google form answering a few questions, including “What is something awesome you’ve built?”

Among those who attended the dinner was a postdoctoral fellow from Egli’s lab, Stepan Jerabek, who had led the base-editing research project. Another attendee, Lucas Harrington, is a gene-editing scientist who trained at the University of California, Berkeley under Jennifer Doudna, a winner of the Nobel Prize in chemistry for development of CRISPR gene editing. Harrington says a venture group he helps run, called SciFounders, is also considering starting an embryo-editing company.

“We share an interest in there being a company to empirically evaluate whether embryo editing can be done safely, and are actively exploring incubating a company to undertake this,” Harrington said in an email. “We believe there need to be legitimate scientists and clinicians working to safely evaluate this technology.”

Because of how rapidly gene editing is advancing, Harrington has also criticized bans and moratoria on the technology. These can’t stop it from being applied but, he says, can drive it into “the shadows,” where it might be used less safely. According to Harrington, “several biohacker groups have quietly raised small amounts of capital” to pursue the technology.

By contrast, Armstrong’s public declaration on X represents a more transparent approach. “It seems pretty serious now. They want to put something together,” says Egli, who hopes the Coinbase CEO might fund some research at his lab. “I think it’s very good he posted publicly, because you can feel the temperature, see what reaction you get, and you stimulate the public conversation.”

Editing error

The first reports that researchers were testing CRISPR on human embryos in the lab emerged from China in 2015, causing shock waves as it became clear how easy, in theory, it was to change human heredity. Two years later, in 2017, a report from Oregon claimed successful correction of a dangerous DNA mutation present in lab embryos made from patients’ egg and sperm cells.

But that breakthrough was not what it seemed. More careful testing by Egli and Oregon showed that CRISPR technology actually caused havoc in a cell, often deleting large chunks of chromosomes. That’s in addition to mosaicism, in which edits occur differently in different cells. What looked at first like precise DNA editing was in fact a dangerous process causing unseen damage.

While the public debate turned on the ethics of CRISPR babies—especially after three edited children were born in China—researchers were discussing basic scientific problems and how to solve them.

Since then, both US labs, as well as some in China, have switched to base editing. That method, in theory, could also endow an embryo with a number of advantageous gene variants, not just one change.

Company job

Some researchers also feel certain that editing an embryo is simpler than trying to treat sick adults. The only approved gene-editing treatment, for sickle-cell disease, costs more than $2 million. By contrast, editing an embryo could be incredibly cheap, and if it’s done early, when an embryo is forming, all the body cells could carry the change.

“You fix the text before you print the book,” says Egli. “It seems like a no-brainer.”

Still, gene editing isn’t quite ready for prime time in making babies. Getting there requires more work, including careful design of the editing system (which includes a protein and short guide molecule) and systematic ways to check embryos for unwanted DNA changes. That is the type of industrial effort Armstrong’s company, if he funds one, would be suited to carry out.

“You would have to optimize something to a point where it is perfect, to where it’s a breeze,” says Egli. “This is the kind of work that companies do.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Nvidia aims to bring AI to wireless

Key features of ARC-Compact include: Energy Efficiency: Utilizing the L4 GPU (72-watt power footprint) and an energy-efficient ARM CPU, ARC-Compact aims for a total system power comparable to custom baseband unit (BBU) solutions currently in use. 5G vRAN support: It fully supports 5G TDD, FDD, massive MIMO, and all O-RAN

Read More »

Netgear’s enterprise ambitions grow with SASE acquisition

Addressing the SME security gap The acquisition directly addresses a portfolio gap that Netgear (Nasdaq:NTGR) has identified through customer feedback.  According to Badjate, customers have been saying that they like the Netgear products, but they also really need more security capabilities. Netgear’s target market focuses on organizations with fewer than

Read More »

IBM’s cloud crisis deepens: 54 services disrupted in latest outage

Rawat said IBM’s incident response appears slow and ineffective, hinting at procedural or resource limitations. The situation also raises concerns about IBM Cloud’s adherence to zero trust principles, its automation in threat response, and the overall enforcement of security controls. “The recent IBM Cloud outages are part of a broader

Read More »

AMD acquires Brium to loosen Nvidia’s grip on AI software

According to Greyhound Research, nearly 67 percent of global CIOs identify software maturity, particularly in middleware and runtime optimization, as the primary barrier to adopting alternatives to Nvidia. Brium’s compiler-based approach to AI inference could ease this dependency. While Nvidia still leads among developers, AMD’s expanding open-source stack, now backed

Read More »

Oil Rises as Solid USA Jobs Data Pushes Algos to Drop Short Bets

Oil rose as stronger-than-expected US jobs data eased concerns about an economic slowdown that would crimp demand, spurring algorithmic traders to reduce short positions.  West Texas Intermediate climbed almost 2% to settle above $64 a barrel, notching the largest weekly gain since November. Crude followed equities higher after US job growth in May narrowly surpassed economist forecasts, allaying concerns of near-term demand deterioration. The figures also pushed economy-sensitive diesel futures to a two-week high.  “Trading is relatively quiet today, with macroeconomic factors continuing to drive the narrative,” said Rebecca Babin, a senior energy trader at CIBC Private Wealth Group. “The unemployment data is easing concerns that demand will sharply decline due to tariff uncertainty.” The positive economic data spurred commodity trading advisers to ease off of their bearish tilt. The funds, which can accelerate price momentum, liquidated short positions to sit at negative 9% short in WTI on Friday, compared with 64% short on June 5, according to data from Bridgeton Research Group. The rally was supported by enduring risk-on sentiment from optimistic signs on trade talks between the US and China, the world’s largest importer of crude. President Donald Trump and his Chinese counterpart, Xi Jinping, agreed to further negotiations over tariffs and supplies of rare earth minerals. The positive signals come against the backdrop of an oil market that has been increasingly rangebound in recent weeks. Prices have traded in a $5 band since the middle of May, and a gauge of volatility for US crude futures is at the lowest since early April.  Oil has been buffeted in Trump’s second term as trade tensions between the world’s two largest economies menace demand. At the same time, the OPEC+ alliance has been adding barrels back to the market at a faster-than-expected rate, further clouding an already weak outlook for the second half

Read More »

USA Targets Niche Gas That China Can’t Replace as Trade War Chip

The US is using its dominance of a niche petroleum gas as a bargaining chip in its trade war with China. America supplies China with almost all of its ethane, a product of the shale boom that’s used as a building block for making plastics. But the commerce department is now ordering shippers to apply for export licenses, and has told at least one, Enterprise Products Partners LP, that it intends to withhold permits for three China-bound cargoes. The trade war is throwing a spotlight on how the US and China rely on each other for certain commodities — dependencies that both nations are seeking to leverage as they negotiate terms to resolve their dispute. In this case, America is the world’s biggest producer of ethane, which is converted into ethylene for plastics factories, and China is its largest customer. The commerce department has cited risks that petroleum products like ethane could be diverted to the military, copying the playbook deployed by Beijing in justifying restrictions on what it calls dual-use items such as rare earths and other critical minerals. “Ethane is no longer just a byproduct of shale — it’s now a geopolitical weapon,” said Julian Renton, lead analyst covering natural gas liquids at East Daley Analytics. “China bet billions building infrastructure around US ethane, and Washington is now questioning whether that bet should continue to pay off.” America’s shale revolution and China’s rapid industrialization have coincided this century to create a market where cheap energy byproducts are parlayed into millions of tons of materials used as trash bags and shampoo bottles, car seats and computer keyboards. But companies that prospered from cooperation are now caught in the crossfire of an increasingly antagonistic trade relationship between Washington and Beijing. Chinese firms such as Satellite Chemical Co. operate giant petrochemical plants that

Read More »

Alaska LNG Attracts Potential Partners in Asia, EU

The first round of partner selection for the planned Alaska LNG project attracted over 50 companies, lead developer Glenfarne Group LLC has said. The potential partners are from the United States, the European Union, Japan, Korea, Taiwan, Thailand and India. The companies “expressed interest for over $115 billion of contract value for various partnerships with the Project, including equipment and material supply, services, investment, and customer agreements”, Glenfarne said in an online statement. Alaska LNG, approved by the Federal Energy Regulatory Commission May 2020, will deliver natural gas from the state’s North Slope to both domestic and global markets. It is the only federally permitted liquefied natural gas (LNG) project on the United States Pacific Coast, according to co-developer Alaska Gasline Development Corp. (AGDC). “Alaska LNG’s economic fundamentals allow it to deliver LNG into Asia at prices that are lower than Henry Hub pricing from the U.S. Gulf Coast”, the statement said. Alaska LNG is planned to have an LNG export terminal with a capacity of 20 million metric tons per annum (MMtpa), an 807-mile 42-inch pipeline and a carbon capture plant with a storage capacity of seven MMtpa. Phase 1 aims to deliver gas about 765 miles from the North Slope to the Anchorage region. Phase 2 would install compression equipment and around 42 miles of pipeline under Cook Inlet to the Alaska LNG Export Facility in Nikiski, which would be constructed at the same time. “Glenfarne anticipates a final investment decision on the domestic portion of the Alaska LNG pipeline in late Q4 2025”, the statement said. Brendan Duval, chief executive and founder of New York City-based energy investor Glenfarne, commented, “The many expressions of interest received reinforce that the market recognizes Alaska LNG’s advantaged economics, fully permitted status, and powerful federal, state, and local support”. The project’s

Read More »

NERC overstates MISO reliability risks: market monitor

The North American Electric Reliability Corp. is overstating the reliability risks faced by the Midcontinent Independent System Operator, according to David Patton, president of Potomac Economics, the grid operator’s market monitor. In its 2024 Long-Term Reliability Assessment released in December, NERC said MISO was at a high risk of having a shortfall in electricity supplies at the peak of an average summer or winter season in the next five years — the worst ranking of all North American regions. “I’d love to work with NERC to figure out where they got their numbers from, because I don’t think they’re accurate,” Patton said Thursday during a technical conference on resource adequacy challenges in the United States held by the Federal Energy Regulatory Commission. NERC understates MISO’s capacity for demand response, behind-the-meter generation and firm capacity imports by more than 8 GW, Patton said in written testimony to FERC. Also, NERC considered possible power plant retirements that have not occurred, according to Patton. Potomac Economics also acts as the market monitor for the Electric Reliability Council of Texas, ISO New England and the New York Independent System Operator. “MISO is the most reliable of any of them,” Patton said. “If I was concerned about the lights going out somewhere, it would not be MISO.” In recent winter storms, MISO exported power to neighboring grid operators to help meet their needs, Patton said, noting the exports reflected the value interconnections between grid operators can have. Further, MISO has been vastly overestimating the power supplies it will need in coming years to meet demand for electricity, Patton said. In its 2024 Regional Resource Assessment, MISO said its footprint may need 17 GW of new resources every year for the next 20 years. “That’s a result of a clearly flawed planning process,” Patton told FERC. In

Read More »

Transferability is transforming clean energy project finance, say dealmakers

Dive Brief: The tax credit transferability provision included in the Inflation Reduction Act has introduced new deal structures and is allowing clean energy developers to secure project financing faster, said speakers at a Thursday panel at the American Council on Renewable Energy’s Finance Forum. “The closing of transactions has become so much easier,” said Gaurav Raniwala, global renewable energy leader at GE Vernova. “You don’t have to line up two different structures simultaneously and then close everything when there’s already enough mess going on. And the type of players that are now able to enter the market has broadened significantly.” A Wednesday report from Crux, a finance technology company that connects tax credit buyers and sellers, said lenders are “increasingly” looking to finance less established technologies like carbon capture, and that “this openness is supported by the robust and progressively more liquid market for transferable tax credits.” Dive Insight: Raniwala said that financing had previously relied on the tax equity market, which “was limited in capacity. The industry wanted to be bigger.” “If you really want to have a dominant energy industry which has abundance of supply to help with electrification, to help with all the AI stuff, we need all sources of energy out there,” he said. “And I think what transferability did was it broadened the market from just traditional tax equity to a whole host of players.” Crux’s analysis said that “tax equity structures have evolved to hybrid structures, or t-flips, which explicitly contemplate the sale of a portion of tax credits in the transfer market” and found that t-flips “made up about 60% of the tax equity committed in 2024, and that share is expected to rise.” “Historically, the tax equity market was about a $20 billion a year market dominated for many years by a

Read More »

Vast array of solar power equipment left exposed online

Dive Brief: Nearly 35,000 solar power devices are remotely manageable and openly accessible to anyone from anywhere in the world, according to a new report from industrial cybersecurity firm Forescout. These exposed devices with internet-accessible management interfaces, which are made by 42 different companies, include equipment that is essential for operating solar energy infrastructure, according to the Tuesday report. Some of the management interfaces may include password protections, but Forescout said that virtually none of them needed to be online and that any exceptions should be placed behind VPNs. The 10 vendors with the greatest number of exposed devices have each disclosed vulnerabilities in the past decade, increasing the risk of their sitting exposed on the public internet.  Dive Insight: The transition to renewable energy sources and the increasing digitization of the power grid have combined to create serious cybersecurity risks. Forescout’s latest findings illustrate how the absence of secure design practices in critical infrastructure devices also can endanger people’s lives and present opportunities to destabilize entire regions. Forescout’s report — based on a scan of public IP addresses using the Shodan search engine — contains details about the distribution of solar equipment with internet-accessible management interfaces. For example, these devices are more prevalent in Europe and Asia than elsewhere, with three-quarters of the devices residing in Europe and 17% in Asia. Germany and Greece each have one-fifth of the total number of exposed devices. In addition, the 10 vendors with the most exposed devices were not the same as the 10 vendors with the biggest market shares; global titan Huawei, for example, is not on Forescout’s list. SMA’s Sunny WebBox, a device that collects and reports information about the performance of solar inverters, was the most commonly observed piece of equipment left remotely accessible, followed by Fronius International inverters.

Read More »

LiquidStack launches cooling system for high density, high-powered data centers

The CDU is serviceable from the front of the unit, with no rear or end access required, allowing the system to be placed against the wall. The skid-mounted system can come with rail and overhead piping pre-installed or shipped as separate cabinets for on-site assembly. The single-phase system has high-efficiency dual pumps designed to protect critical components from leaks and a centralized design with separate pump and control modules reduce both the number of components and complexity. “AI will keep pushing thermal output to new extremes, and data centers need cooling systems that can be easily deployed, managed, and scaled to match heat rejection demands as they rise,” said Joe Capes, CEO of LiquidStack in a statement. “With up to 10MW of cooling capacity at N, N+1, or N+2, the GigaModular is a platform like no other—we designed it to be the only CDU our customers will ever need. It future-proofs design selections for direct-to-chip liquid cooling without traditional limits or boundaries.”

Read More »

Enterprises face data center power design challenges

” Now, with AI, GPUs need data to do a lot of compute and send that back to another GPU. That connection needs to be close together, and that is what’s pushing the density, the chips are more powerful and so on, but the necessity of everything being close together is what’s driving this big revolution,” he said. That revolution in new architecture is new data center designs. Cordovil said that instead of putting the power shelves within the rack, system administrators are putting a sidecar next to those racks and loading the sidecar with the power system, which serves two to four racks. This allows for more compute per rack and lower latency since the data doesn’t have to travel as far. The problem is that 1 mW racks are uncharted territory and no one knows how to manage the power, which is considerable now. ”There’s no user manual that says, hey, just follow this and everything’s going to be all right. You really need to push the boundaries of understanding how to work. You need to start designing something somehow, so that is a challenge to data center designers,” he said. And this brings up another issue: many corporate data centers have power plugs that are like the ones that you have at home, more or less, so they didn’t need to have an advanced electrician certification. “We’re not playing with that power anymore. You need to be very aware of how to connect something. Some of the technicians are going to need to be certified electricians, which is a skills gap in the market that we see in most markets out there,” said Cordovil. A CompTIA A+ certification will teach you the basics of power, but not the advanced skills needed for these increasingly dense racks. Cordovil

Read More »

HPE Nonstop servers target data center, high-throughput applications

HPE has bumped up the size and speed of its fault-tolerant Nonstop Compute servers. There are two new servers – the 8TB, Intel Xeon-based Nonstop Compute NS9 X5 and Nonstop Compute NS5 X5 – aimed at enterprise customers looking to upgrade their transaction processing network infrastructure or support larger application workloads. Like other HPE Nonstop systems, the two new boxes include compute, software, storage, networking and database resources as well as full-system clustering and HPE’s specialized Nonstop operating system. The flagship NS9 X5 features support for dual-fabric HDR200 InfiniBand interconnect, which effectively doubles the interconnect bandwidth between it and other servers compared to the current NS8 X4, according to an HPE blog detailing the new servers. It supports up to 270 networking ports per NS9 X system, can be clustered with up to 16 other NS9 X5s, and can support 25 GbE network connectivity for modern data center integration and high-throughput applications, according to HPE.

Read More »

AI boom exposes infrastructure gaps: APAC’s data center demand to outstrip supply by 42%

“Investor confidence in data centres is expected to strengthen over the remainder of the decade,” the report said. “Strong demand and solid underlying fundamentals fuelled by AI and cloud services growth will provide a robust foundation for investors to build scale.” Enterprise strategies must evolve With supply constrained and prices rising, CBRE recommended that enterprises rethink data center procurement models. Waiting for optimal sites or price points is no longer viable in many markets. Instead, enterprises should pursue early partnerships with operators that have robust development pipelines and focus on securing power-ready land. Build-to-suit models are becoming more relevant, especially for larger capacity requirements. Smaller enterprise facilities — those under 5MW — may face sustainability challenges in the long term. The report suggested that these could become “less relevant” as companies increasingly turn to specialized colocation and hyperscale providers. Still, traditional workloads will continue to represent up to 50% of total demand through 2030, preserving value in existing facilities for non-AI use cases, the report added. The region’s projected 15 to 25 GW gap is more than a temporary shortage — it signals a structural shift, CBRE said. Enterprises that act early to secure infrastructure, invest in emerging markets, and align with power availability will be best positioned to meet digital transformation goals. “Those that wait may find themselves locked out of the digital infrastructure they need to compete,” the report added.

Read More »

Cisco bolsters DNS security package

The software can block domains associated with phishing, malware, botnets, and other high-risk categories such as cryptomining or new domains that haven’t been reported previously. It can also create custom block and allow lists and offers the ability to pinpoint compromised systems using real-time security activity reports, Brunetto wrote. According to Cisco, many organizations leave DNS resolution to their ISP. “But the growth of direct enterprise internet connections and remote work make DNS optimization for threat defense, privacy, compliance, and performance ever more important,” Cisco stated. “Along with core security hygiene, like a patching program, strong DNS-layer security is the leading cost-effective way to improve security posture. It blocks threats before they even reach your firewall, dramatically reducing the alert pressure your security team manages.” “Unlike other Secure Service Edge (SSE) solutions that have added basic DNS security in a ‘checkbox’ attempt to meet market demand, Cisco Secure Access – DNS Defense embeds strong security into its global network of 50+ DNS data centers,” Brunetto wrote. “Among all SSE solutions, only Cisco’s features a recursive DNS architecture that ensures low-latency, fast DNS resolution, and seamless failover.”

Read More »

HPE Aruba unveils raft of new switches for data center, campus modernization

And in large-scale enterprise environments embracing collapsed-core designs, the switch acts as a high-performance aggregation layer. It consolidates services, simplifies network architecture, and enforces security policies natively, reducing complexity and operational cost, Gray said. In addition, the switch offers the agility and security required at colocation facilities and edge sites. Its integrated Layer 4 stateful security and automation-ready platform enable rapid deployment while maintaining robust control and visibility over distributed infrastructure, Gray said. The CX 10040 significantly expands the capacity it can provide and the roles it can serve for enterprise customers, according to one industry analyst. “From the enterprise side, this expands on the feature set and capabilities of the original 10000, giving customers the ability to run additional services directly in the network,” said Alan Weckel, co-founder and analyst with The 650 Group. “It helps drive a lower TCO and provide a more secure network.”  Aimed as a VMware alternative Gray noted that HPE Aruba is combining its recently announced Morpheus VM Essentials plug-in package, which offers a hypervisor-based package aimed at hybrid cloud virtualization environments, with the CX 10040 to deliver a meaningful alternative to Broadcom’s VMware package. “If customers want to get out of the business of having to buy VM cloud or Cloud Foundation stuff and all of that, they can replace the distributed firewall, microsegmentation and lots of the capabilities found in the old VMware NSX [networking software] and the CX 10k, and Morpheus can easily replace that functionality [such as VM orchestration, automation and policy management],” Gray said. The 650 Group’s Weckel weighed in on the idea of the CX 10040 as a VMware alternative:

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »