Stay Ahead, Stay ONMINE

Can we repair the internet?

From addictive algorithms to exploitative apps, data mining to misinformation, the internet today can be a hazardous place. Books by three influential figures—the intellect behind “net neutrality,” a former Meta executive, and the web’s own inventor—propose radical approaches to fixing it. But are these luminaries the right people for the job? Though each shows conviction, and even sometimes inventiveness, the solutions they present reveal blind spots. The Age of Extraction: How Tech Platforms Conquered the Economy and Threaten Our Future ProsperityTim WuKNOPF, 2025 In The Age of Extraction: How Tech Platforms Conquered the Economy and Threaten Our Future Prosperity, Tim Wu argues that a few platform companies have too much concentrated power and must be dismantled. Wu, a prominent Columbia professor who popularized the principle that a free internet requires all online traffic to be treated equally, believes that existing legal mechanisms, especially anti-monopoly laws, offer the best way to achieve this goal. Pairing economic theory with recent digital history, Wu shows how platforms have shifted from giving to users to extracting from them. He argues that our failure to understand their power has only encouraged them to grow, displacing competitors along the way. And he contends that convenience is what platforms most often exploit to keep users entrapped. “The human desire to avoid unnecessary pain and inconvenience,” he writes, may be “the strongest force out there.” He cites Google’s and Apple’s “ecosystems” as examples, showing how users can become dependent on such services as a result of their all-­encompassing seamlessness. To Wu, this isn’t a bad thing in itself. The ease of using Amazon to stream entertainment, make online purchases, or help organize day-to-day life delivers obvious gains. But when powerhouse companies like Amazon, Apple, and Alphabet win the battle of convenience with so many users—and never let competitors get a foothold—the result is “industry dominance” that must now be reexamined. The measures Wu advocates—and that appear the most practical, as they draw on existing legal frameworks and economic policies—are federal anti-monopoly laws, utility caps that limit how much companies can charge consumers for service, and “line of business” restrictions that prohibit companies from operating in certain industries. Columbia University’s Tim Wu shows how platforms have shifted from giving to users to extracting from them. He argues that our failure to understand their power has only encouraged them to grow. Anti-monopoly provisions and antitrust laws are effective weapons in our armory, Wu contends, pointing out that they have been successfully used against technology companies in the past. He cites two well-known cases. The first is the 1960s antitrust case brought by the US government against IBM, which helped create competition in the computer software market that enabled companies like Apple and Microsoft to emerge. The 1982 AT&T case that broke the telephone conglomerate up into several smaller companies is another instance. In each, the public benefited from the decoupling of hardware, software, and other services, leading to more competition and choice in a technology market. But will past performance predict future results? It’s not yet clear whether these laws can be successful in the platform age. The 2025 antitrust case against Google—in which a judge ruled that the company did not have to divest itself of its Chrome browser as the US Justice Department had proposed—reveals the limits of pursuing tech breakups through the law. The 2001 antitrust case brought against Microsoft likewise failed to separate the company from its web browser and mostly kept the conglomerate intact. Wu noticeably doesn’t discuss the Microsoft case when arguing for antitrust action today. Nick Clegg, until recently Meta’s president of global affairs and a former deputy prime minister of the UK, takes a position very different from Wu’s: that trying to break up the biggest tech companies is misguided and would degrade the experience of internet users. In How to Save the Internet: The Threat to Global Connection in the Age of AI and Political Conflict, Clegg acknowledges Big Tech’s monopoly over the web. But he believes punitive legal measures like antitrust laws are unproductive and can be avoided by means of regulation, such as rules for what content social media can and can’t publish. (It’s worth noting that Meta is facing its own antitrust case, involving whether it should have been allowed to acquire Instagram and WhatsApp.) How to Save the Internet: The Threat to Global Connection in the Age of AI and Political ConflictNick CleggBODLEY HEAD, 2025 Clegg also believes Silicon Valley should take the initiative to reform itself. He argues that encouraging social media networks to “open up the books” and share their decision-making power with users is more likely to restore some equilibrium than contemplating legal action as a first resort. But some may be skeptical of a former Meta exec and politician who worked closely with Mark Zuckerberg and still wasn’t able to usher in such changes to social media sites while working for one. What will only compound this skepticism is the selective history found in Clegg’s book, which briefly acknowledges some scandals (like the one surrounding Cambridge Analytica’s data harvesting from Facebook users in 2016) but refuses to discuss other pertinent ones. For example, Clegg laments the “fractured” nature of the global internet today but fails to acknowledge Facebook’s own role in this splintering. Breaking up Big Tech through antitrust laws would hinder innovation, says Clegg, arguing that the idea “completely ignores the benefits users gain from large network effects.” Users stick with these outsize channels because they can find “most of what they’re looking for,” he writes, like friends and content on social media and cheap consumer goods on Amazon and eBay. Wu might concede this point, but he would disagree with Clegg’s claims that maintaining the status quo is beneficial to users. “The traditional logic of antitrust law doesn’t work,” Clegg insists. Instead, he believes less sweeping regulation can help make Big Tech less dangerous while ensuring a better user experience. Clegg has seen both sides of the regulatory coin: He worked in David Cameron’s government passing national laws for technology companies to follow and then moved to Meta to help the company navigate those types of nation-specific obligations. He bemoans the hassle and complexity Silicon Valley faces in trying to comply with differing rules across the globe, some set by “American federal agencies” and others by “Indian nationalists.” But with the resources such companies command, surely they are more than equipped to cope? Given that Meta itself has previously meddled in access to the internet (such as in India, whose telecommunications regulator ultimately blocked its Free Basics internet service for violating net neutrality rules), this complaint seems suspect coming from Clegg. What should be the real priority, he argues, is not any new nation-specific laws but a global “treaty that protects the free flow of data between signatory countries.” What the former Meta executive Nick Clegg advocates—unsurprisingly—is not a breakup of Big Tech but a push for it to become “radically transparent.” Clegg believes that these nation-specific technology obligations—a recent one is Australia’s ban on social media for people under 16—usually reflect fallacies about the technology’s human impact, a subject that can be fraught with anxiety. Such laws have proved ineffective and tend to taint the public’s understanding of social networks, he says. There is some truth to his argument here, but reading a book in which a former Facebook executive dismisses techno-determinism—that is, the argument that technology makes people do or think certain things—may be cold comfort to those who have seen the harm technology can do. In any case, Clegg’s defensiveness about social networks may not gain much favor from users themselves. He stresses the need for more personal responsibility, arguing that Meta doesn’t ever intend for users to stay on Facebook or Instagram endlessly: “How long you spend on the app in a single session is not nearly as important as getting you to come back over and over again.” Social media companies want to serve you content that is “meaningful to you,” he claims, not “simply to give you a momentary dopamine spike.” All this feels disingenuous at best. What Clegg advocates—unsurprisingly—is not a breakup of Big Tech but a push for it to become “radically transparent,” whether on its own or, if necessary, with the help of federal legislators. He also wants platforms to bring users more into their governance processes (by using Facebook’s model of community forums to help improve their apps and products, for example). Finally, Clegg also wants Big Tech to give users more meaningful control of their data and how companies such as Meta can use it. Here Clegg shares common ground with the inventor of the web, Tim Berners-Lee, whose own proposal for reform advances a technically specific vision for doing just that. In his memoir/manifesto This Is for Everyone: The Unfinished Story of the World Wide Web, Berners-Lee acknowledges that his initial vision—of a technology he hoped would remain open-source, collaborative, and completely decentralized—is a far cry from the web that we know today. This Is for Everyone: The Unfinished Story of the World Wide WebTim Berners-LeeFARRAR, STRAUS & GIROUX, 2025 If there’s any surviving manifestation of his original project, he says, it’s Wikipedia, which remains “probably the best single example of what I wanted the web to be.” His best idea for moving power from Silicon Valley platforms into the hands of users is to give them more data control. He pushes for a universal data “pod” he helped develop, known as “Solid” (an abbreviation of “social linked data”). The system—which was originally developed at MIT—would offer a central site where people could manage data ranging from credit card information to health records to social media comment history. “Rather than have all this stuff siloed off with different providers across the web, you’d be able to store your entire digital information trail in a single private repository,” Berners-Lee writes. The Solid product may look like a kind of silver bullet in an age when data harvesting is familiar and data breaches are rampant. Placing greater control with users and enabling them to see “what data [i]s being generated about them” does sound like a tantalizing prospect. But some people may have concerns about, for example, merging their confidential health records with data from personal devices (like heart rate info from a smart watch). No matter how much user control and decentralization Berners-Lee may promise, recent data scandals (such as cases in which period-tracking apps misused clients’ data) may be on people’s minds. Berners-Lee believes that centralizing user data in a product like Solid could save people time and improve daily life on the internet. “An alien coming to Earth would think it was very strange that I had to tell my phone the same things again and again,” he complains about the experience of using different airline apps today. With Solid, everything from vaccination records to credit card transactions could be kept within the digital vault and plugged into different apps. Berners-Lee believes that AI could also help people make more use of this data—for example, by linking meal plans to grocery bills. Still, if he’s optimistic on how AI and Solid could coordinate to improve users’ lives, he is vague on how to make sure that chatbots manage such personal data sensitively and safely. Berners-Lee generally opposes regulation of the web (except in the case of teenagers and social media algorithms, where he sees a genuine need). He believes in internet users’ individual right to control their own data; he is confident that a product like Solid could “course-correct” the web from its current “exploitative” and extractive direction. Of the three writers’ approaches to reform, it is Wu’s that has shown some effectiveness of late. Companies like Google have been forced to give competitors some advantage through data sharing, and they have now seen limits on how their systems can be used in new products and technologies. But in the current US political climate, will antitrust laws continue to be enforced against Big Tech? Clegg may get his way on one issue: limiting new nation-specific laws. President Donald Trump has confirmed that he will use tariffs to penalize countries that ratify their own national laws targeting US tech companies. And given the posture of the Trump administration, it doesn’t seem likely that Big Tech will see more regulation in the US. Indeed, social networks have seemed emboldened (Meta, for example, removed fact-checkers and relaxed content moderation rules after Trump’s election win). In any case, the US hasn’t passed a major piece of federal internet legislation since 1996. If using anti-monopoly laws through the courts isn’t possible, Clegg’s push for a US-led omnibus deal—setting consensual rules for data and acceptable standards of human rights—may be the only way to make some more immediate improvements. In the end, there is not likely to be any single fix for what ails the internet today. But the ideas the three writers agree on—greater user control, more data privacy, and increased accountability from Silicon Valley—are surely the outcomes we should all fight for. Nathan Smith is a writer whose work has appeared in the Washington Post, the Economist, and the Los Angeles Times.

From addictive algorithms to exploitative apps, data mining to misinformation, the internet today can be a hazardous place. Books by three influential figures—the intellect behind “net neutrality,” a former Meta executive, and the web’s own inventor—propose radical approaches to fixing it. But are these luminaries the right people for the job? Though each shows conviction, and even sometimes inventiveness, the solutions they present reveal blind spots.

book cover
The Age of Extraction: How Tech Platforms Conquered the Economy and Threaten Our Future Prosperity
Tim Wu
KNOPF, 2025

In The Age of Extraction: How Tech Platforms Conquered the Economy and Threaten Our Future Prosperity, Tim Wu argues that a few platform companies have too much concentrated power and must be dismantled. Wu, a prominent Columbia professor who popularized the principle that a free internet requires all online traffic to be treated equally, believes that existing legal mechanisms, especially anti-monopoly laws, offer the best way to achieve this goal.

Pairing economic theory with recent digital history, Wu shows how platforms have shifted from giving to users to extracting from them. He argues that our failure to understand their power has only encouraged them to grow, displacing competitors along the way. And he contends that convenience is what platforms most often exploit to keep users entrapped. “The human desire to avoid unnecessary pain and inconvenience,” he writes, may be “the strongest force out there.”

He cites Google’s and Apple’s “ecosystems” as examples, showing how users can become dependent on such services as a result of their all-­encompassing seamlessness. To Wu, this isn’t a bad thing in itself. The ease of using Amazon to stream entertainment, make online purchases, or help organize day-to-day life delivers obvious gains. But when powerhouse companies like Amazon, Apple, and Alphabet win the battle of convenience with so many users—and never let competitors get a foothold—the result is “industry dominance” that must now be reexamined.

The measures Wu advocates—and that appear the most practical, as they draw on existing legal frameworks and economic policies—are federal anti-monopoly laws, utility caps that limit how much companies can charge consumers for service, and “line of business” restrictions that prohibit companies from operating in certain industries.

Columbia University’s Tim Wu shows how platforms have shifted from giving to users to extracting from them. He argues that our failure to understand their power has only encouraged them to grow.

Anti-monopoly provisions and antitrust laws are effective weapons in our armory, Wu contends, pointing out that they have been successfully used against technology companies in the past. He cites two well-known cases. The first is the 1960s antitrust case brought by the US government against IBM, which helped create competition in the computer software market that enabled companies like Apple and Microsoft to emerge. The 1982 AT&T case that broke the telephone conglomerate up into several smaller companies is another instance. In each, the public benefited from the decoupling of hardware, software, and other services, leading to more competition and choice in a technology market.

But will past performance predict future results? It’s not yet clear whether these laws can be successful in the platform age. The 2025 antitrust case against Google—in which a judge ruled that the company did not have to divest itself of its Chrome browser as the US Justice Department had proposed—reveals the limits of pursuing tech breakups through the law. The 2001 antitrust case brought against Microsoft likewise failed to separate the company from its web browser and mostly kept the conglomerate intact. Wu noticeably doesn’t discuss the Microsoft case when arguing for antitrust action today.

Nick Clegg, until recently Meta’s president of global affairs and a former deputy prime minister of the UK, takes a position very different from Wu’s: that trying to break up the biggest tech companies is misguided and would degrade the experience of internet users. In How to Save the Internet: The Threat to Global Connection in the Age of AI and Political Conflict, Clegg acknowledges Big Tech’s monopoly over the web. But he believes punitive legal measures like antitrust laws are unproductive and can be avoided by means of regulation, such as rules for what content social media can and can’t publish. (It’s worth noting that Meta is facing its own antitrust case, involving whether it should have been allowed to acquire Instagram and WhatsApp.)

book cover
How to Save the Internet: The Threat to Global Connection in the Age of AI and Political Conflict
Nick Clegg
BODLEY HEAD, 2025

Clegg also believes Silicon Valley should take the initiative to reform itself. He argues that encouraging social media networks to “open up the books” and share their decision-making power with users is more likely to restore some equilibrium than contemplating legal action as a first resort.

But some may be skeptical of a former Meta exec and politician who worked closely with Mark Zuckerberg and still wasn’t able to usher in such changes to social media sites while working for one. What will only compound this skepticism is the selective history found in Clegg’s book, which briefly acknowledges some scandals (like the one surrounding Cambridge Analytica’s data harvesting from Facebook users in 2016) but refuses to discuss other pertinent ones. For example, Clegg laments the “fractured” nature of the global internet today but fails to acknowledge Facebook’s own role in this splintering.

Breaking up Big Tech through antitrust laws would hinder innovation, says Clegg, arguing that the idea “completely ignores the benefits users gain from large network effects.” Users stick with these outsize channels because they can find “most of what they’re looking for,” he writes, like friends and content on social media and cheap consumer goods on Amazon and eBay.

Wu might concede this point, but he would disagree with Clegg’s claims that maintaining the status quo is beneficial to users. “The traditional logic of antitrust law doesn’t work,” Clegg insists. Instead, he believes less sweeping regulation can help make Big Tech less dangerous while ensuring a better user experience.

Clegg has seen both sides of the regulatory coin: He worked in David Cameron’s government passing national laws for technology companies to follow and then moved to Meta to help the company navigate those types of nation-specific obligations. He bemoans the hassle and complexity Silicon Valley faces in trying to comply with differing rules across the globe, some set by “American federal agencies” and others by “Indian nationalists.”

But with the resources such companies command, surely they are more than equipped to cope? Given that Meta itself has previously meddled in access to the internet (such as in India, whose telecommunications regulator ultimately blocked its Free Basics internet service for violating net neutrality rules), this complaint seems suspect coming from Clegg. What should be the real priority, he argues, is not any new nation-specific laws but a global “treaty that protects the free flow of data between signatory countries.”

What the former Meta executive Nick Clegg advocates—unsurprisingly—is not a breakup of Big Tech but a push for it to become “radically transparent.”

Clegg believes that these nation-specific technology obligations—a recent one is Australia’s ban on social media for people under 16—usually reflect fallacies about the technology’s human impact, a subject that can be fraught with anxiety. Such laws have proved ineffective and tend to taint the public’s understanding of social networks, he says. There is some truth to his argument here, but reading a book in which a former Facebook executive dismisses techno-determinism—that is, the argument that technology makes people do or think certain things—may be cold comfort to those who have seen the harm technology can do.

In any case, Clegg’s defensiveness about social networks may not gain much favor from users themselves. He stresses the need for more personal responsibility, arguing that Meta doesn’t ever intend for users to stay on Facebook or Instagram endlessly: “How long you spend on the app in a single session is not nearly as important as getting you to come back over and over again.” Social media companies want to serve you content that is “meaningful to you,” he claims, not “simply to give you a momentary dopamine spike.” All this feels disingenuous at best.

What Clegg advocates—unsurprisingly—is not a breakup of Big Tech but a push for it to become “radically transparent,” whether on its own or, if necessary, with the help of federal legislators. He also wants platforms to bring users more into their governance processes (by using Facebook’s model of community forums to help improve their apps and products, for example). Finally, Clegg also wants Big Tech to give users more meaningful control of their data and how companies such as Meta can use it.

Here Clegg shares common ground with the inventor of the web, Tim Berners-Lee, whose own proposal for reform advances a technically specific vision for doing just that. In his memoir/manifesto This Is for Everyone: The Unfinished Story of the World Wide Web, Berners-Lee acknowledges that his initial vision—of a technology he hoped would remain open-source, collaborative, and completely decentralized—is a far cry from the web that we know today.

book cover
This Is for Everyone: The Unfinished Story of the World Wide Web
Tim Berners-Lee
FARRAR, STRAUS & GIROUX, 2025

If there’s any surviving manifestation of his original project, he says, it’s Wikipedia, which remains “probably the best single example of what I wanted the web to be.” His best idea for moving power from Silicon Valley platforms into the hands of users is to give them more data control. He pushes for a universal data “pod” he helped develop, known as “Solid” (an abbreviation of “social linked data”).

The system—which was originally developed at MIT—would offer a central site where people could manage data ranging from credit card information to health records to social media comment history. “Rather than have all this stuff siloed off with different providers across the web, you’d be able to store your entire digital information trail in a single private repository,” Berners-Lee writes.

The Solid product may look like a kind of silver bullet in an age when data harvesting is familiar and data breaches are rampant. Placing greater control with users and enabling them to see “what data [i]s being generated about them” does sound like a tantalizing prospect.

But some people may have concerns about, for example, merging their confidential health records with data from personal devices (like heart rate info from a smart watch). No matter how much user control and decentralization Berners-Lee may promise, recent data scandals (such as cases in which period-tracking apps misused clients’ data) may be on people’s minds.

Berners-Lee believes that centralizing user data in a product like Solid could save people time and improve daily life on the internet. “An alien coming to Earth would think it was very strange that I had to tell my phone the same things again and again,” he complains about the experience of using different airline apps today.

With Solid, everything from vaccination records to credit card transactions could be kept within the digital vault and plugged into different apps. Berners-Lee believes that AI could also help people make more use of this data—for example, by linking meal plans to grocery bills. Still, if he’s optimistic on how AI and Solid could coordinate to improve users’ lives, he is vague on how to make sure that chatbots manage such personal data sensitively and safely.

Berners-Lee generally opposes regulation of the web (except in the case of teenagers and social media algorithms, where he sees a genuine need). He believes in internet users’ individual right to control their own data; he is confident that a product like Solid could “course-correct” the web from its current “exploitative” and extractive direction.

Of the three writers’ approaches to reform, it is Wu’s that has shown some effectiveness of late. Companies like Google have been forced to give competitors some advantage through data sharing, and they have now seen limits on how their systems can be used in new products and technologies. But in the current US political climate, will antitrust laws continue to be enforced against Big Tech?

Clegg may get his way on one issue: limiting new nation-specific laws. President Donald Trump has confirmed that he will use tariffs to penalize countries that ratify their own national laws targeting US tech companies. And given the posture of the Trump administration, it doesn’t seem likely that Big Tech will see more regulation in the US. Indeed, social networks have seemed emboldened (Meta, for example, removed fact-checkers and relaxed content moderation rules after Trump’s election win). In any case, the US hasn’t passed a major piece of federal internet legislation since 1996.

If using anti-monopoly laws through the courts isn’t possible, Clegg’s push for a US-led omnibus deal—setting consensual rules for data and acceptable standards of human rights—may be the only way to make some more immediate improvements.

In the end, there is not likely to be any single fix for what ails the internet today. But the ideas the three writers agree on—greater user control, more data privacy, and increased accountability from Silicon Valley—are surely the outcomes we should all fight for.

Nathan Smith is a writer whose work has appeared in the Washington Post, the Economist, and the Los Angeles Times.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Intel details new efficient Xeon processor line

The new chips will be able to support up to 12-channel DDR5 memory with speeds of up to 8000 MT/s, a substantial increase over the 8 channels of 6400MT/s in the prior generation. In addition to that, the platform will support up to 6 UPI 2.0 links with up to

Read More »

Shell Approves New Upstream Project for Nigeria LNG

Shell PLC on Tuesday announced a final investment decision (FID) to develop the HI field to supply up to 350 million standard cubic feet of natural gas a day to Nigeria LNG. The project is part of a joint venture in which Shell owns 40 percent through Shell Nigeria Exploration and Production Co Ltd and Sunlink Energies and Resources Ltd holds 60 percent. At Nigeria LNG, which has a declared capacity of 22 million metric tons of liquefied natural gas a year, Shell owns 25.6 percent. “The increase in feedstock to NLNG, via the train VII project that aims to expand the Bonny Island terminal’s production capacity, is in line with Shell’s plans to grow its global LNG volumes by an average of four to five percent per year until 2030”, Shell said in a statement on its website. HI also supports Shell’s plan announced on Capital Market Day 2025 to start up upstream and integrated gas projects with a total capacity of one million barrels of oil equivalent per day between 2025 and 2030. It also contributes to the company’s target to grow top line production across its upstream and integrated gas business by one percent per year through the end of the decade, Shell said. HI is estimated to hold about 285 million barrels of oil equivalent, Shell said. The field, discovered 1985, lies 50 kilometers (31.07 miles) from shore in waters 100 meters (328.08 feet) deep, according to Shell. The development consists of a wellhead platform with four wells, a pipeline to transport the gas to Bonny and a gas processing plant on the island, from where the processed gas will be transported to Nigeria LNG and the condensate to the Bonny Oil and Gas Export Terminal, Shell said. “Following recent investment decisions related to the Bonga

Read More »

Oil Prices Fall to Lowest Point in 5 Months

Oil prices fell to their lowest point in five months on Tuesday, extending their downward trend. That’s what Daniel Takieddine, Co-founder and CEO of Sky Links Capital Group, said in a market analysis sent to Rigzone today. “Mounting concerns of a global supply surplus weighed on the market, compounded by renewed trade tensions between the U.S. and China,” Takieddine stated in the analysis. “The latest International Energy Agency (IEA) market report added to the concerns and forecasts a growing oversupply of oil,” he added. “The IEA has increased its global supply growth projections to three million barrels per day for this year and 2.4 million for 2026, pointing to production hikes from OPEC+ and robust output from the Americas,” he continued. “In contrast, the agency has lowered its demand growth estimates to approximately 700,000 barrels per day for both years, reinforcing expectations of a significant surplus,” Takieddine warned. The Sky Links Capital Group CEO went on to state in the analysis that trade tensions are injecting fresh uncertainty into the market. “A brief rebound in prices on Monday, sparked by hopes for de-escalation in U.S.-China trade talks, quickly faded,” he said. “The market was further affected by a reduced geopolitical risk premium as hopes for stability in the Middle East grew,” he added. Looking ahead, Takieddine noted in the analysis that markets could closely monitor U.S.-China relations, OPEC+ supply, and upcoming inventory data from the EIA (Energy Information Administration) “to determine the market’s next direction”. “Without a positive surprise from inventory reports or broader macroeconomic data, prices could remain under pressure,” he warned. In a separate statement sent to Rigzone this morning, Naeem Aslam, Chief Market Analyst at Zay Capital Markets, highlighted that the Brent and West Texas Intermediate oil prices were “reflecting the balance between geopolitical tensions, economic policies, and

Read More »

Repsol Starts Renewable Gasoline Production at Tarragona Complex

Repsol SA has declared a “technological milestone” with the start of “industrial-scale” production of 100 percent renewable gasoline at its complex in Tarragona, Spain, making a case for combustion engines. “This new product, made entirely from renewable sources, is fully compatible with gasoline vehicles without the need for any modifications”, the Spanish energy company said in a press release. “Its use reduces net CO2 emissions by more than 70 percent compared to conventional gasoline. “Nexa 95 Gasoline of 100 percent renewable origin – Repsol’s highest-quality 95-octane product – is already available at 20 service stations in Spain, in the Madrid and Catalonia regions”. Repsol expects to deploy the product at 30 stations by yearend by expanding in other parts of the country including the cities of Bilbao, Tarragona, Valencia and Zaragoza. The formulation was developed with Honeywell, according to Repsol. In 2024 Repsol launched Nexa 100 Percent Renewable Diesel, which it says is designed for all diesel engines. Repsol said the development of the products demonstrate that “decarbonizing transport with renewable liquid fuels is a viable solution for combustion engine vehicles, whether gasoline, diesel or hybrid”. “These vehicles today represent 97 percent of the Spanish and European vehicle fleet, and 87 percent of sales in Spain so far this year”, it said. “To meet the climate targets set by Spain and the European Union, it is essential to recognize the role of 100 percent renewable fuels and, consequently, to reconsider the EU regulation on CO2 emission standards, which proposes a ban on combustion engines by 2035. The uncertainty caused by this measure has led to considerable aging of Spain’s vehicle fleet, with an average age of 14.5 years and 8.5 million vehicles – nearly one-third of the total fleet – over 20 years old. “To accelerate the development of renewable fuels, it is essential to establish

Read More »

North America Adds 1 Rig Week on Week

North America added one rig week on week, according to Baker Hughes’ latest North America rotary rig count, which was released on October 10. The total U.S. rig count dropped by two week on week and the total Canada rig count increased by three during the same period, taking the total North America rig count up to 740, comprising 547 rigs from the U.S. and 193 rigs from Canada, the count outlined. Of the total U.S. rig count of 547, 529 rigs are categorized as land rigs, 15 are categorized as offshore rigs, and three are categorized as inland water rigs. The total U.S. rig count is made up of 418 oil rigs, 120 gas rigs, and nine miscellaneous rigs, according to Baker Hughes’ count, which revealed that the U.S. total comprises 480 horizontal rigs, 55 directional rigs, and 12 vertical rigs. Week on week, the U.S. offshore and inland water rig counts remained unchanged, and the country’s land rig count dropped by two, Baker Hughes highlighted. The U.S. oil rig count dropped by four, its gas rig count increased by two, and miscellaneous rig count remained unchanged, week on week, the count showed. The U.S. directional rig count dropped by three week on week, while the country’s horizontal rig count increased by one and its vertical rig count remained unchanged, the count revealed. A major state variances subcategory included in the rig count showed that, week on week, Texas dropped six rigs, Oklahoma dropped three rigs, Wyoming dropped one rig, New Mexico added four rigs, Utah added two rigs, and Louisiana and North Dakota each added one rig. A major basin variances subcategory included in Baker Hughes’ rig count showed that, week on week, the Granite Wash basin dropped two rigs and the Eagle Ford, DJ-Niobrara and Permian basins

Read More »

Midad to Invest $5.4B in Algerian Production Sharing Contract

Sonatrach signed a production sharing contract (PSC) with Midad Energy for the Illizi South area, nearly 100 kilometers (62.14 miles) south of In Amenas, Algeria’s national oil and gas company said Monday. The Saudi company, through its Netherlands-based subsidiary Midad Energy North Africa BV, agreed to shoulder the full exploration and development cost. That is expected to be $5.4 billion, including $288 million for exploration, Sonatrach said in a statement on its website. The PSC is expected to produce, according to Sonatrach, a total of 993 million barrels of oil equivalent consisting of 125 billion cubic meters (4.41 trillion cubic feet) of gas for marketing and 204 million barrels of liquids, including 103 million barrels of liquefied petroleum gas and 101 million barrels of condensate. The contract lasts 30 years, including seven years for exploration, Sonatrach said. The contract is extendable by 10 years, it said. “This program includes also the use of the latest technological and digital solutions”, Sonatrach said. “Furthermore, calling on local content and subcontracting with national supplier under the execution of this contract will be prioritized”. Sonatrach chair and chief executive Rachid Hachichi and Midad Energy North Africa counterpart Sheikh Abdulelah Bin Mohammed Bin Abdullah Al-Aiban signed the PSC in the presence of Algerian Hydrocarbons and Mines Minister Mohamed Arkab and Saudi Ambassador to Algeria Abdullah Bin Nasser Abdullah Al-Busairi, according to Sonatrach. On July 21 Sonatrach announced it has entered five hydrocarbon contracts under Algeria’s 2024 bidding round. Ahara in Illizi province was signed with QatarEnergy and France’s TotalEnergies SE. Guern El Guessa II in the provinces of Bechar, Beni Abbes, El Bayadh and Timimoun was signed with China Petroleum and Chemical Corp (Sinopec). Toual II in the provinces of Ouargla and Illizi was signed with Austria’s Zangas Hoch- und Tiefbau GmbH and Switzerland’s Filada AG.

Read More »

Valeura Raises Production at Nong Yao in Gulf of Thailand

Valeura Energy Inc has completed a 10-well infill drilling campaign at the Nong Yao oilfield in the Gulf of Thailand, increasing the company’s net production before royalties to 24,800 barrels per day (bpd) in the last seven days of the third quarter. “The campaign was primarily production-oriented and resulted in the company’s working interest share oil production before royalties from the Nong Yao field increasing from approximately 7,996 bpd prior to the first new wells coming on stream, to a recent rate of 11,562 bpd, over the seven-day period ending September 30, 2025”, Canada’s Valeura said in an operations update on its website. “The company anticipates that the reservoirs encountered may add to the ultimate production potential of the Nong Yao field and can thereby further extend its economic life”. The campaign involved all three wellhead infrastructure facilities in Nong Yao, Valeura said. The Nong Yao B segment of the campaign “included some of the most technically challenging wells ever drilled in the Gulf of Thailand basin, influenced by both geological complexity and also their extended reach from the wellhead platform, in one instance measuring a total drilled length of over 9,800′”, Valeura noted. Valeura operates Nong Yao with a 90 percent stake through License G11/48, in which Palang Sophon Co Ltd owns the remaining 10 percent. Nong Yao held proven and probable gross pre-royalties reserves of 16.9 million as of yearend 2024, according to Valuera. The field produces medium sweet crude from Miocene-age reservoirs, according to the company. Elsewhere in the Gulf of Thailand, work is progressing on the Wassana oilfield redevelopment project. Wassana’s newbuild wellhead production facility is on track to start up in the second quarter of 2027, Valeura added. “The Wassana redevelopment project is intended to increase production, reduce unit costs and create a hub for

Read More »

OpenAI–Broadcom alliance signals a shift to open infrastructure for AI

The decision also reflects a future of AI workloads running on heterogeneous computing and networking infrastructure, said Lian Jye Su, chief analyst at Omdia. “While it makes sense for enterprises to first rely on Nvidia’s full stack solution to roll out AI, they will generally integrate alternative solutions such as AMD and self-developed chips for cost efficiency, supply chain diversity, and chip availability,” Su said. “This means data center networking vendors will need to consider interoperability and open standards as ways to address the diversification of AI chip architecture.” Hyperscalers and enterprise CIOs are increasingly focused on how to efficiently scale up or scale out AI servers as workloads expand. Nvidia’s GPUs still underpin most large-scale AI training, but companies are looking for ways to integrate them with other accelerators. Neil Shah, VP for research at Counterpoint Research, said that Nvidia’s recent decision to open its NVLink interconnect to ecosystem players earlier this year gives hyperscalers more flexibility to pair Nvidia GPUs with custom accelerators from vendors such as Broadcom or Marvell. “While this reduces the dependence on Nvidia for a complete solution, it actually increases the total addressable market for Nvidia to be the most preferred solution to be tightly paired with the hyperscaler’s custom compute,” Shah said. Most hyperscalers have moved toward custom compute architectures to diversify beyond x86-based Intel or AMD processors, Shah added. Many are exploring Arm or RISC-V designs that can be tailored to specific workloads for greater power efficiency and lower infrastructure costs. Shifting AI infrastructure strategies The collaboration also highlights how networking choices are becoming as strategic as chip design itself, suggesting a change in how AI workloads are powered and connected.

Read More »

Inside Nvidia’s ‘grid-to-chip’ vision: How Vera Rubin and Spectrum-XGS push toward AI giga-factories

Vera Rubin MGX brings together Nvidia’s Vera CPUs and Rubin CPX GPUs, all using the same open MGX rack footprint as Blackwell. The system allows for numerous configurations and integrations. “MGX is a flexible, modular building block-based approach to server and rack scale design,” Delaere said. “It allows our ecosystem to create a wide range of configurations, and do so very quickly.” Vera Rubin MGX will deliver almost eight times more performance than Nvidia’s GB 300 for certain types of calculation, he said. The architecture is liquid-cooled and cable-free, allowing for faster assembly and serviceability. Operators can quickly mix and match components such as CPUs, GPUs, or storage, supporting interoperability, Nvidia said. Matt Kimball, principal data center analyst at Moor Insights and Strategy, highlighted the modularity and cleanness of the MGX tray design. “This simplifies the manufacturing process significantly,” he said. For enterprises managing tens or even hundreds of thousands of racks, “this design enables a level of operational efficiency that can deliver real savings in time and cost.” Nvidia is also showing innovation with cooling, Kimball said. “Running cooling to the midplane is a very clean design and more efficient.”

Read More »

Cisco seriously amps-up Silicon One chip, router for AI data center connectivity

Some say deep buffers shouldn’t be used to handle this type of traffic; the contention is that these buffers fill and drain, creating jitter in the workloads, and that slows things down, Chopra told Network World. “But the real source of that challenge is not the buffers. It’s a poor congestion management scheme and poor load balancing with AI workloads, which are completely deterministic and predictable. You can actually proactively figure out how to place flows across the network and avoid the congestion,” he said. The 8223’s deep-buffer design provides ample memory to temporarily store packets during congestion or traffic bursts, an essential feature for AI networks where inter-GPU communication can create unpredictable, high-volume data flows, according to Gurudatt Shenoy, vice president of Cisco Provider Connectivity. “Combined with its high-radix architecture, the 8223 allows more devices to connect directly, reducing latency, saving rack space, and further lowering power consumption. The result is a flatter, more efficient network topology supporting high-bandwidth, low-latency communication that is critical for AI workloads,” Shenoy wrote in a blog post. NOS options Notably, the first operating systems that the 8223 supports are the Linux Foundation’s Software for Open Networking in the Cloud (SONiC) and Facebook open switching system (FBOSS) – not Cisco’s own IOS XR.  IXR will be supported, too, but at a later date, according to Cisco.  SONiC decouples network software from the underlying hardware and lets it run on hundreds of switches and ASICs from multiple vendors while supporting a full suite of network features such as Border Gateway Protocol (BGP), remote direct memory access (RDMA), QoS, and Ethernet/IP. One of the keys to SONiC is its switch-abstraction interface, which defines an API to provide a vendor-independent way of controlling forwarding elements such as a switching ASIC, an NPU, or a software switch in a uniform

Read More »

Utilities Race to Meet Surging Data Center Demand With New Power Models

Over the last 18 months or so, the energy generation industry and its public utilities have been significantly impacted by the AI data center boom. It has been demonstrated across North America that the increase in demand for power, as driven by the demand for hyperscale and AI data centers, greatly exceeds the ability of the industry to actually generate and deliver power to meet the demand. We have covered many of the efforts being made to control the availability of power. In response, utilities and regulators have begun rethinking how to manage power availability through means such as: temporary moratoriums on new data center interconnections; the creation of new rate classes; cogeneration and load-sharing agreements; renewable integration; and power-driven site selection strategies.  But the bottom line is that in many locations utilities will need to change the way they work and how and where they spend their CAPEX budgets. The industry has already realized that their demand forecast models are hugely out of date, and that has had a ripple effect on much of the planning done by public utilities to meet the next generation of power demand. Most utilities now acknowledge that their demand forecasting models have fallen behind reality, triggering revisions to Integrated Resource Plans (IRPs) and transmission buildouts nationwide. This mismatch between forecast and actual demand is forcing a fundamental rethink of capital expenditure priorities and long-term grid planning. Spend More, Build Faster Utilities are sharply increasing CAPEX and rebalancing their resource portfolios—not just for decarbonization, but to keep pace with multi-hundred-megawatt data center interconnects. This trend is spreading across the industry, not confined to a few isolated utilities. Notable examples include: Duke Energy raised its five-year CAPEX plan to $83 billion (a 13.7% increase) and plans to add roughly 5 GW of natural gas capacity

Read More »

Duos Pairs Mobile Power and Modular Edge Data Centers for Rapid Texas Rollout

Duos Technology Group has launched the fifth of its AI edge data centers, part of a plan to deploy 15 units by the end of 2025. The projects are executed through Duos Edge AI, a subsidiary focused on modular, rapidly installed edge data centers (EDCs) in underserved markets, beginning with school districts and regional carrier hubs across Texas. The newest site is being deployed on-premises with the Dumas Independent School District in Dumas, Texas. High-Density Edge Design Duos’ EDCs emphasize very high rack densities (100 kW+ per rack), SOC 2 Type II compliance, N+1 power with dual generators, and a 90-day build/turn-up cycle. Each site is positioned approximately 12 miles from end users, cutting latency for real-time workloads. To meet the power demands of these edge deployments, Duos formed Duos Energy and partnered with Fortress/APR Energy to deliver behind-the-meter mobile gas turbines. This approach allows compute to go live in 90 days without waiting years for utility interconnection upgrades. The goal is straightforward: move power and compute close to demand, with rapid deployment. Duos’ modular pods are designed for exurban and rural locations as localized compute hubs for carriers, schools, healthcare systems, and municipal users. The rugged design pairs high-density racks with the short deployment cycle and proximity targeting, enabling a wide range of applications. With Dumas ISD now live, Duos has five sites in Texas, including Amarillo/Region 16, Victoria/Region 3, Dumas ISD, and multiple Corpus Christi locations. Mobile Power vs. Modular Compute While Duos doesn’t consistently describe its data center units as “mobile,” they are modular and containerized, engineered for rapid, site-agnostic deployment. The “mobile” label more explicitly applies to Duos’ power strategy—a turbine fleet that can be fielded or re-fielded to match demand. From an operator’s perspective, the combined proposition functions like a mobile platform: pre-integrated compute pods

Read More »

Report: AMD could be Intel’s next foundry customer

[ Related: More Intel news and insights ] AMD has lagged behind Nvidia in the AI business but has done well in the federal supercomputing business, holding numerous top spots with supercomputers like El Capitan and Frontier. Manufacturing its chips in the United States would be a good way to get the Trump administration off its back given its push for domestic manufacturing of semiconductors. The Trump administration is pushing for 50% of chips sold in America to be manufactured domestically, and tariffs on chips that are not. It also faces outbound restrictions. Earlier this year, AMD faced export restrictions GPUs meant for China as part of U.S. export controls against China’s AI business. “I believe this is a smart move by AMD to secure capacity in the local market without fighting against Nvidia and Apple and their deeper pockets for the limited capacity at TSMC,” said Alvi Nguyen, senior analyst with Forrester Research.” With the US investment in Intel, followed by Nvidia, this is can be seen as diversifying their supply chain and providing cheaper, locally sourced parts.” For Intel, this will continue a streak of good news it has enjoyed recently. “Having customers take up capacity at their foundries will go a long way in legitimizing their semiconductor processes and hopefully create the snowball effect of getting even more US-based customers,” said Nguyen. In recent weeks, Intel has partnered with Nvidia to jointly make PC and data center chips. Nvidia also took a $5B stake in Intel. Earlier the Trump administration made a $11.1B, or 10%, stake in Intel.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »

Can we repair the internet?

From addictive algorithms to exploitative apps, data mining to misinformation, the internet today can be a hazardous place. Books by three influential figures—the intellect behind

Read More »