Stay Ahead, Stay ONMINE

Is Python Set to Surpass Its Competitors?

A soufflé is a baked egg dish that originated in France in the 18th century. The process of making an elegant and delicious French soufflé is complex, and in the past, it was typically only prepared by professional French pastry chefs. However, with pre-made soufflé mixes now widely available in supermarkets, this classic French dish has found its way into the kitchens of countless households.  Python is like the pre-made soufflé mixes in programming. Many studies have consistently shown that Python is the most popular programming language among developers, and this advantage will continue to expand in 2025. Python stands out compared to languages like C, C++, Java, and Julia because it’s highly readable and expressive, flexible and dynamic, beginner-friendly yet powerful. These characteristics make Python the most suitable programming language for people even without programming basics. The following features distinguish Python from other programming languages: Dynamic Typing List Comprehensions Generators Argument Passing and Mutability These features reveal Python’s intrinsic nature as a programming language. Without this knowledge, you’ll never truly understand Python. In today’s article, I will elaborate how Python excels over other programming languages through these features. Dynamic Typing For most programming languages like Java or C++, explicit data type declarations are required. But when it comes to Python, you don’t have to declare the type of a variable when you create one. This feature in Python is called dynamic typing, which makes Python flexible and easy to use. List Comprehensions List comprehensions are used to generate lists from other lists by applying functions to each element in the list. They provide a concise way to apply loops and optional conditions in a list. For example, if you’d like to create a list of squares for even numbers between 0 and 9, you can use JavaScript, a regular loop in Python and Python’s list comprehension to achieve the same goal.  JavaScript let squares = Array.from({ length: 10 }, (_, x) = > x) // Create array [0, 1, 2, …, 9] .filter(x = > x % 2 === 0) // Filter even numbers .map(x = > x ** 2); // Square each number console.log(squares); // Output: [0, 4, 16, 36, 64] Regular Loop in Python squares = [] for x in range(10): if x % 2 == 0: squares.append(x**2) print(squares) Python’s List Comprehension squares = [x**2 for x in range(10) if x % 2 == 0]print(squares) All the three sections of code above generate the same list [0, 4, 16, 36, 64], but Python’s list comprehension is the most elegant because the syntax is concise and clearly express the intent while the Python function is more verbose and requires explicit initialization and appending. The syntax of JavaScript is the least elegant and readable because it requires chaining methods of using Array.from, filter, and map. Both Python function and JavaScript function are not intuitive and cannot be read as natural language as Python list comprehension does. Generator Generators in Python are a special kind of iterator that allow developers to iterate over a sequence of values without storing them all in memory at once. They are created with the yield keyword. Other programming languages like C++ and Java, though offering similar functionality, don’t have built-in yield keyword in the same simple, integrated way. Here are several key advantages that make Python Generators unique: Memory Efficiency: Generators yield one value at a time so that they only compute and hold one item in memory at any given moment. This is in contrast to, say, a list in Python, which stores all items in memory. Lazy Evaluation: Generators enable Python to compute values only as needed. This “lazy” computation results in significant performance improvements when dealing with large or potentially infinite sequences. Simple Syntax: This might be the biggest reason why developers choose to use generators because they can easily convert a regular function into a generator without having to manage state explicitly. def fibonacci(): a, b = 0, 1 while True: yield a a, b = b, a + b fib = fibonacci() for _ in range(100): print(next(fib)) The example above shows how to use the yield keyword when creating a sequence. For the memory usage and time difference between the code with and without Generators, generating 100 Fibonacci numbers can hardly see any differences. But when it comes to 100 million numbers in practice, you’d better use generators because a list of 100 million numbers could easily strain many system resources. Argument Passing and Mutability In Python, we don’t really assign values to variables; instead, we bind variables to objects. The result of such an action depends on whether the object is mutable or immutable. If an object is mutable, changes made to it inside the function will affect the original object.  def modify_list(lst): lst.append(4) my_list = [1, 2, 3] modify_list(my_list) print(my_list) # Output: [1, 2, 3, 4] In the example above, we’d like to append ‘4’ to the list my_list which is [1,2,3]. Because lists are mutable, the behavior append operation changes the original list my_list without creating a copy.  However, immutable objects, such as integers, floats, strings, tuples and frozensets, cannot be changed after creation. Therefore, any modification results in a new object. In the example below, because integers are immutable, the function creates a new integer rather than modifying the original variable. def modify_number(n): n += 10 return n a = 5 new_a = modify_number(a) print(a) # Output: 5 print(new_a) # Output: 15 Python’s argument passing is sometimes described as “pass-by-object-reference” or “pass-by-assignment.” This makes Python unique because Python pass references uniformly (pass-by-object-reference) while other languages need to differentiate explicitly between pass-by-value and pass-by-reference. Python’s uniform approach is simple yet powerful. It avoids the need for explicit pointers or reference parameters but requires developers to be mindful of mutable objects. With Python’s argument passing and mutability, we can enjoy the following benefits in coding: Memory Efficiency: It saves memory by passing references instead of making full copies of objects. This especially benefits code development with large data structures. Performance: It avoids unnecessary copies and thus improves the overall coding performance. Flexibility: This feature provides convenience for updating data structure because developers don’t need to explicitly choose between pass-by-value and pass-by-reference. However, this characteristic of Python forces developers to carefully choose between mutable and immutable data types and it also brings more complex debugging. So is Python Really Simple? Python’s popularity results from its simplicity, memory efficiency, high performance, and beginner-friendiness. It’s also a programming language that looks most like a human’s natural language, so even people who haven’t received systematic and holistic programming training are still able to understand it. These characteristics make Python a top choice among enterprises, academic institutes, and government organisations.  For example, when we’d like to filter out the the “completed” orders with amounts greater than 200, and update a mutable summary report (a dictionary) with the total count and sum of amounts for an e-commerce company, we can use list comprehension to create a list of orders meeting our criteria, skip the declaration of variable types and make changes of the original dictionary with pass-by-assignment.  import random import time def order_stream(num_orders): “”” A generator that yields a stream of orders. Each order is a dictionary with dynamic types: – ‘order_id’: str – ‘amount’: float – ‘status’: str (randomly chosen among ‘completed’, ‘pending’, ‘cancelled’) “”” for i in range(num_orders): order = { “order_id”: f”ORD{i+1}”, “amount”: round(random.uniform(10.0, 500.0), 2), “status”: random.choice([“completed”, “pending”, “cancelled”]) } yield order time.sleep(0.001) # simulate delay def update_summary(report, orders): “”” Updates the mutable summary report dictionary in-place. For each order in the list, it increments the count and adds the order’s amount. “”” for order in orders: report[“count”] += 1 report[“total_amount”] += order[“amount”] # Create a mutable summary report dictionary. summary_report = {“count”: 0, “total_amount”: 0.0} # Use a generator to stream 10,000 orders. orders_gen = order_stream(10000) # Use a list comprehension to filter orders that are ‘completed’ and have amount > 200. high_value_completed_orders = [order for order in orders_gen if order[“status”] == “completed” and order[“amount”] > 200] # Update the summary report using our mutable dictionary. update_summary(summary_report, high_value_completed_orders) print(“Summary Report for High-Value Completed Orders:”) print(summary_report) If we’d like to achieve the same goal with Java, since Java lacks built-in generators and list comprehensions, we have to generate a list of orders, then filter and update a summary using explicit loops, and thus make the code more complex, less readable and harder to maintain. import java.util.*; import java.util.concurrent.ThreadLocalRandom; class Order { public String orderId; public double amount; public String status; public Order(String orderId, double amount, String status) { this.orderId = orderId; this.amount = amount; this.status = status; } @Override public String toString() { return String.format(“{orderId:%s, amount:%.2f, status:%s}”, orderId, amount, status); } } public class OrderProcessor { // Generates a list of orders. public static List generateOrders(int numOrders) { List orders = new ArrayList(); String[] statuses = {“completed”, “pending”, “cancelled”}; Random rand = new Random(); for (int i = 0; i < numOrders; i++) { String orderId = "ORD" + (i + 1); double amount = Math.round(ThreadLocalRandom.current().nextDouble(10.0, 500.0) * 100.0) / 100.0; String status = statuses[rand.nextInt(statuses.length)]; orders.add(new Order(orderId, amount, status)); } return orders; } // Filters orders based on criteria. public static List filterHighValueCompletedOrders(List orders) { List filtered = new ArrayList(); for (Order order : orders) { if (“completed”.equals(order.status) && order.amount > 200) { filtered.add(order); } } return filtered; } // Updates a mutable summary Map with the count and total amount. public static void updateSummary(Map summary, List orders) { int count = 0; double totalAmount = 0.0; for (Order order : orders) { count++; totalAmount += order.amount; } summary.put(“count”, count); summary.put(“total_amount”, totalAmount); } public static void main(String[] args) { // Generate orders. List orders = generateOrders(10000); // Filter orders. List highValueCompletedOrders = filterHighValueCompletedOrders(orders); // Create a mutable summary map. Map summaryReport = new HashMap(); summaryReport.put(“count”, 0); summaryReport.put(“total_amount”, 0.0); // Update the summary report. updateSummary(summaryReport, highValueCompletedOrders); System.out.println(“Summary Report for High-Value Completed Orders:”); System.out.println(summaryReport); } } Conclusion Equipped with features of dynamic typing, list comprehensions, generators, and its approach to argument passing and mutability, Python is making itself a simplified coding while enhancing memory efficiency and performance. As a result, Python has become the ideal programming language for self-learners. Thank you for reading!

A soufflé is a baked egg dish that originated in France in the 18th century. The process of making an elegant and delicious French soufflé is complex, and in the past, it was typically only prepared by professional French pastry chefs. However, with pre-made soufflé mixes now widely available in supermarkets, this classic French dish has found its way into the kitchens of countless households. 

Python is like the pre-made soufflé mixes in programming. Many studies have consistently shown that Python is the most popular programming language among developers, and this advantage will continue to expand in 2025. Python stands out compared to languages like C, C++, Java, and Julia because it’s highly readable and expressive, flexible and dynamic, beginner-friendly yet powerful. These characteristics make Python the most suitable programming language for people even without programming basics. The following features distinguish Python from other programming languages:

  • Dynamic Typing
  • List Comprehensions
  • Generators
  • Argument Passing and Mutability

These features reveal Python’s intrinsic nature as a programming language. Without this knowledge, you’ll never truly understand Python. In today’s article, I will elaborate how Python excels over other programming languages through these features.

Dynamic Typing

For most programming languages like Java or C++, explicit data type declarations are required. But when it comes to Python, you don’t have to declare the type of a variable when you create one. This feature in Python is called dynamic typing, which makes Python flexible and easy to use.

List Comprehensions

List comprehensions are used to generate lists from other lists by applying functions to each element in the list. They provide a concise way to apply loops and optional conditions in a list.

For example, if you’d like to create a list of squares for even numbers between 0 and 9, you can use JavaScript, a regular loop in Python and Python’s list comprehension to achieve the same goal. 

JavaScript

let squares = Array.from({ length: 10 }, (_, x) => x)  // Create array [0, 1, 2, ..., 9]
   .filter(x => x % 2 === 0)                          // Filter even numbers
   .map(x => x ** 2);                                 // Square each number
console.log(squares);  // Output: [0, 4, 16, 36, 64]

Regular Loop in Python

squares = []
for x in range(10):
   if x % 2 == 0:
       squares.append(x**2)
print(squares) 

Python’s List Comprehension

squares = [x**2 for x in range(10) if x % 2 == 0]print(squares) 

All the three sections of code above generate the same list [0, 4, 16, 36, 64], but Python’s list comprehension is the most elegant because the syntax is concise and clearly express the intent while the Python function is more verbose and requires explicit initialization and appending. The syntax of JavaScript is the least elegant and readable because it requires chaining methods of using Array.from, filter, and map. Both Python function and JavaScript function are not intuitive and cannot be read as natural language as Python list comprehension does.

Generator

Generators in Python are a special kind of iterator that allow developers to iterate over a sequence of values without storing them all in memory at once. They are created with the yield keyword. Other programming languages like C++ and Java, though offering similar functionality, don’t have built-in yield keyword in the same simple, integrated way. Here are several key advantages that make Python Generators unique:

  • Memory Efficiency: Generators yield one value at a time so that they only compute and hold one item in memory at any given moment. This is in contrast to, say, a list in Python, which stores all items in memory.
  • Lazy Evaluation: Generators enable Python to compute values only as needed. This “lazy” computation results in significant performance improvements when dealing with large or potentially infinite sequences.
  • Simple Syntax: This might be the biggest reason why developers choose to use generators because they can easily convert a regular function into a generator without having to manage state explicitly.
def fibonacci():
   a, b = 0, 1
   while True:
       yield a
       a, b = b, a + b

fib = fibonacci()
for _ in range(100):
   print(next(fib))

The example above shows how to use the yield keyword when creating a sequence. For the memory usage and time difference between the code with and without Generators, generating 100 Fibonacci numbers can hardly see any differences. But when it comes to 100 million numbers in practice, you’d better use generators because a list of 100 million numbers could easily strain many system resources.

Argument Passing and Mutability

In Python, we don’t really assign values to variables; instead, we bind variables to objects. The result of such an action depends on whether the object is mutable or immutable. If an object is mutable, changes made to it inside the function will affect the original object. 

def modify_list(lst):
   lst.append(4)

my_list = [1, 2, 3]
modify_list(my_list)
print(my_list)  # Output: [1, 2, 3, 4]

In the example above, we’d like to append ‘4’ to the list my_list which is [1,2,3]. Because lists are mutable, the behavior append operation changes the original list my_list without creating a copy. 

However, immutable objects, such as integers, floats, strings, tuples and frozensets, cannot be changed after creation. Therefore, any modification results in a new object. In the example below, because integers are immutable, the function creates a new integer rather than modifying the original variable.

def modify_number(n):
   n += 10
   return n

a = 5
new_a = modify_number(a)
print(a)      # Output: 5
print(new_a)  # Output: 15

Python’s argument passing is sometimes described as “pass-by-object-reference” or “pass-by-assignment.” This makes Python unique because Python pass references uniformly (pass-by-object-reference) while other languages need to differentiate explicitly between pass-by-value and pass-by-reference. Python’s uniform approach is simple yet powerful. It avoids the need for explicit pointers or reference parameters but requires developers to be mindful of mutable objects.

With Python’s argument passing and mutability, we can enjoy the following benefits in coding:

  • Memory Efficiency: It saves memory by passing references instead of making full copies of objects. This especially benefits code development with large data structures.
  • Performance: It avoids unnecessary copies and thus improves the overall coding performance.
  • Flexibility: This feature provides convenience for updating data structure because developers don’t need to explicitly choose between pass-by-value and pass-by-reference.

However, this characteristic of Python forces developers to carefully choose between mutable and immutable data types and it also brings more complex debugging.

So is Python Really Simple?

Python’s popularity results from its simplicity, memory efficiency, high performance, and beginner-friendiness. It’s also a programming language that looks most like a human’s natural language, so even people who haven’t received systematic and holistic programming training are still able to understand it. These characteristics make Python a top choice among enterprises, academic institutes, and government organisations. 

For example, when we’d like to filter out the the “completed” orders with amounts greater than 200, and update a mutable summary report (a dictionary) with the total count and sum of amounts for an e-commerce company, we can use list comprehension to create a list of orders meeting our criteria, skip the declaration of variable types and make changes of the original dictionary with pass-by-assignment

import random
import time

def order_stream(num_orders):
   """
   A generator that yields a stream of orders.
   Each order is a dictionary with dynamic types:
     - 'order_id': str
     - 'amount': float
     - 'status': str (randomly chosen among 'completed', 'pending', 'cancelled')
   """
   for i in range(num_orders):
       order = {
           "order_id": f"ORD{i+1}",
           "amount": round(random.uniform(10.0, 500.0), 2),
           "status": random.choice(["completed", "pending", "cancelled"])
       }
       yield order
       time.sleep(0.001)  # simulate delay

def update_summary(report, orders):
   """
   Updates the mutable summary report dictionary in-place.
   For each order in the list, it increments the count and adds the order's amount.
   """
   for order in orders:
       report["count"] += 1
       report["total_amount"] += order["amount"]

# Create a mutable summary report dictionary.
summary_report = {"count": 0, "total_amount": 0.0}

# Use a generator to stream 10,000 orders.
orders_gen = order_stream(10000)

# Use a list comprehension to filter orders that are 'completed' and have amount > 200.
high_value_completed_orders = [order for order in orders_gen
                              if order["status"] == "completed" and order["amount"] > 200]

# Update the summary report using our mutable dictionary.
update_summary(summary_report, high_value_completed_orders)

print("Summary Report for High-Value Completed Orders:")
print(summary_report)

If we’d like to achieve the same goal with Java, since Java lacks built-in generators and list comprehensions, we have to generate a list of orders, then filter and update a summary using explicit loops, and thus make the code more complex, less readable and harder to maintain.

import java.util.*;
import java.util.concurrent.ThreadLocalRandom;

class Order {
   public String orderId;
   public double amount;
   public String status;
  
   public Order(String orderId, double amount, String status) {
       this.orderId = orderId;
       this.amount = amount;
       this.status = status;
   }
  
   @Override
   public String toString() {
       return String.format("{orderId:%s, amount:%.2f, status:%s}", orderId, amount, status);
   }
}

public class OrderProcessor {
   // Generates a list of orders.
   public static List generateOrders(int numOrders) {
       List orders = new ArrayList();
       String[] statuses = {"completed", "pending", "cancelled"};
       Random rand = new Random();
       for (int i = 0; i < numOrders; i++) {
           String orderId = "ORD" + (i + 1);
           double amount = Math.round(ThreadLocalRandom.current().nextDouble(10.0, 500.0) * 100.0) / 100.0;
           String status = statuses[rand.nextInt(statuses.length)];
           orders.add(new Order(orderId, amount, status));
       }
       return orders;
   }
  
   // Filters orders based on criteria.
   public static List filterHighValueCompletedOrders(List orders) {
       List filtered = new ArrayList();
       for (Order order : orders) {
           if ("completed".equals(order.status) && order.amount > 200) {
               filtered.add(order);
           }
       }
       return filtered;
   }
  
   // Updates a mutable summary Map with the count and total amount.
   public static void updateSummary(Map summary, List orders) {
       int count = 0;
       double totalAmount = 0.0;
       for (Order order : orders) {
           count++;
           totalAmount += order.amount;
       }
       summary.put("count", count);
       summary.put("total_amount", totalAmount);
   }
  
   public static void main(String[] args) {
       // Generate orders.
       List orders = generateOrders(10000);
      
       // Filter orders.
       List highValueCompletedOrders = filterHighValueCompletedOrders(orders);
      
       // Create a mutable summary map.
       Map summaryReport = new HashMap();
       summaryReport.put("count", 0);
       summaryReport.put("total_amount", 0.0);
      
       // Update the summary report.
       updateSummary(summaryReport, highValueCompletedOrders);
      
       System.out.println("Summary Report for High-Value Completed Orders:");
       System.out.println(summaryReport);
   }
}

Conclusion

Equipped with features of dynamic typing, list comprehensions, generators, and its approach to argument passing and mutability, Python is making itself a simplified coding while enhancing memory efficiency and performance. As a result, Python has become the ideal programming language for self-learners.

Thank you for reading!

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

IBM targets AI application growth with DataStax buy

In particular IBM said DataStax’s technology will be built into its watsonx portfolio of generative AI products to help manage the vast amounts of unstructured data used in generative AI application development. Thousands of organizations including FedEx, Capital One, The Home Depot and Verizon use Apache Cassandra, and it offers

Read More »

New Relic boosts observability platform with AI intelligence

New Relic announced updates to its Intelligent Observability Platform this week, which the company says will enable a unified view of system relationships and dependencies and intelligently connect technical systems with business context. New Relic’s cloud-based observability platform monitors applications and services in real time to provide insights into software,

Read More »

Oil Slumps as US Confidence Dives

Oil slumped along with equity markets as US consumer confidence tumbled, adding to mounting concerns that US President Donald Trump’s policies will hamper economic growth and sap energy demand.   West Texas Intermediate fell 2.5% to settle below $69 a barrel at the lowest closing price this year. US consumer confidence dropped the most since 2021 and missed analysts’ estimates, prompting traders to flee risk assets, including equities. Trump’s tariffs and recent moves to further decouple economic ties with China, which spurred a drop in the Asian country’s stock markets Tuesday, are worsening the already-gloomy outlook for energy demand in the world’s largest oil consumer. Domestically, the trade turmoil is raising Americans’ inflation expectations amid a cooling labor market. “Crude markets are seeing another layer of bearish pressure from a continued string of misses in economic data,” said Frank Monkam, head of macro trading at Buffalo Bayou Commodities. “Such a rollover in economic data bodes ill for crude demand.” Crude has now broken below the roughly $5 range it had wandered in for February. Oil had initially spiked above $80 early this year before fading amid persistent expectations of lackluster Chinese demand, the potential for additional barrels on the market and the prospect that tariffs will weigh on global growth. Earlier this week, the US imposed more curbs on brokers, vessels and individuals that it said were linked to illicit shipments of Iranian crude. Markets had a muted reaction to the additional sanctions on expectations that the trade would adapt quickly by ramping up ship-to-ship transfers or switching off geo-locating signals for longer. The shifts would resemble Russia’s steps to keep crude exports flowing in the face of restrictions. “Sanctions are not the bullish factor many are expecting unless we see true attempts at locating and blockading tankers with naval

Read More »

Iran Rejects Direct Nuclear Talks With USA Under Trump Policy

Iran’s Foreign Minister Abbas Araghchi said his country won’t agree to direct nuclear talks with the US while President Donald Trump persists with his hard-line policy against the Islamic Republic.  “We will not negotiate under pressure, sanctions, or threats,” Araghchi said in a televised press conference alongside his Russian counterpart Sergei Lavrov in Tehran on Tuesday. Direct negotiations between Iran and the US on the nuclear issue will be impossible “as long as maximum pressure is being applied in this manner,” Araghchi said. Trump has vowed to squeeze Iran’s economy and target its oil exports as part of a return to the “maximum pressure” strategy that he deployed in his first term. That led to the 2018 US withdrawal from a landmark international deal that limited Iran’s atomic activities in exchange for sanctions relief.  Iran’s stockpile of near weapons-grade enriched uranium has since surged. Araghchi said Iran will cooperate fully on its nuclear affairs — but only with “friends” China and Russia.  Since returning to office last month, Trump has said he wants Iran to agree to a new nuclear agreement, but Iran’s Supreme Leader Ayatollah Ali Khamenei dismissed the idea earlier this month, saying negotiating with the US “won’t solve any of the country’s problems.”  Speaking alongside Araghchi, Lavrov said Russia will pursue diplomatic efforts to resolve the Iranian nuclear issue. “We’re convinced that the tool of diplomacy remains,” Lavrov said. “It cannot be neglected — it must be used as effectively as possible without any threats and without hints at the possibility of certain forceful solutions.” Iran and the US haven’t had direct, formal ties since the 1979 Islamic revolution and previous negotiations that led to the 2015 nuclear deal took place through mediators. Araghchi didn’t mention whether indirect or mediated talks with the US are still on

Read More »

Schneider Electric books strong 2024 revenue, earnings growth amid data center boom

Dive Brief: Schneider Electric saw stronger-than-forecast revenue and adjusted earnings in 2024 as its customers’ data center investments — especially in North America — drove mid-double-digit growth in its energy management business, the company said Feb. 20. Year-over-year organic revenue growth accelerated 12% in the fourth quarter, supported by 25% growth in Schneider Electric’s North American energy management business. The company’s year-end 2024 sales backlog of 21.4 billion euros, or about $22.39 billion, was its highest ever, and the company plans to invest about 2 billion euros through 2027 to expand production capacity, it said. Much of the planned capacity expansion will occur in North America despite uncertainty around U.S. trade policy that could necessitate “commercial actions” to protect the company’s profitability, Chief Financial Officer Hilary Maxson said Thursday on Schneider Electric’s earnings call. Dive Insight: Schneider data centers and networks end-market has been strong throughout 2024 and should continue to see robust growth in 2025 and beyond, CEO Olivier Blum said on his first earnings call since replacing Peter Herweck in November. The AI investment boom supports annual growth of 10% or more through 2027 in the company’s data centers and networks business, which accounts for 24% of Schneider Electric’s 2024 end-market exposure, the company said in its earnings presentation.  “Pure data centers” make up 20% of Schneider Electric’s end market exposure, with hyperscalers contributing “a bit less than half” of that total, Maxson said.  “Suffice to say we feel there is healthy growth in that segment … and we believe there is healthy growth to come, [though] not exponential … as this new infrastructure backbone is built out,” Maxson said. Schneider expects DeepSeek, a Chinese AI firm that caught the industry off-guard in January when it released a reasoning model that appeared to use far less energy than

Read More »

New York PSC approves retail and residential storage plan as 6-GW 2030 target in question

Dive Brief: The New York State Public Service Commission has approved the state’s retail and residential energy storage implementation plan, a significant step in its effort to reach 6 GW of energy storage by 2030. The Feb. 13 order approved a framework to reach the state’s retail storage deployment goal of 1,500 MW and its residential storage deployment goal of 200 MW. It also includes incentives for resources participating in the New York Independent System Operator’s distributed energy resources program to also be eligible for the retail storage incentive, the PSC said. The plan was approved as a new forecast by Aurora Energy Research shows New York falling “marginally short” of its 2030 energy storage target despite an expected deployment surge in the late 2020s, but reaching 30 GW of deployed storage capacity by 2050. Dive Insight: New York’s 6-GW 2030 goal will “support a buildout of storage deployments estimated to reduce projected future statewide electric system costs by nearly $2 billion, in addition to further benefits in the form of improved public health because of reduced exposure to harmful fossil fuel pollutants,” the PSC said in announcing the order. The 6-GW goal represents a doubling of the previous 2030 goal of 3 GW. It envisions 1.7 GW of new retail and residential storage plus 3 GW of new bulk storage added to about 1.3 GW of existing storage assets being procured by or under contract with the state as of April 1, 2024, the PSC said on Feb. 13.  Following the adoption this month of its retail and residential implementation plan, the New York State Energy Research and Development Authority expects to make the first of three annual bulk storage solicitations by the end of June for deployment in 2027 and 2028. It plans subsequent storage solicitations in 2026

Read More »

Charging Forward: UK battery storage projects reach startup, grid delays and more

In this week’s Charging Forward, Gore Street, Eku and BW ESS reach energisation at UK battery energy storage system (BESS) projects, amid warnings over an oversubscribed grid connection queue. This week’s headlines: Root-Power secures planning consent for 40 MW Rotherham BESS Sungrow and BW ESS Bramley Project begins operations Warnings over UK grid connection queue Invinity and Frontier Power partner on UK long duration energy storage projects Fire at Statera BESS site in Essex brought under control Gore Street energises UK Enderby BESS project Eku energises two UK BESS projects International news: China and Saudi Arabia collaborate on 12.5 GWh of energy storage projects and Canadian firm Hydrostor secures $200 million for compressed air energy storage Root-Power consent for 40 MW Rotherham BESS UK energy storage developer Root-Power has secured planning consent for a 40 MW/80 MWh BESS project in Brinsworth, Rotherham. Root-Power said the site will power 80,000 homes for a two-hour period once fully operational, and delivering a biodiversity net gain of 32.76%. The Brinsworth BESS is the fourth planning approval for Root-Power in 2025, following consents at sites in Yorkshire, County Durham and the Scottish Highlands. © Supplied by Root-PowerThe site of Root-Power’s 40 MW/80 MWh Brinsworth BESS project in Rotherham. Root-Power managing director Neil Brooks said the company “carefully selected a near perfect location” for the Brinsworth project. “Managing competing constraints is always difficult when planning a project, so finding a suitable location only 1 mile from the point of connection in an urban area, without causing unacceptable noise or visual impact on sensitive receptors is a real achievement,” he said. “We are happy to see that the planning committee unanimously supported our application, which is a real vote of confidence in our process and team.” Sungrow and BW ESS Bramley BESS starts up Swiss energy storage developer

Read More »

Costain secures multi-million pound Sizewell C contract

UK construction and engineering firm Costain (LON:COST) has secured a multi-million pound contract to support the construction of the Sizewell C nuclear power plant. Costain said under the ten-year framework agreement, the company will provide support in areas such as delivery integration, health and safety and quality control. French state-owned energy firm EDF is developing the 3.2 GW nuclear power station, which could provide up to 7% of UK energy needs over its 60-year lifetime. The UK government holds a 76.1% stake in Sizewell C, with EDF holding the remaining 23.9%. Costain defence and nuclear energy sector director Bob Anstey said the Sizewell C project is a “vital part of creating a sustainable future”. “We have a long and successful track record in delivering for our civil nuclear customers, with a highly qualified and experienced workforce that consistently works to the highest safety and quality standards,” Anstey said. “A key part of our role will be to help ensure the project leaves a positive legacy, and we look forward to working closely with Sizewell C on a range of social value and employment initiatives that improve lives and provide long-term benefits to local communities.” Sizewell C Ltd managing director Nigel Cann said the project will “strengthen energy security and provide clean, reliable electricity for millions”. “We welcome Costain to the Sizewell C supplier family,” Cann said. “We are committed to providing thousands of great jobs and career development opportunities and we’re looking forward to working with our suppliers to boost skills, promote a diverse workforce and spread opportunities as widely as possible.” Sizewell C criticism The Sizewell C project has attracted significant criticism amid concerns over its ballooning costs. Earlier this year, campaign group Together Against Sizewell C (TASC) wrote to the National Audit Office calling for a review of

Read More »

Cisco, Nvidia expand AI partnership to include Silicon One technology

In addition, Cisco and Nvidia will invest in cross-portfolio technology to tackle common challenges like congestion management and load balancing, ensuring that enterprises can accelerate their AI deployments, Patel stated. The vendors said they would also collaborate to create and validate Nvidia Cloud Partner (NCP) and Enterprise Reference Architectures based on Nvidia Spectrum-X with Cisco Silicon One, Hyperfabric, Nexus, UCS Compute, Optics, and other Cisco technologies. History of Cisco, Nvidia collaborations The announcement is just the latest expansion of the Cisco/Nvidia partnership. The companies have already worked together to make Nvidia’s Tensor Core GPUs available in Cisco’s Unified Computing System (UCS) rack and blade servers, including Cisco UCS X-Series and UCS X-Series Direct, to support AI and data-intensive workloads in the data center and at the edge. The integrated package includes Nvidia AI Enterprise software, which features pretrained models and development tools for production-ready AI. Earlier this month, Cisco said it has shipped the UCS C845A M8 Rack Server for enterprise data center environments. The 8U rack server is built on Nvidia’s HGX platform and designed to deliver the accelerated compute capabilities needed for AI workloads such as LLM training, model fine-tuning, large model inferencing, and retrieval-augmented generation (RAG). The companies are also collaborating on AI Pods, which are preconfigured, validated, and optimized infrastructure packages that customers can plug into their data center or edge environments as needed. The Pods are based on Cisco Validated Design principals, which provide a blueprint for building reliable, scalable, and secure network infrastructures, according to Cisco. The Pods include Nvidia AI Enterprise, which features pretrained models and development tools for production-ready AI, and are managed through Cisco Intersight.

Read More »

3 strategies for carbon-free data centers

Because of the strain that data centers (as well as other electrification sources, such as electric vehicles) are putting on the grid, “the data center industry needs to develop new power supply strategies to support growth plans,” Dietrich said. Here are the underling factors that play into the three strategies outlined by Uptime. Scale creates new opportunities: It’s not just that more data centers are being built, but the data centers under construction are fundamentally different in terms of sheer magnitude. For example, a typical enterprise data center might require between 10 and 25 megawatts of power. Today, the hyperscalers are building data centers in the 250-megawatt range and a large data center campus could require 1,000 megawatts of power. Data centers not only require a reliable source of power, they also require backup power in the form of generators. Dietrich pointed out that if a data center operator builds out enough backup capacity to support 250 megawatts of demand, they’re essentially building a new, on-site power plant. On the one hand, that new power plant requires permitting, it’s costly, and it requires highly training staffers to operate. On the other hand, it provides an opportunity. Instead of letting this asset sit around unused except in an emergency, organizations can leverage these power plants to generate energy that can be sold back to the grid. Dietrich described this arrangement as a win-win that enables the data center to generate revenue, and it helps the utility to gain a new source of power. Realistic expectations: Alternative energy sources like wind and solar, which are dependent on environmental factors, can’t technically or economically supply 100% of data center power, but they can provide a significant percentage of it. Organizations need to temper their expectations, Dietrich said.

Read More »

Questions arise about reasons why Microsoft has cancelled data center lease plans

This, the company said, “allows us to invest and allocate resources to growth areas for our future. Our plans to spend over $80 billion on infrastructure this fiscal year remains on track as we continue to grow at a record pace to meet customer demand.” When asked for his reaction to the findings, John Annand, infrastructure and operations research practice lead at Info-Tech Research Group, pointed to a blog released last month by Microsoft president Brad Smith, and said he thinks the company “is hedging its bets. It reaffirms the $80 billion AI investment guidance in 2025, $40 billion in the US. Why lease when you can build/buy your own?” Over the past four years, he said, Microsoft “has been leasing more data centers than owning. Perhaps they are using the fact that the lessors are behind schedule on providing facilities or the power upgrades required to bring that ratio back into balance. The limiting factor for data centers has always been the availability of power, and this has only become more true with power-hungry AI workloads.” The company, said Annand, “has made very public statements about owning nuclear power plants to help address this demand. If third-party data center operators are finding it tough to provide Microsoft with the power they need, it would make sense that Microsoft vertically integrate its supply chain; so, cancel leases or statements of qualification in favor of investing in the building of their own capacity.” However, Gartner analyst Tony Harvey said of the report, “so much of this is still speculation.” Microsoft, he added, “has not stated as yet that they are reducing their capex spend, and there are reports that Microsoft have strongly refuted that they are making changes to their data center strategy.” The company, he said, “like any other hyperscaler,

Read More »

Quantum Computing Advancements Leap Forward In Evolving Data Center and AI Landscape

Overcoming the Barriers to Quantum Adoption Despite the promise of quantum computing, widespread deployment faces multiple hurdles: High Capital Costs: Quantum computing infrastructure requires substantial investment, with uncertain return-on-investment models. The partnership will explore cost-sharing strategies to mitigate risk. Undefined Revenue Models: Business frameworks for quantum services, including pricing structures and access models, remain in development. Hardware Limitations: Current quantum processors still struggle with error rates and scalability, requiring advancements in error correction and hybrid computing approaches. Software Maturity: Effective algorithms for leveraging quantum computing’s advantages remain an active area of research, particularly in real-world AI and optimization problems. SoftBank’s strategy includes leveraging its extensive telecom infrastructure and AI expertise to create real-world testing environments for quantum applications. By integrating quantum into existing data center operations, SoftBank aims to position itself at the forefront of the quantum-AI revolution. A Broader Play in Advanced Computing SoftBank’s quantum initiative follows a series of high-profile moves into the next generation of computing infrastructure. The company has been investing heavily in AI data centers, aligning with its “Beyond Carrier” strategy that expands its focus beyond telecommunications. Recent efforts include the development of large-scale AI models tailored to Japan and the enhancement of radio access networks (AI-RAN) through AI-driven optimizations. Internationally, SoftBank has explored data center expansion opportunities beyond Japan, as part of its efforts to support AI, cloud computing, and now quantum applications. The company’s long-term vision suggests that quantum data centers could eventually play a role in supporting AI-driven workloads at scale, offering performance benefits that classical supercomputers cannot achieve. The Road Ahead SoftBank and Quantinuum’s collaboration signals growing momentum for quantum computing in enterprise settings. While quantum remains a long-term bet, integrating QPUs into data center infrastructure represents a forward-looking approach that could redefine high-performance computing in the years to come. With

Read More »

STACK Infrastructure Pushes Aggressive Data Center Expansion and Sustainability Strategy Into 2025

Global data center developer and operator STACK Infrastructure is providing a growing range of digital infrastructure solutions for hyperscalers, cloud service providers, and enterprise clients. Like almost all of the cutting-edge developers in the industry, Stack is maintaining the focus on scalability, reliability, and sustainability while delivering a full range of solutions, including build-to-suit, colocation, and powered shell facilities, with continued development in key global markets. Headquartered in the United States, the company has expanded its presence across North America, Europe, and Asia-Pacific, catering to the increasing demand for high-performance computing, artificial intelligence (AI), and cloud-based workloads. The company is known for its commitment to sustainable growth, leveraging green financing initiatives, energy-efficient designs, and renewable power sources to minimize its environmental impact. Through rapid expansion in technology hubs like Silicon Valley, Northern Virginia, Malaysia, and Loudoun County, the company continues to develop industry benchmarks for innovation and infrastructure resilience. With a customer-centric approach and a robust development pipeline, STACK Infrastructure is shaping the future of digital connectivity and data management in an era of accelerating digital transformation. Significant Developments Across 23 Major Data Center Markets Early in 2024, Stack broke ground on the expansion of their existing 100 MW campus in San Jose, servicing the power constrained Silicon Valley. Stack worked with the city of San Jose to add a 60 MW expansion to their SVY01 data center. While possibly the highest profile of Stack’s developments, due to its location, at that point in time the company had announced significant developments across 23 major data center markets, including:       Stack’s 48 MW Santa Clara data center, featuring immediately available shell space powered by an onsite substation with rare, contracted capacity. Stack’s 56 MW Toronto campus, spanning 19 acres, includes an existing 8 MW data center and 48 MW expansion capacity,

Read More »

Meta Update: Opens Mesa, Arizona Data Center; Unveils Major Subsea Cable Initiative; Forges Oklahoma Wind Farm PPA; More

Meta’s Project Waterworth: Building the Global Backbone for AI-Powered Digital Infrastructure Also very recently, Meta unveiled its most ambitious subsea cable initiative yet: Project Waterworth. Aimed at revolutionizing global digital connectivity, the project will span over 50,000 kilometers—surpassing the Earth’s circumference—and connect five major continents. When completed, it will be the world’s longest subsea cable system, featuring the highest-capacity technology available today. A Strategic Expansion to Key Global Markets As announced on Feb. 14, Project Waterworth is designed to enhance connectivity across critical regions, including the United States, India, Brazil, and South Africa. These regions are increasingly pivotal to global digital growth, and the new subsea infrastructure will fuel economic cooperation, promote digital inclusion, and unlock opportunities for technological advancement. In India, for instance, where rapid digital infrastructure growth is already underway, the project will accelerate progress and support the country’s ambitions for an expanded digital economy. This enhanced connectivity will foster regional integration and bolster the foundation for next-generation applications, including AI-driven services. Strengthening Global Digital Highways Subsea cables are the unsung heroes of global digital infrastructure, facilitating over 95% of intercontinental data traffic. With a multi-billion-dollar investment, Meta aims to open three new oceanic corridors that will deliver the high-speed, high-capacity bandwidth needed to fuel innovations like artificial intelligence. Meta’s experience in subsea infrastructure is extensive. Over the past decade, the company has collaborated with various partners to develop more than 20 subsea cables, including systems boasting up to 24 fiber pairs—far exceeding the typical 8 to 16 fiber pairs found in most new deployments. This technological edge ensures scalability and reliability, essential for handling the world’s ever-increasing data demands. Engineering Innovations for Resilience and Capacity Project Waterworth isn’t just about scale—it’s about resilience and cutting-edge engineering. The system will be the longest 24-fiber-pair subsea cable ever built, enhancing

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »