Your Gateway to Power, Energy, Datacenters, Bitcoin and AI

Dive into the latest industry updates, our exclusive Paperboy Newsletter, and curated insights designed to keep you informed. Stay ahead with minimal time spent.

Discover What Matters Most to You

Explore ONMINE’s curated content, from our Paperboy Newsletter to industry-specific insights tailored for energy, Bitcoin mining, and AI professionals.

AI

Lorem Ipsum is simply dummy text of the printing and typesetting industry.

Bitcoin:

Lorem Ipsum is simply dummy text of the printing and typesetting industry.

Datacenter:

Lorem Ipsum is simply dummy text of the printing and typesetting industry.

Energy:

Lorem Ipsum is simply dummy text of the printing and typesetting industry.

Shape
Discover What Matter Most to You

Featured Articles

ML Feature Management: A Practical Evolution Guide

In the world of machine learning, we obsess over model architectures, training pipelines, and hyper-parameter tuning, yet often overlook a fundamental aspect: how our features live and breathe throughout their lifecycle. From in-memory calculations that vanish after each prediction to the challenge of reproducing exact feature values months later, the way we handle features can make or break our ML systems’ reliability and scalability. Who Should Read This Starting Point: The invisible approach Many ML teams, especially those in their early stages or without dedicated ML engineers, start with what I call “the invisible approach” to feature engineering. It’s deceptively simple: fetch raw data, transform it in-memory, and create features on the fly. The resulting dataset, while functional, is essentially a black box of short-lived calculations — features that exist only for a moment before vanishing after each prediction or training run. While this approach might seem to get the job done, it’s built on shaky ground. As teams scale their ML operations, models that performed brilliantly in testing suddenly behave unpredictably in production. Features that worked perfectly during training mysteriously produce different values in live inference. When stakeholders ask why a specific prediction was made last month, teams find themselves unable to reconstruct the exact feature values that led to that decision. Core Challenges in Feature Engineering These pain points aren’t unique to any single team; they represent fundamental challenges that every growing ML team eventually faces. Evolution of Solutions Approach 1: On-Demand Feature Generation The simplest solution starts where many ML teams begin: creating features on demand for immediate use in prediction. Raw data flows through transformations to generate features, which are used for inference, and only then — after predictions are already made — are these features typically saved to parquet files. While this method is straightforward, with teams often choosing parquet files because

Read More »

UK gas policies could lead to increased emissions from imports, report finds

Reduced North Sea investment as a result of UK government policy could lead to millions of tonnes of additional CO2 emissions from imports, according to a report. Edinburgh-based financial analyst firm Gneiss Energy modelled the potential impact of UK government policies relating to the country’s offshore oil and gas sector. Since taking office last year, the UK Labour government has increased the windfall tax on North Sea oil and gas firms and pledged not to issue new exploration licences. These policies, combined with a recent court ruling focused on downstream emissions, have raised concerns about the impact on future investment in the North Sea. Without new projects and further investment, domestic UK gas production in the maturing North Sea basin is likely to significantly decrease over the next decade, leading to a greater reliance on imports. According to the North Sea Transition Authority (NSTA), the carbon intensity of UK gas production is almost four times lower than liquefied natural gas (LNG) imports. UK gas and LNG emissions The Gneiss Energy report estimated the emissions impact of reduced UK production on whilst in tandem increasing imports of liquefied natural gas (LNG) into the 2030s. It modelled scenarios ranging from total reliance on UK domestic production to a ‘sunset’ scenario which assumes a 50% reduction in gas supplies from the UK continental shelf (UKCS). In the sunset scenario, UK gas demand is met with increased supply balanced between Norway and LNG imports from countries like the United States, Qatar and Algeria. Gneiss estimates this would lead to 13% higher pre-consumption CO2 emissions compared to the current UK energy mix. © Supplied by National Grid GrainThe Grain LNG terminal on the Isle of Grain. Image: National Grid By contrast, if the UK were to achieve 100% domestic gas supply it would lead to

Read More »

Oil Prices Slip Amid USA-China Trade War Fears

Oil edged lower as concerns that trade war between the US and China would hurt global growth outweighed the announcement of reinforced sanctions on Iran. West Texas Intermediate fell 0.6% to settle below $73 a barrel, recovering from an earlier decline of as much as 3.4% that was driven by the prospect that dueling tariffs between the US and China would reduce energy demand. Oil pared losses after Trump signed a directive that aims to ramp up economic pressure on Iran. Over the past four years, sanctions evasion and more relaxed US enforcement have allowed Iran to boost oil exports by about 1 million barrels a day. Enforcing sanctions may slash the country’s exports by about two-thirds and could cost Iran roughly $30 billion a year. Earlier on Tuesday, China announced retaliatory measures against Trump’s tariffs, temporarily sending futures below where they ended 2024 for the first time this year. Speaking to reporters Tuesday afternoon, Trump said it was “fine” that China responded in kind with tariffs on some American products. China will place levies on a range of US goods, including oil and liquefied natural gas, in response to Washington’s “unilateral imposition of tariffs,” the country’s finance ministry said. The US shipped about 250,000 barrels of crude a day to China on average last year, a relatively small volume. But an escalation of trade disputes between the world’s two largest economies could have a broader impact and hurt global consumption.   The trade confrontation with China stands in contrast to Trump’s agreement to push back planned levies on Canada and Mexico by a month after the nations agreed to take tougher measures to combat migration and drug trafficking. The flare-up came as China’s markets were shut for the Lunar New Year holidays. Oil futures have faced a bumpy few

Read More »

Adnoc to Shift USA Assets to New Global Unit XRG

Abu Dhabi’s main energy company will transfer stakes in some US assets to its new international investment firm, XRG PJSC, part of a push to create a $80 billion global venture. Abu Dhabi National Oil Co.’s holdings in a hydrogen venture with Exxon Mobil Corp. and NextDecade Corp.’s liquefied natural gas export project will move to XRG, according to Chief Executive Officer Sultan Al Jaber. XRG aims to invest in technologies that power the energy transition and the shift to advanced technologies, Al Jaber wrote in a column on the news website Semafor. Adnoc in September agreed to take a 35% stake in Exxon’s proposed hydrogen project in Baytown, Texas, which is likely to be the world’s biggest once built. That was its second acquisition in the US after Adnoc in May agreed to buy a stake in NextDecade’s LNG project, also in Texas. Adnoc announced its plan for XRG last year, saying the company will have an enterprise value of over $80 billion and will aim to more than double the value of its assets over the next decade. Adnoc plans to start the unit in the first quarter.  The UAE company has already said XRG will control Covestro AG following a roughly $13 billion deal to buy the German chemical producer. XRG will also hold Abu Dhabi’s share of a joint venture with BP Plc for gas projects in the Mediterranean region, including in Egypt. The UAE is pursuing expansion in chemicals and gas to feed future energy demand and supply plastics for consumer products amid the energy transition. Through Adnoc, it is expanding its energy business beyond oil to focus on gas trading and chemical derivatives like plastics on expectation demand for these products will benefit from the energy transition.  Over the weekend, Adnoc said it was in

Read More »

AMD’s Q4 revenue hits $7.66B, up 24% but stock falls

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Advanced Micro Devices reported revenue of $7.658 billion for the fourth quarter, up 24% from a year ago. But AMD’s stock price fell in after-hours trading to $113.66, down 4.89%. Net income was $482 million for the fourth quarter ended December 31, down 28% from $667 million a year earlier. AMD saw data center segment revenue of $3.9 billion in the quarter, up 69% from a year ago. It was driven by record Epyc CPU and Instinct GPU revenues — and the demand is being driven by the rapid adoption of AI. For 2024, data center segment revenue was a record $12.6 billion, an increase of 94% compared to the prior year, driven by growth in both AMD Instinct and Epyc processors. Exiting 2024, there are more than 450 Epyc platforms available from the leading server OEMs and ODMs, including more than 120 “Turin” platforms that went into production in the fourth quarter from Cisco, Dell, HPE, Lenovo, Supermicro and others. “2024 was an outstanding year as we accelerated our AI hardware roadmap to deliver an annual cadence of new Instinct accelerators, expanded our ROCm software suite with significant uplifts in inferencing and training performance, built strong customer relationships with key industry leaders and delivered greater than $5 billion of data center AI revenue for the year,” Su said. Lisa Su is CEO of AMD. In cloud, AMD exited 2024 with well over 50% share at the majority of the largest hyperscale customers. “Hyperscaler demand for EPYC CPUs was very strong, driven by expanded deployments powering both their internal compute infrastructure and online services,” Su said. “Public cloud demand was also very strong with the number of Epyc instances increasing 27%

Read More »

OmniHuman: ByteDance’s new AI creates realistic videos from a single photo

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More ByteDance researchers have developed an artificial intelligence system that transforms single photographs into realistic videos of people speaking, singing and moving naturally — a breakthrough that could reshape digital entertainment and communications. The new system, called OmniHuman, generates full-body videos showing people gesturing and moving in ways that match their speech, surpassing previous AI models that could only animate faces or upper bodies. How OmniHuman Uses 18,700 Hours of Training Data to Create Realistic Motion “End-to-end human animation has undergone notable advancements in recent years. However, existing methods still struggle to scale up as large general video generation models, limiting their potential in real applications,” the researchers wrote in a paper published on arXiv. The team trained OmniHuman on more than 18,700 hours of human video data using a novel approach that combines multiple types of inputs — text, audio, and body movements. This “omni-conditions” training strategy allows the AI to learn from much larger and more diverse datasets than previous methods. Credit: ByteDance AI video generation breakthrough shows full-body movement and natural gestures “Our key insight is that incorporating multiple conditioning signals, such as text, audio, and pose, during training can significantly reduce data wastage,” the research team explained. The technology marks a significant advance in AI-generated media, demonstrating capabilities from creating videos of people delivering speeches to showing subjects playing musical instruments. In testing, OmniHuman outperformed existing systems across multiple quality benchmarks. Credit: ByteDance Tech giants race to develop next-generation video AI systems The development emerges amid intensifying competition in AI video generation, with companies like Google, Meta, and Microsoft pursuing similar technology. ByteDance’s breakthrough could give the TikTok parent company an advantage in this rapidly evolving field. Industry

Read More »

ML Feature Management: A Practical Evolution Guide

In the world of machine learning, we obsess over model architectures, training pipelines, and hyper-parameter tuning, yet often overlook a fundamental aspect: how our features live and breathe throughout their lifecycle. From in-memory calculations that vanish after each prediction to the challenge of reproducing exact feature values months later, the way we handle features can make or break our ML systems’ reliability and scalability. Who Should Read This Starting Point: The invisible approach Many ML teams, especially those in their early stages or without dedicated ML engineers, start with what I call “the invisible approach” to feature engineering. It’s deceptively simple: fetch raw data, transform it in-memory, and create features on the fly. The resulting dataset, while functional, is essentially a black box of short-lived calculations — features that exist only for a moment before vanishing after each prediction or training run. While this approach might seem to get the job done, it’s built on shaky ground. As teams scale their ML operations, models that performed brilliantly in testing suddenly behave unpredictably in production. Features that worked perfectly during training mysteriously produce different values in live inference. When stakeholders ask why a specific prediction was made last month, teams find themselves unable to reconstruct the exact feature values that led to that decision. Core Challenges in Feature Engineering These pain points aren’t unique to any single team; they represent fundamental challenges that every growing ML team eventually faces. Evolution of Solutions Approach 1: On-Demand Feature Generation The simplest solution starts where many ML teams begin: creating features on demand for immediate use in prediction. Raw data flows through transformations to generate features, which are used for inference, and only then — after predictions are already made — are these features typically saved to parquet files. While this method is straightforward, with teams often choosing parquet files because

Read More »

UK gas policies could lead to increased emissions from imports, report finds

Reduced North Sea investment as a result of UK government policy could lead to millions of tonnes of additional CO2 emissions from imports, according to a report. Edinburgh-based financial analyst firm Gneiss Energy modelled the potential impact of UK government policies relating to the country’s offshore oil and gas sector. Since taking office last year, the UK Labour government has increased the windfall tax on North Sea oil and gas firms and pledged not to issue new exploration licences. These policies, combined with a recent court ruling focused on downstream emissions, have raised concerns about the impact on future investment in the North Sea. Without new projects and further investment, domestic UK gas production in the maturing North Sea basin is likely to significantly decrease over the next decade, leading to a greater reliance on imports. According to the North Sea Transition Authority (NSTA), the carbon intensity of UK gas production is almost four times lower than liquefied natural gas (LNG) imports. UK gas and LNG emissions The Gneiss Energy report estimated the emissions impact of reduced UK production on whilst in tandem increasing imports of liquefied natural gas (LNG) into the 2030s. It modelled scenarios ranging from total reliance on UK domestic production to a ‘sunset’ scenario which assumes a 50% reduction in gas supplies from the UK continental shelf (UKCS). In the sunset scenario, UK gas demand is met with increased supply balanced between Norway and LNG imports from countries like the United States, Qatar and Algeria. Gneiss estimates this would lead to 13% higher pre-consumption CO2 emissions compared to the current UK energy mix. © Supplied by National Grid GrainThe Grain LNG terminal on the Isle of Grain. Image: National Grid By contrast, if the UK were to achieve 100% domestic gas supply it would lead to

Read More »

Oil Prices Slip Amid USA-China Trade War Fears

Oil edged lower as concerns that trade war between the US and China would hurt global growth outweighed the announcement of reinforced sanctions on Iran. West Texas Intermediate fell 0.6% to settle below $73 a barrel, recovering from an earlier decline of as much as 3.4% that was driven by the prospect that dueling tariffs between the US and China would reduce energy demand. Oil pared losses after Trump signed a directive that aims to ramp up economic pressure on Iran. Over the past four years, sanctions evasion and more relaxed US enforcement have allowed Iran to boost oil exports by about 1 million barrels a day. Enforcing sanctions may slash the country’s exports by about two-thirds and could cost Iran roughly $30 billion a year. Earlier on Tuesday, China announced retaliatory measures against Trump’s tariffs, temporarily sending futures below where they ended 2024 for the first time this year. Speaking to reporters Tuesday afternoon, Trump said it was “fine” that China responded in kind with tariffs on some American products. China will place levies on a range of US goods, including oil and liquefied natural gas, in response to Washington’s “unilateral imposition of tariffs,” the country’s finance ministry said. The US shipped about 250,000 barrels of crude a day to China on average last year, a relatively small volume. But an escalation of trade disputes between the world’s two largest economies could have a broader impact and hurt global consumption.   The trade confrontation with China stands in contrast to Trump’s agreement to push back planned levies on Canada and Mexico by a month after the nations agreed to take tougher measures to combat migration and drug trafficking. The flare-up came as China’s markets were shut for the Lunar New Year holidays. Oil futures have faced a bumpy few

Read More »

Adnoc to Shift USA Assets to New Global Unit XRG

Abu Dhabi’s main energy company will transfer stakes in some US assets to its new international investment firm, XRG PJSC, part of a push to create a $80 billion global venture. Abu Dhabi National Oil Co.’s holdings in a hydrogen venture with Exxon Mobil Corp. and NextDecade Corp.’s liquefied natural gas export project will move to XRG, according to Chief Executive Officer Sultan Al Jaber. XRG aims to invest in technologies that power the energy transition and the shift to advanced technologies, Al Jaber wrote in a column on the news website Semafor. Adnoc in September agreed to take a 35% stake in Exxon’s proposed hydrogen project in Baytown, Texas, which is likely to be the world’s biggest once built. That was its second acquisition in the US after Adnoc in May agreed to buy a stake in NextDecade’s LNG project, also in Texas. Adnoc announced its plan for XRG last year, saying the company will have an enterprise value of over $80 billion and will aim to more than double the value of its assets over the next decade. Adnoc plans to start the unit in the first quarter.  The UAE company has already said XRG will control Covestro AG following a roughly $13 billion deal to buy the German chemical producer. XRG will also hold Abu Dhabi’s share of a joint venture with BP Plc for gas projects in the Mediterranean region, including in Egypt. The UAE is pursuing expansion in chemicals and gas to feed future energy demand and supply plastics for consumer products amid the energy transition. Through Adnoc, it is expanding its energy business beyond oil to focus on gas trading and chemical derivatives like plastics on expectation demand for these products will benefit from the energy transition.  Over the weekend, Adnoc said it was in

Read More »

AMD’s Q4 revenue hits $7.66B, up 24% but stock falls

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Advanced Micro Devices reported revenue of $7.658 billion for the fourth quarter, up 24% from a year ago. But AMD’s stock price fell in after-hours trading to $113.66, down 4.89%. Net income was $482 million for the fourth quarter ended December 31, down 28% from $667 million a year earlier. AMD saw data center segment revenue of $3.9 billion in the quarter, up 69% from a year ago. It was driven by record Epyc CPU and Instinct GPU revenues — and the demand is being driven by the rapid adoption of AI. For 2024, data center segment revenue was a record $12.6 billion, an increase of 94% compared to the prior year, driven by growth in both AMD Instinct and Epyc processors. Exiting 2024, there are more than 450 Epyc platforms available from the leading server OEMs and ODMs, including more than 120 “Turin” platforms that went into production in the fourth quarter from Cisco, Dell, HPE, Lenovo, Supermicro and others. “2024 was an outstanding year as we accelerated our AI hardware roadmap to deliver an annual cadence of new Instinct accelerators, expanded our ROCm software suite with significant uplifts in inferencing and training performance, built strong customer relationships with key industry leaders and delivered greater than $5 billion of data center AI revenue for the year,” Su said. Lisa Su is CEO of AMD. In cloud, AMD exited 2024 with well over 50% share at the majority of the largest hyperscale customers. “Hyperscaler demand for EPYC CPUs was very strong, driven by expanded deployments powering both their internal compute infrastructure and online services,” Su said. “Public cloud demand was also very strong with the number of Epyc instances increasing 27%

Read More »

OmniHuman: ByteDance’s new AI creates realistic videos from a single photo

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More ByteDance researchers have developed an artificial intelligence system that transforms single photographs into realistic videos of people speaking, singing and moving naturally — a breakthrough that could reshape digital entertainment and communications. The new system, called OmniHuman, generates full-body videos showing people gesturing and moving in ways that match their speech, surpassing previous AI models that could only animate faces or upper bodies. How OmniHuman Uses 18,700 Hours of Training Data to Create Realistic Motion “End-to-end human animation has undergone notable advancements in recent years. However, existing methods still struggle to scale up as large general video generation models, limiting their potential in real applications,” the researchers wrote in a paper published on arXiv. The team trained OmniHuman on more than 18,700 hours of human video data using a novel approach that combines multiple types of inputs — text, audio, and body movements. This “omni-conditions” training strategy allows the AI to learn from much larger and more diverse datasets than previous methods. Credit: ByteDance AI video generation breakthrough shows full-body movement and natural gestures “Our key insight is that incorporating multiple conditioning signals, such as text, audio, and pose, during training can significantly reduce data wastage,” the research team explained. The technology marks a significant advance in AI-generated media, demonstrating capabilities from creating videos of people delivering speeches to showing subjects playing musical instruments. In testing, OmniHuman outperformed existing systems across multiple quality benchmarks. Credit: ByteDance Tech giants race to develop next-generation video AI systems The development emerges amid intensifying competition in AI video generation, with companies like Google, Meta, and Microsoft pursuing similar technology. ByteDance’s breakthrough could give the TikTok parent company an advantage in this rapidly evolving field. Industry

Read More »

UK gas policies could lead to increased emissions from imports, report finds

Reduced North Sea investment as a result of UK government policy could lead to millions of tonnes of additional CO2 emissions from imports, according to a report. Edinburgh-based financial analyst firm Gneiss Energy modelled the potential impact of UK government policies relating to the country’s offshore oil and gas sector. Since taking office last year, the UK Labour government has increased the windfall tax on North Sea oil and gas firms and pledged not to issue new exploration licences. These policies, combined with a recent court ruling focused on downstream emissions, have raised concerns about the impact on future investment in the North Sea. Without new projects and further investment, domestic UK gas production in the maturing North Sea basin is likely to significantly decrease over the next decade, leading to a greater reliance on imports. According to the North Sea Transition Authority (NSTA), the carbon intensity of UK gas production is almost four times lower than liquefied natural gas (LNG) imports. UK gas and LNG emissions The Gneiss Energy report estimated the emissions impact of reduced UK production on whilst in tandem increasing imports of liquefied natural gas (LNG) into the 2030s. It modelled scenarios ranging from total reliance on UK domestic production to a ‘sunset’ scenario which assumes a 50% reduction in gas supplies from the UK continental shelf (UKCS). In the sunset scenario, UK gas demand is met with increased supply balanced between Norway and LNG imports from countries like the United States, Qatar and Algeria. Gneiss estimates this would lead to 13% higher pre-consumption CO2 emissions compared to the current UK energy mix. © Supplied by National Grid GrainThe Grain LNG terminal on the Isle of Grain. Image: National Grid By contrast, if the UK were to achieve 100% domestic gas supply it would lead to

Read More »

Adnoc to Shift USA Assets to New Global Unit XRG

Abu Dhabi’s main energy company will transfer stakes in some US assets to its new international investment firm, XRG PJSC, part of a push to create a $80 billion global venture. Abu Dhabi National Oil Co.’s holdings in a hydrogen venture with Exxon Mobil Corp. and NextDecade Corp.’s liquefied natural gas export project will move to XRG, according to Chief Executive Officer Sultan Al Jaber. XRG aims to invest in technologies that power the energy transition and the shift to advanced technologies, Al Jaber wrote in a column on the news website Semafor. Adnoc in September agreed to take a 35% stake in Exxon’s proposed hydrogen project in Baytown, Texas, which is likely to be the world’s biggest once built. That was its second acquisition in the US after Adnoc in May agreed to buy a stake in NextDecade’s LNG project, also in Texas. Adnoc announced its plan for XRG last year, saying the company will have an enterprise value of over $80 billion and will aim to more than double the value of its assets over the next decade. Adnoc plans to start the unit in the first quarter.  The UAE company has already said XRG will control Covestro AG following a roughly $13 billion deal to buy the German chemical producer. XRG will also hold Abu Dhabi’s share of a joint venture with BP Plc for gas projects in the Mediterranean region, including in Egypt. The UAE is pursuing expansion in chemicals and gas to feed future energy demand and supply plastics for consumer products amid the energy transition. Through Adnoc, it is expanding its energy business beyond oil to focus on gas trading and chemical derivatives like plastics on expectation demand for these products will benefit from the energy transition.  Over the weekend, Adnoc said it was in

Read More »

Oil Prices Slip Amid USA-China Trade War Fears

Oil edged lower as concerns that trade war between the US and China would hurt global growth outweighed the announcement of reinforced sanctions on Iran. West Texas Intermediate fell 0.6% to settle below $73 a barrel, recovering from an earlier decline of as much as 3.4% that was driven by the prospect that dueling tariffs between the US and China would reduce energy demand. Oil pared losses after Trump signed a directive that aims to ramp up economic pressure on Iran. Over the past four years, sanctions evasion and more relaxed US enforcement have allowed Iran to boost oil exports by about 1 million barrels a day. Enforcing sanctions may slash the country’s exports by about two-thirds and could cost Iran roughly $30 billion a year. Earlier on Tuesday, China announced retaliatory measures against Trump’s tariffs, temporarily sending futures below where they ended 2024 for the first time this year. Speaking to reporters Tuesday afternoon, Trump said it was “fine” that China responded in kind with tariffs on some American products. China will place levies on a range of US goods, including oil and liquefied natural gas, in response to Washington’s “unilateral imposition of tariffs,” the country’s finance ministry said. The US shipped about 250,000 barrels of crude a day to China on average last year, a relatively small volume. But an escalation of trade disputes between the world’s two largest economies could have a broader impact and hurt global consumption.   The trade confrontation with China stands in contrast to Trump’s agreement to push back planned levies on Canada and Mexico by a month after the nations agreed to take tougher measures to combat migration and drug trafficking. The flare-up came as China’s markets were shut for the Lunar New Year holidays. Oil futures have faced a bumpy few

Read More »

US ‘needs more storage’ to ensure grid reliability, resilience: SEIA

Dive Brief: The United States should deploy 10 million distributed energy storage installations and reach 700 GWh in total deployed storage capacity to meet its energy needs by 2030, the Solar Energy Industries Association said on Jan. 28. SEIA also released a national state energy storage policy guide that includes current statutory procurements and deployment targets, state and utility incentives, demonstration programs and consumer protections for energy storage. Amid rising load growth projections and an increasing share of renewable generation, a rapid increase in energy storage capacity is essential to “[ensure] our grid can adapt to challenges, support critical infrastructure and deliver reliable power to every community,” SEIA President and CEO Abigail Ross Hopper said in a statement. Dive Insight: The U.S. has nearly 500,000 distributed energy storage installations and about 83 GWh of total energy storage capacity, SEIA said, citing data provided by Wood Mackenzie.  The 2030 targets announced last month would represent a more than 20-fold increase in the number of distributed storage installations and a more than eight-fold increase in total deployed capacity. The 700-GWh target is also well above Wood Mackenzie’s business-as-usual forecast of 450 GWh in energy storage capacity by 2030, SEIA said. Wood Mackenzie’s most recent Energy Storage Monitor showed total U.S. storage deployments reaching a record 9.9 GWh in the fourth quarter of 2024, up 58% from the year-ago period. For full-year 2024, deployments increased 30%, marking the industry’s “strongest year yet,” Wood Mackenzie Senior Research Analyst Nina Rangel told Utility Dive in December. Early-stage development constraints mean “it will be difficult to keep this pace,” leading Wood Mackenzie to forecast average annual growth of 10% between 2025 and 2028, Rangel said.  Utility-scale storage installations account for the lion’s share of U.S. storage deployments, reaching about 9 GW in Q4 2024, Wood Mackenzie

Read More »

Macquarie Strategists Forecast USA Crude Inventory Build

In an oil and gas report sent to Rigzone by the Macquarie team late Monday, Macquarie strategists revealed that they are forecasting that U.S. crude inventories will be up by 10.8 million barrels for the week ending January 31. “This compares to our early look for the week which anticipated an 11.7 million barrel build, and a 3.5 million barrel build realized for the week ending January 24,” the strategists said in the report. “On the product side of the ledger, in aggregate, our expectations are slightly looser than our early view,” they added. In the report, the Macquarie strategists noted that, “for this week’s crude balance, from refineries”, they “model crude runs down modestly (-0.3 million barrels per day)”. “Among net imports, we model a slight decrease, with exports slightly higher (+0.1 million barrels per day) and imports effectively unchanged on a nominal basis,” they added. The strategists stated in the report that timing of cargoes remains a source of potential volatility in this week’s crude balance. “From implied domestic supply (prod. +adj. +transfers), we look for a rebound (+0.8 million barrels per day) following a soft print last week amidst potential weather impacts,” the strategists said in the report. “Rounding out the picture, we anticipate another small increase in SPR [Strategic Petroleum Reserve] inventory (+0.3 million barrels) on the week,” they added. The Macquarie strategists also noted in the report that, “among products”, they “look for builds in gasoline (+0.5 million barrels) and jet (+0.5 million barrels), with a draw in distillate (-3.8 million barrels)”. “We model implied demand for these three products at ~13.8 million barrels per day for the week ending January 31,” the Macquarie strategists went on to state. In an oil and gas report sent to Rigzone by the Macquarie team late Thursday, Macquarie

Read More »

California PUC streamlines transmission permitting process

The California Public Utilities Commission on Thursday streamlined its transmission permitting requirements as part of a broader effort to meet the state’s clean energy goals. Among the changes, the PUC now requires project developers to meet with commission staff before filing a permit application to help the review process run more smoothly, according to the agency. The revised rules also give transmission developers a chance to submit a draft version of California Environmental Quality Act documents instead of an environmental assessment to help speed permit reviews, according to a proposed decision approved by the commission on Thursday. Also, the PUC will assume a project is needed if the California Independent System Operator has already determined that it is needed, a change that will avoid duplicative analysis, according to the commission. The decision adopts a pilot to study ways to streamline the PUC’s environmental review process. “These changes will accelerate permitting timelines by reducing redundancy and shifting environmental analysis earlier in the application process,” PUC Commissioner Karen Douglas, who led the proceeding, said in a press release. The revised permitting rules will help the PUC address the scale of grid upgrades that will be needed to maintain a reliable grid and meet California’s climate goals, Douglas said. California will need about $46 billion to $63 billion in new transmission over the next 20 years, according to a “transmission outlook” report CAISO released July 31. The investments include up to $12 billion for upgrades to its system, up to $15.2 billion for transmission to import wind from other states and up to $37 billion for offshore wind. California law requires that all the state’s electricity comes from renewable energy and zero-carbon sources by 2045. The PUC’s decision completes a two-phase process to implement SB 529, a bill signed into law in 2022 that

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »

Three Aberdeen oil company headquarters sell for £45m

Three Aberdeen oil company headquarters have been sold in a deal worth £45 million. The CNOOC, Apache and Taqa buildings at the Prime Four business park in Kingswells have been acquired by EEH Ventures. The trio of buildings, totalling 275,000 sq ft, were previously owned by Canadian firm BMO. The financial services powerhouse first bought the buildings in 2014 but took the decision to sell the buildings as part of a “long-standing strategy to reduce their office exposure across the UK”. The deal was the largest to take place throughout Scotland during the last quarter of 2024. Trio of buildings snapped up London headquartered EEH Ventures was founded in 2013 and owns a number of residential, offices, shopping centres and hotels throughout the UK. All three Kingswells-based buildings were pre-let, designed and constructed by Aberdeen property developer Drum in 2012 on a 15-year lease. © Supplied by CBREThe Aberdeen headquarters of Taqa. Image: CBRE The North Sea headquarters of Middle-East oil firm Taqa has previously been described as “an amazing success story in the Granite City”. Taqa announced in 2023 that it intends to cease production from all of its UK North Sea platforms by the end of 2027. Meanwhile, Apache revealed at the end of last year it is planning to exit the North Sea by the end of 2029 blaming the windfall tax. The US firm first entered the North Sea in 2003 but will wrap up all of its UK operations by 2030. Aberdeen big deals The Prime Four acquisition wasn’t the biggest Granite City commercial property sale of 2024. American private equity firm Lone Star bought Union Square shopping centre from Hammerson for £111m. © ShutterstockAberdeen city centre. Hammerson, who also built the property, had originally been seeking £150m. BP’s North Sea headquarters in Stoneywood, Aberdeen, was also sold. Manchester-based

Read More »

2025 ransomware predictions, trends, and how to prepare

Zscaler ThreatLabz research team has revealed critical insights and predictions on ransomware trends for 2025. The latest Ransomware Report uncovered a surge in sophisticated tactics and extortion attacks. As ransomware remains a key concern for CISOs and CIOs, the report sheds light on actionable strategies to mitigate risks. Top Ransomware Predictions for 2025: ● AI-Powered Social Engineering: In 2025, GenAI will fuel voice phishing (vishing) attacks. With the proliferation of GenAI-based tooling, initial access broker groups will increasingly leverage AI-generated voices; which sound more and more realistic by adopting local accents and dialects to enhance credibility and success rates. ● The Trifecta of Social Engineering Attacks: Vishing, Ransomware and Data Exfiltration. Additionally, sophisticated ransomware groups, like the Dark Angels, will continue the trend of low-volume, high-impact attacks; preferring to focus on an individual company, stealing vast amounts of data without encrypting files, and evading media and law enforcement scrutiny. ● Targeted Industries Under Siege: Manufacturing, healthcare, education, energy will remain primary targets, with no slowdown in attacks expected. ● New SEC Regulations Drive Increased Transparency: 2025 will see an uptick in reported ransomware attacks and payouts due to new, tighter SEC requirements mandating that public companies report material incidents within four business days. ● Ransomware Payouts Are on the Rise: In 2025 ransom demands will most likely increase due to an evolving ecosystem of cybercrime groups, specializing in designated attack tactics, and collaboration by these groups that have entered a sophisticated profit sharing model using Ransomware-as-a-Service. To combat damaging ransomware attacks, Zscaler ThreatLabz recommends the following strategies. ● Fighting AI with AI: As threat actors use AI to identify vulnerabilities, organizations must counter with AI-powered zero trust security systems that detect and mitigate new threats. ● Advantages of adopting a Zero Trust architecture: A Zero Trust cloud security platform stops

Read More »

ML Feature Management: A Practical Evolution Guide

In the world of machine learning, we obsess over model architectures, training pipelines, and hyper-parameter tuning, yet often overlook a fundamental aspect: how our features live and breathe throughout their lifecycle. From in-memory calculations that vanish after each prediction to the challenge of reproducing exact feature values months later, the way we handle features can make or break our ML systems’ reliability and scalability. Who Should Read This Starting Point: The invisible approach Many ML teams, especially those in their early stages or without dedicated ML engineers, start with what I call “the invisible approach” to feature engineering. It’s deceptively simple: fetch raw data, transform it in-memory, and create features on the fly. The resulting dataset, while functional, is essentially a black box of short-lived calculations — features that exist only for a moment before vanishing after each prediction or training run. While this approach might seem to get the job done, it’s built on shaky ground. As teams scale their ML operations, models that performed brilliantly in testing suddenly behave unpredictably in production. Features that worked perfectly during training mysteriously produce different values in live inference. When stakeholders ask why a specific prediction was made last month, teams find themselves unable to reconstruct the exact feature values that led to that decision. Core Challenges in Feature Engineering These pain points aren’t unique to any single team; they represent fundamental challenges that every growing ML team eventually faces. Evolution of Solutions Approach 1: On-Demand Feature Generation The simplest solution starts where many ML teams begin: creating features on demand for immediate use in prediction. Raw data flows through transformations to generate features, which are used for inference, and only then — after predictions are already made — are these features typically saved to parquet files. While this method is straightforward, with teams often choosing parquet files because

Read More »

AMD’s Q4 revenue hits $7.66B, up 24% but stock falls

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Advanced Micro Devices reported revenue of $7.658 billion for the fourth quarter, up 24% from a year ago. But AMD’s stock price fell in after-hours trading to $113.66, down 4.89%. Net income was $482 million for the fourth quarter ended December 31, down 28% from $667 million a year earlier. AMD saw data center segment revenue of $3.9 billion in the quarter, up 69% from a year ago. It was driven by record Epyc CPU and Instinct GPU revenues — and the demand is being driven by the rapid adoption of AI. For 2024, data center segment revenue was a record $12.6 billion, an increase of 94% compared to the prior year, driven by growth in both AMD Instinct and Epyc processors. Exiting 2024, there are more than 450 Epyc platforms available from the leading server OEMs and ODMs, including more than 120 “Turin” platforms that went into production in the fourth quarter from Cisco, Dell, HPE, Lenovo, Supermicro and others. “2024 was an outstanding year as we accelerated our AI hardware roadmap to deliver an annual cadence of new Instinct accelerators, expanded our ROCm software suite with significant uplifts in inferencing and training performance, built strong customer relationships with key industry leaders and delivered greater than $5 billion of data center AI revenue for the year,” Su said. Lisa Su is CEO of AMD. In cloud, AMD exited 2024 with well over 50% share at the majority of the largest hyperscale customers. “Hyperscaler demand for EPYC CPUs was very strong, driven by expanded deployments powering both their internal compute infrastructure and online services,” Su said. “Public cloud demand was also very strong with the number of Epyc instances increasing 27%

Read More »

OmniHuman: ByteDance’s new AI creates realistic videos from a single photo

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More ByteDance researchers have developed an artificial intelligence system that transforms single photographs into realistic videos of people speaking, singing and moving naturally — a breakthrough that could reshape digital entertainment and communications. The new system, called OmniHuman, generates full-body videos showing people gesturing and moving in ways that match their speech, surpassing previous AI models that could only animate faces or upper bodies. How OmniHuman Uses 18,700 Hours of Training Data to Create Realistic Motion “End-to-end human animation has undergone notable advancements in recent years. However, existing methods still struggle to scale up as large general video generation models, limiting their potential in real applications,” the researchers wrote in a paper published on arXiv. The team trained OmniHuman on more than 18,700 hours of human video data using a novel approach that combines multiple types of inputs — text, audio, and body movements. This “omni-conditions” training strategy allows the AI to learn from much larger and more diverse datasets than previous methods. Credit: ByteDance AI video generation breakthrough shows full-body movement and natural gestures “Our key insight is that incorporating multiple conditioning signals, such as text, audio, and pose, during training can significantly reduce data wastage,” the research team explained. The technology marks a significant advance in AI-generated media, demonstrating capabilities from creating videos of people delivering speeches to showing subjects playing musical instruments. In testing, OmniHuman outperformed existing systems across multiple quality benchmarks. Credit: ByteDance Tech giants race to develop next-generation video AI systems The development emerges amid intensifying competition in AI video generation, with companies like Google, Meta, and Microsoft pursuing similar technology. ByteDance’s breakthrough could give the TikTok parent company an advantage in this rapidly evolving field. Industry

Read More »

Updating the Frontier Safety Framework

Our next iteration of the FSF sets out stronger security protocols on the path to AGIAI is a powerful tool that is helping to unlock new breakthroughs and make significant progress on some of the biggest challenges of our time, from climate change to drug discovery. But as its development progresses, advanced capabilities may present new risks.That’s why we introduced the first iteration of our Frontier Safety Framework last year – a set of protocols to help us stay ahead of possible severe risks from powerful frontier AI models. Since then, we’ve collaborated with experts in industry, academia, and government to deepen our understanding of the risks, the empirical evaluations to test for them, and the mitigations we can apply. We have also implemented the Framework in our safety and governance processes for evaluating frontier models such as Gemini 2.0. As a result of this work, today we are publishing an updated Frontier Safety Framework.Key updates to the framework include:Security Level recommendations for our Critical Capability Levels (CCLs), helping to identify where the strongest efforts to curb exfiltration risk are neededImplementing a more consistent procedure for how we apply deployment mitigationsOutlining an industry leading approach to deceptive alignment risk

Recommendations for Heightened SecuritySecurity mitigations help prevent unauthorized actors from exfiltrating model weights. This is especially important because access to model weights allows removal of most safeguards. Given the stakes involved as we look ahead to increasingly powerful AI, getting this wrong could have serious implications for safety and security. Our initial Framework recognised the need for a tiered approach to security, allowing for the implementation of mitigations with varying strengths to be tailored to the risk. This proportionate approach also ensures we get the balance right between mitigating risks and fostering access and innovation.Since then, we have drawn on wider research to evolve these security mitigation levels and recommend a level for each of our CCLs.* These recommendations reflect our assessment of the minimum appropriate level of security the field of frontier AI should apply to such models at a CCL. This mapping process helps us isolate where the strongest mitigations are needed to curtail the greatest risk. In practice, some aspects of our security practices may exceed the baseline levels recommended here due to our strong overall security posture.This second version of the Framework recommends particularly high security levels for CCLs within the domain of machine learning research and development (R&D). We believe it will be important for frontier AI developers to have strong security for future scenarios when their models can significantly accelerate and/or automate AI development itself. This is because the uncontrolled proliferation of such capabilities could significantly challenge society’s ability to carefully manage and adapt to the rapid pace of AI development.Ensuring the continued security of cutting-edge AI systems is a shared global challenge – and a shared responsibility of all leading developers. Importantly, getting this right is a collective-action problem: the social value of any single actor’s security mitigations will be significantly reduced if not broadly applied across the field. Building the kind of security capabilities we believe may be needed will take time – so it’s vital that all frontier AI developers work collectively towards heightened security measures and accelerate efforts towards common industry standards.

Deployment Mitigations ProcedureWe also outline deployment mitigations in the Framework that focus on preventing the misuse of critical capabilities in systems we deploy. We’ve updated our deployment mitigation approach to apply a more rigorous safety mitigation process to models reaching a CCL in a misuse risk domain.The updated approach involves the following steps: first, we prepare a set of mitigations by iterating on a set of safeguards. As we do so, we will also develop a safety case, which is an assessable argument showing how severe risks associated with a model’s CCLs have been minimised to an acceptable level. The appropriate corporate governance body then reviews the safety case, with general availability deployment occurring only if it is approved. Finally, we continue to review and update the safeguards and safety case after deployment. We’ve made this change because we believe that all critical capabilities warrant this thorough mitigation process.Approach to Deceptive Alignment RiskThe first iteration of the Framework primarily focused on misuse risk (i.e., the risks of threat actors using critical capabilities of deployed or exfiltrated models to cause harm). Building on this, we’ve taken an industry leading approach to proactively addressing the risks of deceptive alignment, i.e. the risk of an autonomous system deliberately undermining human control.An initial approach to this question focuses on detecting when models might develop a baseline instrumental reasoning ability letting them undermine human control unless safeguards are in place. To mitigate this, we explore automated monitoring to detect illicit use of instrumental reasoning capabilities.We don’t expect automated monitoring to remain sufficient in the long-term if models reach even stronger levels of instrumental reasoning, so we’re actively undertaking – and strongly encouraging – further research developing mitigation approaches for these scenarios. While we don’t yet know how likely such capabilities are to arise, we think it is important that the field prepares for the possibility.ConclusionWe will continue to review and develop the Framework over time, guided by our AI Principles, which further outline our commitment to responsible development.As a part of our efforts, we’ll continue to work collaboratively with partners across society. For instance, if we assess that a model has reached a CCL that poses an unmitigated and material risk to overall public safety, we aim to share information with appropriate government authorities where it will facilitate the development of safe AI. Additionally, the latest Framework outlines a number of potential areas for further research – areas where we look forward to collaborating with the research community, other companies, and government.We believe an open, iterative, and collaborative approach will help to establish common standards and best practices for evaluating the safety of future AI models while securing their benefits for humanity. The Seoul Frontier AI Safety Commitments marked an important step towards this collective effort – and we hope our updated Frontier Safety Framework contributes further to that progress. As we look ahead to AGI, getting this right will mean tackling very consequential questions – such as the right capability thresholds and mitigations – ones that will require the input of broader society, including governments.

Read More »

Four Chinese AI startups to watch beyond DeepSeek

The meteoric rise of DeepSeek—the Chinese AI startup now challenging global giants—has stunned observers and put the spotlight on China’s AI sector. Since ChatGPT’s debut in 2022, the country’s tech ecosystem has been in relentless pursuit of homegrown alternatives, giving rise to a wave of startups and billion-dollar bets.  Today, the race is dominated by tech titans like Alibaba and ByteDance, alongside well-funded rivals backed by heavyweight investors. But two years into China’s generative AI boom we are seeing a shift: Smaller innovators have to carve out their own niches or risk missing out. What began as a sprint has become a high-stakes marathon—China’s AI ambitions have never been higher. An elite group of companies known as the “Six Tigers”—Stepfun, Zhipu, Minimax, Moonshot, 01.AI, and Baichuan—are generally considered to be at the forefront of China’s AI sector. But alongside them, research-focused firms like DeepSeek and ModelBest continue to grow in influence. Some, such as Minimax and Moonshot, are giving up on costly foundational model training to hone in on building consumer-facing applications on top of others’ models. Others, like Stepfun and Infinigence AI, are doubling down on research, driven in part by US semiconductor restrictions. We have identified these four Chinese AI companies as the ones to watch.
Stepfun Founded in April 2023 by former Microsoft senior vice president Jiang Daxin, Stepfun emerged relatively late onto the AI startup scene, but it has quickly become a contender thanks to its portfolio of foundational models. It is also committed to building artificial general intelligence (AGI), a mission a lot of Chinese startups have given up on. With backing from investors like Tencent and funding from Shanghai’s government, the firm released 11 foundational AI models last year—spanning language, visual, video, audio, and multimodal systems. Its biggest language model so far, Step-2, has over 1 trillion parameters (GPT-4 has about 1.8 trillion). It is currently ranked behind only ChatGPT, DeepSeek, Claude, and Gemini’s models on LiveBench, a third-party benchmark site that evaluates the capabilities of large language models.
Stepfun’s multimodal model, Step-1V, is also highly ranked for its ability to understand visual inputs on Chatbot Arena, a crowdsource platform where users can compare and rank AI models’ performance. This company is now working with AI application developers, who are building on top of its models. According to Chinese media outlet 36Kr, demand from external developers to use Stepfun’s multimodal API surged over 45-fold in the second half of 2024. ModelBest Researchers at the prestigious Tsinghua University founded ModelBest in 2022 in Beijing’s Haidian district. Since then, the company has distinguished itself by leaning into efficiency and embracing the trend of small language models. Its MiniCPM series—often dubbed “Little Powerhouses” in Chinese—is engineered for on-device, real-time processing on smartphones, PCs, automotive systems, smart home devices, and even robots. Its pitch to customers is that this combination of smaller models and local data processing cuts costs and enhances privacy.  ModelBest’s newest model, MiniCPM 3.0, has only 4 billion parameters but matches the performance of GPT-3.5 on various benchmarks. On GitHub and Hugging Face, the company’s models can be found under the profile of OpenBMB (Open Lab for Big Model Base), its open-source research lab.  Investors have taken note: In December 2024, the company announced a new, third round of funding worth tens of millions of dollars.  Zhipu Also originating at Tsinghua University, Zhipu AI has grown into a company with strong ties to government and academia. The firm is developing foundational models as well as AI products based on them, including ChatGLM, a conversational model, and a video generator called Ying, which is akin to OpenAI’s Sora system.  GLM-4-Plus, the company’s most advanced large language model to date, is trained on high-quality synthetic data, which reduces training costs, but has still matched the performance of GPT-4. The company has also developed GLM-4V-Plus, a vision model capable of interpreting web pages and videos, which represents a step toward AI with more “agentic” capabilities. Among the cohort of new Chinese AI startups, Zhipu is the first to get on the US government’s radar. On January 15, the Biden administration revised its export control regulations, adding over 20 Chinese entities—including 10 subsidiaries of Zhipu AI—to its restricted trade list, restricting them from receiving US goods or technology for national interest reasons. The US claims Zhipu’s technology is helping China’s military, which the company denies. 

Valued at over $2 billion, Zhipu is currently one of the biggest AI startups in China and is reportedly soon planning an IPO. The company’s investors include Beijing city government-affiliated funds and various prestigious VCs. Infinigence AI Founded in 2023, Infinigence AI is smaller than other companies on this list, though it has still attracted $140 million in funding so far. The company focuses on infrastructure instead of model development. Its main selling point is its ability to combine chips from lots of different brands successfully to execute AI tasks, forming what’s dubbed a “heterogeneous computing cluster.” This is a unique challenge Chinese AI companies face due to US chip sanctions. Infinigence AI claims its system could increase the effectiveness of AI training by streamlining how different chip architectures—including various models from AMD, Huawei, and Nvidia—work in synchronization. In addition, Infinigence AI has launched its Infini-AI cloud platform, which combines multiple vendors’ products to develop and deploy models. The company says it wants to build an effective compute utilization solution “with Chinese characteristics,” and native to AI training. It claims that its training system HetHub could reduce AI models training time by 30% by optimizing the heterogeneous computing clusters Chinese companies often have. Honorable mentions Baichuan While many of its competitors chase scale and expansive application ranges, Baichuan AI, founded by industry veteran Wang Xiaochuan (the founder of Sogou) in April 2023, is focused on the domestic Chinese market, targeting sectors like medical assistance and health care.  With a valuation over $2 billion after its newest round of fundraising, Baichuan is currently among the biggest AI startups in China. Minimax Founded by AI veteran Yan Junjie, Minimax is best known for its product Talkie, a companion chatbot available around the world. The platform provides various characters users can chat with for emotional support or entertainment, and it had even more downloads last year than leading competitor chatbot platform Character.ai.  Chinese media outlet 36Kr reported that Minimax’s revenue in 2024 was around $70 million, making it one of the most successful consumer-facing Chinese AI startups in the global market. 
Moonshot Moonshot is best known for building Kimi, the second-most-popular AI chatbot in China, just after ByteDance’s Doubao, with over 13 million users. Released in 2023, Kimi supports input lengths of over 200,000 characters, making it a popular choice among students, white-collar workers, and others who routinely have to work with long chunks of text. Founded by Yang Zhilin, a renowned AI researcher who studied at Tsinghua University and Carnegie Mellon University, Moonshot is backed by big tech companies, including Alibaba, and top venture capital firms. The company is valued at around $3 billion but is reportedly scaling back on its foundational model research as well as overseas product development plans, as key people leave the company.

Read More »

Adobe Acrobat AI now reads and explains your contracts in minutes — here’s how it works

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Adobe is expanding its push into artificial intelligence with new features that aim to demystify complex contracts and legal documents for both businesses and consumers, as the company seeks to maintain its dominant position in the document management market. The software giant announced today that its Acrobat AI Assistant can now automatically detect contracts, summarize key terms, and compare differences across multiple versions — capabilities that Adobe says could help address a widespread problem: Most people don’t fully read agreements before signing them. According to Adobe’s research, nearly 70% of consumers have signed contracts without understanding all the terms. The problem extends into the business world, where 64% of small business owners have avoided signing contracts due to uncertainty about the content. “Control F is dead,” said Lori DeFurio, a product executive at Adobe, referring to the traditional way people search documents. “Why would I ever search again when I can just ask?” The shift from keyword searching to conversational AI reflects Adobe’s broader vision for making complex documents more accessible to everyone. Credit: Adobe How Adobe’s AI actually reads your contracts The new features represent a significant enhancement to Adobe’s AI capabilities, but notably stop short of providing legal advice. Instead, the system acts more like an intelligent research assistant, helping users locate and understand important contract terms while providing clear citations back to source material. “This is not a replacement for legal advice,” emphasized Michi Alexander, vice president of product marketing at Adobe, in an exclusive interview with VentureBeat. “This is just to help you understand as a starting point your contracts and then where you potentially might want to ask questions.” The technology works by analyzing contract text

Read More »

ML Feature Management: A Practical Evolution Guide

In the world of machine learning, we obsess over model architectures, training pipelines, and hyper-parameter tuning, yet often overlook a fundamental aspect: how our features live and breathe throughout their lifecycle. From in-memory calculations that vanish after each prediction to the challenge of reproducing exact feature values months later, the way we handle features can make or break our ML systems’ reliability and scalability. Who Should Read This Starting Point: The invisible approach Many ML teams, especially those in their early stages or without dedicated ML engineers, start with what I call “the invisible approach” to feature engineering. It’s deceptively simple: fetch raw data, transform it in-memory, and create features on the fly. The resulting dataset, while functional, is essentially a black box of short-lived calculations — features that exist only for a moment before vanishing after each prediction or training run. While this approach might seem to get the job done, it’s built on shaky ground. As teams scale their ML operations, models that performed brilliantly in testing suddenly behave unpredictably in production. Features that worked perfectly during training mysteriously produce different values in live inference. When stakeholders ask why a specific prediction was made last month, teams find themselves unable to reconstruct the exact feature values that led to that decision. Core Challenges in Feature Engineering These pain points aren’t unique to any single team; they represent fundamental challenges that every growing ML team eventually faces. Evolution of Solutions Approach 1: On-Demand Feature Generation The simplest solution starts where many ML teams begin: creating features on demand for immediate use in prediction. Raw data flows through transformations to generate features, which are used for inference, and only then — after predictions are already made — are these features typically saved to parquet files. While this method is straightforward, with teams often choosing parquet files because

Read More »

UK gas policies could lead to increased emissions from imports, report finds

Reduced North Sea investment as a result of UK government policy could lead to millions of tonnes of additional CO2 emissions from imports, according to a report. Edinburgh-based financial analyst firm Gneiss Energy modelled the potential impact of UK government policies relating to the country’s offshore oil and gas sector. Since taking office last year, the UK Labour government has increased the windfall tax on North Sea oil and gas firms and pledged not to issue new exploration licences. These policies, combined with a recent court ruling focused on downstream emissions, have raised concerns about the impact on future investment in the North Sea. Without new projects and further investment, domestic UK gas production in the maturing North Sea basin is likely to significantly decrease over the next decade, leading to a greater reliance on imports. According to the North Sea Transition Authority (NSTA), the carbon intensity of UK gas production is almost four times lower than liquefied natural gas (LNG) imports. UK gas and LNG emissions The Gneiss Energy report estimated the emissions impact of reduced UK production on whilst in tandem increasing imports of liquefied natural gas (LNG) into the 2030s. It modelled scenarios ranging from total reliance on UK domestic production to a ‘sunset’ scenario which assumes a 50% reduction in gas supplies from the UK continental shelf (UKCS). In the sunset scenario, UK gas demand is met with increased supply balanced between Norway and LNG imports from countries like the United States, Qatar and Algeria. Gneiss estimates this would lead to 13% higher pre-consumption CO2 emissions compared to the current UK energy mix. © Supplied by National Grid GrainThe Grain LNG terminal on the Isle of Grain. Image: National Grid By contrast, if the UK were to achieve 100% domestic gas supply it would lead to

Read More »

Oil Prices Slip Amid USA-China Trade War Fears

Oil edged lower as concerns that trade war between the US and China would hurt global growth outweighed the announcement of reinforced sanctions on Iran. West Texas Intermediate fell 0.6% to settle below $73 a barrel, recovering from an earlier decline of as much as 3.4% that was driven by the prospect that dueling tariffs between the US and China would reduce energy demand. Oil pared losses after Trump signed a directive that aims to ramp up economic pressure on Iran. Over the past four years, sanctions evasion and more relaxed US enforcement have allowed Iran to boost oil exports by about 1 million barrels a day. Enforcing sanctions may slash the country’s exports by about two-thirds and could cost Iran roughly $30 billion a year. Earlier on Tuesday, China announced retaliatory measures against Trump’s tariffs, temporarily sending futures below where they ended 2024 for the first time this year. Speaking to reporters Tuesday afternoon, Trump said it was “fine” that China responded in kind with tariffs on some American products. China will place levies on a range of US goods, including oil and liquefied natural gas, in response to Washington’s “unilateral imposition of tariffs,” the country’s finance ministry said. The US shipped about 250,000 barrels of crude a day to China on average last year, a relatively small volume. But an escalation of trade disputes between the world’s two largest economies could have a broader impact and hurt global consumption.   The trade confrontation with China stands in contrast to Trump’s agreement to push back planned levies on Canada and Mexico by a month after the nations agreed to take tougher measures to combat migration and drug trafficking. The flare-up came as China’s markets were shut for the Lunar New Year holidays. Oil futures have faced a bumpy few

Read More »

Adnoc to Shift USA Assets to New Global Unit XRG

Abu Dhabi’s main energy company will transfer stakes in some US assets to its new international investment firm, XRG PJSC, part of a push to create a $80 billion global venture. Abu Dhabi National Oil Co.’s holdings in a hydrogen venture with Exxon Mobil Corp. and NextDecade Corp.’s liquefied natural gas export project will move to XRG, according to Chief Executive Officer Sultan Al Jaber. XRG aims to invest in technologies that power the energy transition and the shift to advanced technologies, Al Jaber wrote in a column on the news website Semafor. Adnoc in September agreed to take a 35% stake in Exxon’s proposed hydrogen project in Baytown, Texas, which is likely to be the world’s biggest once built. That was its second acquisition in the US after Adnoc in May agreed to buy a stake in NextDecade’s LNG project, also in Texas. Adnoc announced its plan for XRG last year, saying the company will have an enterprise value of over $80 billion and will aim to more than double the value of its assets over the next decade. Adnoc plans to start the unit in the first quarter.  The UAE company has already said XRG will control Covestro AG following a roughly $13 billion deal to buy the German chemical producer. XRG will also hold Abu Dhabi’s share of a joint venture with BP Plc for gas projects in the Mediterranean region, including in Egypt. The UAE is pursuing expansion in chemicals and gas to feed future energy demand and supply plastics for consumer products amid the energy transition. Through Adnoc, it is expanding its energy business beyond oil to focus on gas trading and chemical derivatives like plastics on expectation demand for these products will benefit from the energy transition.  Over the weekend, Adnoc said it was in

Read More »

AMD’s Q4 revenue hits $7.66B, up 24% but stock falls

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Advanced Micro Devices reported revenue of $7.658 billion for the fourth quarter, up 24% from a year ago. But AMD’s stock price fell in after-hours trading to $113.66, down 4.89%. Net income was $482 million for the fourth quarter ended December 31, down 28% from $667 million a year earlier. AMD saw data center segment revenue of $3.9 billion in the quarter, up 69% from a year ago. It was driven by record Epyc CPU and Instinct GPU revenues — and the demand is being driven by the rapid adoption of AI. For 2024, data center segment revenue was a record $12.6 billion, an increase of 94% compared to the prior year, driven by growth in both AMD Instinct and Epyc processors. Exiting 2024, there are more than 450 Epyc platforms available from the leading server OEMs and ODMs, including more than 120 “Turin” platforms that went into production in the fourth quarter from Cisco, Dell, HPE, Lenovo, Supermicro and others. “2024 was an outstanding year as we accelerated our AI hardware roadmap to deliver an annual cadence of new Instinct accelerators, expanded our ROCm software suite with significant uplifts in inferencing and training performance, built strong customer relationships with key industry leaders and delivered greater than $5 billion of data center AI revenue for the year,” Su said. Lisa Su is CEO of AMD. In cloud, AMD exited 2024 with well over 50% share at the majority of the largest hyperscale customers. “Hyperscaler demand for EPYC CPUs was very strong, driven by expanded deployments powering both their internal compute infrastructure and online services,” Su said. “Public cloud demand was also very strong with the number of Epyc instances increasing 27%

Read More »

OmniHuman: ByteDance’s new AI creates realistic videos from a single photo

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More ByteDance researchers have developed an artificial intelligence system that transforms single photographs into realistic videos of people speaking, singing and moving naturally — a breakthrough that could reshape digital entertainment and communications. The new system, called OmniHuman, generates full-body videos showing people gesturing and moving in ways that match their speech, surpassing previous AI models that could only animate faces or upper bodies. How OmniHuman Uses 18,700 Hours of Training Data to Create Realistic Motion “End-to-end human animation has undergone notable advancements in recent years. However, existing methods still struggle to scale up as large general video generation models, limiting their potential in real applications,” the researchers wrote in a paper published on arXiv. The team trained OmniHuman on more than 18,700 hours of human video data using a novel approach that combines multiple types of inputs — text, audio, and body movements. This “omni-conditions” training strategy allows the AI to learn from much larger and more diverse datasets than previous methods. Credit: ByteDance AI video generation breakthrough shows full-body movement and natural gestures “Our key insight is that incorporating multiple conditioning signals, such as text, audio, and pose, during training can significantly reduce data wastage,” the research team explained. The technology marks a significant advance in AI-generated media, demonstrating capabilities from creating videos of people delivering speeches to showing subjects playing musical instruments. In testing, OmniHuman outperformed existing systems across multiple quality benchmarks. Credit: ByteDance Tech giants race to develop next-generation video AI systems The development emerges amid intensifying competition in AI video generation, with companies like Google, Meta, and Microsoft pursuing similar technology. ByteDance’s breakthrough could give the TikTok parent company an advantage in this rapidly evolving field. Industry

Read More »

Stay Ahead with the Paperboy Newsletter

Your weekly dose of insights into AI, Bitcoin mining, Datacenters and Energy indusrty news. Spend 3-5 minutes and catch-up on 1 week of news.

Smarter with ONMINE

Streamline Your Growth with ONMINE