Your Gateway to Power, Energy, Datacenters, Bitcoin and AI

Dive into the latest industry updates, our exclusive Paperboy Newsletter, and curated insights designed to keep you informed. Stay ahead with minimal time spent.

Discover What Matters Most to You

Explore ONMINE’s curated content, from our Paperboy Newsletter to industry-specific insights tailored for energy, Bitcoin mining, and AI professionals.

AI

Lorem Ipsum is simply dummy text of the printing and typesetting industry.

Bitcoin:

Lorem Ipsum is simply dummy text of the printing and typesetting industry.

Datacenter:

Lorem Ipsum is simply dummy text of the printing and typesetting industry.

Energy:

Lorem Ipsum is simply dummy text of the printing and typesetting industry.

Shape
Discover What Matter Most to You

Featured Articles

The key to renewable energy development is building local trust

Arun Muthukrishnan is senior manager of development at Arevon Energy. After developing over 1 GW of utility-scale solar and storage projects across diverse terrains and jurisdictions in the U.S., I’ve come to realize that the real complexity of this industry rarely lies in photovoltaic modules or grid studies. It lies in people. Coordinating with city officials, utility engineers, fire marshals and local communities isn’t just part of the job. It is the job. One of my earliest lessons came from a mid-sized city in Southern California where we thought we had a slam-dunk project. The land was zoned appropriately, the interconnection looked promising and we had a clean environmental review.  What we didn’t anticipate was the skepticism from the city council. They had been burned in the past by developers who promised jobs and community benefits but later disappeared. So before we even reached the planning commission hearing, we met one-on-one with local leaders, showed up at community events and clearly outlined how the project aligned with the city’s climate goals. That work paid off. Not only did the council approve the project unanimously, but they became advocates, mentioning it in newsletters and media interviews. The biggest lesson? Cities aren’t obstacles. They’re potential champions if you engage them early and honestly. Utility coordination is also often misunderstood by developers as a check-box process. But utilities aren’t just black box institutions processing queue positions. They’re risk-averse entities tasked with maintaining grid reliability. On one Electric Reliability Council of Texas project, the hosting capacity maps painted a rosy picture. But thanks to a weekly call we initiated with the utility’s transmission planning team, we learned about a transformer nearing overload that wouldn’t show up in the system impact study for months. That visibility allowed us to move our point of interconnection slightly

Read More »

Governors seek more influence over PJM amid ‘crisis of confidence’

Facing an “unprecedented crisis of confidence,” the PJM Interconnection needs fundamental change and new leadership, according to nine governors representing the majority of electric customers in the grid operator’s footprint. “At a time of rapidly rising load growth, PJM’s multi-year inability to efficiently connect new resources to its grid and to engage in effective long-term transmission planning has deprived our states of thousands of jobs and billions of dollars in investment that may flow to other regions,” the governors said to PJM’s board in a letter released Thursday. The letter from the governors comes about a year after total capacity costs in PJM’s last capacity auction jumped to $14.7 billion from $2.2 billion in the previous auction. The increase led to potential electric bill hikes in some states in the 10% to 20% range. Some states took steps to ease those bill increases. PJM is set to release the results of its most recent capacity auction on Tuesday. Increasingly, states are considering leaving PJM, the governors said. The letter was signed by governors from Delaware, Illinois, Kentucky, Maryland, Michigan, New Jersey, Pennsylvania, Tennessee and Virginia. “We are deeply concerned that PJM’s response has been typified by halting, inconsistent steps and rising internal conflicts within the stakeholder community that have recently culminated in the abrupt termination of two long-standing members of the Board of Managers and the imminent departure of the CEO,” the governors said. Manu Asthana, PJM president and CEO, on April 14 said he plans to step down from his job at the end of this year. He joined PJM in January 2020. PJM is seeking a replacement. Also, at a May 12 Members Committee meeting, two incumbent board nominees — Chairman Mark Takahashi and Terry Blackwell — failed to receive enough votes to be reelected to three-year terms, according

Read More »

Bureau of Land Management Approves Geothermal Facility

In a release posted on its website this week, the U.S. Bureau of Land Management announced that it had approved the 30-megawatt Crescent Valley geothermal energy production facility and associated transmission line. The project includes construction and operation of one power plant, a photovoltaic solar field, 17 additional geothermal fluid production and injection wells and well pads, new and improved access roads, an aggregate pit, geothermal fluid pipelines, an electrical gen-tie line, substation, switching station, and ancillary support facilities, the Bureau of Land Management noted in the release. “Geothermal projects support domestic energy production and American energy independence, while contributing to the nation’s economy and security,” the Bureau stated in the release. “Consistent with Executive Order 14154, ‘Unleashing American Energy’, the geothermal projects on public lands help meet the energy needs of U.S. citizens, will solidify the nation as a global energy leader long into the future, and achieve American Energy Dominance,” it added. In the release, the Bureau of Land Management highlighted that, “according to the U.S. Environmental Protection Agency, one megawatt produced by a geothermal project can power about 1,104 average American homes’ electricity use per year”. The Bureau stated in the release that geothermal “is an abundant resource, especially in the West, where the BLM has authority to manage geothermal resource leasing, exploration, and development on approximately 245 million surface acres of public lands and the 700 million acres where the United States owns the subsurface mineral estate”. In a separate release posted on its site on June 27, the Bureau of Land Management announced that it had approved three geothermal energy projects under an expedited timeline in Nevada, which it said support the administration’s goals for energy development on public lands. These comprise the Diamond Flat Geothermal Project, the McGinness Hills Geothermal Optimization Project, and the Pinto Geothermal Project, the release highlighted.

Read More »

US DOE Launches Nuclear Fuel Program to Support Advanced Reactor Testing

The United States Department of Energy (DOE) has opened application for producing nuclear fuel to support the development of advanced reactors. The Fuel Line Pilot Program seeks U.S. companies to build and operate production facilities outside of national laboratories under the DOE authorization process. It supports the Reactor Pilot Program announced last month. “This initiative will help end America’s reliance on foreign sources of enriched uranium and critical materials, while opening the door for private sector investment in America’s nuclear renaissance”, the DOE said in a statement online. “The program leverages the DOE authorization process to build and operate nuclear fuel production lines to serve for research, development, and demonstration purposes and to provide a fast-tracked approach to commercial licensing”, the program webpage says. The first applications are due August 15. The DOE expects to announce initial selections a month thereafter. The DOE may open subsequent applications. “Applicants will be responsible for all costs associated with the construction, operation, and decommissioning of an advanced nuclear fuel line, as well as the procurement of all nuclear material feedstock”, the DOE statement said. “The selections will be based on a set of criteria, including technological readiness, established fuel fabrication plans, and financial viability. “While the advanced nuclear fuel lines will serve for research, development, and demonstration purposes, seeking DOE authorization of the facilities can help unlock private funding and provide a fast-tracked approach to enable future commercial licensing activities for potential applicants”. “America has the resources and the expertise to lead the world in nuclear energy development, but we need secure domestic supply chains to fuel this rapidly growing energy source and achieve a true nuclear energy renaissance”, commented Energy Secretary Chris Wright. “The Trump Administration is accelerating innovation, not regulation, and leveraging partnerships with the private sector to safely fuel and

Read More »

Six Companies Join New York Offshore Wind Innovation Hub Accelerator

The New York University (NYU) Tandon School of Engineering, in partnership with Equinor ASA, National Offshore Wind R&D Consortium (NOWRDC), and New York City Economic Development Corp. (NYCEDC), has selected six companies to receive support for the development of ideas and advancement of offshore wind’s potential in New York under the Offshore Wind Innovation Hub. The Offshore Wind Innovation Hub, led by Equinor, reviewed a pool of 53 applicants. In a media release, the Hub partners said the six companies have been selected based on the novelty and potential of their solutions to join the 2025 Accelerator Cohort. The six companies are Anemo Robotics, Kalypso Offshore Energy, MESPAC, Orpheus Ocean, Reblade, and Werover. Among the focus areas was identifying innovations that can contribute to efficiencies in turbine maintenance and improved marine life monitoring, the Hub partners said. “We’re excited to join the Offshore Wind Innovation Hub program as it represents a significant opportunity for us to explore the U.S. offshore wind market. We look forward to gaining valuable insights and mentorship from the distinguished companies and experts involved in the program”, Balca Yılmaz, CEO and Co-Founder of Werover, said. The winners will participate in a six-month mentoring and business development program that aims to prepare them for strategic partnerships with offshore wind developers, suppliers, and the wider industry. The program helps innovators overcome barriers and commercialize solutions in New York and beyond, the Hub partners said. Building on two years of success, the 2025 Accelerator builds on progress made by the 2024 cohort in business, product development, fundraising, hiring, and piloting. Notable achievements include Triton Anchor’s $5.7M fundraise, Claviate’s contract with Siemens to manage turbines, and Pliant Energy’s two pilot projects in New York City waters with NYC EDC, according to the Hub partners. The Offshore Wind Innovation Hub is based

Read More »

A major AI training data set contains millions of examples of personal data

Millions of images of passports, credit cards, birth certificates, and other documents containing personally identifiable information are likely included in one of the biggest open-source AI training sets, new research has found. Thousands of images—including identifiable faces—were found in a small subset of DataComp CommonPool, a major AI training set for image generation scraped from the web. Because the researchers audited just 0.1% of CommonPool’s data, they estimate that the real number of images containing personally identifiable information, including faces and identity documents, is in the hundreds of millions. The study that details the breach was published on arXiv earlier this month. The bottom line, says William Agnew, a postdoctoral fellow in AI ethics at Carnegie Mellon University and one of the coauthors, is that “anything you put online can [be] and probably has been scraped.” The researchers found thousands of instances of validated identity documents—including images of credit cards, driver’s licenses, passports, and birth certificates—as well as over 800 validated job application documents (including résumés and cover letters), which were confirmed through LinkedIn and other web searches as being associated with real people. (In many more cases, the researchers did not have time to validate the documents or were unable to because of issues like image clarity.) 
A number of the résumés disclosed sensitive information including disability status, the results of background checks, birth dates and birthplaces of dependents, and race. When résumés were linked to people with online presences, researchers also found contact information, government identifiers, sociodemographic information, face photographs, home addresses, and the contact information of other people (like references). Examples of identity-related documents found in CommonPool’s small scale dataset, showing a credit card, social security number, and a driver’s license. For each sample, the type of URL site is shown at the top, the image in the middle, and the caption in quotes below. All personal information has been replaced, and text has been paraphrased to avoid direct quotations. Images have been redacted to show the presence of faces without identifying the individuals.COURTESY OF THE RESEARCHERS When it was released in 2023, DataComp CommonPool, with its 12.8 billion data samples, was the largest existing data set of publicly available image-text pairs, which are often used to train generative text-to-image models. While its curators said that CommonPool was intended for academic research, its license does not prohibit commercial use as well. 
CommonPool was created as a follow-up to the LAION-5B data set, which was used to train models including Stable Diffusion and Midjourney. It draws on the same data source: web scraping done by the nonprofit Common Crawl between 2014 and 2022.  While commercial models often do not disclose what data sets they are trained on, the shared data sources of DataComp CommonPool and LAION-5B mean that the datasets are similar, and that the same personally identifiable information likely appears in LAION-5B, as well as in other downstream models trained on CommonPool data. CommonPool researchers did not respond to emailed questions. And since DataComp CommonPool has been downloaded more than 2 million times over the past two years, it is likely that “there [are]many downstream models that are all trained on this exact data set,” says Rachel Hong, a PhD student in computer science at the University of Washington and the paper’s lead author. Those would duplicate similar privacy risks. Good intentions are not enough “You can assume that any large scale web-scraped data always contains content that shouldn’t be there,” says Abeba Birhane, a cognitive scientist and tech ethicist who leads Trinity College Dublin’s AI Accountability Lab—whether it’s personally identifiable information (PII), child sexual abuse imagery, or hate speech (which Birhane’s own research into LAION-5B has found).  Indeed, the curators of DataComp CommonPool were themselves aware it was likely that PII would appear in the data set and did take some measures to preserve privacy, including automatically detecting and blurring faces. But in their limited data set, Hong’s team found and validated over 800 faces that the algorithm had missed, and they estimated that overall, the algorithm had missed 102 million faces in the entire data set. On the other hand, they did not apply filters that could have recognized known PII strings, like emails or social security numbers.  “Filtering is extremely hard to do well,” says Agnew. “They would have had to make very significant advancements in PII detection and removal that they haven’t made public to be able to effectively filter this.”   Examples of resume documents and personal disclosures found in CommonPool’s small scale dataset. For each sample, the type of URL site is shown at the top, the image in the middle, and the caption in quotes below. All personal information has been replaced, and text has been paraphrased to avoid direct quotations. Images have been redacted to show the presence of faces without identifying the individuals. Image courtesy researchers.COURTESY OF THE RESEARCHERS There are other privacy issues that the face blurring doesn’t address. While the face blurring filter is automatically applied, it is optional and can be removed. Additionally, the captions that often accompany the photos, as well as the photos’ metadata, often contain even more personal information, such as names and exact locations. Another privacy mitigation measure comes from Hugging Face, a platform that distributes training data sets and hosts CommonPool, which integrates with a tool that theoretically allows people to search for and remove their own information from a data set. But as the researchers note in their paper, this would require people to know that their data is there to start with. When asked for comment, Florent Daudens of Hugging Face said that “maximizing the privacy of data subjects across the AI ecosystem takes a multilayered approach, which includes but is not limited to the widget mentioned,” and that the platform is “working with our community of users to move the needle in a more privacy-grounded direction.” 

In any case, just getting your data removed from one data set probably isn’t enough.“ Even if someone finds out their data was used in a training data sets and … exercises their right to deletion, technically the law is unclear about what that means,”  says Tiffany Li, an assistant professor of law at the University of New Hampshire School of Law. “If the organization only deletes data from the training data sets—but does not delete or retrain the already trained model—then the harm will nonetheless be done.” The bottom line, says Agnew, is that “if you web-scrape, you’re going to have private data in there. Even if you filter, you’re still going to have private data in there, just because of the scale of this. And that’s something that we [machine-learning researchers], as a field, really need to grapple with.” Reconsidering consent CommonPool was built on web data scraped between 2014 and 2022, meaning that many of the images likely date to before 2020, when ChatGPT was released. So even if it’s theoretically possible that some people consented to having their information publicly available to anyone on the web, they could not have consented to having their data used to train large AI models that did not yet exist. And with web scrapers often scraping data from each other, an image that was originally uploaded by the owner to one specific location would often find its way into other image repositories. “I might upload something onto the internet, and then … a year or so later, [I] want to take it down, but then that [removal] doesn’t necessarily do anything anymore,” says Agnew. The researchers also found numerous examples of children’s personal information, including depictions of birth certificates, passports, and health status, but in contexts suggesting that they had been shared for limited purposes. “It really illuminates the original sin of AI systems built off public data—it’s extractive, misleading, and dangerous to people who have been using the internet with one framework of risk, never assuming it would all be hoovered up by a group trying to create an image generator,” says Ben Winters, the director of AI and privacy at the Consumer Federation of America. Finding a policy that fits Ultimately, the paper calls for the machine-learning community to rethink the common practice of indiscriminate web scraping and also lays out the possible violations of current privacy laws represented by the existence of PII in massive machine-learning data sets, as well as the limitations of those laws’ ability to protect privacy. “We have the GDPR in Europe, we have the CCPA in California, but there’s still no federal data protection law in America, which also means that different Americans have different rights protections,” says Marietje Schaake, a Dutch lawmaker turned tech policy expert who currently serves as a fellow at Stanford’s Cyber Policy Center. 
Besides, these privacy laws apply to companies that meet certain criteria for size and other characteristics. They do not necessarily apply to researchers like those who were responsible for creating and curating DataComp CommonPool. And even state laws that do address privacy, like California’s consumer privacy act, have carve-outs for “publicly available” information. Machine-learning researchers have long operated on the principle that if it’s available on the internet, then it is public and no longer private information, but Hong, Agnew, and their colleagues hope that their research challenges this assumption.  “What we found is that ‘publicly available’ includes a lot of stuff that a lot of people might consider private—résumés, photos, credit card numbers, various IDs, news stories from when you were a child, your family blog. These are probably not things people want to just be used anywhere, for anything,” says Hong.   Hopefully, Schaake says, this research “will raise alarm bells and create change.” 

Read More »

The key to renewable energy development is building local trust

Arun Muthukrishnan is senior manager of development at Arevon Energy. After developing over 1 GW of utility-scale solar and storage projects across diverse terrains and jurisdictions in the U.S., I’ve come to realize that the real complexity of this industry rarely lies in photovoltaic modules or grid studies. It lies in people. Coordinating with city officials, utility engineers, fire marshals and local communities isn’t just part of the job. It is the job. One of my earliest lessons came from a mid-sized city in Southern California where we thought we had a slam-dunk project. The land was zoned appropriately, the interconnection looked promising and we had a clean environmental review.  What we didn’t anticipate was the skepticism from the city council. They had been burned in the past by developers who promised jobs and community benefits but later disappeared. So before we even reached the planning commission hearing, we met one-on-one with local leaders, showed up at community events and clearly outlined how the project aligned with the city’s climate goals. That work paid off. Not only did the council approve the project unanimously, but they became advocates, mentioning it in newsletters and media interviews. The biggest lesson? Cities aren’t obstacles. They’re potential champions if you engage them early and honestly. Utility coordination is also often misunderstood by developers as a check-box process. But utilities aren’t just black box institutions processing queue positions. They’re risk-averse entities tasked with maintaining grid reliability. On one Electric Reliability Council of Texas project, the hosting capacity maps painted a rosy picture. But thanks to a weekly call we initiated with the utility’s transmission planning team, we learned about a transformer nearing overload that wouldn’t show up in the system impact study for months. That visibility allowed us to move our point of interconnection slightly

Read More »

Governors seek more influence over PJM amid ‘crisis of confidence’

Facing an “unprecedented crisis of confidence,” the PJM Interconnection needs fundamental change and new leadership, according to nine governors representing the majority of electric customers in the grid operator’s footprint. “At a time of rapidly rising load growth, PJM’s multi-year inability to efficiently connect new resources to its grid and to engage in effective long-term transmission planning has deprived our states of thousands of jobs and billions of dollars in investment that may flow to other regions,” the governors said to PJM’s board in a letter released Thursday. The letter from the governors comes about a year after total capacity costs in PJM’s last capacity auction jumped to $14.7 billion from $2.2 billion in the previous auction. The increase led to potential electric bill hikes in some states in the 10% to 20% range. Some states took steps to ease those bill increases. PJM is set to release the results of its most recent capacity auction on Tuesday. Increasingly, states are considering leaving PJM, the governors said. The letter was signed by governors from Delaware, Illinois, Kentucky, Maryland, Michigan, New Jersey, Pennsylvania, Tennessee and Virginia. “We are deeply concerned that PJM’s response has been typified by halting, inconsistent steps and rising internal conflicts within the stakeholder community that have recently culminated in the abrupt termination of two long-standing members of the Board of Managers and the imminent departure of the CEO,” the governors said. Manu Asthana, PJM president and CEO, on April 14 said he plans to step down from his job at the end of this year. He joined PJM in January 2020. PJM is seeking a replacement. Also, at a May 12 Members Committee meeting, two incumbent board nominees — Chairman Mark Takahashi and Terry Blackwell — failed to receive enough votes to be reelected to three-year terms, according

Read More »

Bureau of Land Management Approves Geothermal Facility

In a release posted on its website this week, the U.S. Bureau of Land Management announced that it had approved the 30-megawatt Crescent Valley geothermal energy production facility and associated transmission line. The project includes construction and operation of one power plant, a photovoltaic solar field, 17 additional geothermal fluid production and injection wells and well pads, new and improved access roads, an aggregate pit, geothermal fluid pipelines, an electrical gen-tie line, substation, switching station, and ancillary support facilities, the Bureau of Land Management noted in the release. “Geothermal projects support domestic energy production and American energy independence, while contributing to the nation’s economy and security,” the Bureau stated in the release. “Consistent with Executive Order 14154, ‘Unleashing American Energy’, the geothermal projects on public lands help meet the energy needs of U.S. citizens, will solidify the nation as a global energy leader long into the future, and achieve American Energy Dominance,” it added. In the release, the Bureau of Land Management highlighted that, “according to the U.S. Environmental Protection Agency, one megawatt produced by a geothermal project can power about 1,104 average American homes’ electricity use per year”. The Bureau stated in the release that geothermal “is an abundant resource, especially in the West, where the BLM has authority to manage geothermal resource leasing, exploration, and development on approximately 245 million surface acres of public lands and the 700 million acres where the United States owns the subsurface mineral estate”. In a separate release posted on its site on June 27, the Bureau of Land Management announced that it had approved three geothermal energy projects under an expedited timeline in Nevada, which it said support the administration’s goals for energy development on public lands. These comprise the Diamond Flat Geothermal Project, the McGinness Hills Geothermal Optimization Project, and the Pinto Geothermal Project, the release highlighted.

Read More »

US DOE Launches Nuclear Fuel Program to Support Advanced Reactor Testing

The United States Department of Energy (DOE) has opened application for producing nuclear fuel to support the development of advanced reactors. The Fuel Line Pilot Program seeks U.S. companies to build and operate production facilities outside of national laboratories under the DOE authorization process. It supports the Reactor Pilot Program announced last month. “This initiative will help end America’s reliance on foreign sources of enriched uranium and critical materials, while opening the door for private sector investment in America’s nuclear renaissance”, the DOE said in a statement online. “The program leverages the DOE authorization process to build and operate nuclear fuel production lines to serve for research, development, and demonstration purposes and to provide a fast-tracked approach to commercial licensing”, the program webpage says. The first applications are due August 15. The DOE expects to announce initial selections a month thereafter. The DOE may open subsequent applications. “Applicants will be responsible for all costs associated with the construction, operation, and decommissioning of an advanced nuclear fuel line, as well as the procurement of all nuclear material feedstock”, the DOE statement said. “The selections will be based on a set of criteria, including technological readiness, established fuel fabrication plans, and financial viability. “While the advanced nuclear fuel lines will serve for research, development, and demonstration purposes, seeking DOE authorization of the facilities can help unlock private funding and provide a fast-tracked approach to enable future commercial licensing activities for potential applicants”. “America has the resources and the expertise to lead the world in nuclear energy development, but we need secure domestic supply chains to fuel this rapidly growing energy source and achieve a true nuclear energy renaissance”, commented Energy Secretary Chris Wright. “The Trump Administration is accelerating innovation, not regulation, and leveraging partnerships with the private sector to safely fuel and

Read More »

Six Companies Join New York Offshore Wind Innovation Hub Accelerator

The New York University (NYU) Tandon School of Engineering, in partnership with Equinor ASA, National Offshore Wind R&D Consortium (NOWRDC), and New York City Economic Development Corp. (NYCEDC), has selected six companies to receive support for the development of ideas and advancement of offshore wind’s potential in New York under the Offshore Wind Innovation Hub. The Offshore Wind Innovation Hub, led by Equinor, reviewed a pool of 53 applicants. In a media release, the Hub partners said the six companies have been selected based on the novelty and potential of their solutions to join the 2025 Accelerator Cohort. The six companies are Anemo Robotics, Kalypso Offshore Energy, MESPAC, Orpheus Ocean, Reblade, and Werover. Among the focus areas was identifying innovations that can contribute to efficiencies in turbine maintenance and improved marine life monitoring, the Hub partners said. “We’re excited to join the Offshore Wind Innovation Hub program as it represents a significant opportunity for us to explore the U.S. offshore wind market. We look forward to gaining valuable insights and mentorship from the distinguished companies and experts involved in the program”, Balca Yılmaz, CEO and Co-Founder of Werover, said. The winners will participate in a six-month mentoring and business development program that aims to prepare them for strategic partnerships with offshore wind developers, suppliers, and the wider industry. The program helps innovators overcome barriers and commercialize solutions in New York and beyond, the Hub partners said. Building on two years of success, the 2025 Accelerator builds on progress made by the 2024 cohort in business, product development, fundraising, hiring, and piloting. Notable achievements include Triton Anchor’s $5.7M fundraise, Claviate’s contract with Siemens to manage turbines, and Pliant Energy’s two pilot projects in New York City waters with NYC EDC, according to the Hub partners. The Offshore Wind Innovation Hub is based

Read More »

A major AI training data set contains millions of examples of personal data

Millions of images of passports, credit cards, birth certificates, and other documents containing personally identifiable information are likely included in one of the biggest open-source AI training sets, new research has found. Thousands of images—including identifiable faces—were found in a small subset of DataComp CommonPool, a major AI training set for image generation scraped from the web. Because the researchers audited just 0.1% of CommonPool’s data, they estimate that the real number of images containing personally identifiable information, including faces and identity documents, is in the hundreds of millions. The study that details the breach was published on arXiv earlier this month. The bottom line, says William Agnew, a postdoctoral fellow in AI ethics at Carnegie Mellon University and one of the coauthors, is that “anything you put online can [be] and probably has been scraped.” The researchers found thousands of instances of validated identity documents—including images of credit cards, driver’s licenses, passports, and birth certificates—as well as over 800 validated job application documents (including résumés and cover letters), which were confirmed through LinkedIn and other web searches as being associated with real people. (In many more cases, the researchers did not have time to validate the documents or were unable to because of issues like image clarity.) 
A number of the résumés disclosed sensitive information including disability status, the results of background checks, birth dates and birthplaces of dependents, and race. When résumés were linked to people with online presences, researchers also found contact information, government identifiers, sociodemographic information, face photographs, home addresses, and the contact information of other people (like references). Examples of identity-related documents found in CommonPool’s small scale dataset, showing a credit card, social security number, and a driver’s license. For each sample, the type of URL site is shown at the top, the image in the middle, and the caption in quotes below. All personal information has been replaced, and text has been paraphrased to avoid direct quotations. Images have been redacted to show the presence of faces without identifying the individuals.COURTESY OF THE RESEARCHERS When it was released in 2023, DataComp CommonPool, with its 12.8 billion data samples, was the largest existing data set of publicly available image-text pairs, which are often used to train generative text-to-image models. While its curators said that CommonPool was intended for academic research, its license does not prohibit commercial use as well. 
CommonPool was created as a follow-up to the LAION-5B data set, which was used to train models including Stable Diffusion and Midjourney. It draws on the same data source: web scraping done by the nonprofit Common Crawl between 2014 and 2022.  While commercial models often do not disclose what data sets they are trained on, the shared data sources of DataComp CommonPool and LAION-5B mean that the datasets are similar, and that the same personally identifiable information likely appears in LAION-5B, as well as in other downstream models trained on CommonPool data. CommonPool researchers did not respond to emailed questions. And since DataComp CommonPool has been downloaded more than 2 million times over the past two years, it is likely that “there [are]many downstream models that are all trained on this exact data set,” says Rachel Hong, a PhD student in computer science at the University of Washington and the paper’s lead author. Those would duplicate similar privacy risks. Good intentions are not enough “You can assume that any large scale web-scraped data always contains content that shouldn’t be there,” says Abeba Birhane, a cognitive scientist and tech ethicist who leads Trinity College Dublin’s AI Accountability Lab—whether it’s personally identifiable information (PII), child sexual abuse imagery, or hate speech (which Birhane’s own research into LAION-5B has found).  Indeed, the curators of DataComp CommonPool were themselves aware it was likely that PII would appear in the data set and did take some measures to preserve privacy, including automatically detecting and blurring faces. But in their limited data set, Hong’s team found and validated over 800 faces that the algorithm had missed, and they estimated that overall, the algorithm had missed 102 million faces in the entire data set. On the other hand, they did not apply filters that could have recognized known PII strings, like emails or social security numbers.  “Filtering is extremely hard to do well,” says Agnew. “They would have had to make very significant advancements in PII detection and removal that they haven’t made public to be able to effectively filter this.”   Examples of resume documents and personal disclosures found in CommonPool’s small scale dataset. For each sample, the type of URL site is shown at the top, the image in the middle, and the caption in quotes below. All personal information has been replaced, and text has been paraphrased to avoid direct quotations. Images have been redacted to show the presence of faces without identifying the individuals. Image courtesy researchers.COURTESY OF THE RESEARCHERS There are other privacy issues that the face blurring doesn’t address. While the face blurring filter is automatically applied, it is optional and can be removed. Additionally, the captions that often accompany the photos, as well as the photos’ metadata, often contain even more personal information, such as names and exact locations. Another privacy mitigation measure comes from Hugging Face, a platform that distributes training data sets and hosts CommonPool, which integrates with a tool that theoretically allows people to search for and remove their own information from a data set. But as the researchers note in their paper, this would require people to know that their data is there to start with. When asked for comment, Florent Daudens of Hugging Face said that “maximizing the privacy of data subjects across the AI ecosystem takes a multilayered approach, which includes but is not limited to the widget mentioned,” and that the platform is “working with our community of users to move the needle in a more privacy-grounded direction.” 

In any case, just getting your data removed from one data set probably isn’t enough.“ Even if someone finds out their data was used in a training data sets and … exercises their right to deletion, technically the law is unclear about what that means,”  says Tiffany Li, an assistant professor of law at the University of New Hampshire School of Law. “If the organization only deletes data from the training data sets—but does not delete or retrain the already trained model—then the harm will nonetheless be done.” The bottom line, says Agnew, is that “if you web-scrape, you’re going to have private data in there. Even if you filter, you’re still going to have private data in there, just because of the scale of this. And that’s something that we [machine-learning researchers], as a field, really need to grapple with.” Reconsidering consent CommonPool was built on web data scraped between 2014 and 2022, meaning that many of the images likely date to before 2020, when ChatGPT was released. So even if it’s theoretically possible that some people consented to having their information publicly available to anyone on the web, they could not have consented to having their data used to train large AI models that did not yet exist. And with web scrapers often scraping data from each other, an image that was originally uploaded by the owner to one specific location would often find its way into other image repositories. “I might upload something onto the internet, and then … a year or so later, [I] want to take it down, but then that [removal] doesn’t necessarily do anything anymore,” says Agnew. The researchers also found numerous examples of children’s personal information, including depictions of birth certificates, passports, and health status, but in contexts suggesting that they had been shared for limited purposes. “It really illuminates the original sin of AI systems built off public data—it’s extractive, misleading, and dangerous to people who have been using the internet with one framework of risk, never assuming it would all be hoovered up by a group trying to create an image generator,” says Ben Winters, the director of AI and privacy at the Consumer Federation of America. Finding a policy that fits Ultimately, the paper calls for the machine-learning community to rethink the common practice of indiscriminate web scraping and also lays out the possible violations of current privacy laws represented by the existence of PII in massive machine-learning data sets, as well as the limitations of those laws’ ability to protect privacy. “We have the GDPR in Europe, we have the CCPA in California, but there’s still no federal data protection law in America, which also means that different Americans have different rights protections,” says Marietje Schaake, a Dutch lawmaker turned tech policy expert who currently serves as a fellow at Stanford’s Cyber Policy Center. 
Besides, these privacy laws apply to companies that meet certain criteria for size and other characteristics. They do not necessarily apply to researchers like those who were responsible for creating and curating DataComp CommonPool. And even state laws that do address privacy, like California’s consumer privacy act, have carve-outs for “publicly available” information. Machine-learning researchers have long operated on the principle that if it’s available on the internet, then it is public and no longer private information, but Hong, Agnew, and their colleagues hope that their research challenges this assumption.  “What we found is that ‘publicly available’ includes a lot of stuff that a lot of people might consider private—résumés, photos, credit card numbers, various IDs, news stories from when you were a child, your family blog. These are probably not things people want to just be used anywhere, for anything,” says Hong.   Hopefully, Schaake says, this research “will raise alarm bells and create change.” 

Read More »

Six Companies Join New York Offshore Wind Innovation Hub Accelerator

The New York University (NYU) Tandon School of Engineering, in partnership with Equinor ASA, National Offshore Wind R&D Consortium (NOWRDC), and New York City Economic Development Corp. (NYCEDC), has selected six companies to receive support for the development of ideas and advancement of offshore wind’s potential in New York under the Offshore Wind Innovation Hub. The Offshore Wind Innovation Hub, led by Equinor, reviewed a pool of 53 applicants. In a media release, the Hub partners said the six companies have been selected based on the novelty and potential of their solutions to join the 2025 Accelerator Cohort. The six companies are Anemo Robotics, Kalypso Offshore Energy, MESPAC, Orpheus Ocean, Reblade, and Werover. Among the focus areas was identifying innovations that can contribute to efficiencies in turbine maintenance and improved marine life monitoring, the Hub partners said. “We’re excited to join the Offshore Wind Innovation Hub program as it represents a significant opportunity for us to explore the U.S. offshore wind market. We look forward to gaining valuable insights and mentorship from the distinguished companies and experts involved in the program”, Balca Yılmaz, CEO and Co-Founder of Werover, said. The winners will participate in a six-month mentoring and business development program that aims to prepare them for strategic partnerships with offshore wind developers, suppliers, and the wider industry. The program helps innovators overcome barriers and commercialize solutions in New York and beyond, the Hub partners said. Building on two years of success, the 2025 Accelerator builds on progress made by the 2024 cohort in business, product development, fundraising, hiring, and piloting. Notable achievements include Triton Anchor’s $5.7M fundraise, Claviate’s contract with Siemens to manage turbines, and Pliant Energy’s two pilot projects in New York City waters with NYC EDC, according to the Hub partners. The Offshore Wind Innovation Hub is based

Read More »

US DOE Launches Nuclear Fuel Program to Support Advanced Reactor Testing

The United States Department of Energy (DOE) has opened application for producing nuclear fuel to support the development of advanced reactors. The Fuel Line Pilot Program seeks U.S. companies to build and operate production facilities outside of national laboratories under the DOE authorization process. It supports the Reactor Pilot Program announced last month. “This initiative will help end America’s reliance on foreign sources of enriched uranium and critical materials, while opening the door for private sector investment in America’s nuclear renaissance”, the DOE said in a statement online. “The program leverages the DOE authorization process to build and operate nuclear fuel production lines to serve for research, development, and demonstration purposes and to provide a fast-tracked approach to commercial licensing”, the program webpage says. The first applications are due August 15. The DOE expects to announce initial selections a month thereafter. The DOE may open subsequent applications. “Applicants will be responsible for all costs associated with the construction, operation, and decommissioning of an advanced nuclear fuel line, as well as the procurement of all nuclear material feedstock”, the DOE statement said. “The selections will be based on a set of criteria, including technological readiness, established fuel fabrication plans, and financial viability. “While the advanced nuclear fuel lines will serve for research, development, and demonstration purposes, seeking DOE authorization of the facilities can help unlock private funding and provide a fast-tracked approach to enable future commercial licensing activities for potential applicants”. “America has the resources and the expertise to lead the world in nuclear energy development, but we need secure domestic supply chains to fuel this rapidly growing energy source and achieve a true nuclear energy renaissance”, commented Energy Secretary Chris Wright. “The Trump Administration is accelerating innovation, not regulation, and leveraging partnerships with the private sector to safely fuel and

Read More »

BP Sells US Onshore Wind Assets to LS Power

BP PLC has agreed to divest its onshore wind business in the United States to LS Power Development LLC, toward a goal of $3-4 billion in asset sales this year. The sale of BP Wind Energy North America Inc. to New York City-based LS Power consists of 1.3 gigawatts (GW) net capacity from 10 projects in operation. Five of the projects are wholly owned by BP: the 44-megawatt (MW) Flat Ridge I and 470-MW Flat Ridge II in Kansas, the 288-MW Fowler Ridge I and 99-MW Fowler Ridge III in Indiana, and the 25-MW Titan in South Dakota. In each of the other five, BP owns 50 percent: the 21-MW Auwahi in Hawaii, the 248-MW Cedar Creek II in Colorado, the 200-MW Fowler Ridge II in Indiana, the 125-MW Goshen II in Idaho and the 141-MW Mehoopany in Pennsylvania. All 10 projects, which can generate up to 1.7 GW gross, are grid-connected and signed to 15 offtakers, according to a joint statement Friday. To be managed under LS Power’s portfolio company Clearlight Energy, the projects would grow the purchaser’s operating fleet to about 4.3 GW, the statement said. “LS Power will add bp’s US onshore wind business to an existing fleet of renewable, energy storage, flexible gas and renewable fuels assets, which comprise a 21GW operating portfolio and more than 780 miles of high-voltage transmission lines in operation as well as another 350+ miles currently under construction or development”, it said. The parties expect to complete the transaction by year-end, subject to regulatory approvals. The price was not disclosed. Employees will transfer to the new owner. LS Power chief executive Paul Segal said, “We are focused on a holistic approach to advancing American energy infrastructure that includes improving existing energy assets while investing in transformative strategies that make energy more efficient, affordable

Read More »

What Is The Biggest Oil Discovery of All Time?

What is the biggest oil discovery of all time? That’s the question Rigzone asked David Moseley, the Head of Europe Research at Welligence, in a recent interview. Responding to the question, Moseley told Rigzone that “Ghawar in Saudi Arabia is often considered the largest conventional oil discovery globally”. A story published in Saudi Aramco’s Elements Magazine – which was posted on the company’s website in February this year and penned by Saudi Aramco’s Global Communications Specialist at the time, Daniel Bird – stated that “the ANDR-1 wildcat well, which later led to the discovery of the giant Ghawar field, is currently both the longest production run-life and the highest cumulative production well in Saudi Arabia”. The story highlighted that cumulative production at ANDR-1 stood at 160.2 million stock tank barrels. “Drilling of ‘Ain Dar began in 1948, with production starting in 1951 at an extraordinary rate of 15,600 barrels per day (bpd) of ‘dry oil’- which contains only a small amount of basic sediments,” the story noted. “Although conventional wells are known to start producing a higher volume of water a number of years into commercial production, the dry oil at the ‘Ain Dar well continued to flow for a staggering 49 years, before it first produced the first water volumes in 1999,” it added. “Today, despite it being one of our earliest wildcat wells, it continues to deliver 2,800 bpd – some 73 years after production first started at the site – which is possible thanks to the continuous adoption of new, improved extraction technologies,” it continued. “Remarkably, the original well casings are still in place, showcasing the workmanship and quality of materials used by our engineers in the 1940s,” it went on to state. When Rigzone asked Moseley if we are likely to see another discovery of Ghawar’s magnitude

Read More »

ADNOC to Transfer OMV Stake to Its Global Investment Arm

Abu Dhabi National Oil Co. PJSC (ADNOC) said it intends to transfer its 24.9 percent interest in Austria’s state-backed OMV AG to its global investment unit, XRG PJSC. “This transfer, which is subject to regulatory approvals, is aligned with ADNOC’s strategy to consolidate its international growth investments under XRG”, ADNOC said in a statement online. “ADNOC remains committed to its longstanding partnership with OMV through XRG and reaffirms its support for the company’s continued growth and success”, the statement added. OMV is an integrated oil and gas company focused on Europe. It explores for and develops oil and gas, as well as produces fuels and chemicals. In the United Arab Emirates, it owns a 15 percent share in ADNOC Refining and ADNOC Global Trading, according to OMV. Austrian state holding company Oesterreichische Beteiligungs AG owns 31.5 percent in OMV, while 43.4 percent are on free float. Austria’s Treasury and OMV employees hold 0.2 percent, OMV says on its website. XRG meanwhile was launched late last year to drive the UAE’s expansion in the chemical, low-carbon energy and natural gas markets. ADNOC added, “ADNOC is also progressing with preparation for the proposed establishment of Borouge Group International, which is set to be a top-four global polyolefins producer. ADNOC’s proposed 46.94 percent shareholding in the new entity is expected to be held by XRG upon completion of the transaction, subject to regulatory approvals”. Last March, OMV and ADNOC signed an agreement to consolidate their polyolefin businesses, with ADNOC also agreeing to acquire NOVA Chemicals Corp. to be transferred to the new joint venture (JV). Under the agreement Borealis AG and Borouge PLC will merge to form Borouge Group International. OMV owns 75 percent of Vienna-based Borealis while ADNOC holds the remaining 25 percent. In Abu Dhabi-based Borouge, ADNOC owns 54 percent and

Read More »

EU Envoys Back Revised Oil Price Cap, New Sanctions on Russia

European Union states have approved a fresh sanctions package on Russia over its war against Ukraine, which includes a revised oil price cap and new banking restrictions, after Slovakia lifted its veto. The package, the bloc’s 18th since Moscow’s full scale invasion, will see about 20 more Russian banks cut off the international payments system SWIFT and face a full transaction ban, sanctions on the Nord Stream gas pipelines to ensure they aren’t brought back into operation in future, and restrictions imposed on Russian petroleum refined in third countries. The price cap on Russian oil, which is currently set at $60 per barrel, will be set dynamically at $15 below market rates moving forward. The new mechanism will see the threshold start off somewhere between $45-$50 and automatically revised at least twice a year based on market prices, Bloomberg previously reported. The bloc’s envoys backed the sanctions on Friday morning. The package is set to be approved later Friday at a meeting of EU ministers in Brussels.  Other measures include sanctions on dozens more vessels in Russia’s shadow fleet of oil tankers, bringing the total above 400, as well as on several entities and traders that work with the covert fleet; the addition of more goods to existing export lists of restricted items used by Moscow’s war machine; and sanctions on several entities, including in China and elsewhere, that are seen to aid Russia skirt the bloc’s trade and energy restrictions. The package had been held up for weeks by Slovakia as it was seeking relief from an EU plan to phase out Russian fossil fuels. Prime Minister Robert Fico announced on Thursday that he was lifting his country’s veto after accepting guarantees provided by the European Commission. What do you think? We’d love to hear from you, join the

Read More »

National Grid, Con Edison urge FERC to adopt gas pipeline reliability requirements

The Federal Energy Regulatory Commission should adopt reliability-related requirements for gas pipeline operators to ensure fuel supplies during cold weather, according to National Grid USA and affiliated utilities Consolidated Edison Co. of New York and Orange and Rockland Utilities. In the wake of power outages in the Southeast and the near collapse of New York City’s gas system during Winter Storm Elliott in December 2022, voluntary efforts to bolster gas pipeline reliability are inadequate, the utilities said in two separate filings on Friday at FERC. The filings were in response to a gas-electric coordination meeting held in November by the Federal-State Current Issues Collaborative between FERC and the National Association of Regulatory Utility Commissioners. National Grid called for FERC to use its authority under the Natural Gas Act to require pipeline reliability reporting, coupled with enforcement mechanisms, and pipeline tariff reforms. “Such data reporting would enable the commission to gain a clearer picture into pipeline reliability and identify any problematic trends in the quality of pipeline service,” National Grid said. “At that point, the commission could consider using its ratemaking, audit, and civil penalty authority preemptively to address such identified concerns before they result in service curtailments.” On pipeline tariff reforms, FERC should develop tougher provisions for force majeure events — an unforeseen occurence that prevents a contract from being fulfilled — reservation charge crediting, operational flow orders, scheduling and confirmation enhancements, improved real-time coordination, and limits on changes to nomination rankings, National Grid said. FERC should support efforts in New England and New York to create financial incentives for gas-fired generators to enter into winter contracts for imported liquefied natural gas supplies, or other long-term firm contracts with suppliers and pipelines, National Grid said. Con Edison and O&R said they were encouraged by recent efforts such as North American Energy Standard

Read More »

US BOEM Seeks Feedback on Potential Wind Leasing Offshore Guam

The United States Bureau of Ocean Energy Management (BOEM) on Monday issued a Call for Information and Nominations to help it decide on potential leasing areas for wind energy development offshore Guam. The call concerns a contiguous area around the island that comprises about 2.1 million acres. The area’s water depths range from 350 meters (1,148.29 feet) to 2,200 meters (7,217.85 feet), according to a statement on BOEM’s website. Closing April 7, the comment period seeks “relevant information on site conditions, marine resources, and ocean uses near or within the call area”, the BOEM said. “Concurrently, wind energy companies can nominate specific areas they would like to see offered for leasing. “During the call comment period, BOEM will engage with Indigenous Peoples, stakeholder organizations, ocean users, federal agencies, the government of Guam, and other parties to identify conflicts early in the process as BOEM seeks to identify areas where offshore wind development would have the least impact”. The next step would be the identification of specific WEAs, or wind energy areas, in the larger call area. BOEM would then conduct environmental reviews of the WEAs in consultation with different stakeholders. “After completing its environmental reviews and consultations, BOEM may propose one or more competitive lease sales for areas within the WEAs”, the Department of the Interior (DOI) sub-agency said. BOEM Director Elizabeth Klein said, “Responsible offshore wind development off Guam’s coast offers a vital opportunity to expand clean energy, cut carbon emissions, and reduce energy costs for Guam residents”. Late last year the DOI announced the approval of the 2.4-gigawatt (GW) SouthCoast Wind Project, raising the total capacity of federally approved offshore wind power projects to over 19 GW. The project owned by a joint venture between EDP Renewables and ENGIE received a positive Record of Decision, the DOI said in

Read More »

Biden Bars Offshore Oil Drilling in USA Atlantic and Pacific

President Joe Biden is indefinitely blocking offshore oil and gas development in more than 625 million acres of US coastal waters, warning that drilling there is simply “not worth the risks” and “unnecessary” to meet the nation’s energy needs.  Biden’s move is enshrined in a pair of presidential memoranda being issued Monday, burnishing his legacy on conservation and fighting climate change just two weeks before President-elect Donald Trump takes office. Yet unlike other actions Biden has taken to constrain fossil fuel development, this one could be harder for Trump to unwind, since it’s rooted in a 72-year-old provision of federal law that empowers presidents to withdraw US waters from oil and gas leasing without explicitly authorizing revocations.  Biden is ruling out future oil and gas leasing along the US East and West Coasts, the eastern Gulf of Mexico and a sliver of the Northern Bering Sea, an area teeming with seabirds, marine mammals, fish and other wildlife that indigenous people have depended on for millennia. The action doesn’t affect energy development under existing offshore leases, and it won’t prevent the sale of more drilling rights in Alaska’s gas-rich Cook Inlet or the central and western Gulf of Mexico, which together provide about 14% of US oil and gas production.  The president cast the move as achieving a careful balance between conservation and energy security. “It is clear to me that the relatively minimal fossil fuel potential in the areas I am withdrawing do not justify the environmental, public health and economic risks that would come from new leasing and drilling,” Biden said. “We do not need to choose between protecting the environment and growing our economy, or between keeping our ocean healthy, our coastlines resilient and the food they produce secure — and keeping energy prices low.” Some of the areas Biden is protecting

Read More »

Biden Admin Finalizes Hydrogen Tax Credit Favoring Cleaner Production

The Biden administration has finalized rules for a tax incentive promoting hydrogen production using renewable power, with lower credits for processes using abated natural gas. The Clean Hydrogen Production Credit is based on carbon intensity, which must not exceed four kilograms of carbon dioxide equivalent per kilogram of hydrogen produced. Qualified facilities are those whose start of construction falls before 2033. These facilities can claim credits for 10 years of production starting on the date of service placement, according to the draft text on the Federal Register’s portal. The final text is scheduled for publication Friday. Established by the 2022 Inflation Reduction Act, the four-tier scheme gives producers that meet wage and apprenticeship requirements a credit of up to $3 per kilogram of “qualified clean hydrogen”, to be adjusted for inflation. Hydrogen whose production process makes higher lifecycle emissions gets less. The scheme will use the Energy Department’s Greenhouse Gases, Regulated Emissions and Energy Use in Transportation (GREET) model in tiering production processes for credit computation. “In the coming weeks, the Department of Energy will release an updated version of the 45VH2-GREET model that producers will use to calculate the section 45V tax credit”, the Treasury Department said in a statement announcing the finalization of rules, a process that it said had considered roughly 30,000 public comments. However, producers may use the GREET model that was the most recent when their facility began construction. “This is in consideration of comments that the prospect of potential changes to the model over time reduces investment certainty”, explained the statement on the Treasury’s website. “Calculation of the lifecycle GHG analysis for the tax credit requires consideration of direct and significant indirect emissions”, the statement said. For electrolytic hydrogen, electrolyzers covered by the scheme include not only those using renewables-derived electricity (green hydrogen) but

Read More »

Xthings unveils Ulticam home security cameras powered by edge AI

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Xthings announced that its Ulticam security camera brand has a new model out today: the Ulticam IQ Floodlight, an edge AI-powered home security camera. The company also plans to showcase two additional cameras, Ulticam IQ, an outdoor spotlight camera, and Ulticam Dot, a portable, wireless security camera. All three cameras offer free cloud storage (seven days rolling) and subscription-free edge AI-powered person detection and alerts. The AI at the edge means that it doesn’t have to go out to an internet-connected data center to tap AI computing to figure out what is in front of the camera. Rather, the processing for the AI is built into the camera itself, and that sets a new standard for value and performance in home security cameras. It can identify people, faces and vehicles. CES 2025 attendees can experience Ulticam’s entire lineup at Pepcom’s Digital Experience event on January 6, 2025, and at the Venetian Expo, Halls A-D, booth #51732, from January 7 to January 10, 2025. These new security cameras will be available for purchase online in the U.S. in Q1 and Q2 2025 at U-tec.com, Amazon, and Best Buy. The Ulticam IQ Series: smart edge AI-powered home security cameras Ulticam IQ home security camera. The Ulticam IQ Series, which includes IQ and IQ Floodlight, takes home security to the next level with the most advanced AI-powered recognition. Among the very first consumer cameras to use edge AI, the IQ Series can quickly and accurately identify people, faces and vehicles, without uploading video for server-side processing, which improves speed, accuracy, security and privacy. Additionally, the Ulticam IQ Series is designed to improve over time with over-the-air updates that enable new AI features. Both cameras

Read More »

Intel unveils new Core Ultra processors with 2X to 3X performance on AI apps

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Intel unveiled new Intel Core Ultra 9 processors today at CES 2025 with as much as two or three times the edge performance on AI apps as before. The chips under the Intel Core Ultra 9 and Core i9 labels were previously codenamed Arrow Lake H, Meteor Lake H, Arrow Lake S and Raptor Lake S Refresh. Intel said it is pushing the boundaries of AI performance and power efficiency for businesses and consumers, ushering in the next era of AI computing. In other performance metrics, Intel said the Core Ultra 9 processors are up to 5.8 times faster in media performance, 3.4 times faster in video analytics end-to-end workloads with media and AI, and 8.2 times better in terms of performance per watt than prior chips. Intel hopes to kick off the year better than in 2024. CEO Pat Gelsinger resigned last month without a permanent successor after a variety of struggles, including mass layoffs, manufacturing delays and poor execution on chips including gaming bugs in chips launched during the summer. Intel Core Ultra Series 2 Michael Masci, vice president of product management at the Edge Computing Group at Intel, said in a briefing that AI, once the domain of research labs, is integrating into every aspect of our lives, including AI PCs where the AI processing is done in the computer itself, not the cloud. AI is also being processed in data centers in big enterprises, from retail stores to hospital rooms. “As CES kicks off, it’s clear we are witnessing a transformative moment,” he said. “Artificial intelligence is moving at an unprecedented pace.” The new processors include the Intel Core 9 Ultra 200 H/U/S models, with up to

Read More »

The Download: three-person babies, and tracking “AI readiness” in the US

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology. Researchers announce babies born from a trial of three-person IVF Eight babies have been born in the UK thanks to a technology that uses DNA from three people: the two biological parents plus a third person who supplies healthy mitochondrial DNA. The babies were born to mothers who carry genes for mitochondrial diseases and risked passing on severe disorders.  In the team’s approach, patients’ eggs are fertilized with sperm, and the DNA-containing nuclei of those cells are transferred into donated fertilized eggs that have had their own nuclei removed. The new embryos contain the DNA of the intended parents along with a tiny fraction of mitochondrial DNA from the donor, floating in the embryos’ cytoplasm.The study, which makes use of a technology called mitochondrial donation, has been described as a “tour de force” and “a remarkable accomplishment” by others in the field. But not everyone sees the trial as a resounding success. Read the full story.
—Jessica Hamzelou
These four charts show where AI companies could go next in the US No one knows exactly how AI will transform our communities, workplaces, and society as a whole. Because it’s hard to predict the impact AI will have on jobs, many workers and local governments are left trying to read the tea leaves to understand how to prepare and adapt. A new interactive report released by the Brookings Institution attempts to map how embedded AI companies and jobs are in different regions of the United States in order to prescribe policy treatments to those struggling to keep up. Here are four charts to help understand the issues. —Peter Hall In defense of air-conditioning —Casey Crownhart I’ll admit that I’ve rarely hesitated to point an accusing finger at air-conditioning. I’ve outlined in many stories and newsletters that AC is a significant contributor to global electricity demand, and it’s only going to suck up more power as temperatures rise.

But I’ll also be the first to admit that it can be a life-saving technology, one that may become even more necessary as climate change intensifies. And in the wake of Europe’s recent deadly heat wave, it’s been oddly villainized. Read our story to learn more. This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 Donald Trump is cracking down on “dangerous science”But the scientists affected argue their work is essential to developing new treatments. (WP $)+ How MAHA is infiltrating states across the US. (The Atlantic $) 2 The US Senate has approved Trump’s request to cancel foreign aid The White House is determined to reclaim around $8 billion worth of overseas aid. (NYT $)+ The bill also allocates around $1.1 billion to public broadcasting. (WP $)+ HIV could infect 1,400 infants every day because of US aid disruptions. (MIT Technology Review)3 American air strikes only destroyed one Iranian nuclear site The remaining two sites weren’t damaged that badly, and could resume operation within months. (NBC News) 4 The US is poised to ban Chinese technology in submarine cablesThe cables are critical to internet connectivity across the world. (FT $)+ The cables are at increasing risk of sabotage. (Bloomberg $)
5 The US measles outbreak is worseningHealth officials’ tactics for attempting to contain it aren’t working. (Wired $)+ Vaccine hesitancy is growing, too. (The Atlantic $)+ Why childhood vaccines are a public health success story. (MIT Technology Review) 6 A new supercomputer is comingThe Nexus machine will search for new cures for diseases. (Semafor)7 Elon Musk has teased a Grok AI companion inspired by TwilightNo really, you shouldn’t have… (The Verge)+ Inside the Wild West of AI companionship. (MIT Technology Review)
8 Future farms could be fully autonomous 🐄Featuring AI-powered tractors and drone surveillance. (WSJ $)+ African farmers are using private satellite data to improve crop yields. (MIT Technology Review) 9 Granola is Silicon Valley’s favorite new toolNo, not the tasty breakfast treat. (The Information $) 10 WeTransfer isn’t going to train its AI on our files after allAfter customers reacted angrily on social media. (BBC) Quote of the day “He’s doing the exact opposite of everything I voted for.”
—Andrew Schulz, a comedian and podcaster who interviewed Donald Trump last year, explains why he’s starting to lose faith in the President to Wired. One more thing The open-source AI boom is built on Big Tech’s handouts. How long will it last?In May 2023 a leaked memo reported to have been written by Luke Sernau, a senior engineer at Google, said out loud what many in Silicon Valley must have been whispering for weeks: an open-source free-for-all is threatening Big Tech’s grip on AI.In many ways, that’s a good thing. AI won’t thrive if just a few mega-rich companies get to gatekeep this technology or decide how it is used. But this open-source boom is precarious, and if Big Tech decides to shut up shop, a boomtown could become a backwater. Read the full story.

Read More »

Slack gets smarter: New AI tools summarize chats, explain jargon, and automate work

Slack is rolling out an extensive array of artificial intelligence features that promise to eliminate routine tasks and turn the messaging platform into a central hub for enterprise productivity, marking owner Salesforce’s direct challenge to Microsoft’s workplace AI dominance.The announcements, set to roll out over the coming months, include AI-powered writing assistance embedded directly into Slack’s canvas feature, contextual message explanations, automated action item identification, and enterprise search capabilities that span multiple connected business applications. The moves come as Salesforce simultaneously restricts external AI companies from accessing Slack data, creating a walled garden approach that mirrors broader industry trends toward platform consolidation.“Unlike some AI tools that sit outside the flow of work, Slack’s AI shows up where work happens – across conversations, decisions, and documentation,” said Shalini Agarwal, Vice President of Slack Product at Salesforce, in an exclusive interview with VentureBeat. “The key differentiator is context, which comes in the form of structured and unstructured data in Slack.”The timing underscores intensifying competition in the $45 billion enterprise collaboration market, where Microsoft’s Teams platform and its Copilot AI assistant have gained significant traction against Slack since Salesforce’s $27.7 billion acquisition of the messaging service in 2021. Google is also pushing its Duet AI across Workspace applications, creating a three-way battle for corporate customers increasingly focused on AI-driven productivity gains.

Read More »

In defense of air-conditioning

I’ll admit that I’ve rarely hesitated to point an accusing finger at air-conditioning. I’ve outlined in many stories and newsletters that AC is a significant contributor to global electricity demand, and it’s only going to suck up more power as temperatures rise. But I’ll also be the first to admit that it can be a life-saving technology, one that may become even more necessary as climate change intensifies. And in the wake of Europe’s recent deadly heat wave, it’s been oddly villainized.  We should all be aware of the growing electricity toll of air-conditioning, but the AC hate is misplaced. Yes, AC is energy intensive, but so is heating our homes, something that’s rarely decried in the same way that cooling is. Both are tools for comfort and, more important, for safety.  So why is air-conditioning cast as such a villain? In the last days of June and the first few days of July, temperatures hit record highs across Europe. Over 2,300 deaths during that period were attributed to the heat wave, according to early research from World Weather Attribution, an academic collaboration that studies extreme weather. And human-caused climate change accounted for 1,500 of the deaths, the researchers found. (That is, the number of fatalities would have been under 800 if not for higher temperatures because of climate change.)
We won’t have the official death toll for months, but these early figures show just how deadly heat waves can be. Europe is especially vulnerable, because in many countries, particularly in the northern part of the continent, air-conditioning is not common. Popping on a fan, drawing the shades, or opening the windows on the hottest days used to cut it in many European countries. Not anymore. The UK was 1.24 °C (2.23 °F) warmer over the past decade than it was between 1961 and 1990, according to the Met Office, the UK’s national climate and weather service. One recent study found that homes across the country are uncomfortably or dangerously warm much more frequently than they used to be.
The reality is, some parts of the world are seeing an upward shift in temperatures that’s not just uncomfortable but dangerous. As a result, air-conditioning usage is going up all over the world, including in countries with historically low rates. The reaction to this long-term trend, especially in the face of the recent heat wave, has been apoplectic. People are decrying AC across social media and opinion pages, arguing that we need to suck it up and deal with being a little bit uncomfortable. Now, let me preface this by saying that I do live in the US, where roughly 90% of homes are cooled with air-conditioning today. So perhaps I am a little biased in favor of AC. But it baffles me when people talk about air-conditioning this way. I spent a good amount of my childhood in the southeastern US, where it’s very obvious that heat can be dangerous. I was used to many days where temperatures were well above 90 °F (32 °C), and the humidity was so high your clothes would stick to you as soon as you stepped outdoors.  For some people, being active or working in those conditions can lead to heatstroke. Prolonged exposure, even if it’s not immediately harmful, can lead to heart and kidney problems. Older people, children, and those with chronic conditions can be more vulnerable.  In other words, air-conditioning is more than a convenience; in certain conditions, it’s a safety measure. That should be an easy enough concept to grasp. After all, in many parts of the world we expect access to heating in the name of safety. Nobody wants to freeze to death.  And it’s important to clarify here that while air-conditioning does use a lot of electricity in the US, heating actually has a higher energy footprint.  In the US, about 19% of residential electricity use goes to air-conditioning. That sounds like a lot, and it’s significantly more than the 12% of electricity that goes to space heating. However, we need to zoom out to get the full picture, because electricity makes up only part of a home’s total energy demand. A lot of homes in the US use natural gas for heating—that’s not counted in the electricity being used, but it’s certainly part of the home’s total energy use.

When we look at the total, space heating accounts for a full 42% of residential energy consumption in the US, while air conditioning accounts for only 9%. I’m not letting AC off the hook entirely here. There’s obviously a difference between running air-conditioning (or other, less energy-intensive technologies) when needed to stay safe and blasting systems at max capacity because you prefer it chilly. And there’s a lot of grid planning we’ll need to do to make sure we can handle the expected influx of air-conditioning around the globe.  But the world is changing, and temperatures are rising. If you’re looking for a villain, look beyond the air conditioner and into the atmosphere. This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Read More »

Researchers announce babies born from a trial of three-person IVF

Eight babies have been born in the UK thanks to a technology that uses DNA from three people: the two biological parents plus a third person who supplies healthy mitochondrial DNA. The babies were born to mothers who carry genes for mitochondrial diseases and risked passing on severe disorders. The eight babies are healthy, say the researchers behind the trial. “Mitochondrial disease can have a devastating impact on families,” Doug Turnbull of Newcastle University, one of the researchers behind the study, said in a statement. “Today’s news offers fresh hope to many more women at risk of passing on this condition, who now have the chance to have children growing up without this terrible disease.” The study, which makes use of a technology called mitochondrial donation, has been described as a “tour de force” and “a remarkable accomplishment” by others in the field. In the team’s approach, patients’ eggs are fertilized with sperm, and the DNA-containing nuclei of those cells are transferred into donated fertilized eggs that have had their own nuclei removed. The new embryos contain the DNA of the intended parents along with a tiny fraction of mitochondrial DNA from the donor, floating in the embryos’ cytoplasm.  “The concept of [mitochondrial donation] has attracted much commentary and occasionally concern and anxiety,” Stuart Lavery, a consultant in reproductive medicine at University College Hospitals NHS Foundation Trust, said in a statement. “The Newcastle team have demonstrated that it can be used in a clinically effective and ethically acceptable way to prevent disease and suffering.”
Not everyone sees the trial as a resounding success. While five of the children were born “with no health problems,” one developed a fever and a urinary tract infection, and another had muscle jerks. A third was treated for an abnormal heart rhythm. Three of the babies were born with a low level of the very mitochondrial-DNA mutations the treatment was designed to prevent. Heidi Mertes, a medical ethicist at Ghent University, says she is “moderately optimistic.” “I’m happy that it worked,” she says. “But at the same time, it’s concerning … it’s a call for caution and treading carefully.”
Pavlo Mazur, a former embryologist who has used a similar approach in the conception of 15 babies in Ukraine, believes that trials like this one should be paused until researchers figure out what’s going on. Others believe that researchers should study the technique in people who don’t have mitochondrial mutations, to lower the risk of passing any disease-causing mutations to children. Long time coming The news of the births has been long awaited by researchers in the field. Mitochondrial donation was first made legal in the UK in 2015. Two years later, the Human Fertility and Embryology Authority (HFEA), which regulates fertility treatment and research in the UK, granted a fertility clinic in Newcastle the sole license to perform the procedure. Newcastle Fertility Centre at Life launched a trial of mitochondrial donation in 2017 with the aim of treating 25 women a year. That was eight years ago. Since then, the Newcastle team have been extremely tight-lipped about the trial. That’s despite the fact that other teams elsewhere have used mitochondrial donation to help people achieve pregnancy. A New York–based doctor used a type of mitochondrial donation to help a Jordanian couple conceive in Mexico in 2016. Mitochondrial donation has also been trialed by teams in Ukraine and Greece. But as the only trial overseen by the HFEA, the Newcastle team’s study was viewed by many as the most “official.” Researchers have been itching to hear how the work has been going, given the potential implications for researchers elsewhere (mitochondrial donation was officially made legal in Australia in 2022). “I’m very glad to see [the results] come out at last,” says Dagan Wells, a reproductive biologist at the University of Oxford who worked on the Greece trial. “It would have been nice to have some information out along the way.” At the Newcastle clinic, each patient must receive approval from the HFEA to be eligible for mitochondrial donation. Since the trial launched in 2017, 39 patients have won this approval. Twenty-five of them underwent hormonal stimulation to release multiple eggs that could be frozen in storage. Nineteen of those women went on to have mitochondrial donation. So far, seven of the women have given birth (one had twins), and an eighth is still pregnant. The oldest baby is two years old. The results were published today in the New England Journal of Medicine. “As parents, all we ever wanted was to give our child a healthy start in life,” one of the mothers, who is remaining anonymous, said in a statement. “Mitochondrial donation IVF made that possible. After years of uncertainty this treatment gave us hope—and then it gave us our baby … Science gave us a chance.” When each baby was born, the team collected a blood and urine sample to look at the child’s mitochondrial DNA. They found that the levels of mutated DNA were far lower than they would have expected without mitochondrial donation. Three of the mothers were “homoplasmic”—100% of their mitochondrial DNA carried the mutation. But blood tests showed that in the women’s four babies (including the twins), 5% or less of the mitochondrial DNA had the mutation, suggesting they won’t develop disease.

A mixed result The researchers see this as a positive result. “Children who would otherwise have inherited very high levels are now inheriting levels that are reduced by 77% to 100%,” coauthor Mary Herbert, a professor of reproductive biology at Newcastle University and Monash University, told me during a press briefing. But three of the eight babies had health symptoms. At seven months, one was diagnosed with a rare form of epilepsy, which seemed to resolve within the following three months. Another baby developed a urinary tract infection. A third baby developed “prolonged” jaundice, high levels of fat in the blood, and a disturbed heart rhythm that required treatment. The baby seemed to have recovered by 18 months, and doctors believe that the symptoms were not related to the mitochondrial mutations, but the team members admit that they can’t be sure. Given the small sample size, it’s hard to make comparisons with babies conceived in other ways.  And they acknowledge that a phenomenon called “reversal” is happening in some of the babies. In theory, the children shouldn’t inherit any “bad” mitochondrial DNA from their mothers. But three of them did. The levels of “bad” mitochondrial DNA in the babies’ blood ranged between 5% and 16%. And they were higher in the babies’ urine—the highest figure being 20%. The researchers don’t know why this is happening. When an embryologist pulls out the nucleus of a fertilized egg, a bit of mitochondria-containing cytoplasm will inevitably be dragged along with it. But the team didn’t see any link between the amount of carried-over cytoplasm and the level of “bad” mitochondria. “We continue to investigate this issue,” Herbert said. “As long as they don’t understand what’s happening, I would still be worried,” says Mertes. Such low levels aren’t likely to cause mitochondrial diseases, according to experts contacted by MIT Technology Review. But some are concerned that the percentage of mutated DNA could be higher in different tissues, such as the brain or muscle, or that the levels might change with age. “You never know which tissues [reversal] will show up in,” says Mazur, who has seen the phenomenon in babies born through mitochondrial donation to parents who didn’t have mitochondrial mutations. “It’s chaotic.” The Newcastle team says it hasn’t looked at other tissues, because it designed the study to be noninvasive.
There has been at least one case in which similar levels of “bad” mitochondria have caused symptoms, says Joanna Poulton, a mitochondrial geneticist at the University of Oxford. She thinks it’s unlikely that the children in the trial will develop any symptoms but adds that “it’s a bit of a worry.” The age of reversal No one knows exactly when this reversal happens. But Wells and his colleagues have some idea. In their study in Greece, they looked at the mitochondrial DNA of embryos and checked them again during pregnancy and after birth. The trial was designed to study the impact of mitochondrial donation for infertility—none of the parents involved had genes for mitochondrial disease.
The team has seen mitochondrial reversal in two of the seven babies born in the trial, says Wells. If you put the two sets of results together, mitochondrial donation “seems to have this possibility of reversal occurring in maybe about a third of children,” he says. In his study, the reversal seemed to occur early on in the embryos’ development, Wells says. Five-day-old embryos “look perfect,” but mitochondrial mutations start showing up in tests taken at around 15 weeks of pregnancy, he says. After that point, the levels appear to be relatively stable. The Newcastle researchers say they will monitor the children until they are five years old. People enrolling in future trials might opt for amniocentesis, which involves sampling blood from the fetus’s amniotic sac at around 15 to 18 weeks, suggests Mertes. That test might reveal the likely level of mitochondrial mutations in the resulting child. “Then the parents could decide what to do,” says Mertes. “If you could see there was a 90% mutation load [for a] very serious mitochondrial disease, they would still have an option to cancel the pregnancy,” she says. Wells thinks the Newcastle team’s results are “generally reassuring.” He doesn’t think the trials should be paused. But he wants people to understand that mitochondrial donation is not without risk. “This can only be viewed as a risk reduction strategy, and not a guarantee of having an unaffected child,” he says. And, as Mertes points out, there’s another option for women who carry mitochondrial DNA mutations: egg donation. Donor eggs fertilized with a partner’s sperm and transferred to a woman’s uterus won’t have her disease-causing mitochondria.  That option won’t appeal to people who feel strongly about having a genetic link to their children. But Poulton asks: “If you know whose uterus you came out of, does it matter that the [egg] came from somewhere else?”

Read More »

Claude Code revenue jumps 5.5x as Anthropic launches analytics dashboard

Want smarter insights in your inbox? Sign up for our weekly newsletters to get only what matters to enterprise AI, data, and security leaders. Subscribe Now Anthropic announced today it is rolling out a comprehensive analytics dashboard for its Claude Code AI programming assistant, addressing one of the most pressing concerns for enterprise technology leaders: understanding whether their investments in AI coding tools are actually paying off. The new dashboard will provide engineering managers with detailed metrics on how their teams use Claude Code, including lines of code generated by AI, tool acceptance rates, user activity breakdowns, and cost tracking per developer. The feature comes as companies increasingly demand concrete data to justify their AI spending amid a broader enterprise push to measure artificial intelligence’s return on investment. “When you’re overseeing a big engineering team, you want to know what everyone’s doing, and that can be very difficult,” said Adam Wolff, who manages Anthropic’s Claude Code team and previously served as head of engineering at Robinhood. “It’s hard to measure, and we’ve seen some startups in this space trying to address this, but it’s valuable to gain insights into how people are using the tools that you give them.” The dashboard addresses a fundamental challenge facing technology executives: As AI-powered development tools become standard in software engineering, managers lack visibility into which teams and individuals are benefiting most from these expensive premium tools. Claude Code pricing starts at $17 per month for individual developers, with enterprise plans reaching significantly higher price points. The AI Impact Series Returns to San Francisco – August 5 The next phase of AI is here – are you ready? Join leaders from Block, GSK, and SAP for an exclusive look at how autonomous agents are reshaping enterprise workflows – from real-time decision-making to end-to-end automation.

Read More »

AWS unveils Bedrock AgentCore, a new platform for building enterprise AI agents with open source frameworks and tools

Want smarter insights in your inbox? Sign up for our weekly newsletters to get only what matters to enterprise AI, data, and security leaders. Subscribe Now Cloud giant Amazon Web Services (AWS) believes AI agents will change how we all work and interact with information, and that enterprises need a platform that allows them to build and deploy agents at scale — all in one place. Today at its New York Summit, AWS unveiled Amazon Bedrock AgentCore, a new enterprise-grade platform designed to build, deploy, and operate AI agents securely and at scale. Swami Sivasubramanian, AWS Vice President of Agentic AI, said during the keynote that AgentCore “helps organizations move beyond experiments to production-ready agent systems that can be trusted with your most critical business processes.” AgentCore is a modular stack of services—available in preview—that gives developers the core infrastructure needed to move AI agents from prototype to production, including runtime, memory, identity, observability, API integration, and tools for web browsing and code execution. The AI Impact Series Returns to San Francisco – August 5 The next phase of AI is here – are you ready? Join leaders from Block, GSK, and SAP for an exclusive look at how autonomous agents are reshaping enterprise workflows – from real-time decision-making to end-to-end automation. Secure your spot now – space is limited: https://bit.ly/3GuuPLF “We believe that agents are going to fundamentally change how we use tools and the internet,” said Deepak Singh, AWS Vice President of Databases and AI. “The line between an agent and an application is getting blurrier.” AgentCore builds on the existing Bedrock Agents framework, launched in late 2024, but dramatically expands capabilities by supporting any agent framework or foundation model—not just those hosted within Bedrock. That includes compatibility with open-source toolkits like CrewAI, LangChain, LlamaIndex, LangGraph, and AWS’s

Read More »

The key to renewable energy development is building local trust

Arun Muthukrishnan is senior manager of development at Arevon Energy. After developing over 1 GW of utility-scale solar and storage projects across diverse terrains and jurisdictions in the U.S., I’ve come to realize that the real complexity of this industry rarely lies in photovoltaic modules or grid studies. It lies in people. Coordinating with city officials, utility engineers, fire marshals and local communities isn’t just part of the job. It is the job. One of my earliest lessons came from a mid-sized city in Southern California where we thought we had a slam-dunk project. The land was zoned appropriately, the interconnection looked promising and we had a clean environmental review.  What we didn’t anticipate was the skepticism from the city council. They had been burned in the past by developers who promised jobs and community benefits but later disappeared. So before we even reached the planning commission hearing, we met one-on-one with local leaders, showed up at community events and clearly outlined how the project aligned with the city’s climate goals. That work paid off. Not only did the council approve the project unanimously, but they became advocates, mentioning it in newsletters and media interviews. The biggest lesson? Cities aren’t obstacles. They’re potential champions if you engage them early and honestly. Utility coordination is also often misunderstood by developers as a check-box process. But utilities aren’t just black box institutions processing queue positions. They’re risk-averse entities tasked with maintaining grid reliability. On one Electric Reliability Council of Texas project, the hosting capacity maps painted a rosy picture. But thanks to a weekly call we initiated with the utility’s transmission planning team, we learned about a transformer nearing overload that wouldn’t show up in the system impact study for months. That visibility allowed us to move our point of interconnection slightly

Read More »

Governors seek more influence over PJM amid ‘crisis of confidence’

Facing an “unprecedented crisis of confidence,” the PJM Interconnection needs fundamental change and new leadership, according to nine governors representing the majority of electric customers in the grid operator’s footprint. “At a time of rapidly rising load growth, PJM’s multi-year inability to efficiently connect new resources to its grid and to engage in effective long-term transmission planning has deprived our states of thousands of jobs and billions of dollars in investment that may flow to other regions,” the governors said to PJM’s board in a letter released Thursday. The letter from the governors comes about a year after total capacity costs in PJM’s last capacity auction jumped to $14.7 billion from $2.2 billion in the previous auction. The increase led to potential electric bill hikes in some states in the 10% to 20% range. Some states took steps to ease those bill increases. PJM is set to release the results of its most recent capacity auction on Tuesday. Increasingly, states are considering leaving PJM, the governors said. The letter was signed by governors from Delaware, Illinois, Kentucky, Maryland, Michigan, New Jersey, Pennsylvania, Tennessee and Virginia. “We are deeply concerned that PJM’s response has been typified by halting, inconsistent steps and rising internal conflicts within the stakeholder community that have recently culminated in the abrupt termination of two long-standing members of the Board of Managers and the imminent departure of the CEO,” the governors said. Manu Asthana, PJM president and CEO, on April 14 said he plans to step down from his job at the end of this year. He joined PJM in January 2020. PJM is seeking a replacement. Also, at a May 12 Members Committee meeting, two incumbent board nominees — Chairman Mark Takahashi and Terry Blackwell — failed to receive enough votes to be reelected to three-year terms, according

Read More »

Bureau of Land Management Approves Geothermal Facility

In a release posted on its website this week, the U.S. Bureau of Land Management announced that it had approved the 30-megawatt Crescent Valley geothermal energy production facility and associated transmission line. The project includes construction and operation of one power plant, a photovoltaic solar field, 17 additional geothermal fluid production and injection wells and well pads, new and improved access roads, an aggregate pit, geothermal fluid pipelines, an electrical gen-tie line, substation, switching station, and ancillary support facilities, the Bureau of Land Management noted in the release. “Geothermal projects support domestic energy production and American energy independence, while contributing to the nation’s economy and security,” the Bureau stated in the release. “Consistent with Executive Order 14154, ‘Unleashing American Energy’, the geothermal projects on public lands help meet the energy needs of U.S. citizens, will solidify the nation as a global energy leader long into the future, and achieve American Energy Dominance,” it added. In the release, the Bureau of Land Management highlighted that, “according to the U.S. Environmental Protection Agency, one megawatt produced by a geothermal project can power about 1,104 average American homes’ electricity use per year”. The Bureau stated in the release that geothermal “is an abundant resource, especially in the West, where the BLM has authority to manage geothermal resource leasing, exploration, and development on approximately 245 million surface acres of public lands and the 700 million acres where the United States owns the subsurface mineral estate”. In a separate release posted on its site on June 27, the Bureau of Land Management announced that it had approved three geothermal energy projects under an expedited timeline in Nevada, which it said support the administration’s goals for energy development on public lands. These comprise the Diamond Flat Geothermal Project, the McGinness Hills Geothermal Optimization Project, and the Pinto Geothermal Project, the release highlighted.

Read More »

US DOE Launches Nuclear Fuel Program to Support Advanced Reactor Testing

The United States Department of Energy (DOE) has opened application for producing nuclear fuel to support the development of advanced reactors. The Fuel Line Pilot Program seeks U.S. companies to build and operate production facilities outside of national laboratories under the DOE authorization process. It supports the Reactor Pilot Program announced last month. “This initiative will help end America’s reliance on foreign sources of enriched uranium and critical materials, while opening the door for private sector investment in America’s nuclear renaissance”, the DOE said in a statement online. “The program leverages the DOE authorization process to build and operate nuclear fuel production lines to serve for research, development, and demonstration purposes and to provide a fast-tracked approach to commercial licensing”, the program webpage says. The first applications are due August 15. The DOE expects to announce initial selections a month thereafter. The DOE may open subsequent applications. “Applicants will be responsible for all costs associated with the construction, operation, and decommissioning of an advanced nuclear fuel line, as well as the procurement of all nuclear material feedstock”, the DOE statement said. “The selections will be based on a set of criteria, including technological readiness, established fuel fabrication plans, and financial viability. “While the advanced nuclear fuel lines will serve for research, development, and demonstration purposes, seeking DOE authorization of the facilities can help unlock private funding and provide a fast-tracked approach to enable future commercial licensing activities for potential applicants”. “America has the resources and the expertise to lead the world in nuclear energy development, but we need secure domestic supply chains to fuel this rapidly growing energy source and achieve a true nuclear energy renaissance”, commented Energy Secretary Chris Wright. “The Trump Administration is accelerating innovation, not regulation, and leveraging partnerships with the private sector to safely fuel and

Read More »

Six Companies Join New York Offshore Wind Innovation Hub Accelerator

The New York University (NYU) Tandon School of Engineering, in partnership with Equinor ASA, National Offshore Wind R&D Consortium (NOWRDC), and New York City Economic Development Corp. (NYCEDC), has selected six companies to receive support for the development of ideas and advancement of offshore wind’s potential in New York under the Offshore Wind Innovation Hub. The Offshore Wind Innovation Hub, led by Equinor, reviewed a pool of 53 applicants. In a media release, the Hub partners said the six companies have been selected based on the novelty and potential of their solutions to join the 2025 Accelerator Cohort. The six companies are Anemo Robotics, Kalypso Offshore Energy, MESPAC, Orpheus Ocean, Reblade, and Werover. Among the focus areas was identifying innovations that can contribute to efficiencies in turbine maintenance and improved marine life monitoring, the Hub partners said. “We’re excited to join the Offshore Wind Innovation Hub program as it represents a significant opportunity for us to explore the U.S. offshore wind market. We look forward to gaining valuable insights and mentorship from the distinguished companies and experts involved in the program”, Balca Yılmaz, CEO and Co-Founder of Werover, said. The winners will participate in a six-month mentoring and business development program that aims to prepare them for strategic partnerships with offshore wind developers, suppliers, and the wider industry. The program helps innovators overcome barriers and commercialize solutions in New York and beyond, the Hub partners said. Building on two years of success, the 2025 Accelerator builds on progress made by the 2024 cohort in business, product development, fundraising, hiring, and piloting. Notable achievements include Triton Anchor’s $5.7M fundraise, Claviate’s contract with Siemens to manage turbines, and Pliant Energy’s two pilot projects in New York City waters with NYC EDC, according to the Hub partners. The Offshore Wind Innovation Hub is based

Read More »

A major AI training data set contains millions of examples of personal data

Millions of images of passports, credit cards, birth certificates, and other documents containing personally identifiable information are likely included in one of the biggest open-source AI training sets, new research has found. Thousands of images—including identifiable faces—were found in a small subset of DataComp CommonPool, a major AI training set for image generation scraped from the web. Because the researchers audited just 0.1% of CommonPool’s data, they estimate that the real number of images containing personally identifiable information, including faces and identity documents, is in the hundreds of millions. The study that details the breach was published on arXiv earlier this month. The bottom line, says William Agnew, a postdoctoral fellow in AI ethics at Carnegie Mellon University and one of the coauthors, is that “anything you put online can [be] and probably has been scraped.” The researchers found thousands of instances of validated identity documents—including images of credit cards, driver’s licenses, passports, and birth certificates—as well as over 800 validated job application documents (including résumés and cover letters), which were confirmed through LinkedIn and other web searches as being associated with real people. (In many more cases, the researchers did not have time to validate the documents or were unable to because of issues like image clarity.) 
A number of the résumés disclosed sensitive information including disability status, the results of background checks, birth dates and birthplaces of dependents, and race. When résumés were linked to people with online presences, researchers also found contact information, government identifiers, sociodemographic information, face photographs, home addresses, and the contact information of other people (like references). Examples of identity-related documents found in CommonPool’s small scale dataset, showing a credit card, social security number, and a driver’s license. For each sample, the type of URL site is shown at the top, the image in the middle, and the caption in quotes below. All personal information has been replaced, and text has been paraphrased to avoid direct quotations. Images have been redacted to show the presence of faces without identifying the individuals.COURTESY OF THE RESEARCHERS When it was released in 2023, DataComp CommonPool, with its 12.8 billion data samples, was the largest existing data set of publicly available image-text pairs, which are often used to train generative text-to-image models. While its curators said that CommonPool was intended for academic research, its license does not prohibit commercial use as well. 
CommonPool was created as a follow-up to the LAION-5B data set, which was used to train models including Stable Diffusion and Midjourney. It draws on the same data source: web scraping done by the nonprofit Common Crawl between 2014 and 2022.  While commercial models often do not disclose what data sets they are trained on, the shared data sources of DataComp CommonPool and LAION-5B mean that the datasets are similar, and that the same personally identifiable information likely appears in LAION-5B, as well as in other downstream models trained on CommonPool data. CommonPool researchers did not respond to emailed questions. And since DataComp CommonPool has been downloaded more than 2 million times over the past two years, it is likely that “there [are]many downstream models that are all trained on this exact data set,” says Rachel Hong, a PhD student in computer science at the University of Washington and the paper’s lead author. Those would duplicate similar privacy risks. Good intentions are not enough “You can assume that any large scale web-scraped data always contains content that shouldn’t be there,” says Abeba Birhane, a cognitive scientist and tech ethicist who leads Trinity College Dublin’s AI Accountability Lab—whether it’s personally identifiable information (PII), child sexual abuse imagery, or hate speech (which Birhane’s own research into LAION-5B has found).  Indeed, the curators of DataComp CommonPool were themselves aware it was likely that PII would appear in the data set and did take some measures to preserve privacy, including automatically detecting and blurring faces. But in their limited data set, Hong’s team found and validated over 800 faces that the algorithm had missed, and they estimated that overall, the algorithm had missed 102 million faces in the entire data set. On the other hand, they did not apply filters that could have recognized known PII strings, like emails or social security numbers.  “Filtering is extremely hard to do well,” says Agnew. “They would have had to make very significant advancements in PII detection and removal that they haven’t made public to be able to effectively filter this.”   Examples of resume documents and personal disclosures found in CommonPool’s small scale dataset. For each sample, the type of URL site is shown at the top, the image in the middle, and the caption in quotes below. All personal information has been replaced, and text has been paraphrased to avoid direct quotations. Images have been redacted to show the presence of faces without identifying the individuals. Image courtesy researchers.COURTESY OF THE RESEARCHERS There are other privacy issues that the face blurring doesn’t address. While the face blurring filter is automatically applied, it is optional and can be removed. Additionally, the captions that often accompany the photos, as well as the photos’ metadata, often contain even more personal information, such as names and exact locations. Another privacy mitigation measure comes from Hugging Face, a platform that distributes training data sets and hosts CommonPool, which integrates with a tool that theoretically allows people to search for and remove their own information from a data set. But as the researchers note in their paper, this would require people to know that their data is there to start with. When asked for comment, Florent Daudens of Hugging Face said that “maximizing the privacy of data subjects across the AI ecosystem takes a multilayered approach, which includes but is not limited to the widget mentioned,” and that the platform is “working with our community of users to move the needle in a more privacy-grounded direction.” 

In any case, just getting your data removed from one data set probably isn’t enough.“ Even if someone finds out their data was used in a training data sets and … exercises their right to deletion, technically the law is unclear about what that means,”  says Tiffany Li, an assistant professor of law at the University of New Hampshire School of Law. “If the organization only deletes data from the training data sets—but does not delete or retrain the already trained model—then the harm will nonetheless be done.” The bottom line, says Agnew, is that “if you web-scrape, you’re going to have private data in there. Even if you filter, you’re still going to have private data in there, just because of the scale of this. And that’s something that we [machine-learning researchers], as a field, really need to grapple with.” Reconsidering consent CommonPool was built on web data scraped between 2014 and 2022, meaning that many of the images likely date to before 2020, when ChatGPT was released. So even if it’s theoretically possible that some people consented to having their information publicly available to anyone on the web, they could not have consented to having their data used to train large AI models that did not yet exist. And with web scrapers often scraping data from each other, an image that was originally uploaded by the owner to one specific location would often find its way into other image repositories. “I might upload something onto the internet, and then … a year or so later, [I] want to take it down, but then that [removal] doesn’t necessarily do anything anymore,” says Agnew. The researchers also found numerous examples of children’s personal information, including depictions of birth certificates, passports, and health status, but in contexts suggesting that they had been shared for limited purposes. “It really illuminates the original sin of AI systems built off public data—it’s extractive, misleading, and dangerous to people who have been using the internet with one framework of risk, never assuming it would all be hoovered up by a group trying to create an image generator,” says Ben Winters, the director of AI and privacy at the Consumer Federation of America. Finding a policy that fits Ultimately, the paper calls for the machine-learning community to rethink the common practice of indiscriminate web scraping and also lays out the possible violations of current privacy laws represented by the existence of PII in massive machine-learning data sets, as well as the limitations of those laws’ ability to protect privacy. “We have the GDPR in Europe, we have the CCPA in California, but there’s still no federal data protection law in America, which also means that different Americans have different rights protections,” says Marietje Schaake, a Dutch lawmaker turned tech policy expert who currently serves as a fellow at Stanford’s Cyber Policy Center. 
Besides, these privacy laws apply to companies that meet certain criteria for size and other characteristics. They do not necessarily apply to researchers like those who were responsible for creating and curating DataComp CommonPool. And even state laws that do address privacy, like California’s consumer privacy act, have carve-outs for “publicly available” information. Machine-learning researchers have long operated on the principle that if it’s available on the internet, then it is public and no longer private information, but Hong, Agnew, and their colleagues hope that their research challenges this assumption.  “What we found is that ‘publicly available’ includes a lot of stuff that a lot of people might consider private—résumés, photos, credit card numbers, various IDs, news stories from when you were a child, your family blog. These are probably not things people want to just be used anywhere, for anything,” says Hong.   Hopefully, Schaake says, this research “will raise alarm bells and create change.” 

Read More »

Stay Ahead with the Paperboy Newsletter

Your weekly dose of insights into AI, Bitcoin mining, Datacenters and Energy indusrty news. Spend 3-5 minutes and catch-up on 1 week of news.

Smarter with ONMINE

Streamline Your Growth with ONMINE