EOS

Syndicate content Eos
Science News by AGU
Updated: 23 hours 35 min ago

The Past 3 Years Have Been the Three Hottest on Record

Wed, 01/14/2026 - 17:00

Global average temperatures in 2025 were the third hottest on record, surpassed only by 2024 and 2023, according to an analysis published by Berkeley Earth, a nonprofit climate research organization.

According to the analysis, last year’s global average temperature was about 1.35°C–1.53°C (2.43°F–2.75°F) greater than the 1850–1900 average. The previous year, 2024, was 1.46°C–1.62°C (2.63°F–2.92°F) above the preindustrial baseline, while 2023 was 1.48°C–1.60°C (2.66°F–2.88°F) above the baseline.

The report’s authors called the exceptional heat of the past 3 years a “warming spike” that may indicate an acceleration in the rate of climate change. “The warming observed from 2023 through 2025 stands out clearly from the long-term trend,” said Robert Rohde, chief scientist at Berkeley Earth, in a statement. 

Such a spike may also indicate that the past warming rate is no longer a reliable predictor of future warming, the authors wrote.

“2023, 2024, and 2025 collectively cause us to rethink” Earth’s warming rate, Rohde said in a press briefing. Whether warming is accelerating or not, Earth’s temperature is rapidly exceeding key thresholds, such as the Paris Agreement limit of 1.5°C (2.7°F), he said.

Scientists say the exceptional warming observed in the past 3 years could be evidence of accelerating warming. Credit: Berkeley Earth, CC BY-NC 4.0

“The overall trends in temperature are very consistent” among international agencies that track global temperature.

The report aligns with an analysis from NOAA’s National Centers for Environmental Information (NCEI) that also concluded that 2025 was the third-hottest year in the global temperature record. NOAA-NCEI calculated that the year was 1.17°C (2.11°F) above the 20th-century global average.

“There are different methodologies for how the global temperature [reports] are created, but the science behind it, the data behind it, by and large, are all shared,” said Karin Gleason, a climate scientist and chief of the monitoring section at NOAA-NCEI.

“The overall trends in temperature are very consistent” among international agencies that track global temperature, she said.

What’s Causing the Spike?

While global average temperatures have been increasing for more than a century, the past 3 years’ warming spike is notably extreme relative to the mostly linear trend of the past 50 years. 

“The magnitude of this recent spike suggests additional factors have amplified recent warming beyond what we would expect from greenhouse gases and natural variability alone.”

“The magnitude of this recent spike suggests additional factors have amplified recent warming beyond what we would expect from greenhouse gases and natural variability alone,” Rohde said.

The report suggested that reductions in cloud cover and changes to atmospheric aerosols, particularly as a result of new regulations on sulfur pollution from ships in 2020, may be partly to blame for the spike. The Hunga Tonga volcanic eruption in 2022 may have also contributed to warming, though further research is needed to fully understand the eruption’s effects, the report stated.

The El Niño-Southern Oscillation (ENSO), a climate phenomenon that affects heat storage in the ocean, contributed to extreme heat in 2023 and 2024 during the El Niño phase, but remained in a weak La Niña condition for much of 2025. Such a condition would typically be expected to slightly cool global temperatures. Without the effect of La Niña, it’s possible 2025 would have been the hottest year ever recorded, Gleason said.

Gleason pointed out that a similar “warming spike” occurred in 2015 and 2016 as a result of a strong El Niño.

Humanity Faces the Heat

According to Berkeley Earth’s report, about 770 million people across the world experienced their local hottest year ever in 2025. The majority of the large population centers affected by this record-breaking heat were in Asia.

No place on Earth recorded the locally coldest year ever.

An estimated 770 million people experienced the locally hottest year ever recorded in 2025. Credit: Berkeley Earth, CC BY-NC 4.0

The report came as estimates from the Rhodium Group, a think tank, showed that the United States’ greenhouse gas emissions increased by 2.4% in 2025 after 2 years of decline. The United States experienced its fourth-hottest year ever recorded in 2025, according to an analysis from Climate Central, a nonprofit climate change research group, and another analysis by NOAA-NCEI. 

The exceptional warming underscores “how essential sustained monitoring is to understanding [climate] changes in real time,” Kristen Sissener, executive director of Berkeley Earth, said in a statement. “Continued investment in high-quality, resilient, and robust open climate data is critical to ensuring that governments, industry, and local communities can respond based on evidence, not assumptions.”

The Berkeley Earth report predicted that global temperature trends in 2026 will be similar to those of 2025, with 2026 expected to be roughly the fourth-warmest year since records began. 

—Grace van Deelen (@gvd.bsky.social), Staff Writer

14 January: This story has been updated to include information from a Berkeley Earth press briefing.

Citation: van Deelen, G. (2026), The past 3 years have been the three hottest on record, Eos, 107, https://doi.org/10.1029/2026EO260031. Published on 14 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

AI Sheds Light on Hard-to-Study Ocean Currents

Wed, 01/14/2026 - 14:12
Source: Journal of Geophysical Research: Machine Learning and Computation

The Indonesian Throughflow carries both warm water and fresh water from the Pacific into the Indian Ocean. As the only low-latitude current that connects the two bodies of water, it plays a key role in ocean circulation and sea surface temperature worldwide.

The current is as complex as it is important: The seas surrounding Indonesia are home to deep basins and sills and a hodgepodge of ocean processes that make the Indonesian Throughflow difficult to measure. On-the-ground—or, rather, on-the-sea—observations are scarce as well because such observational systems are expensive and difficult to design and maintain.

Wang et al. combined artificial intelligence (AI) modeling techniques with observing system simulation experiment design concepts. Their method used sea surface height measurements to predict the behavior of this influential current and its individual passages and estimate which strait has the greatest effect on the current’s behavior.

The researchers developed a deep learning model that uses two types of networks to conduct observing system simulation experiments. The first, called a convolutional neural network (CNN), is often used for image classification and, in this case, was used to extract trends from data about the Indonesian Throughflow. The second, called a recurrent neural network (RNN), is most commonly used to sort through sequential data. In this work, the RNN processed the trends identified by the CNN and analyzed their changes over time. The approach proved to be much less computationally costly than running a traditional observing system simulation experiment.

The results recapitulated observed water transport trends and showed that sea surface height is a key predictor of conditions in some of the shallower straits between Indonesian islands. The Maluku Strait emerged as a passage where water conditions have a strong influence on the entire system and thus as a strong candidate for future monitoring efforts, the researchers found. Combining information about the Maluku and Halmahera Straits was even more effective at predicting system-wide conditions. (Journal of Geophysical Research: Machine Learning and Computation, https://doi.org/10.1029/2025JH000808, 2025)

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2026), AI sheds light on hard-to-study ocean currents, Eos, 107, https://doi.org/10.1029/2026EO260027. Published on 14 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Los microplásticos tienen efectos muy variados en el suelo

Wed, 01/14/2026 - 14:12

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

A medida que la producción mundial de plástico se ha disparado, pequeños fragmentos de plástico se han infiltrado en los ríos, el hielo marino e incluso en nuestros cerebros. De acuerdo con un nuevo estudio, cuando las minúsculas fibras y los fragmentos se filtran en el suelo, cambian la forma en que este interactúa con el agua.

El estudio, publicado en la revista Vadose Zone Journal, midió la retención de agua y la conductividad en suelos de tres regiones de Alemania con y sin cuatro microplásticos diferentes. Los investigadores encontraron que una concentración de plástico de solo el 0.4% en masa puede cambiar la velocidad con que el agua fluye a través del suelo, dependiendo tanto del tipo de plástico como del tipo de suelo. Según los autores, es probable que las propiedades hidráulicas alteradas se deban a la naturaleza hidrófoba del plástico y a que las partículas de microplástico cambian la disposición de los gránulos individuales del suelo.

Las pequeñas partículas del suelo se adhieren entre sí formando grumos. Los espacios entre estos grumos forman conductos por los que circulan agua, nutrientes y las raíces de las plantas. El tamaño y la distribución de estos espacios afectan al drenaje del suelo y a su capacidad de retención de agua, lo que tiene implicaciones para el crecimiento de las plantas.

“Las características hídricas de un suelo indican la rapidez con la que el agua se drena a través del suelo, lo que afecta a los cultivos y a los acuíferos.”

“Las características hídricas del suelo indican la rapidez con la que el agua se drena a través del suelo, lo que impacta cultivos y acuíferos”, menciona la autora principal del estudio, Katharina Neubert, científica especializada en suelos del Forschungszentrum Jülich en Alemania.

Investigaciones anteriores han mostrado que los microplásticos pueden alterar la estructura del suelo y sus propiedades hidráulicas, pero cada uno de esos estudios examinó sólo un tipo de suelo o un tipo de plástico. El nuevo estudio es el primero en evaluar cómo múltiples tipos de microplásticos afectan a múltiples tipos de suelo.

Los investigadores colectaron suelo de tres regiones agrícolas distintas de Alemania, que tenían diferentes texturas, niveles de carbono y niveles de pH. Después, obtuvieron cuatro microplásticos ampliamente usados variando en tamaño entre 300 micrómetros y 5 milímetros: polietileno, polipropileno, poliestireno y poliéster. Descompusieron las partículas más grandes en una licuadora y luego mezclaron cada plástico con cada tipo de suelo en una concentración del 0.4% en peso. En combinación con un control libre de plástico para cada tipo de suelo, se obtuvieron 15 combinaciones únicas de suelo y microplásticos.

Los autores vertieron cada mezcla en un cilindro metálico conectado a un dispositivo de succión para ver la rapidez con la que la succión extraía el agua del suelo. Realizaron la prueba en suelo húmedo y seco, ya que el nivel de humedad también influye en la rapidez con la que el agua se drena a través del suelo.

Desenterrando una relación matizada

Los cuatro microplásticos alteraron las tasas de flujo del agua en al menos uno de los suelos, pero la magnitud y la dirección del efecto variaron considerablemente. Por ejemplo, las fibras de poliéster, comúnmente desprendidas de algunos tipos de ropa, aumentaron la velocidad a la que fluía el agua a través de un suelo en más de un 50% cuando estaba húmedo, pero redujeron la tasa de flujo en más de un 50% en condiciones secas.

“Es muy difícil hacer una afirmación general sobre cómo cambia el suelo con los microplásticos.”

“Todos los resultados dependen del contexto”, afirma Rosolino Ingraffia, científico especializado en suelos de la Università degli Studi di Palermo en Italia, que no participó en la investigación. “Es muy difícil hacer una afirmación general sobre cómo cambia el suelo con los microplásticos”.

Otro estudio reciente en el que Neubert participó como coautora mostró cómo las diferencias en las tasas de flujo podrían traducirse en la agricultura. Ella cultivó plantas de trigo en los mismos tres tipos de suelo con y sin dos microplásticos: polietileno y poliéster. Los resultados fueron igualmente complicados, ya que el plástico añadido aumentaba, disminuía o no afectaba al crecimiento de las raíces, dependiendo de la combinación.

La concentración de plástico del 0.4% utilizada en ambos estudios es mucho mayor que la que albergan la mayoría de los campos agrícolas en la actualidad, según Neubert e Ingraffia. Por ejemplo, las tierras cultivables que han sido tratadas con biosólidos durante una década presentan concentraciones más cercanas al 0.002%. Sin embargo, los cálculos basados en la tasa actual de acumulación de microplásticos sugieren que muchas zonas podrían alcanzar esta concentración del 0.4% en 50 o 60 años, añadió Ingraffia.

Neubert espera que su investigación dé lugar a regulaciones que impidan que los microplásticos alcancen esos niveles. Alemania planea eliminar progresivamente el uso de lodos de depuradora ricos en nutrientes como fertilizantes en la mayoría de los campos agrícolas, en parte debido a la preocupación por la contaminación plástica, afirmó. Un estudio identificó esta práctica como una de las principales fuentes de microplásticos en el suelo de Alemania.

Es importante mantener el plástico fuera del suelo porque “aún no sabemos qué consecuencias tiene para nuestros suelos”, dijo Neubert.

—Mark DeGraff (@markr4nger.bsky.social), Escritor científico

This translation by Saúl A. Villafañe-Barajas (@villafanne) was made possible by a partnership with Planeteando and Geolatinas. Esta traducción fue posible gracias a una asociación con Planeteando y Geolatinas.

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Where the Tianshan Will Break Next: Strain, Slip, and Seismic Hazard

Wed, 01/14/2026 - 14:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Geophysical Research Letters 

The Tianshan Mountains in Central Asia have produced more than 100 large earthquakes in the past three centuries, showing that many faults in the region are still active. Chang et al. [2025] use the complete set of available GNSS (satellite-based positioning) measurement data, from 936 stations, to map how the crust is currently deforming. From these measurements, surface strain rates are calculated and, using novel inversion methods, an estimate of the seismic potential can be provided.

The authors find that most deformation (about 70%) is concentrated in the western Tianshan, where mapped faults accommodate roughly 60% of this strain. By comparing these results with the history of past earthquakes, the study identifies 20 fault segments with a “deficit”, that is, capable of producing future earthquakes of magnitude 7 or larger.

This work provides the first region-wide model of slip deficit and seismic potential for Tianshan and offers information that can directly improve seismic hazard assessments in Central Asia. The findings are especially timely following the 2024 Mw 7.0 Wushi earthquake.

Citation: Chang, F., Fang, J., Dong, S., Yin, H., Rollins, C., Elliott, J. R., & Hooper, A. J. (2025). Geodetic strain rates, slip deficit rates, and seismic potential in the Tianshan, Central Asia. Geophysical Research Letters, 52, e2025GL118470. https://doi.org/10.1029/2025GL118470   

—Fabio A. Capitanio, Editor, Geophysical Research Letters

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Melting Glaciers Mix Up Waters More Than We Thought

Tue, 01/13/2026 - 14:12
Source: Journal of Geophysical Research: Oceans

As marine-terminating glaciers melt, the resulting freshwater is released at the seafloor, which mixes with salty seawater and influences circulation patterns. As the oceans warm, it’s growing increasingly important to study this process. Researchers do so using the framework of buoyant plume theory, which describes how rising freshwater interacts with denser salt water. Falling chunks of ice, which can easily crush boats, make working near glaciers dangerous. Thus, empirical data that can verify buoyant plume theory have rarely been collected.

Ovall et al. helped fill this gap by using remotely operated kayaks equipped with instruments to monitor the features of water flowing out from Xeitl Sít’ (also called LeConte Glacier) in southeastern Alaska. Their work marked the first time researchers took measurements of a plume’s size, shape, and velocity from directly above the upwelling plume.

The robotic kayaks allowed the researchers to observe the plume of rising freshwater without risking their own safety. Instruments aboard the kayaks sent acoustic signals downward, which bounced off particles within the rising plume to measure its velocity.

The volume and characteristics of the rising plume of water are substantially different from those predicted by buoyant plume theory, they found. The study’s measurements found that upwelling water moves at rates of more than a meter per second. Buoyant plume theory doesn’t capture the extent to which freshwater pulls salt water into the rising plume, leading researchers to underestimate the volume of the plume by as much as 50%. That mismatch likely arose in part because scientists underestimated how the shape of a glacier’s submarine portion affects the interaction between freshwater and ocean water. However, the authors note, there are likely other factors at play that have not yet been identified. (Journal of Geophysical Research: Oceans, https://doi.org/10.1029/2025JC022902, 2025)

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), Melting glaciers mix up waters more than we thought, Eos, 106, https://doi.org/10.1029/2025EO250474. Published on 13 January 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Microbial Genes Could Improve Our Understanding of Water Pollution

Tue, 01/13/2026 - 14:12
Source: Journal of Geophysical Research: Biogeosciences

Underground environments like soil and aquifers teem with microbial life. These tiny microbes play a big role in cycling nutrients and breaking down or transforming pollutants. However, scientists still struggle to reliably model how microbes grow and decay.

Most studies of groundwater microbe communities focus on free-floating planktonic microbes, which make up less than 10% of an aquifer’s microbial population. The majority of microbes in groundwater are attached to sediment, making examination more difficult. Many studies are also done in labs, rather than on site.

Strobel et al. set out to study whether tracking biomarkers, such as specific genes produced by microbes during their life cycles, can improve models aimed at predicting how well microbes degrade pollutants in aquifers. They conducted research in southwestern Germany’s Ammer River floodplain, where groundwater sources with low oxygen levels and sediment with a high organic carbon content were ideal for microbial denitrification (the reduction of nitrate to nitrogen gas) to occur. The team constructed two 8.4-meter-deep wells surrounded by PVC casings and inserted seven microbial trapping devices (MTDs)—containers of sterilized sediment packed into a filter that served as a proxy for the microbial community in the aquifer matrix—into one of the wells. The MTDs remained submerged for 4.5 months prior to any experiments to allow the microbial community time to adapt to the environment and proliferate.

During a roughly 10-day period, while the MTDs were in the outflow well, the researchers injected nitrate-rich groundwater at the inflow well and extracted groundwater from the outflow well. The presence of nitrate, a pollutant that comes from sources such as fertilizer and sewage waste, spurred the microbial community into the process of denitrification. The team monitored the concentration of nitrate at the outflow and periodically withdrew an MTD to be transported to a lab for DNA analysis.

The growing abundance of key denitrification genes (napA and narG) in the earlier samples, followed by a decline in the later samples, indicated a dynamic microbial response to the added nitrate. The researchers’ efforts to use mathematical models to match their observations showed the importance of microbial growth during denitrification to control the extent of nitrate removal. The researchers note that though MTDs do not act as a perfect proxy for studying real aquifers, overall, the findings provide insight into the use of biomarkers to track biogeochemical processes, such as denitrification, in nature. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2025JG009181, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen R. (2026), Microbial genes could improve our understanding of water pollution, Eos, 107, https://doi.org/10.1029/2026EO260015. Published on 13 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Are We Really Seeing More Foreshocks with Enhanced Catalogs?

Tue, 01/13/2026 - 14:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Journal of Geophysical Research: Solid Earth

Foreshocks are smaller earthquakes that sometimes occur before bigger ones and studying them could help give early warnings of large earthquakes and understand how large earthquakes occur. But, because scientists use different ways to define and find foreshocks, estimates for how often they happen before big earthquakes in Southern California vary a lot—from 19% to 72%.

Khan et al. [2025] looked at both regular earthquake catalog and special “enhanced” catalogs with more small events to figure out why these estimates are so different. They found that using a simple method, just by checking small quakes near big ones in space and time, could lead to high foreshock rate, but the rate is comparable between standard and enhanced catalogs. Using statistics of past seismicity to define foreshock is better, but the choice of statistical representation matters. Assuming a constant average rate of past earthquakes (using a Poisson distribution) produces the highest foreshock rates and makes the results most sensitive to magnitude cut-offs and catalog choice. Their preferred method uses statistical distributions that account for variations in past earthquake rates, resulting in more reliable foreshock rates that are less sensitive to the magnitude cut-off or the type of earthquake catalog used.

This study clears up confusion about the wide range of foreshocks rates from previous studies in the same region and is the most thorough review of foreshock studies in Southern California so far. The authors also provide clear definitions, guidelines, and computer codes for other researchers to use. The authors emphasize the need to carefully consider biases in data and statistical methods in searching for precursory signals before large earthquakes and offer useful tips for improving short-term earthquake forecasts in the future.

Citation: Khan, R. A., Werner, M. J., Biggs, J., & Fagereng, Å. (2025). Effect of mainshock selection, earthquake catalog and definition on foreshock rate estimates in Southern California. Journal of Geophysical Research: Solid Earth, 130, e2024JB030733. https://doi.org/10.1029/2024JB030733

—Xiaowei Chen, Associate Editor, JGR: Solid Earth

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Hundreds of Scientists “Vehemently Oppose” U.S. Effort to Purchase Greenland

Mon, 01/12/2026 - 20:49
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

More than 200 scientists have signed a letter condemning U.S. President Donald Trump’s efforts to acquire Greenland.

“Greenland’s scientists and citizens have made enormous contributions to the world’s understanding of the Arctic and how rapid Arctic changes are affecting people around the world,” the letter reads. “To Greenlanders: Qujanaq, and we stand with you.”

It follows another letter issued in February 2025, which called the effort “a dangerous distraction from the urgent work of addressing environmental change impacts to U.S. citizens.”

The president first expressed interest in buying Greenland, an autonomous territory of Denmark, in 2019, during his first term in office, and has mentioned it throughout his second term. The campaign for the acquisition has intensified in the wake of the United States’ seizure of Venezuelan President Nicolás Maduro.

 
Related

Greenland is rich in oil and in minerals such as lithium, copper, and rare earths. However, Malte Humpert, founder and senior fellow at The Arctic Institute, told CNN that the idea of extensive rare earth mining on the island is “completely bonkers.”

“You might as well mine on the Moon,” he said. “In some respects, it’s worse than the Moon.”

Greenland is also strategically located between the North American and Eurasian Arctic. Its northwest coast is also home to the U.S. Pituffik Space Base.

“If we don’t take Greenland, Russia or China will take Greenland, and I am not going to let that happen,” Trump told reporters on 11 January from Air Force One. “One way or the other, we’re going to have Greenland … They need us much more than we need them.”

“Times have changed since Inuit lands were mere commodities that could be bought and sold,” wrote Sara Olsvig, Chair of the Inuit Circumpolar in a January 2025 statement. “In today’s world, we are active participants in decision-making about our lands and resources. We are beyond the times of typical colonial attitudes of superiority.”

In a LinkedIn post last week, Greenland’s prime minster, Jens-Frederik Nielsen, called the rhetoric “totally unacceptable” and “disrespectful.” A statement issued by the leaders of several European countries affirmed that “Greenland belongs to its people.”

Greenland is a critical location for climate science research, and many researchers have expressed concerns about how a U.S. takeover could affect this international scientific enterprise.

“Anything that injures our long-standing friendly relationship with Greenland is also an injury to science,” Yarrow Axford, a paleoclimatologist and one of the creators of the letter, wrote in an email to Eos. “There’s so much climate science and other important work that can only be done in Greenland, and only in partnership with Greenland’s people. I hope we can all weather this latest storm together.”

Mia Tuccillo, a paleolimnologist and Arctic scientist who is advised by Yarrow and also helped author the letter, wrote in an email to Eos that the research collaborations between the two nations are relatively new, and are delicate because of the history of U.S. intervention in Greenland.

“The statements by our government and by Trump that challenge Greenland’s sovereignty directly threaten these new priorities and collaborations—things that have greatly revolutionized and improved the ethos of geosciences—and things that are still very new and very, very valuable,” Tuccillo wrote.

“A unilateral US takeover threatens to disrupt the open scientific collaboration that is helping us understand the threat of global sea-level rise,” wrote glaciologist Martin Siegert in The Conversation.

The U.S. scientists behind the letter also issued a statement expressing solidarity with Greenland. Many shared (unattributed) personal messages at the end of the letter.

“Greenland is a unique culture and a critical part of the earth’s climate system, not a pawn in a real estate deal,” wrote one scientist.

“Without the help, knowledge, and skills of people in Greenland, we would have never been able to even reach our field site let alone conduct our research. When Greenlanders lead the way, our science improves and becomes more useful and relevant to both local and the international communities,” wrote another.

—Emily Gardner (@emfurd.bsky.social), Associate Editor

Editor’s note: This article has been updated to correctly differentiate between the letters issued in February 2025 and January 2026.

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New Insights into the Foggy Role of Contrails Within Clouds

Mon, 01/12/2026 - 14:04

When airplanes create trails of soot and moisture, water in the atmosphere condenses on the particles and freezes, leaving behind the familiar streaks known as condensation trails—or contrails. Contrails are so frequently the target of conspiracy theories that it might seem as though the word is a portmanteau of “conspiracy trails.” And although contrails do not contain harmful chemicals, these bands of condensation can, in fact, affect the atmosphere, with some reports suggesting that they account for more than half of aircrafts’ warming effect on the climate.

Most of these traces of air travel vanish within minutes of a plane’s passage. To have any effect on even local climate conditions, the air must be cold and humid enough for the contrails to last on the order of hours so that they can spread into a thin blanket of high-altitude ice crystals that captures some of Earth’s outgoing heat.

Though contrails are most recognizable when they pull a fresh veil across a clear sky, it’s within preexisting cirrus clouds that relevant climate conditions are most common. Exactly what percentage of condensation deposits form within clouds and what that means for their effects on the climate, though, have long been uncertain.

“We didn’t expect that.”

Now, new research in Nature Communications aims to elucidate scientists’ understanding of how contrails embedded within high-altitude cirrus clouds affect the climate.

Along with his team, Andreas Petzold, an atmospheric scientist at the research institution Forschungszentrum Jülich, examined 7 years of temperature and humidity data collected by sensors aboard passenger aircraft that together covered a combined 17 million kilometers (10.6 million miles) of flights. They combined these data with satellite-based weather observations to determine how often the conditions for long-lived contrails are met both inside and outside of extant clouds.

Though Petzold expected that the majority of contrails would form in regions preseeded with clouds, he didn’t anticipate the scale. “The fraction was so huge,” he said. “We didn’t expect that.”

In the flight corridors of the Northern Hemisphere over eastern North America, the North Atlantic, and western Europe, where the bulk of data were collected, roughly 90% of long-lived contrails formed within preexisting clouds. Many climate models, however, assume that the atmospheric imprints of aircraft are stamped on clear skies.

The net climate effect of a contrail changes depending on the thickness of the cloud in which it forms. Thicker cirrus clouds can buffer the warming that contrails might contribute and can even lead to local cooling. But when contrails appear in thin clouds (many so thin that the eye can’t see them), the force of their warming can become even more significant than if they had formed in clear skies.

The new findings mean that the relationships between contrails and the climate is more complex than previously realized. “We need to get a quantification of the effects from model studies,” Petzold said, “because we’ve shown that this is such a big fraction, but we do not know how they impact the whole picture.”

Cirrus Streaks

During the day, the Sun’s heat can make a cirrus cloud thickened by a contrail more reflective, creating a local cooling effect. But at night, this contrail thickening traps heat and increases local warming.

In another study published just a few weeks after Petzold’s, a Leipzig University research group studied contrails’ climate effects by examining more than 40,000 contrails that planes streaked through cirrus clouds over a 6-year span.

They found that on average, embedded contrails contributed just 5 milliwatts per square meter of warming across the planet—a measurement of the amount of change in radiative force occurring at any given moment in time. That’s a paltry sum compared to the 3,320 milliwatts per square meter of warming caused by greenhouse gases emitted over the industrial era, as estimated by the Intergovernmental Panel on Climate Change.

“Cirrus are quite important for climate in general.”

Though neither study overturns scientists’ understanding of the way contrails form, said Michael Diamond, a cloud physicist at Florida State University, “one of the really big advances here is just how much high-quality data they’re bringing to bear.”

The data collected by Petzold’s team could help inform future studies of the internal microphysics of cirrus clouds, which is important not only for a better understanding of the climate consequences of aviation but also because “cirrus are quite important for climate in general,” Diamond said. Cirrus is the only cloud type that traps more heat than it reflects, so understanding whether cirrus clouds will become more or less frequent as climate change progresses is a key question to answer.

Results like Petzold’s can also help inform the work that many in and around the aviation industry are doing to improve forecasting so that aviators can follow flight paths that limit the potential for long-lived contrails to form. And though finding ways to eliminate aviation emissions entirely through sustainable fuel sources or battery-powered planes is essential, decreasing the formation of the most impactful forms of long-lived contrails would make a meaningful difference in reducing the near-term warming caused by air travel.

—Syris Valentine (@shapersyris.bsky.social), Science Writer

Citation: Valentine, S. (2026), New insights into the foggy role of contrails within clouds, Eos, 107, https://doi.org/10.1029/2026EO260024. Published on 12 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Rethinking How to Measure Roots

Mon, 01/12/2026 - 14:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Roots are essential plant organs responsible for the uptake of water and nutrients from soil.  However, they are largely hidden from view and notoriously hard to quantify. Roots are often quantified by their mass distribution with depth, which involves separating and weighing roots having a variety of diameters below a cutoff (often 2 millimeters). However, this approach emphasizes the largest roots that contain most of the mass, while the very fine roots with little mass are responsible for most of the biogeochemical functioning.

Billings et al. [2025] have developed a relatively simple method for estimating the volume of soil interacting with fine and coarser roots, by quantifying root abundance instead of mass. They show that the abundance of fine roots does not decline as fast as overall root mass with increasing soil depth. Their results upend the standard paradigm of exponential decline in root functions set by root mass measurements and indicate a new paradigm is needed that links fine-root depth distributions with their hydrological, geochemical and ecological functions.

Citation: Billings, S. A., Sullivan, P. L., Li, L., Hirmas, D. R., Nippert, J. B., Ajami, H., et al. (2025). Contrasting depth dependencies of plant root presence and mass across biomes underscore prolific root-regolith interactions. AGU Advances, 6, e2025AV002072.  https://doi.org/10.1029/2025AV002072

—Susan Trumbore, Editor, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Binaliw: the massive garbage landslide in Cebu City, the Philippines

Mon, 01/12/2026 - 08:02

Recovery operations continue for the 36 victims of the 8 January 2026 garbage landslide in the Philippines.

Recovery operations are continuing at the site of the 8 January 2026 landslide at Binaliw in Cebu, the Philippines. At the time of writing, it is reported that the bodies of eight victims have been recovered, whilst 28 more remain missing. Whilst there were some reports yesterday of signs of life in the debris, the reality is that this is unlikely to be a rescue operation. A further 18 people were injured in the failure.

The location of the landslide is [10.41609, 123.92159].

Recovery operations have been hindered by heavy rainfall and the potential for a further failure at the site. Garbage also generates methane, which represents an additional risk.

There is some footage of the landslide as it occurred posted to Youtube:-

There is also a really good set of drone footage of the aftermath:-

This image, from the drone footage, captures the situation well:-

The aftermath of the 8 January 2026 garbage landslide at Banaliw in the Philippines. Image from a drone video posted to Youtube by The Daily Guardian, courtesy of Reuters.

The victims are believed to be located in the destroyed building at the foot of the Binaliw landslide.

Note the very steep rear scarp of the landslide. It appears that the failure mechanism at the crown was rotational – the remains of a rotated block can be seen forming a bench across the site – with the lower portion transitioning into a flow.

Rotational landslides typically occur in relatively homogenous materials (which at the scale of the landslide, will often be the case for garbage). At the most simple level, it is likely that the garbage pile was over-steepened, perhaps compounded by poor management of water. Work will be needed to understand how that can have occurred, but the processes through which tipping of wate at the top of the pile will be a focus. I would also consider carefully the road that appears to have crossed the waste upslope of the building (now buried). Did that cause local oversteepening?

I have written about garbage landslides repeatedly over the years. In 2011, I highlighted an event at Baguio in the Philippines. In every case, the losses were preventable.

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Central China Water Towers Provide Stable Water Resources Under Change

Fri, 01/09/2026 - 15:24
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

The mountains ringing the Pacific Rim—stretching from the Andes to the Rockies, the Himalayas, and beyond—act as actual “water towers.” They host huge reserves of water that are stored in snowpack, glaciers, lakes, and soils, and then feed rivers and supply freshwater to billions of people downstream.

Yue et al. [2026] analyze how climate change affects freshwater supply from water towers by analyzing a new dendrochronological network of 100 tree-ring sampling sites. They first reconstruct Central China Water Tower (CCWT) runoff back to 1595. Then, by considering projections from climate models, the authors reveal increasing runoff across most Pacific Rim water towers, whereas water resources from the Northern Rocky Mountains are projected to decline substantially. These differences are attributed to distinct geographies and synoptic climatic conditions. The findings provide insights for adaptive management strategies in China.

Citation: Yue, W., Torbenson, M. C. A., Chen, F., Reinig, F., Esper, J., Martinez del Castillo, E., et al. (2026). Runoff reconstructions and future projections indicate highly variable water supply from Pacific Rim water towers. AGU Advances, 7, e2025AV002053.  https://doi.org/10.1029/2025AV002053

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

In 2025, the Ocean Stored a Record-Breaking Amount of Heat, Again

Fri, 01/09/2026 - 14:23
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.

The ocean soaked up more heat last year than in any year since modern measurements began around 1960, according to a new analysis published in Advances in Atmospheric Science

The world’s oceans absorb more than 90% of excess heat trapped in Earth’s atmosphere by greenhouse gas emissions. As heat in the atmosphere accumulates, heat stored in the ocean increases, too, making ocean heat a reliable indicator of long-term climate change. 

Ocean temperatures influence the frequency and intensity of marine heatwaves, change atmospheric circulation, and govern global precipitation patterns. 

Scientists measure the ocean’s heat in different ways. One common metric is global annual mean sea surface temperature, the average temperature in the top few meters of ocean waters. Global sea surface temperature in 2025 was the third warmest ever recorded, at about 0.5°C (0.9°F) above the 1981-2010 average.

“Last year was a bonkers, crazy warming year.”

Another metric is ocean heat content, which measures the total heat energy stored in the world’s oceans. It’s measured in zettajoules: One zettajoule is equivalent to 1,000,000,000,000,000,000,000 joules. To measure heat content in 2025, the study’s authors assessed ocean observational data from the upper 2,000 meters of the ocean, where most of the heat is absorbed, from NOAA’s National Centers for Environmental Information, the European Union’s Copernicus Climate Change Service, and the Chinese Academy of Sciences. 

They found that in total, the ocean absorbed an additional 23 zettajoules of heat energy in 2025, breaking the ocean heat content record for the ninth consecutive year and marking the longest sequence of consecutive ocean heat content records ever recorded.

“Last year was a bonkers, crazy warming year,” John Abraham, a mechanical engineer at the University of St. Thomas and a coauthor of the new study, told Wired.

Twenty-three zettajoules in one year is equivalent to the energy of 12 Hiroshima bombs exploding in the ocean every second. It’s also a large increase over the 16 zettajoules of heat the ocean absorbed in 2024. The hottest areas of the ocean observed in 2025 were the tropical and South Atlantic, Mediterranean Sea, North Indian Ocean, and Southern Ocean. 

 
Related

The results provide “direct evidence that the climate system is out of thermal equilibrium and accumulating heat,” the authors write.

A hotter ocean favors increased global precipitation and fuels more extreme tropical storms. In the past year, warmer global temperatures were likely partly responsible for the damaging effects of Hurricane Melissa in Jamaica and Cuba, heavy monsoon rains in Pakistan, severe flooding in the Central Mississippi Valley, and more.

“Ocean warming continues to exert profound impacts on the Earth system,” the authors wrote.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

9 January: This article was updated to correct the conversion of 23 zettajoules to Hiroshima bomb explosions.

These updates are made possible through information from the scientific community. Do you have a story about science or scientists? Send us a tip at eos@agu.org. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Managing Carbon Stocks Requires an Integrated View of the Carbon Cycle

Fri, 01/09/2026 - 14:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Managing carbon stocks in the land, ocean, and atmosphere under changing climate requires a globally integrated view of carbon cycle processes at local and regional scales. The growing Earth Observation (EO) record is the backbone of this multi-scale system, providing local information with discrete coverage from surface measurements and regional information at global scale from satellites.

Carbon flux information, anchored by inverse estimates from spaceborne Greenhouse Gas (GHG) concentrations, provides an important top-down view of carbon emissions and sinks, but currently lacks global continuity at assessment and management scales (less than 100 kilometers). Partial-column data can help separate signals in the boundary layer from the overlying atmosphere, providing an opportunity to enhance surface sensitivity and bring flux resolution down from that of column-integrated data (100–500 kilometers).

As described in Parazoo et al. [2025], the carbon cycle community envisions a carbon observation system leveraging GHG partial columns in the lower and upper troposphere to weave together information across scales from surface and satellite EO data, and integration of top-down / bottom-up analyses to link process understanding to global assessment. Such an actionable system that integrates existing and new EO data and inventories using advanced top-down and bottom-up analyses can help address the diverse and shifting needs of carbon management stakeholders.

Diverse carbon cycle science needs span multiple time (x-axis) and space (y-axis) scales across land (green shading), ocean (blue shading), and fossil (orange shading) sectors. Science needs addressed by the current and planned carbon flux and biomass Earth Observation (EO) program of record (PoR; purple and green, respectively) are depicted by the solid circle. Key EO science gaps exist at 1–100 kilometer spatial scale spanning sub-seasonal impacts of climate extremes and wildfires, interannual change and biomass, long term changes in growth, storage, and emissions, and carbon-climate feedbacks and tipping points (grey shading). Future GHG and biomass observing systems (e.g., dashed circles) will provide important benefits to carbon management efforts. Credit: Parazoo et al. [2025], Figure 1

Citation: Parazoo, N., Carroll, D., Abshire, J. B., Bar-On, Y. M., Birdsey, R. A., Bloom, A. A., et al. (2025). A U.S. scientific community vision for sustained earth observations of greenhouse gases to support local to global action. AGU Advances, 6, e2025AV001914.  https://doi.org/10.1029/2025AV001914

—Don Wuebbles, Editor, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New River Chemistry Insights May Boost Coastal Ocean Modeling

Fri, 01/09/2026 - 13:46
Source: Global Biogeochemical Cycles

Rivers deliver freshwater, nutrients, and carbon to Earth’s oceans, influencing the chemistry of coastal seawater worldwide. Notably, a river’s alkalinity and the levels of dissolved inorganic carbon it brings to the sea help to shape regional conditions for marine life, including shellfish and corals. These factors also affect the ability of coastal seawater to absorb carbon dioxide from Earth’s atmosphere—which can have major implications for climate change.

However, the factors influencing river chemistry are complex. Consequently, models for predicting worldwide carbon dynamics typically simplify or only partially account for key effects of river chemistry on coastal seawater. That could now change with new river chemistry insights from Da et al. By more realistically accounting for river inputs, the researchers demonstrate significant corrections to overestimation of the amount of carbon dioxide absorbed by the coastal ocean.

The researchers used real-world data on rivers around the world to analyze how factors such as forest cover, carbonate-containing rock, rainfall, permafrost, and glaciers in a watershed influence river chemistry. In particular, they examined how these factors affect a river’s levels of dissolved inorganic carbon as well as its total alkalinity—the ability of the water to resist changes in pH.

The researchers found that variations in total alkalinity between the different rivers were primarily caused by differences in watershed forest cover, carbonate rock coverage, and annual rainfall patterns. Between-river variations in the ratio of dissolved inorganic carbon to total alkalinity were significantly shaped by carbonate rock coverage and the amount of atmospheric carbon dioxide taken up by photosynthesizing plants in the watershed, they found.

The analysis enabled the researchers to develop new statistical models for using watershed features to realistically estimate dissolved inorganic carbon and total alkalinity levels at the mouths of rivers, where they flow into the ocean.

When incorporated into a global ocean model, the improved river chemistry estimates significantly reduced the overestimation of carbon dioxide taken up by coastal seawater. In other words, compared with prior ocean modeling results, the new results were more in line with real-world, data-based calculations of carbon dioxide absorption.

This study demonstrates the importance of accurately accounting for river chemistry when making model-based predictions of carbon cycling and climate change. More research is needed to further refine river chemistry estimates to enable even more accurate coastal ocean modeling. (Global Biogeochemical Cycles, https://doi.org/10.1029/2025GB008528, 2025)

—Sarah Stanley, Science Writer

Citation: Stanley, S. (2026), New river chemistry insights may boost coastal ocean modeling, Eos, 107, https://doi.org/10.1029/2026EO260022. Published on 9 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The Looming Data Loss That Threatens Public Safety and Prosperity

Fri, 01/09/2026 - 13:45

From farming and engineering to emergency management and insurance, many industries critical to daily life rely on Earth system and related socioeconomic datasets. NOAA has linked its data, information, and services to trillions of dollars in economic activity each year, and roughly three quarters of U.S. Fortune 100 companies use NASA Earth data, according to the space agency.

Such data are collected in droves every day by an array of satellites, aircraft, and surface and subsurface instruments. But for many applications, not just any data will do.

Leaving reference quality datasets (RQDs) to languish, or losing them altogether, would represent a dramatic shift in the country’s approach to managing environmental risk.

Trusted, long-standing datasets known as reference quality datasets (RQDs) form the foundation of hazard prediction and planning and are used in designing safety standards, planning agricultural operations, and performing insurance and financial risk assessments, among many other applications. They are also used to validate weather and climate models, calibrate data from other observations that are of less than reference quality, and ground-truth hazard projections. Without RQDs, risk assessments grow more uncertain, emergency planning and design standards can falter, and potential harm to people, property, and economies becomes harder to avoid.

Yet some well-established, federally supported RQDs in the United States are now slated to be, or already have been, decommissioned, or they are no longer being updated or maintained because of cuts to funding and expert staff. Leaving these datasets to languish, or losing them altogether, would represent a dramatic—and potentially very costly—shift in the country’s approach to managing environmental risk.

What Is an RQD?

No single definition exists for what makes a dataset an RQD, although they share common characteristics, including that they are widely used within their respective user communities as records of important environmental variables and indicators. RQDs are best collected using observing systems designed to produce highly accurate, stable, and long-term records, although only a few long-term observing systems can achieve these goals.

As technological advances and operating constraints are introduced, specialized efforts are needed to integrate new and past observations from multiple observing systems seamlessly. This integration requires minimizing biases in new observations and ensuring that these observations have the broad spatial and temporal coverage required of RQDs (Figure 1). The nature of these efforts varies by the user community, which sets standards so that the datasets meet the specific needs of end users.

Fig. 1. Various satellite sensors provide total precipitable water (TPW) data products characterizing the integrated amount of water vapor available throughout the atmospheric column. However, each of these products has biases and sampling errors because of differences in the algorithms, sensors, and spatial and temporal sampling resolutions on which they are based. NOAA’s Cooperative Institute for Research in the Atmosphere produces a unified, or blended, TPW—an example of which is shown here—that merges all available TPW products. Click image for larger version. Credit: NOAA

The weather and climate community—which includes U.S.- and internationally based organizations such as NOAA, NASA, the National Research Council, and the cosponsors of the Global Climate Observing System (GCOS)—has agreed upon principles to guide the development of RQDs [Bojinski et al., 2014; National Research Council, 1999]. For example, data must account for changes in observing times, frequency of observations, instruments, calibration, and undesirable local effects (e.g., obstructions affecting the instruments’ sensors). These RQDs are referred to as either thematic or fundamental climate data records depending on the postprocessing involved (e.g., sensor-detected satellite radiances (fundamental) versus a postprocessing data product such as integrated atmospheric water vapor (thematic)).

Another important attribute of RQDs is that their data are curated to include detailed provenance tracking, metadata, and information on validation, standardization, version control, archiving, and accessibility. The result of all this careful collection, community input, and curation is data that have been rigorously evaluated for scientific integrity, temporal and spatial consistency, and long-term availability.

An Anchor to Real-World Conditions

RQDs are crucial in many ways across sectors. They are vital, for example, in realistically calibrating and validating projections and predictions of environmental hazards by weather, climate, and Earth system models. They can also validate parameterizations used to represent physical processes in models and ground global reanalysis and gridded model products in true ambient conditions [Thorne and Vose, 2010].

RQDs have become even more important with the rapid emergence of artificial intelligence weather forecasting approaches.

Without these reference data to anchor them, the outputs of large-scale high-resolution gridded climate datasets (e.g., PRISM (Portable Remote Imaging Spectrometer), E-OBS, IMERG (Integrated Multi-satellite Retrievals for GPM), CHELSA-W5E5) can drift systematically. Over multidecadal horizons, this drift degrades our ability to separate genuine Earth system changes and variations from artifacts. RQDs have become even more important with the rapid emergence of artificial intelligence (AI) weather forecasting approaches, which must be trained on observations and model outputs and thus can inherit their spatial and temporal biases.

Indeed, RQDs are fundamental to correcting biases and minimizing the propagation of uncertainties in high-resolution models, both conventional and AI. Researchers consistently find that the choice and quality of reference datasets are critical in evaluating, bias-correcting, and interpreting climate and weather model outputs [Gampe et al., 2019; Gibson et al., 2019; Jahn et al., 2025; Gómez-Navarro et al., 2012; Tarek et al., 2021]. If the reference data used are of lower quality, greater uncertainty can be introduced into projections of precipitation and temperature, for example, especially with respect to extreme conditions and downstream impacts such as streamflows or disease risk. This potential underscores the importance of RQDs for climate and weather modeling.

Each community has its own requirements for RQDs. To develop and implement statistical risk models to assess local vulnerability to environmental hazards, the finance and insurance sectors prioritize high spatial and temporal resolution, data completeness, adequate metadata to dissect specific events, certification that data are from a trusted source, open-source accessibility, and effective user data formats. These sectors directly or indirectly (i.e., downstream datasets) rely on many federally supported datasets. Examples include NOAA’s Storm Events Database, Billion-Dollar Weather and Climate Disasters dataset, and Global Historical Climatology Network hourly dataset; NASA’s family of sea surface altimetry RQDs and its Soil Moisture Active Passive and Gravity Recovery and Climate Experiment terrestrial water storage datasets; and the interagency Monitoring Trends in Burn Severity dataset, which tracks burned wildfire areas.

Meanwhile, the engineering design community requires regularly updated reference data that can help distinguish truly extreme from implausible outlier conditions. This community uses scores of federally supported RQDs to establish safety and design standards, including NOAA’s Atlas 14 and Atlas 15 precipitation frequency datasets, U.S. Geological Survey’s (USGS) National Earthquake Hazards Reduction Program dataset, and NASA’s sea level data and tools (which are instrumental in applications related to ocean transport and ocean structures).

As RQDs are a cornerstone for assessing environmental hazards across virtually all sectors of society, the loss or degradation of RQDs is an Achilles heel for reliably predicting and projecting all manner of environmental hazard.

Linking Reference Observing and Data Systems

U.S. agencies have long recognized the importance of reference observing systems and the RQDs they supply. Since the early 2000s, for example, NOAA’s U.S. Climate Reference Network (USCRN) has operated a network of highly accurate stations (now numbering 137) across the country that measure a variety of meteorological variables and soil conditions (Figure 2) [Diamond et al., 2013]. The USCRN plans redundancy into its system, such as triplicate measurements of the same quantity to detect and correct sensor biases, allowing data users to trust the numbers they see.

Fig. 2. A typical U.S. Climate Reference Network station includes instruments to collect a variety of data on environmental variables such as air temperature, precipitation, wind, soil moisture and temperature, humidity, and solar radiation. Credit: NOAA

The World Meteorological Organization has helped to coordinate similar networks with reference quality standards internationally. One such network is the GCOS Reference Upper-Air Network, which tracks climate variables through the troposphere and stratosphere (and to which NOAA contributes). The resulting RQDs from this network are used to calibrate and bias-correct data from other (e.g., satellite) observing systems.

Only the federal government carries a statutory, sovereign, and enduring mandate to provide universally accessible environmental data as a public good.

In the absence of such reference quality observing systems, RQDs must be derived by expert teams using novel data analyses, special field-observing experiments, statistical methods, and physical models. Recognizing their importance, Thorne et al. [2018] developed frameworks for new reference observing networks. Expert teams have been assembled in the past to develop RQDs from observing systems that are of less than reference quality [Hausfather et al., 2016]. However, these teams require years of sustained work and funding, and only the federal government carries a statutory, sovereign, and enduring mandate to provide universally accessible environmental data as a public good; other sectors contribute valuable but nonmandated and nonsovereign efforts.

Datasets at Risk

Recent abrupt actions to reduce support for RQDs are out of step with the long-standing recognition of these datasets’ value and of the substantial efforts required to develop them.

Federal funding and staffing to maintain RQDs are being cut through reduced budgets, agency reorganizations, and reductions in force. The president’s proposed fiscal year 2026 budget would, for example, cut NOAA’s budget by more than 25% and abolish NOAA’s Office of Oceanic and Atmospheric Research, although the newest appropriations package diminishes cuts to science. The National Science Foundation–supported National Center for Atmospheric Research (NCAR), which archives field experiment datasets and community model outputs, is at risk of being dismantled.

Major cuts have also been proposed to NASA’s Earth Sciences Division, as well as to Earth sciences programs in the National Science Foundation, Department of Energy (DOE), Department of the Interior, and elsewhere. Changes enacted so far have already affected some long-running datasets that are no longer being processed and are increasingly at risk of disappearing entirely.

The degradation of RQDs that we’re now seeing comes at a time of growing risk from climate and weather hazards. In the past decade alone, the United States has faced over $1.4 trillion in damages from climate-related disasters—and over $2.9 trillion since 1980. Inflation adjusted per-person costs of U.S. disasters have jumped nearly tenfold since the 1980s and now cost each resident nearly $500 annually (Figure 3). The flooding disasters from Hurricane Helene in September 2024 and in central Texas in July 2025 offer recent reminders of both the risks from environmental hazards and the continued need to predict, project, and prepare for future events.

Fig. 3. The average inflation-adjusted cost per person in the United States from billion-dollar disasters—indicated here in pentad years—rose from about $50 in 1980 to roughly $450 as of 2020. Costs are derived using the National Centers for Environmental Information’s Billion-Dollar Weather and Climate Disasters reference quality dataset, which is no longer being updated.

Threatened datasets include many RQDs whose benefits are compounded because they are used in building other downstream RQDs. This includes examples such as USGS’s National Land Cover Database, which is instrumental to downstream RQDs like Federal Emergency Management Agency flood maps, U.S. Department of Agriculture (USDA) crop models, and EPA land use products. Another example is USDA’s National Agriculture Imagery Program, which delivers high-resolution aerial imagery during the growing season and supports floodplain mapping, wetland delineation, and transportation infrastructure planning.

Many other federally supported projects that produce derivative and downstream RQDs are at risk, primarily through reductions in calibration, reprocessing, observing-network density, expert stewardship, and in some cases abrupt termination of observations. Earth system examples include NOAA’s bathymetry and blended coastal relief products (e.g., National Bathymetric Source, BlueTopo, and Coastal Relief Models), USGS’s 3DEP Digital Elevation Model, and the jointly supported EarthScope Consortium geodetic products.

Several global satellite-derived RQDs face end-of-life and longer-term degradation issues, such as those related to NASA’s algorithm development and testing for the Global Precipitation Climatology Project, the National Snow and Ice Data Center’s sea ice concentration and extent data, and the family of MODIS (Moderate Resolution Imaging Spectroradiometer) RQDs. In addition, USGS’s streamflow records and NOAA’s World Ocean Atlas are at-risk foundational RQDs whose downstream products span sectors including engineering, hazards management, energy, insurance, defense, and ecosystem services.

More Than a Science Issue

The degradation of weather, climate, environmental, and Earth system RQDs propagates risk well beyond the agencies that produce them and isn’t a problem of just science and technology, because the products they power don’t serve just scientists.

Apart from fueling modeling of climate and weather risks and opportunities, they underpin earthquake and landslide vulnerability maps, energy grid management, safe infrastructure design, compound risk mitigation and adaptation strategies, and many other applications that governments, public utilities, and various industries use to assess hazards and serve public safety.

A sustained capability to produce high-resolution, decision-ready hazard predictions and projections relies on a chain of dependencies that begins with RQDs.

A sustained capability to produce high-resolution, decision-ready hazard predictions and projections relies on a chain of dependencies that begins with RQDs. If high-quality reference data vanish or aren’t updated, every subsequent link in that chain is adversely affected, and all these products become harder to calibrate and the information they provide is less certain.

RQDs are often used in ways that are not immediately transparent. A case in point is the critical step of updating weather model reanalyses (e.g., ERA5 (ECMWF Reanalysis v5) or MERRA-2 (Modern-Era Retrospective Analysis for Research and Applications, Version 2)), which are increasingly used in many weather and climate hazards products, by replacing the real-time operational data they assimilate with data from up-to-date RQDs wherever possible. These real-time operational data are rarely screened effectively for absolute calibration errors and subtle but important systemic biases, so this step helps to ensure the model simulations are free of time- and space-dependent biases. Using outputs from reanalysis models not validated or powered by RQDs can thus be problematic because biases can propagate into other hazard predictions, projections, and assessments, resulting in increased uncertainty and an inability to validate extremes.

A Vital Investment

With rapid advances in new observing system technologies and a diverse and ever–changing mix of observing methods, demand is growing for scientific expertise to blend old and new data seamlessly. The needed expertise involves specialized knowledge of how to process the data, integrate new observing system technologies, and more.

The costs for maintaining and updating RQDs are far less than recovering from a single billion-dollar disaster.

Creating RQDs isn’t easy, and sustained support is necessary. This support isn’t just a scientific priority—it’s also a vital national investment. Whereas the costs of restoring lost or hibernated datasets and rebuilding expert teams—if those tasks would even be possible—would be enormous, the costs for maintaining and updating RQDs are far less than recovering from a single billion-dollar disaster.

Heeding recurring recommendations to continue collecting precise and uninterrupted observations of the global climate system—as well as to continue research, development, and updates necessary to produce RQDs—in federal budgets for fiscal year 2026 and beyond thus seems the most sensible approach. If this doesn’t happen, then the United States will need to transition to relying on the interest, capacities, and capabilities of various other organizations both domestic and international to sustain the research, development, and operations required to produce RQDs and make them available.

Given the vast extent of observing system infrastructures, the expertise required to produce RQDs from numerous observing systems, and the long-term stability needed to sustain them, such a transition could be extremely challenging and largely inadequate for many users. Thus, by abandoning federally supported RQDs, we risk being penny-wise and climate foolish.

References

Bojinski, S., et al. (2014), The concept of essential climate variables in support of climate research, applications, and policy, Bull. Am. Meteorol. Soc., 95(9), 1,431–1,443, https://doi.org/10.1175/BAMS-D-13-00047.1.

Diamond, H. J., et al. (2013), U.S. Climate Reference Network after one decade of operations: Status and assessment, Bull. Am. Meteorol. Soc., 94(4), 485–498, https://doi.org/10.1175/BAMS-D-12-00170.1.

Gampe, D., J. Schmid, and R. Ludwig (2019), Impact of reference dataset selection on RCM evaluation, bias correction, and resulting climate change signals of precipitation, J. Hydrometeorol., 20(9), 1,813–1,828, https://doi.org/10.1175/JHM-D-18-0108.1.

Gibson, P. B., et al. (2019), Climate model evaluation in the presence of observational uncertainty: Precipitation indices over the contiguous United States, J. Hydrometeorol., 20(7), 1,339–1,357, https://doi.org/10.1175/JHM-D-18-0230.1.

Gómez-Navarro, J. J., et al. (2012), What is the role of the observational dataset in the evaluation and scoring of climate models?, Geophys. Res. Lett., 39(24), L24701, https://doi.org/10.1029/2012GL054206.

Hausfather, Z., et al. (2016), Evaluating the impact of U.S. Historical Climatology Network homogenization using the U.S. Climate Reference Network, Geophys. Res. Lett., 43(4), 1,695–1,701, https://doi.org/10.1002/2015GL067640.

Jahn, M., et al. (2025), Evaluating the role of observational uncertainty in climate impact assessments: Temperature-driven yellow fever risk in South America, PLOS Clim., 4(1), e0000601, https://doi.org/10.1371/journal.pclm.0000601.

National Research Council (1999), Adequacy of Climate Observing Systems, 66 pp., Natl. Acad. Press, Washington, D.C., https://doi.org/10.17226/6424.

Tarek, M., F. Brissette, and R. Arsenault (2021), Uncertainty of gridded precipitation and temperature reference datasets in climate change impact studies, Hydrol. Earth Syst. Sci., 25(6), 3,331–3,350, https://doi.org/10.5194/hess-25-3331-2021.

Thorne, P. W., and R. S. Vose (2010), Reanalyses suitable for characterizing long-term trends, Bull. Am. Meteorol. Soc., 91(3), 353–362, https://doi.org/10.1175/2009BAMS2858.1.

Thorne, P. W., et al. (2018), Towards a global land surface climate fiducial reference measurements network, Int. J. Climatol., 38(6), 2,760–2,774, https://doi.org/10.1002/joc.5458.

Author Information

Thomas R. Karl (Karl51tom@gmail.com), Climate and Weather LLC, Mills River, N.C.; Stephen C. Diggs, University of California Office of the President, Oakland; Franklin Nutter, Reinsurance Association of America, Washington, D.C.; Kevin Reed, New York Climate Exchange, New York; also at Stony Brook University, Stony Brook, N.Y.; and Terence Thompson, S&P Global, New York

Citation: Karl, T. R., S. C. Diggs, F. Nutter, K. Reed, and T. Thompson (2026), The looming data loss that threatens public safety and prosperity, Eos, 107, https://doi.org/10.1029/2026EO260021. Published on 9 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Plan to End NEPA’s “Regulatory Reign of Terror” Is Finalized

Thu, 01/08/2026 - 18:37
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

The Trump administration has finalized a plan to roll back regulations outlined by one of the nation’s bedrock environmental laws.

Signed into law in 1970, the National Environmental Policy Act (NEPA) requires federal agencies to assess how proposed major projects—such as the purchase of parklands, the establishment of military complexes, or the construction of buildings and highways—will impact the environment.

NEPA opponents, which include both Republicans and Democrats, claim the processes outlined in the legislation unnecessarily delay approvals for infrastructure and energy projects. Last February, the Council on Environmental Quality (CEQ) published an interim final rule removing NEPA regulations. The new action adopts the rule as final.

 
Related

“In this Administration, NEPA’s regulatory reign of terror has ended,” said CEQ Chairman Katherine Scarlett in a statement. “Thanks to President Trump’s leadership, CEQ acted early to slash needless layering of bureaucratic burden and restore common sense to the environmental review and permitting process.”

In response to the interim final rule, the CEQ received more than 108,000 public comments, according to a document outlining the rule published today on the Federal Register. One such comment came from a coalition of environmental groups, expressing strong opposition to the rule last March.

NEPA “promotes sound and environmentally-informed decisionmaking by federal agencies, and it provides the primary way for the public to learn about and provide input regarding the impacts of federal actions on their lives,” the letter read. “The only certainty provided by the Interim Final Rule is less government transparency, more project delay, more litigation, less resilient infrastructure, and poor environmental and health outcomes for communities.”

—Emily Gardner (@emfurd.bsky.social), Associate Editor

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Trump Pulls United States Out of International Climate Efforts “Contrary” to National Interests

Thu, 01/08/2026 - 16:11
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

In an executive order issued on 7 January, the White House ordered the country’s withdrawal from 66 international agreements determined to be “contrary to the interests of the United States,” including two global efforts to combat climate change: the United Nations Framework Convention on Climate Change (UNFCCC) and the Intergovernmental Panel on Climate Change (IPCC).

The UNFCCC is a 1992 treaty that sets the legal framework for international cooperation to limit climate change. The IPCC is the United Nations organization that assesses and communicates climate science to global governments. 

The order will make the United States the only country in the world that does not participate in the UNFCCC.

 “As the only country in the world not a part of the UNFCCC treaty, the Trump administration is throwing away decades of U.S. climate change leadership and global collaboration.” 

“This is a shortsighted, embarrassing, and foolish decision,” Gina McCarthy, former EPA administrator under President Barack Obama, told E&E News. “As the only country in the world not a part of the UNFCCC treaty, the Trump administration is throwing away decades of U.S. climate change leadership and global collaboration.” 

McCarthy added that the U.S. withdrawal would limit the country’s ability to influence important decisions that impact the global economy, especially as other countries invest heavily in clean energy.

KD Chavez, executive director of the Climate Justice Alliance, an advocacy organization, said in a statement that the withdrawal “protects polluters while abandoning all of us, our livelihoods, and Mother Earth.”

“This move undermines treaty obligations, tribal sovereignty, and the global cooperation needed to survive the climate crisis,” Chavez said.

Others say the UFCCC is ineffective, and that leaving it could open new opportunities to cooperate with other countries to combat or mitigate climate change: “The framework convention is a joke,” George David Banks, Trump’s international climate adviser during his first term, told E&E News.

The UNFCCC has been criticized in the past for the ineffectiveness of its annual “conferences of the parties,” or COPs, as well as the influence of fossil fuel lobbyists at these meetings. 

 
Related

Because the Senate originally, and unanimously, advised President George H.W. Bush to join the UNFCCC in 1992, legal experts question whether the order to withdraw is constitutional, or whether the United States could rejoin in the future. 

The withdrawal from the IPCC also cuts the United States out of global climate science assessments. “Walking away doesn’t make the science disappear, it only leaves people across the United States, policymakers, and businesses flying in the dark at the very moment when credible climate information is most urgently needed,” Delta Merner, associate accountability campaign director for the Climate and Energy Program at the Union of Concerned Scientists, said in a statement

On his first day in office last year, Trump pulled the United States out of the Paris Agreement, a legally binding treaty setting long-term emissions goals, for a second time—an action that one United Nations report estimated would eliminate 0.1°C (0.18°F) of global progress on climate change by 2100. Withdrawing from the IPCC and UNFCCC leaves the United States further isolated from international cooperative efforts to limit climate change.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Successful Liquid Lake Conditions in a Cold Martian Paleoclimate

Thu, 01/08/2026 - 15:20
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Understanding the paleoclimate of Mars is essential for gaining insights into Mars’ early history and atmospheric conditions. Such information is the key to learning why Mars shifted from a potentially warm, wet planet to the cold, dry desert we see now; whether climate change was gradual or catastrophic, thus informing how terrestrial planets evolve over billions of years.

Moreland et al. [2025] use an adapted lake energy balance model to investigate the connections between Martian geology and climate. By combining climate input from the Mars Weather Research & Forecasting general circulation model with geologic constraints from Curiosity rover observations, the study contributes to resolve the historic disconnect between the modeling results that suggest cold climate and the geologic evidence that liquid water was retained into Mars’ lakes. By concluding that relatively small lakes with a relatively limited water input and seasonal ice cover could retain seasonal liquid water for long times under Mars’ paleoclimate, the authors provide groundbreaking findings to inform climate models and enhance our understanding of conditions on early Mars.  

Citation: Moreland, E. L., Dee, S. G., Jiang, Y., Bischof, G., Mischna, M. A., Hartigan, N., et al. (2026). Seasonal ice cover could allow liquid lakes to persist in a cold Mars paleoclimate. AGU Advances, 7, e2025AV001891. https://doi.org/10.1029/2025AV001891

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The Northern Sargasso Sea Has Lost Much of Its Namesake Algae

Thu, 01/08/2026 - 14:37

Sargassum has a bad reputation for washing up on shorelines, rotting on the beach, and creating a stinky mess. But this marine algae also functions as a habitat for many marine species, and new research published in Nature Geoscience indicates that its biomass has significantly declined where it once flourished: Since 2015, the amount of Sargassum in the northern Sargasso Sea has decreased by more than 90%. That change is likely caused by a reduced supply of healthy algae from the Gulf of Mexico, where water temperatures are rising, the researchers suggest.

“This is the only sea on Earth that has no physical boundaries.”

The floating brown algae known as Sargassum is found throughout the Atlantic Ocean, the Caribbean Sea, and the Gulf of Mexico. (Other species exist in the Pacific.) A region of the subtropical North Atlantic Ocean is even named in its honor: the Sargasso Sea. Rafts of Sargassum measuring tens of meters wide and several kilometers long frequently form in the Sargasso Sea, and marine life ranging from crabs to shrimp to sea turtles takes refuge in the nooks and crannies afforded by its leaves and air-filled bladders.

The Sargasso Sea is a geographical anomaly when it comes to bodies of water—it’s bounded by ocean currents, not land. “This is the only sea on Earth that has no physical boundaries,” said Chuanmin Hu, an optical oceanographer at the University of South Florida in Tampa and the senior author of the new study.

Spotting Algae from Space

To better understand how Sargassum populations have shifted over time in the Sargasso Sea and beyond, Hu and his colleagues mined archival satellite data. The team focused on observations made from 2000 to 2023 with the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument, which collects data in the near- and midinfrared ranges of the electromagnetic spectrum. That spectral coverage is important because Sargassum, like all other vegetation, strongly reflects near-infrared light; ocean water, on the other hand, does not.

Sargassum has a different signal than the background ocean water,” said Hu.

The team, coled by Yingjun Zhang, Brian Barnes, and Deborah Goodwin, exploited that telltale sign to estimate the amount of algae present in various swaths of water. The researchers focused on six geographic regions that cumulatively spanned more than 40° in latitude and 90° in longitude. The team was able to detect Sargassum where the fractional areal coverage of the algae was as low as 1 part in 500. There can be up to about 5 times that much Sargassum in a pixel, said Barnes, a satellite oceanographer at the University of South Florida in St. Petersburg.

The Northern Sargasso Sea, with Less Sargassum

The researchers found that Sargassum populations in the northern part of the Sargasso Sea have decreased dramatically since 2015—the satellite data revealed a roughly twelvefold drop in average biomass between 2000–2014 datasets and 2015–2023 datasets. (Measurements from the team’s shipboard surveys showed that Sargassum density declined by only about 50% over the same time period, but the team noted that those in situ data are sparse and potentially suffer from sampling bias.) If the satellite data are reflecting reality—and it’s likely that they are—that’s a substantial decrease in Sargassum, said Barnes. “There’s so much less now.”

At the same time, there’s been a proliferation of Sargassum in the so-called Great Atlantic Sargassum Belt. This 9,000-kilometer-wide swath of the ocean stretching from western Africa to the Gulf of Mexico saw an uptick in Sargassum beginning in 2011 that hasn’t abated. But it’s not as though the Great Atlantic Sargassum Belt is robbing the northern Sargasso Sea of its algae. The Great Atlantic Sargassum Belt is playing a role in the demise in the northern Sargasso Sea, but the largest changes are likely caused by shifting conditions in the Gulf of Mexico, the team surmised.

The agent that facilitates all of these connections? That’s ocean currents, said Zhang, an oceanographer at the Scripps Institution of Oceanography at the University of California, San Diego. The Sargasso Sea and the Gulf of Mexico may be thousands of kilometers apart, but they’re nonetheless linked by waters on the move.

Algae on a Journey

Satellite data have shown that the Gulf of Mexico is one of the key sources of Sargassum that ultimately ends up in the northern Sargasso Sea. The algae makes a journey that lasts several months: From the Gulf of Mexico, Sargassum hitches a ride on ocean currents—namely, the Loop Current and the Florida Current—before getting swept up in the Gulf Stream. It then makes its way along the East Coast of the United States before finally reaching the northern Sargasso Sea.

But sea surface temperatures have been rising in the Gulf of Mexico in recent years, often reaching more than 30°C in the summertime. Sargassum prefers temperatures ranging from 23°C to 28°C, and heat-stressed algae are less likely to survive the monthslong journey to the northern Sargasso Sea, said Hu. “During the long-distant transport, most of it will die.”

“You have a one-two punch.”

That makes sense, said William Hernandez, an oceanographer at the University of Puerto Rico–Mayaguez who was not involved in the research. Sargassum stressed by high temperature is less likely to take up nutrients and grow adequately, he said. “It’s the same thing that you see in terrestrial vegetation.”

In addition to heat stress, Sargassum in the Gulf of Mexico is also likely suffering from a lack of nutrients. That’s because the plentiful Sargassum in the Great Atlantic Sargassum Belt is gobbling up necessary compounds like phosphorus and sulfates, said Hernandez. So when currents off the coast of South America and in the Caribbean sweep water into the Gulf of Mexico, they’re transporting something that’s essentially already been picked over, he said. “By the time those waters reach that area, they’ve already been depleted of their nutrients.”

The combined effects of heat stress and limited nutrients really wallop Sargassum populations, said Hernandez. “You have a one-two punch.” There might well be ecological repercussions to having less Sargassum in the northern Sargasso Sea, the team suggests. Fish and other creatures rely on Sargassum for habitat, so less algae could translate into measurable impacts on other animals. Collecting in situ animal data in the Sargasso Sea will help answer that question, said Hu. “There should be impacts on other animals. Is that the case?”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2026), The northern Sargasso Sea has lost much of its namesake algae, Eos, 107, https://doi.org/10.1029/2026EO260014. Published on 8 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer