EOS

Syndicate content Eos
Science News by AGU
Updated: 1 day 1 hour ago

Pollution Is Rampant. We Might As Well Make Use of It.

Fri, 01/30/2026 - 14:21

When representatives of 197 countries ratified the Montreal Protocol to phase out ozone-depleting substances in 1987, they probably didn’t anticipate creating a new method for estimating the age of groundwater.

But the Montreal Protocol paved the way for a chemical called trifluoroacetic acid, or TFA, to become widespread in the atmosphere, and therefore in rainwater. Because the concentration of TFA has increased steadily since 1987, it’s a helpful tool for gaining a rough idea of how recently an aquifer has been recharged—which is what is meant by “groundwater age.”

Using TFA as a quick and easy tracer is one of several research techniques that rely on the mass amounts of anthropogenic material that enter the environment every moment of every day. Scientists are using pollution to study processes both small-scale and worldwide, from the history of a single bird’s nest to the history of humans on this planet.

Novel Tracers

TFA is one of thousands of per- and polyfluoroalkyl substances (PFAS), which are also known as “forever chemicals” because they take thousands of years to degrade. Fortunately, TFA seems to be much less toxic than the long-chain PFAS, such as perfluorooctanesulfonic acid (PFOS) and perfluorooctanoic acid (PFOA), that have been associated with human health problems.

TFA’s omnipresence is a side effect of the move from using ozone-depleting chlorofluorocarbons (CFCs) in refrigerants. The alternative refrigerants, originally thought to be less harmful than CFCs, have consequences of their own, however, making this a case of what scientists have called “a regrettable substitute.”

Cyclists ride in front of a bus on a rainy evening in Copenhagen, Denmark, where scientists have used the concentration of trifluoroacetic acid (TFA) in rain to estimate the age of groundwater. Credit: Kristoffer Trolle/Wikimedia Commons, CC BY 2.0

When modern refrigerants evaporate into the atmosphere, they break down into TFA, which then falls to the ground in the rain, explained environmental geochemist Christian Nyrop Albers from the Geological Survey of Denmark and Greenland.

Groundwater becomes drinking water, so part of Albers’s job is to screen groundwater for pollutants. But to convince politicians they need to regulate a pollutant, he and his colleagues need to show that the substance is entering groundwater because of how it’s used today, not in decades past. So they need to know how old the groundwater is.

More sophisticated methods “are not always very easy to use, or they are very expensive or time-consuming.”

“There are many sophisticated methods for that, but they are not always very easy to use, or they are very expensive or time-consuming,” Albers said. The gold standard is to measure the decay of a substance called tritium into helium, but only a few labs in the world have the capacity to do the test, and the water sample must be stored for 6 months to see the decay.

Measuring TFA is not as precise as measuring tritium decay, and those using the technique have to be cognizant of any farms in the area, because agricultural chemicals can also release TFA into the groundwater and affect results. But measuring TFA is fast and easy, so “we use it on a regular basis now,” Albers said. He and his colleagues recently published the method, and a research group in Germany has begun using it, too.

In general, PFAS in the environment are the “subject of huge amounts of discussion,” said environmental radiochemist Andy Cundy from the University of Southampton, who was not involved in developing the method. “As the measurement of PFAS becomes more routine, I think we will see more and more people using PFAS as tracers,” he added.

Plastic Cuts Both Ways Among the nests Auke-Florian Hiemstra analyzed was a common coot’s nest containing plastic dating back to the 1990s. Credit: Auke-Florian Hiemstra

More than 460 million metric tons of plastic are produced each year, with that number growing all the time. When it’s used as food packaging, plastic often comes with an expiration date stamped on it. Auke-Florian Hiemstra of the Naturalis Biodiversity Center in Leiden, Netherlands, is a nidologist, or a scientist who studies birds’ nests. He used those expiration dates to trace the history of birds’ nests found along the canals in Amsterdam. In the past, carbon-14 dating has been applied to some very old nests, but using plastic proved to be a far easier process.

Scientists used trash to date the construction of birds’ nests in Amsterdam. Click image for larger version. Credit: Auke-Florian Hiemstra

“This one bird nest that we found turned out to be like a history book,” Hiemstra said. The trash within it ranged from face masks from the COVID-19 pandemic to a candy bar wrapper advertising the 1994 FIFA World Cup. Of course, a piece of plastic’s expiration date doesn’t correspond exactly to the date when a bird incorporated it into its nest, but finding several pieces from the same time frame is suggestive. To increase confidence in the method, the researchers integrated their findings with the archives of Google Street View, which showed the presence of the nest at various points in time.

But even as plastic opens opportunities to estimate the ages of some natural materials, it may make it harder to tell the ages of others. That’s because plastic is derived from long-dead plants and animals that have negligible amounts of the carbon-14 isotope that’s used for carbon dating. Plastic carbon may dilute natural carbon and make materials appear older than they are.

This could be problematic for the study of ocean processes. One way of measuring how long it’s been since water was at the surface relies on carbon-14 dating. If 1% of the carbon in a sample of water is from microplastics—a conservative estimate given that up to 5% of ocean carbon is from plastic in some samples—then that would make the sample appear 64 years older than it actually is, calculated environmental oceanographer Shiye Zhao from the Japan Agency for Marine-Earth Science and Technology.

Ocean circulation proceeds over thousands of years, so adding 64 years doesn’t change the overall picture by very much. But the amount of plastic is always increasing, so “think about this in a future scenario,” Zhao said. Especially in plastic hot spots, the material could obscure the study of ocean circulation substantially.

“That could be an issue as more microplastics enter the ocean,” said Cundy.

The Anthropocene

Anthropogenic pollution can help scientists understand how nature is responding to other aspects of human influence.

We’re living in a period that’s colloquially called the Anthropocene because markers of human activity are obvious in environmental records worldwide. Although no formal date has been agreed upon, scientists have proposed a range of dates for when the Anthropocene began. One definition suggests that the period began in the mid-20th century and is marked by many human-made substances, such as plastic, that are evident in geological strata, including ice and sediment cores. But one of the most ubiquitous and reliable candidate markers for the start of the Anthropocene is plutonium-239. Atomic bomb tests conducted in the 1940s and 1950s were the main sources of plutonium-239, which went flying into the atmosphere and around the globe, depositing a layer across Earth and “labeling the entire planet,” said Cundy.

Having a marker for when anthropogenic activities began to affect the geological record is a powerful research tool.

Having a marker for when anthropogenic activities began to affect the geological record is a powerful research tool because it provides a benchmark against which scientists can measure how nature has responded since, said environmental geochemist Agnieszka Gałuszka from Jan Kochanowski University of Kielce, in Poland.

In a study of pollen in paleoecological records from across North America, for example, scientists looked at how the diversity of plant species has changed since the mid-20th century and compared that with previous time periods. They found that rates of species appearing and disappearing have been higher at any other time since the end of the last ice age, about 13,000 years ago. That’s probably because of land use changes, as well as of introduction of pests and invasive species to the continent, all driven by humans.

Likewise, in a study of peatlands in the Izery Mountains of Europe, researchers investigated how coal burning has affected microorganisms since the mid-1960s. By analyzing microbial communities, scientists discovered that amoebae picked up titanium, aluminum, and chromium from inorganic coal residue and incorporated these elements into their shells. “It was quite shocking news to all of us,” Gałuszka said.

Identifying pollutants as markers of the plausible start of the Anthropocene has led scientists to ask, “What has been the change over time?” said Cundy. “And, importantly, what have been the causes of that change over time? Is it human induced, or is it natural?”

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2026), Pollution is rampant. We might as well make use of it., Eos, 107, https://doi.org/10.1029/2026EO260039. Published on 30 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Our Ocean’s “Natural Antacids” Act Faster Than We Thought

Fri, 01/30/2026 - 14:21
Source: AGU Advances

Earth’s ocean absorbs carbon dioxide from the atmosphere, helping to temper the impact of climate change but increasing ocean acidity. However, calcium carbonate minerals found in the seabed act as a natural antacid: Higher acidity causes calcium carbonate to dissolve and generate carbonate molecules that can neutralize the acid.

For many years, researchers have thought of this carbonate dissolution buffer mainly as a very slow process because most ocean carbonates lie in deep-ocean sediments. There, the effects of their dissolution won’t reach the atmosphere for hundreds or thousands of years—long after many effects of acidification are already felt by ecosystems.

However, calcium carbonate also exists in more than 60% of the seabed of the shallower waters of continental shelves. New research by van de Velde et al. suggests that shelf carbonate dissolution may play a previously underappreciated climate feedback role on much faster timescales.

To explore the potential importance of shallow carbonate dissolution, the researchers analyzed high-precision ocean carbonate chemistry observations collected over 25 years in continental shelf waters off the southeastern coast of New Zealand.

They found that in the study area, calcium carbonate buffering has occurred in shallow shelf waters for at least the past 25 years and that this climate feedback process operates on annual to decadal timescales—orders of magnitude faster than in the deep ocean. Additional biogeochemical modeling suggested that this continental shelf carbonate dissolution is driven by an increase in dissolved carbon dioxide resulting from anthropogenic carbon dioxide emissions.

Similar dissolution feedback may occur in continental shelf waters around the world, in which case, shelf carbonate dissolution may have been accelerating globally since the 1800s. Furthermore, the researchers calculated that this process could account for up to 10% of the current discrepancy between state-of-the-art model predictions of ocean carbon dioxide uptake and real-world measurements.

Further research will be needed to explore the global role of shelf carbonate dissolution and how it should be incorporated into climate models. Such knowledge could have key implications for proposed efforts to combat climate change by deliberately boosting ocean alkalinity, the authors say. (AGU Advances, https://doi.org/10.1029/2025AV001865, 2025)

—Sarah Stanley, Science Writer

Citation: Stanley, S. (2026), Our ocean’s “natural antacids” act faster than we thought, Eos, 107, https://doi.org/10.1029/2026EO260013. Published on 30 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Alligators May Boost Carbon Storage in Coastal Wetlands

Thu, 01/29/2026 - 14:17

The vital role apex predators play in maintaining healthy ecosystems is well-documented, but research published in Scientific Reports suggests predators might also influence the global carbon cycle. The study found that across coastal wetlands in the southeastern United States, soils store more carbon where American alligators are present, linking predator recovery to enhanced carbon retention in some of the planet’s most efficient natural carbon sinks.

Wetland carbon storage (so-called “blue carbon”) is facilitated by wetlands’ waterlogged, oxygen-poor soils, which slow decomposition and allow organic material to accumulate over time. Scientists know that when wetlands are drained or degraded, stored carbon can be released into the atmosphere as carbon dioxide. Less well understood is how biological interactions within these habitats shape carbon dynamics. The new study adds to a growing body of evidence showing that animals—particularly apex predators—can influence vegetation, soils, sediment flows, and nutrient cycles at scales large enough to affect the planet’s carbon budget.

“What we found was a positive correlation between alligator abundance and carbon sequestration in specific habitats,” said Christopher Murray, an ecologist at Southeastern Louisiana University and lead author of the study. “Where we have more alligators, from small populations to much larger populations, we actually see higher carbon sequestration.”

Across the alligator’s native range, wetlands stored an average of 0.16 gram more carbon per square centimeter in the top 10 centimeters of soil when alligators were present.

Murray and his colleagues at Southeastern and the Louisiana Universities Marine Consortium analyzed soil carbon data from the Smithsonian’s Coastal Carbon Network. From that database, the team selected 649 continuous soil cores from tidally influenced wetlands in 13 states. They compared those carbon measurements with data on alligator presence, density, and nesting patterns assembled from state wildlife agencies and long-running monitoring programs.

Across the alligator’s native range, wetlands stored an average of 0.16 gram more carbon per square centimeter in the top 10 centimeters of soil when alligators were present. That surface layer reflects relatively recent carbon accumulation over roughly the past 6 decades. This period overlaps with the recovery of alligator populations following the Endangered Species Protection Act of 1966.  

The researchers attribute the observed patterns to a combination of physical ecosystem engineering and trophic cascades, or actions by predators that reverberate through multiple layers of a food web. As apex predators, alligators may suppress herbivore populations that otherwise damage vegetation and disturb soils, potentially allowing denser plant growth and greater carbon burial. Alligators also modify wetland landscapes directly. By digging dens, carving channels, and creating small ponds, they reshape hydrology, redistribute sediments and nutrients, and create localized microhabitats where organic carbon can accumulate and persist.

Tropic Effects

At a continental scale—spanning a wide range of coastal wetland types across multiple states—the study found no statistically significant difference in carbon storage between sites with and without alligators. The authors suggest that this reflects substantial ecological variability across regions, including differences in vegetation, geomorphology, hydrology, and food web structure, which can mask the influence of any single predator species when ecosystems are analyzed collectively.

An American alligator rests on a fallen tree. Research suggests that wetlands within the alligator’s native range store more carbon in surface soils when alligators are present. Credit: Emil Siekkinen

“Originally, I was surprised by that finding,” said Murray. The team’s original hypothesis predicted higher carbon sequestration wherever alligators were present, consistent with trophic cascade theory. The absence of a clear continental-scale signal, Murray said, made it obvious to him, “later on, that there’s a different apex predator that is working in those habitats.”

When the analysis was narrowed to the alligator’s native range, thereby reducing ecological variability, the pattern became clearer. At these regional scales, wetlands with alligators consistently stored more carbon, suggesting that in ecosystems where they occupy the top trophic position, alligators may exert a detectable influence on wetland carbon dynamics.

“Apex predators like crocodilians have a critical role in the function of our world.”

 “This study is important because it links an apex predator directly to wetland soil carbon stocks, moving beyond theory to show that food web structure can shape carbon outcomes at ecosystem scales,” marine ecologist and Blue Carbon Lab director Peter Macreadie, who was not involved in the study, wrote in an email. “It also challenges prevailing blue carbon approaches by showing that long-term carbon storage depends not only on vegetation and sediments, but on maintaining intact trophic interactions.”

Such trophic effects help explain how sea otters maintain kelp forests by controlling sea urchins and why wolves have been linked to forest regeneration through changes in large herbivore behavior. The alligator study suggests that similar processes may operate in coastal wetlands, where predator presence supports vegetation growth, soil stability, and carbon retention.

The study does not establish causation, and Murray emphasized that long-term exclusion experiments would be needed to directly test how changes in alligator populations affect carbon accumulation over time. Even so, the findings suggest that predator recovery may have consequences for the climate that are rarely considered in conservation planning. Murray said that the implications of this work extend beyond carbon accounting, however. “Apex predators like crocodilians have a critical role in the function of our world,” he said. “And they should be respected rather than feared.”

—Emil Siekkinen, Science Writer

Citation: Siekkinen, E. (2026), Alligators may boost carbon storage in coastal wetlands, Eos, 107, https://doi.org/10.1029/2026EO260038. Published on 29 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Insights for Making Quick Clay Landslides Less Quick

Thu, 01/29/2026 - 14:17

In countries of the far north, a particular kind of natural disaster can strike almost without warning. Quick clay landslides, in which previously solid soil suddenly liquefies, can carry away houses and farms and bury towns and roads. The slides occur when salts leach out from clay soils that were previously beneath sea level, eventually bringing the soils’ stability beneath a critical threshold and making them vulnerable to potential triggering events.

“If we can understand how these salts are doing it, maybe we can find something else that does the same thing.”

Striking examples in Norway include buildings sliding sideways into the sea near the northern town of Alta in 2020 and the Verdal landslide in 1893, in which 3 square kilometers of land broke loose in the central Norwegian municipality, killing 116 people and burying 105 farms. Quick clay (often called sensitive clay in North America) is also found in Alaska, as well as Canada, Finland, Russia, and Sweden, where governments often attempt to stabilize at-risk soils. Doing so can be expensive and environmentally harmful, leading researchers to seek better ways of making quick clay safe again.

In new research published in the Journal of Colloid and Interface Science, Norwegian researchers dove down to the microscopic scale to provide new insights into how different kinds of salts contribute to the mechanical strength of quick clay. The findings could reveal novel ways to make at-risk soils safe from slides, said study coauthor Astrid de Wijn, a materials scientist at the Norwegian University of Science and Technology (NTNU).

“If we can understand how these salts are doing it, maybe we can find something else that does the same thing,” she said.

For Want of Salt

The key statistic for quick clay risk is the marine limit—the line dividing soils that were previously below sea level from those that remained above it. In high-latitude countries like Norway, melting glaciers at the end of the last ice age, around 10,000 years ago, caused a process of unburdening and uplift called isostatic rebound that brought some previously submerged areas above water. The marine limit varies from place to place but can be more than 200 meters above current sea levels in the south of Norway and includes significant portions of the country.

The soil “will behave like sort of a sour cream. It just pours out of the landslide crater.”

Soils beneath the marine limit were infused with salts from the sea, which they’ve gradually lost over time from groundwater leaching. Those salt ions act as electrochemical binders between clay molecules, helping strengthen them, said Jean-Sébastien L’Heureux, a geotechnical engineer and technical expert on quick clay at the Norwegian Geotechnical Institute who was not involved with the research.

Without the salts to hold them, the microscopic particles of clay look more like a house of cards, stacked haphazardly with nothing binding them together. It is in this state that regular clay becomes quick clay, where even small perturbations like minor earthquakes or construction projects can cause devastating landslides. Previously solid soil “will behave like sort of a sour cream,” L’Heureux said. “It just pours out of the landslide crater.”

The main way to prevent such catastrophes is to stabilize the soil, a process that to date has typically involved injecting lime and cement to act as a binder. The technique is effective but environmentally unfriendly because of the large amounts of carbon dioxide (CO2) it creates. Coming up with an equally effective, more sustainable method is the goal of the Sustainable Stable Ground (SSG) project run by NTNU, which de Wijn and her coauthor, NTNU chemist Ge Li, are part of.

Using molecular dynamics simulations that re-create how clay molecules act at the nanoscale, the two researchers were able to compare how different salt cations affected the clay’s strength. The key difference was between divalent cations like magnesium chloride (MgCl2) and calcium chloride (CaCl2) and monovalent ones like sodium chloride (NaCl) and potassium chloride (KCl), Li said. Divalent cations enhance interactions between clay particles to a greater extent and stick out more, increasing friction. That means they enhance clay strength more than monovalent cations do and could offer a blueprint for future chemical stabilizers in quick clay.

In Search of Better Solutions

Finding a truly effective, affordable, and sustainable means of stabilizing quick clay will likely take some time, however. Priscilla Paniagua, a geotechnical engineer at the Norwegian Geotechnical Institute not affiliated with the paper, noted that simply adding more salt, as some projects have attempted to do, is unlikely to be effective, as current technology makes it difficult to scale. What’s more, the salt will simply leach out from the soils again, Li noted.

Some teams have proposed using materials like biochar or ash to stabilize soils, approaches that work well in the lab but have yet to be scaled up, Paniagua said. Another issue is that some proposed stabilization methods would increase only the remolded strength of quick clay, or its strength after it has liquefied and begun moving.

“It means that it won’t be quick [clay], but…you’re not increasing the full stability of the slope,” L’Heureux said. Such approaches would mitigate the impact of a quick clay landslide but wouldn’t prevent it from occurring.

Though challenges remain, Li and de Wijn remain hopeful that a better solution for quick clay is possible. Li said their modeling work is informing small-scale lab experiments testing how various materials affect soil strength. New proposals for stabilizers include polymers that enhance clay binding and even CO2 injected into the soil to help lime solidify, de Wijn said.

Today, better maps of quick clay landslide risk give local governments and developers more information about where it’s safe to build and where it isn’t. But with many soils destabilized, scientists note, the risk of landslides remains.

—Nathaniel Scharping (@nathanielscharp), Science Writer

Citation: Scharping, N. (2026), Insights for making quick clay landslides less quick, Eos, 107, https://doi.org/10.1029/2026EO260040. Published on 29 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

As Some Soils Warm, Microbes Stockpile Essential Nutrients

Wed, 01/28/2026 - 14:07

As high-latitude soils warm, microbes in the soil change how they handle nutrients like nitrogen. Normally, these microbes are nitrogen recyclers, pulling it from the soil and turning it into inorganic forms—like ammonium and nitrates—that plants can absorb. But a new study published in Global Change Biology suggests that with rising temperatures, microbes are changing their strategy. They take up more nitrogen for themselves while reducing the amount they release back into the environment. This change alters the flow of nitrogen through the ecosystem, potentially slowing vegetation growth and affecting the rate at which our planet warms.

These findings come from experiments carried out in subarctic grasslands near Hveragerði, Iceland. In 2008, earthquakes rerouted groundwater in an area that had been warmed by geothermal gradients, creating patches of soil heated between 0.5°C and 40°C above normal temperatures. The event turned the region into a natural laboratory where researchers could study how ecosystems respond to long-term warming under natural conditions.

Earlier research in this location had already shown that in warming soils, microbes become highly active while plants are dormant. As a result, nitrogen-containing compounds released into the soil by the microbes were lost, either by leaching into groundwater or by escaping into the atmosphere as the potent greenhouse gas nitrous oxide.

An abandoned greenhouse near the experimental sites in Iceland serves as a reminder that climate change is having an especially strong effect on high-latitude soils. Credit: Sara Marañón Jiménez

In this work, scientists added nitrogen-15 to the soil, which they could track to determine how much the plants had used up and what they did with it. Researchers found that after the initial nutrient loss, microbes became more conservative in their handling of nitrogen, recycling nitrogen internally rather than absorbing more from the ground. At the same time, microbes stopped releasing ammonium, a nitrogen-rich by-product of their normal metabolism that is usable by plants—the microbial equivalent of urine, said study coauthor Sara Marañón Jiménez, a soil scientist at the Centre for Ecological Research and Forestry Applications in Spain.

Nitrogen Heist

This change in nitrogen cycling has important consequences for the whole ecosystem. On the one hand, it has a positive effect because it prevents further nitrogen loss.

“The study shows that nitrogen is not released as inorganic nitrogen, but it seems to go directly in an organic loop,” said Sara Hallin, a soil microbiologist at the Swedish University of Agricultural Sciences in Uppsala who was not involved in the study. “You could say that it’s a positive aspect, and so it’s more beneficial for the ecosystem if that nitrogen is sort of retained.”

“If microorganisms start immobilizing nitrogen, it could lead to competition between microbes and plants.”

On the other hand, microbes’ nutrient-hoarding behavior might reduce nitrogen availability for plants. “There’s a delicate feedback between plants that take nitrogen, make photosynthesis, and put carbon in the soil as organic matter and microorganisms that take this organic matter, recycle it, and release nitrogen in forms the plants can use,” Marañón Jiménez said. “If microorganisms start immobilizing nitrogen, it could lead to competition between microbes and plants.”

The team is now working on a study to determine what exactly happens to soil at the very early stage of warming, before nutrients have been lost. “This way we hope to recover the first chapters, to see what we’ve been missing,”

To this end, they transplanted bits of normal soils into heated areas to study the process in detail from the very beginning. “Soils exposed to [soil] temperature increases showed the same nutrient loss after 5 years [as] after 10 years,” Marañón Jiménez said, suggesting that most of the nutrient loss occurs early on.

A Greenhouse Time Bomb

Climate models may be underestimating how the loss of nitrogen and carbon from cold soils is contributing to global warming, researchers said. Disruptions to nutrient cycling at these latitudes could represent a previously overlooked source of greenhouse gas emissions.

Arctic soils store massive amounts of carbon, built up over thousands of years from plant material that microbes cannot fully break down. This partially decomposed organic matter accumulates, forming one of the largest carbon reservoirs on Earth. As temperatures rise, scientists expect microbes to become more active, accelerating decomposition and releasing much of this stored carbon into the atmosphere as carbon dioxide.

“As biomass is lost from the microbial mass, that means there’s less storage capacity for carbon and nitrogen in the soil, leading to poorer soils where plants can’t grow as well.”

Researchers had hoped warmer temperatures would allow plants to grow more vigorously, absorbing some of the extra carbon released by Arctic soils.

The new findings call this idea into question. “It’s a chain reaction,” Marañón Jiménez explained. “As biomass is lost from the microbial mass, that means there’s less storage capacity for carbon and nitrogen in the soil, leading to poorer soils where plants can’t grow as well, and plants cannot compensate emissions by absorbing more carbon.”

Studying these geothermally heated soils could yield confusing results, though. “It’s not really the way global warming works,” Hallin said. Global warming includes increases in air temperature, she explained, whereas the plants in the current study had only their root system in a warmer climate, not their aboveground shoot system. “That could potentially cause some effects [the researchers] are not accounting for,” she said.

Finally, the authors of the new study also warn that not all soils have the same response to warming. The Icelandic soils in this study are volcanic and rich in minerals, unlike the organic peat soils that dominate many Arctic regions. Deep peatlands in Scandinavia and northern Russia store vast amounts of carbon and may behave differently, highlighting the need for similar long-term studies across a wider range of Arctic landscapes.

—Javier Barbuzano (@javibar.bsky.social), Science Writer

Citation: Barbuzano, J. (2026), As some soils warm, microbes stockpile essential nutrients, Eos, 107, https://doi.org/10.1029/2026EO260043. Published on 28 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Which Countries Are Paying the Highest Price for Particulate Air Pollution?

Wed, 01/28/2026 - 14:06
Source: GeoHealth

Polluted air causes an estimated 7 million deaths worldwide each year, according to the World Health Organization. Much of the mortality comes from PM2.5, particulate pollution smaller than 2.5 micrometers in diameter that can enter the lungs and bloodstream and cause respiratory and cardiovascular problems. In addition to particles emitted directly into the atmosphere, ammonia (NH3), nitrogen oxides (NOX), and sulfur dioxide (SO2), which are emitted by factories, ships, cars, and power plants, are all precursors that can contribute to the formation of PM2.5. The effects of particulate pollution are not evenly distributed, however.

Oztaner et al. model the consequences of air pollution across the Northern Hemisphere by region, offering a more granular look at where targeted mitigation policies could be the most beneficial. Using the multiphase adjoint model of EPA’s Community Multiscale Air Quality (CMAQ) modeling platform, the authors assessed the benefits of mitigating various pollutants from the perspective of both lives and money saved. Monetary values of air pollution impacts were calculated using a well-established method used by international agencies, although the method introduces ethical concerns because it assigns values to lives partly based on different countries’ per capita gross domestic products (GDP).

Overall, they found that a 10% reduction in all modeled emissions could save 513,700 lives and $1.2 trillion each year in the Northern Hemisphere.

The largest mortality reductions came from China and India, where cutting emissions would save 184,000 and 124,000 lives, respectively, each year. The largest cost savings were found in China, followed by Europe and North America. Health benefits also varied by type of emissions and sector. NH3 causes more issues in China, whereas NOX is relatively more harmful in Europe than in other places. Across the Northern Hemisphere, the agricultural sector contributes most to particulate and precursor pollution, with a 10% reduction in agriculture-related emissions projected to save 95,000 lives and an estimated $290 billion. This is followed by the residential and industrial sectors.

The authors note that caution is warranted when comparing results across similar studies, in part because the link between pollutant concentrations and health outcomes is not always linear and in part because different regions may have different methodologies when accounting for emissions by sector. Also, their study focuses only on PM2.5-related mortality and does not consider other pollutants, such as ozone. Overall, they suggest their work offers a meaningful reference for comparing the effects of different pollutant mitigation strategies in the Northern Hemisphere. (GeoHealth, https://doi.org/10.1029/2025GH001533, 2026)

—Nathaniel Scharping (@nathanielscharp), Science Writer

Citation: Scharping, N. (2026), Which countries are paying the highest price for particulate air pollution?, Eos, 107, https://doi.org/10.1029/2026EO260026. Published on 28 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Wildfire Smoke Linked to 17,000 Strokes Annually in the United States

Tue, 01/27/2026 - 15:25

Smoke from wildfires may be responsible for 17,000 strokes each year in the United States, new research suggests.

The study, published in European Heart Journal, examined various sources of particulate matter smaller than 2.5 micrometers in diameter (about 30 times smaller than the width of a human hair). Also known as PM2.5, such particles are so small that they can be inhaled and enter the bloodstream, where they have been linked to an array of health effects, including decreased lung function, cardiovascular diseases, and even neurological disorders. But the new study seems to indicate that PM2.5 from wildfires is particularly harmful.

“The longer you’re exposed to smoke, the greater your stroke risk.”

Scientists examined a cohort of about 25 million people over the age of 65 who were covered by Medicare, a federal health insurance program. Between 2007 and 2018, about 2.9 million of those people experienced a stroke. The researchers calculated the average amount of wildfire smoke, as well as nonsmoke PM2.5, that each study participant was exposed to over the course of each year on the basis of participants’ zip codes.

After 1, 2, or 3 years of exposure to nonsmoke PM2.5, the participants’ risk of stroke didn’t change much.

“But for smoke, this picture is very different,” said Yang Liu, a health and environmental scientist at Emory University and corresponding author of the paper. “It’s like you are seeing some kind of a dose-response effect: The longer you’re exposed to smoke, the greater your stroke risk.”

More specifically, the study found that an increase of 1 microgram per cubic meter in the average concentration of wildfire smoke was associated with a 1.3% increase in stroke risk. Researchers found that Medicaid-eligible individuals (those who qualify for the program have limited income and resources) were especially vulnerable to the effects of wildfire smoke.

Unique Harms of Wildfire Smoke

The researchers input air quality data from several sources, including satellites, ground-based air monitors, and low-cost sensors such as PurpleAir devices, into a machine learning framework. The framework was used to estimate the daily wildfire smoke PM2.5 and nonsmoke PM2.5 concentrations across the contiguous United States at a 1-kilometer resolution. The team then used this information to calculate the average exposure rates within each zip code over 1, 2, and 3 years.

Their model and subsequent analyses of the findings were also designed to control for other factors that could affect stroke risk, including meteorology (extreme heat can increase stroke risk), access to care, Medicaid eligibility, and substance abuse disorders.

Jennifer Stowell, a geohealth scientist at the University of Maryland, said this was an “important” study.

“I really like where this paper has gone because they’ve characterized exposure slightly differently,” she said. “Rather than looking at more acute exposure, they looked at up to 3 years of exposure prior to a stroke. Also, other studies, for the most part, rely on emergency department data. So the fact that this is data in addition to that, from doctors’ offices and all sorts of things, is a big plus.”

The study did not establish the reason for the link between wildfire smoke exposure and stroke risk, but previous studies have suggested that inhaling pollutants can cause oxidative stress that affects the function of the endothelial cells (those lining the blood and lymphatic vessels) and of the cardiovascular system as a whole.

The study’s findings are also in line with previous research: A 2021 study suggested that PM2.5 from wildfires is up to 10 times more harmful than PM2.5 from other sources, such as ambient pollution.

“It all comes down to what [materials] wildfires are burning,” Stowell said. “There is a lot of organic matter, chemicals, and particles that we don’t normally see in air pollution from traffic or from industry that can be emitted during a fire. This is especially true if that fire burns any sort of man-made structures. Then, you start getting some highly toxic, synthetic emissions that we don’t normally breathe.”

Only a Small Part of the Picture

In a world where wildfires are growing both more frequent and more severe, Liu said he hopes a study like this will help guide future research, noting the importance of a large-scale epidemiological study to complement lab-based research.

“Policymakers can look at the disease burden numbers and say, ‘Wow, it may be worthwhile to spend more money on firefighting, or forest management, because it’s a huge disease burden.’”

“I think its real burden is going to be much, much larger than what we show in this paper.”

Liu said he wasn’t at all surprised by his team’s findings because stroke is only one part of the overall picture of how smoke affects overall health.

“I think its real burden is going to be much, much larger than what we show in this paper,” Liu said. In fact, he noted that the study focuses only on the fee-for-service Medicare population and doesn’t account for the more than 40% of the Medicare population enrolled in private insurance.

“So even for the overall Medicare population, or just the elderly population in the U.S., we are underreporting the burden, maybe by half,” he said.

—Emily Gardner (@emfurd.bsky.social), Associate Editor

Citation: Gardner, E. (2026), Wildfire smoke linked to 17,000 strokes annually in the United States, Eos, 107, https://doi.org/10.1029/2026EO260042. Published on 27 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

What Americans Lose If Their National Center for Atmospheric Research Is Dismantled

Tue, 01/27/2026 - 14:15

Americans set few everyday expectations for science, but they are fundamental: We expect the weather forecast to be right, we expect science and technology that allow weather hazards to be anticipated within reason, and we expect public services to protect our lives and livelihoods from such hazards—floods, fires, tornadoes, and hurricanes.

NCAR is not just another research center. It is purpose-built critical infrastructure designed to integrate observations, modeling, supercomputing, and applied research in ways that no single university, agency, or contractor can replicate on its own.

Well, the fulfillment of those expectations is in real doubt now that the Trump administration plans to dismantle the National Science Foundation’s (NSF) National Center for Atmospheric Research (NCAR), a federally funded institution that underpins critical science that Americans rely on. Administration officials have argued that NCAR’s work can simply be redistributed to other institutions without loss. But NCAR is not just another research center. It is purpose-built critical infrastructure designed to integrate observations, modeling, supercomputing, and applied research in ways that no single university, agency, or contractor can replicate on its own.

Although Congress rejected the administration’s proposed funding cuts to NSF, the most recent spending bill did not include explicit language protecting NCAR as a unified entity.

As a result, the center remains vulnerable—not through outright defunding, but through fragmentation. The administration could try to cut interagency contracts that NCAR relies on to fund its staff, lay off staff, and relocate critical capabilities. NSF has already outlined plans to restructure NCAR, including moving its supercomputer to another site and transferring or divesting research aircraft it operates. These risks would hollow out the institution itself, breaking apart integrated teams, disrupting continuity in projects, and weakening the unique collaborative model at NCAR that accelerates scientific progress in weather, water, climate, and space weather.

This distinction matters. NCAR’s value does not lie solely in the science it produces, but in how that science is organized, sustained, and shared across the nation.

The following are five of the many ways Americans will lose the benefits of scientific research if plans to dismantle NCAR unfold, and two ways we can work to prevent it.

1. Air Travelers Will Lose Protection

Every day, millions of Americans board airplanes expecting to arrive safely at their destinations. What most passengers never see is the science working behind the scenes to keep flights safe through better understanding of atmospheric conditions such as turbulence and microburst winds.

Turbulence alone is the leading cause of injuries on U.S. commercial flights and cargo operations, and NCAR research has played a central role in reducing that risk by improving how turbulence is detected, predicted, and avoided. NCAR scientists helped develop advanced forecasting techniques that allow pilots and dispatchers to reroute aircraft away from dangerous air currents before passengers are ever put at risk.

In addition to safety, NCAR research has reduced the $100 million financial strain severe turbulence costs the U.S. aviation system every year through aircraft damage, inspections, medical costs, and delays.

NCAR’s contributions to aviation safety extend well beyond turbulence. In the 1970s and 1980s, NCAR scientists led research that identified and explained microbursts, a poorly understood weather phenomenon consisting of powerful downdraft winds produced by thunderstorms. Microbursts had caused multiple fatal airline crashes during takeoff and landing, and NCAR findings convinced the Federal Aviation Administration (FAA) and international aviation authorities to develop radar warning systems to detect these threats. Since these tools have been deployed, fatal U.S. airline crashes caused by microbursts have effectively been eliminated.

Dismantling NCAR and moving this work elsewhere would break the integrated system that makes aviation safety research effective in the first place. NCAR uniquely brings together long-term observational data, advanced modeling, specialized instrumentation, and direct operational partnerships with agencies like the FAA under one roof. Fragmenting that capacity across multiple institutions would disrupt decades of trusted, public service relationships with the aviation community, making it harder and slower to translate research into real-world protections for pilots and passengers. With millions of people in the sky every day, this is not a risk we should take.

2. Food Security and the U.S. Agricultural Economy Will Be Put at Risk

Agriculture contributes hundreds of billions of dollars annually to the U.S. economy, and food security remains a national priority, making NCAR’s research crucial to this weather-sensitive sector. Drought, heat waves, and floods are recurring stresses that affect what crops farmers can grow, as well as food prices for consumers.

NCAR’s long-standing collaborations, integrated modeling and computing capacity, and role as a trusted public service institution are what allow farmers to rely on consistent, decision-ready information year after year.

NCAR research is directly relevant to food security. For example, NCAR scientists are working in conjunction with universities in Kansas and Nebraska and the U.S. Department of Agriculture to develop CropSmart, a next-generation system that aggregates weather forecasts, crop data, soil conditions, and other inputs into actionable, decision-ready information for farmers, agribusinesses, and agricultural officials. Early projections from CropSmart suggest that if advanced decision support systems like this were adopted on even half of irrigated farms in a state like Nebraska, farmers could save up to 1 billion cubic meters of water and $100 million in irrigation energy costs annually while also cutting about a million tons of greenhouse gas emissions per year.

If NCAR is broken up, we lose this economic opportunity and the myriad ways it supports U.S. agriculture. NCAR’s long-standing collaborations, integrated modeling and computing capacity, and role as a trusted public service institution are what allow farmers to rely on consistent, decision-ready information year after year.

All the agricultural tools housed, supported, or innovated by NCAR would be put at risk, leaving farmers with fewer early warnings, less reliable guidance, and greater exposure to weather extremes. These losses would translate to the food on our tables having a higher price tag, which inevitably increases food insecurity, already a significant problem in the United States.

3. U.S. National Security and Military Readiness Will Be Weakened

The U.S. military depends on weather and climate intelligence to operate safely, effectively, and strategically. From flight operations and naval deployments to training exercises and base infrastructure, weather conditions shape nearly every aspect of defense readiness. When forecasts are wrong or incomplete, missions can be delayed, equipment can be damaged, and personnel and our national defense are put at risk.

Accurate environmental intelligence reduces risk, lowers costs, and strengthens national security.

NCAR’s research and operational tools provide the environmental intelligence that defense planners, operators, and test authorities rely on to keep us safe. Accurate, NCAR-enhanced forecasts have saved the U.S. Army millions of dollars by reducing weather-related test cancellations and avoiding needless mobilization costs. NCAR weather forecasting tools have been used for defense-related purposes, including anti-terrorism support at the Olympic games, protection of the Pentagon, support for firefighters, and analysis of exposure of our military personnel to toxins.

The strategic value of this work is reflected in the breadth of defense agencies that rely on NCAR today. NCAR maintains active partnerships and contracts with the Air Force, the Army Corps of Engineers, the National Ground Intelligence Center, the Defense Threat Reduction Agency, and the Army Test and Evaluation Command. These relationships exist for a simple reason: Accurate environmental intelligence reduces risk, lowers costs, and strengthens national security.

Dismantling NCAR is a national security threat. Defense agencies rely on specialized, mission-critical environmental products and expertise that are developed, maintained, and refined through streamlined, long-standing relationships with NCAR scientists. These capabilities cannot be replaced quickly without disruption, and even short gaps in trusted weather and environmental intelligence would increase operational risk for current and future missions. Protecting NCAR is an investment in military readiness, operational efficiency, and the safety of those who serve.

4. Americans in Disaster-Prone Areas Will Have Less Time to Prepare for, and Evacuate from, Extreme Weather

Since 1980, weather hazards have cost the United States thousands of lives and more than $3.1 trillion. In 2025 alone, disasters cost nearly 300 lives and $115 billion in damages to homes and businesses. And these weather hazards are expected to worsen because of our changing climate.

A 2010 study from the National Academies of Sciences, Engineering, and Medicine found that public weather forecasts and warnings deliver roughly $31.5 billion in annual economic benefits in the United States. These gains in preparedness and economic benefit would not have been possible without sustained scientific research from NCAR.

Hurricane forecasting provides a clear example of how NCAR research has secured the safety and mitigated the economic losses of residents and businesses. Since 1980, hurricanes have caused nearly $3 trillion in damages in the United States.

For decades, NCAR scientists have worked to develop and refine instruments and methods to collect real-time hurricane observations and improve our understanding of storm behavior. By the 1980s, data and modeling advances emerging from NCAR research were being used operationally by NOAA, contributing to a roughly 20%–30% improvement in the accuracy of hurricane track forecasts compared to earlier decades.

NCAR continues to enhance forecasting capabilities for hurricanes, as well as their associated flood risks, through the center’s sophisticated flood risk model. Today, the model is used operationally by the National Weather Service in more than 3,800 locations serving 3 million people.

If NCAR’s role in advancing forecast science is weakened by dismantling it, these gains in disaster preparedness will be put in jeopardy. Forecast improvements do not happen automatically; they require sustained research, coordination, and testing. If NCAR’s research capabilities to develop and improve weather forecasting disappear, the United States will face a major public safety risk.

5. Americans Lose a Unique Source of National Pride

NCAR was never designed to serve a select few. It was built with public investment to serve the nation as a whole.

NCAR was never designed to serve a select few. It was built with public investment to serve the nation as a whole. From its founding, NCAR embraced the idea that understanding the Earth system—its atmosphere, oceans, land, and ice—requires collaboration across institutions, disciplines, and generations, not isolated efforts working in parallel.

That collaborative model is embedded in how NCAR operates. It is stewarded by a consortium of more than 120 colleges and universities across the United States, representing a wide range of regions, institutional types, and scientific strengths. This structure allows knowledge, tools, and expertise to flow across the country, connecting large research universities with smaller institutions, federal agencies with academic scientists, and fundamental research with real-world applications for the public and private sectors. The result is a shared national capability that no single institution could sustain on its own.

There is something deeply American in that collaborative vision, a belief that publicly funded science should be openly shared, collectively advanced, and used to strengthen the common good. NCAR represents what is possible when a nation chooses to invest in science as a public good.

For more than 6 decades, NCAR has shown that open, collaborative science can save lives, support economic resilience and national defense, and expand opportunity across generations. Preserving and celebrating NCAR are choosing a future where shared knowledge, innovation, and public-serving science continue to thrive.

What We Must Do Now

This moment demands more than concern—it requires action.

First, NSF is requesting feedback regarding its intent to restructure NCAR. Feedback “will be used to inform NSF’s future actions with respect to the components of NCAR and to ensure the products, services, and tools provided in the future align with the needs and expectations of stakeholders to the extent practicable.”

Respond, and inform NSF about the value and benefits of all of NCAR, not only its constituent parts. Readers can submit comments through 13 March.

Second, Congress ultimately holds the authority to fund and protect NCAR, and lawmakers need to hear clearly that dismantling it would put the health, safety, and financial stability of Americans at risk. By October 2026, Congress will address the funding of NSF for next year; we must actively and consistently reach out to our congressional representatives now and throughout the year.

Readers can contact their members of Congress through easy-to-use resources provided by AGU and the Union of Concerned Scientists.

Author Information

Carlos Martinez (cmartinez@ucs.org) is a senior climate scientist with the Climate & Energy Program at the Union of Concerned Scientists.

Citation: Martinez, C. (2026), What Americans lose if their National Center for Atmospheric Research is dismantled, Eos, 107, https://doi.org/10.1029/2026EO260041. Published on 27 January 2026. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Rocks Formed by Microbes Absorb Carbon Day and Night

Tue, 01/27/2026 - 14:14

On every continent, unassuming rocks covered in a thin, slimy layer of microbes pull carbon from the air and deposit it as solid calcium carbonate rock. These are microbialites, rocks formed by communities of microorganisms that absorb nutrients from the environment and precipitate solid minerals. 

“We’re going to learn some critical information through this work that can add to our understanding of carbon cycling and carbon capture.”

A new study of South African coastal microbialites, published in Nature Communications, shows these microbial communities are taking up carbon at surprisingly high rates—even at night, when scientists hypothesized that uptake rates would fall. 

The rates discovered by the research team are “astonishing,” said Francesco Ricci, a microbiologist at Monash University in Australia who studies microbialites but was not involved in the new study. Ricci said the carbon-precipitating rates of the South African microbialites show that the systems are “extremely efficient” at creating geologically stable forms of carbon.

The study also related those rates to the genetic makeup of the microbial communities, shedding light on how the microbes there work together to pull carbon from the air.

Microbes that rely on photosynthesis live primarily in the top layer of a microbialite, while microbes with metabolisms that don’t require sunlight or oxygen reside deeper within. Credit: Thomas Bornman

“We’re going to learn some critical information through this work that can add to our understanding of carbon cycling and carbon capture,” said Rachel Sipler, a marine biogeochemist at the Bigelow Laboratory for Ocean Sciences in Maine. Sipler and her collaborator, Rosemary Dorrington, a marine biologist at Rhodes University in South Africa, led the new study.

Measuring Microbialites

Over several years and many visits to microbialite systems in coastal South Africa, Sipler and the research team measured different isotopes of carbon and nitrogen to study the microbial communities’ metabolisms and growth rates. They found that the structures grew almost 5 centimeters (2 inches) vertically each year, which translates to about 9–16 kilograms (20–35 pounds) of carbon dioxide sequestered every year per square meter (10.7 square feet) of microbialite. 

Results showed the microbialites absorbed carbon at nearly the same rates at night as they did during the day. Both the nighttime rates and the total amount of carbon precipitated by the system were surprisingly high, Ricci said.

 “Different organisms with different metabolic capacities work together, and they build something amazing.”

The traditional understanding of microbialite systems is that their carbon capture relies mostly on photosynthesis, which requires sunshine, making the high nighttime rate so surprising that Sipler and the team initially thought it was a mistake. “Oh, no, how did we mess up all these experiments,” she remembers thinking. But further analysis confirmed the results.

It makes sense that a community of microbes could work together in this way, Ricci said. During the day, photosynthesis produces organic material that fuels other microbial processes, some of which can be used by other organisms in the community to absorb carbon without light. As a result, carbon precipitation can continue when the Sun isn’t shining.

 “Different organisms with different metabolic capacities work together, and they build something amazing,” Sipler said.

Future Carbon Precipitation

The genetic diversity of the microbial community is key to creating the metabolisms that, together, build up microbialites. In their experiments, the research team also found that they were able to grow “baby microbialites” by taking a representative sample of the microbial community back to the lab. “We can form them in the lab and keep them growing,” Sipler said.

The findings could inform future carbon sequestration efforts: Because carbon is so concentrated in microbialites, microbialite growth is a more efficient way to capture carbon than other natural carbon sequestration processes, such as planting trees. And the carbon in a microbialite exists in a stable mineral form that can be more durable across time, Sipler said.

Additional microbialite research could uncover new metabolic pathways that may, for example, process hydrogen or capture carbon in new ways, said Ricci, who owns a pet microbialite (“very low maintenance”). “They are definitely a system to explore more for biotechnological applications.”

Sipler said the next steps for her team will be to continue testing the microbial communities in the lab to determine how the microbialite growth rate may vary under different environmental conditions and to explore how that growth can be optimized. 

“This is an amazing observation that we and others will be building on for a very long time,” she said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Citation: van Deelen, G. (2026), Rocks formed by microbes absorb carbon day and night, Eos, 107, https://doi.org/10.1029/2026EO260037. Published on 27 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Cows, Coal, and Chemistry: The Role of Photochemistry in Methane Budget

Tue, 01/27/2026 - 14:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Methane is the second-most important greenhouse gas and is increasing in the atmosphere. Unlike CO2, which is taken up by the land and oceans, CH4 (methane) is destroyed in the atmosphere, mostly by reaction with OH (methane-hydroxyl radical). As methane is one of the largest sinks for the OH radical, it is also a control over atmospheric OH concentration, which in turn controls the lifetime of CH4 in the atmosphere, creating a feedback.

He et al. [2026] shows how the recent increases can best be explained by enforcing consistence between three terms: the CH4 concentration itself, the isotopic concentration of CH4 which reflects sources with different signatures, and the abundance of OH simulated with a state-of-the art chemistry model. The results show that changes to atmospheric CH4 are best explained by a mix of increasing (tropical agriculture), and decreasing (biomass burning) sinks, modulated by the global OH trend. The authors also find that that the fate of emitted CH4 in the atmosphere is sensitive to chemical feedbacks, which, if ignored, could lead to incorrect assumptions about sources, and hence, diminish the effectiveness of mitigation.

Citation: He, J., Naik, V., & Horowitz, L. W. (2026). Interpreting changes in global methane budget in a chemistry-climate model constrained with methane and isotopic observations. AGU Advances, 7, e2025AV001822. https://doi.org/10.1029/2025AV001822

—David Schimel, Editor, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Report: 13 Great Lakes’ Worth of Water Underlies the Contiguous United States

Mon, 01/26/2026 - 14:14

It’s not easy to determine how much water there is across a landscape. A measly 1% of Earth’s freshwater is on the surface, where it can be seen and measured with relative ease. But beneath that, measurements vary massively depending on water table depth and ground porosity we can’t directly see.

“We’re operating in a situation where we don’t know how much is going into the savings account every month, and we don’t know how much is in our savings account.”

Reed Maxwell, a hydrologist at Princeton University, likes to think of rainfall, snow, and surface water as a checking account used for short-term water management needs and groundwater as a savings account, where a larger sum should, ideally, be building up over time.

“We’re operating in a situation where we don’t know how much is going into the savings account every month, and we don’t know how much is in our savings account,” he said.

But a new groundwater map by Maxwell and colleagues offers the highest-resolution estimate so far of the amount of groundwater in the contiguous United States: about 306,500 cubic kilometers. That’s 13 times the volume of all the Great Lakes combined, almost 7 times the amount of water discharged by all rivers on Earth in a year. This estimate, made at 30-meter resolution, includes all groundwater to a depth of 392 meters, the deepest for which reliable porosity data exist. Previous estimates using similar constraints have ranged from 159,000 to 570,000 cubic kilometers.

“It’s definitely a move forward from some of the previous [mapping] efforts,” said Grant Ferguson, a hydrogeologist at the University of Saskatchewan who was not involved in the research. “They’re looking at much better resolution than we have in the past and using some interesting techniques.”

Well, Well, Well

Past estimations of groundwater quantity have been based largely on well observations.

“That’s the really crazy thing about groundwater in general,” said Laura Condon, a hydrologist at the University of Arizona and a coauthor of the paper. “We have these pinpricks into the subsurface where there’s a well, they take a measurement of how deep down the water table depth is, and that’s what we have to work with.”

But not all wells are measured regularly. For obvious reasons, there tend to be more wells in places where more groundwater is present, making data on areas with less groundwater scarcer. And a well represents just one point, whereas water table depth can vary greatly over short distances.

Researchers have used these data points, as well as knowledge of the physics of how water flows underground, to model water table depth at a resolution of about 1 kilometer. They’ve also used satellite data to capture large-scale trends in water movement. But those data are of lower resolution: Data from NASA’s GRACE (Gravity Recovery and Climate Experiment) Tellus mission, for instance, have a resolution of about 300 kilometers, about 10,000 times coarser than the new map.

To demonstrate the value of high-resolution data, the team showed what happened when they decreased the resolution of their entire map from 30 meters to 100 kilometers—the spatial resolution of many global hydrologic models. The resulting more pixelated map estimated just above 252,000 cubic kilometers of water, an underestimation of 18% compared to the new map.

In addition to identifying groundwater quantities at high resolution, the new map reveals more nuanced information about known groundwater sources.

For instance, it shows that about 40% of the land in the contiguous United States has a water table depth shallower than 10 meters. “That 10-meter range is that range where you can have groundwater–plant–land surface interactions,” Condon said. “And so that’s just really pointing to how connected those systems are.”

Bias for Good

The new work used direct well measurements as well as satellite data—about a million measurements, made between 1895 and 2023—along with maps of precipitation, temperature, hydraulic conductivity, soil texture, elevation, and distance of streams. Then, the scientists used the data to train a machine learning model.

In addition to its being able to quickly sort through so many data points, Maxwell noted another benefit of the machine learning approach that might sound unexpected: its bias. Early groundwater estimates were relatively simplistic, not accounting for either hydrogeology or the fact that humans themselves pump water out of the ground. The team’s machine learning approach was able to incorporate that information because evidence of groundwater pumping was present in the data used to train it.

“When you hear about bias in machine learning all the time, it’s usually in a negative connotation, right?” Maxwell said. “As it turns out, when you can’t disentangle the signal of groundwater pumping and groundwater depletion from the almost 1 million observations that we used to train this machine learning approach, it implicitly learned that bias.… It’s learned the pumping signals, it’s learned the human depletion signal.”

“Wherever you’re standing, dig down, and there’s water down there somewhere.”

Maxwell and the other researchers hope the map can be a resource for regional water management decisionmakers, as well as for farmers making decisions about irrigation. Condon added that she hopes it raises awareness of groundwater in general.

“Groundwater is literally everywhere all the time,” she said. The map is “filled in everywhere, wherever you are. Some places it’s 300 meters deep, some places it’s 1 meter deep. But wherever you’re standing, dig down, and there’s water down there somewhere.”

—Emily Gardner (@emfurd.bsky.social), Associate Editor

Citation: Gardner, E. (2026), Report: 13 Great Lakes’ worth of water underlies the contiguous United States, Eos, 107, https://doi.org/10.1029/2026EO260036. Published on 26 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Calibrating the Clocks: Reconciling Groundwater Age from Two Isotopes

Mon, 01/26/2026 - 14:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Water Resources Research

A crucial source of freshwater, groundwater is vulnerable to contamination and overuse. Knowing how long groundwater has been under ground is critical for sustainable management of this resource. The Carbon-14 (14C) and Argon-39 (39Ar) isotopes are environmental tracers especially suited for dating groundwater aged between 50 and 30,000 years. However, ages obtained from previous analyses of these two tracers disagreed with each other.

Musy et al. [2025] use a quantitative framework to understand the effect of groundwater flow within the Earth’s subsurface on the age calculated from 14C and 39Ar measured in aquifers in Denmark. Reactions that affect 14C, the production of 39Ar in the subsurface, and the existence of slow and fast paths for groundwater flow, such as in fractured aquifers, explain the differences observed between age estimates. Accounting for these processes leads to more accurate estimate of groundwater residence times and supports better water resource management.

Citation: Musy, S. L., Hinsby, K., Wachs, D., Sültenfuss, J., Troldborg, L., Aeschbach, W., et al. (2025). Bridging the 39Ar–14C groundwater dating gap: A dual-permeability transport perspective based on numerical modeling and field data. Water Resources Research, 61, e2025WR040370. https://doi.org/10.1029/2025WR040370

—Sergi Molins, Associate Editor, Water Resources Research

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Kyanite Exsolution Reveals Ultra-Deep Subduction of Continents

Fri, 01/23/2026 - 14:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Journal of Geophysical Research: Solid Earth

Understanding how deep continental rocks can be subducted into the Earth’s mantle is essential to understand lithospheric recycling, reconstructing deep subduction and exhumation processes. Minerals formed at great depths often preserve microscopic “exsolution” features, where one mineral separates out from another during cooling or decompression, but their interpretation has remained debated.

Li et al. [2025] performed the first systematic laboratory experiments on kyanite exsolution from aluminiferous stishovite, a high-pressure polymorph of SiO2 (silicon dioxide) stable at depths exceeding 300 kilometers. The experiments show that aluminum almost completely separates from stishovite to form kyanite during decompression, producing distinctive microscopic textures. These findings address a long-standing debate about whether a specific crystallographic relationship between exsolved phases and their host mineral is required to identify exsolution microstructures.

Importantly, the study demonstrates that a strict crystallographic alignment between the host mineral and exsolved phases is helpful but not always required to identify true exsolution. These results provide a robust experimental framework for interpreting similar microstructures observed in natural rocks. Overall, the findings offer compelling new evidence that continental rocks can undergo ultra-deep subduction into the mantle depths of at least about 300 kilometers and later be exhumed back to the Earth’s surface.

Citation: Li, X., Wang, C., Liu, L., Kang, L., Xu, H. J., Zhang, J., et al. (2025). Kyanite exsolution from aluminiferous stishovite in laboratory experiments: New insights into continental ultra-deep subduction. Journal of Geophysical Research: Solid Earth, 130, e2025JB031612. https://doi.org/10.1029/2025JB031612

—Jun Tsuchiya, Editor; and Sujoy Ghosh, Associate Editor, JGR: Solid Earth

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Discovering Venus on Iceland

Fri, 01/23/2026 - 13:54

In August 2023, 18 scientists and engineers spent 15 days in barren regions of Iceland to test how well instruments on the VERITAS (Venus Emissivity, Radio Science, InSAR, Topography, and Spectroscopy) spacecraft will perform when investigating the surface of Venus from orbit. This testing was a critical step in developing procedures to enhance the science output of the mission, which will provide the first new data about the planet’s surface in more than 3 decades.

Iceland might not seem an obvious choice as an analogue for Venus.

Among other tasks, the team—including us—traversed rugged Icelandic terrain to sample lava flows for analysis on-site and, later, in the lab. We also used on-the-ground observations of the flows to calibrate and verify corresponding airborne-detected radar signatures, information that will be used to help interpret the radar data they collect on the Venus mission.

At first glance, Iceland might not seem an obvious choice as an analogue for Venus. After all, Iceland is cool, wet, and near the Arctic Circle. Venus is famous for its extremely hot and dry surface, where average temperatures are roughly 870°F (465°C). Yet the two share key commonalities as well.

Iceland is covered in basalt, the same rock that’s thought to make up the low-lying plains on Venus. Also, Iceland is underlain by an active mantle plume that feeds its volcanic vents, and Venus too shows evidence of having mantle plumes below its surface. With these similarities, scientists can make direct comparisons between the diverse morphologies, compositions, and signatures of Icelandic lava flows and those of flows on Venus. Such comparisons are needed to ensure that we can accurately interpret the data VERITAS sends back to answer long-standing questions about our most Earth-like neighbor.

Reading the Radar Signals

Magellan, the last mission to observe Venus’s surface, ended more than 31 years ago after spending 4 years in orbit. Data from Magellan offered unprecedented glimpses of the planet and opened new lines of inquiry for scientists. However, these data are relatively low resolution by today’s standards, complicating efforts to resolve surface features and understand how they relate to Venus’s geologic past.

In June 2021, NASA selected the Discovery-class VERITAS mission as a long-awaited follow-up to Magellan because its suite of instruments have the potential to reveal the processes that caused the evolution of Venus and Earth to diverge. While Venus likely once had surface water and a planetary dynamo, these essential elements of habitability are long gone. However, tectonism and volcanism, driven by robust internal heat production with associated outgassing, probably persist today, as suggested by evidence in a recent study. If Venus has active volcanism and tectonism, then VERITAS should be able to confirm that identification and to detect surface activity that has occurred since Magellan’s visit, such as new volcanic flows and fault scarps.

The VERITAS payload includes a synthetic aperture radar (SAR) platform to view the planet’s surface and to make topographic measurements using a technique called single-pass radar interferometry. Venus’s thick atmosphere precludes the use of visible light imaging for these purposes, leaving SAR as the only current way to observe its surface over wide areas at high resolution.

Like other radar systems, space-based SAR works by transmitting radio waves to a planet and then detecting signals reflected back to a receiver, which gives information about the surface. Radar data are fundamentally different from visible imagery, as the brightness of radar returns depends not only on surface material properties such as albedo and color, but also on surface roughness and electrical permittivity, and on other effects such as the polarization of radar signals and their penetration into a planet’s surface.

Topography is a key metric for unlocking the geologic processes that have shaped the evolutionary history of a planet.

This complexity makes it difficult to determine the geological properties of structures on Venus’s surface directly from their radar signatures. It is impossible to tell from orbital data alone whether any particular radar signature is caused by a rock’s roughness or its composition, because we do not have samples of Venus to test.

Topography is a key metric for unlocking the geologic processes that have shaped the evolutionary history of a planet. Existing topographic data from Venus were obtained by radar altimetry during Magellan at a spatial resolution of 15–20 kilometers and a vertical accuracy of 80–100 meters, each over an order of magnitude coarser than what’s available for other terrestrial bodies.

VERITAS will measure topography using single-pass radar interferometry with a spatial resolution of 240 meters and a vertical accuracy of 5 meters, which is in line with data from the Moon, Mars, and Mercury. This sharper view will dramatically improve scientists’ ability to compare Venus with these bodies and help decipher why it evolved so differently from Earth.

In addition to its radar capabilities, VERITAS’s Venus Emissivity Mapper (VEM) spectrometer will provide the first global-scale view of surface rock types, allowing discrimination of felsic from mafic rocks based on their iron content. These data will help scientists answer key questions about Venus’s history of volcanism and how it shaped the planet’s young surface, as well as about whether large plateaus called tesserae have a similar composition and origin as Earth’s continents (and whether they formed in the presence of water).

What Are Venus’s Rocks Made Of?

Geologic maps of Earth represent the composition and age of rocks in defining geologic units. On Venus, geologic units have been defined based on radar imagery alone, so scientists have had to make assumptions about the composition and formation of features by comparing their morphologies to those of well-known terrestrial features. However, without accurate knowledge of what the Venus rocks are made of, it is difficult to confirm hypotheses of the planet’s geologic history.

  • The VERITAS field campaign explored remote regions of Iceland and encountered rugged conditions. Specialized off-road vehicles were required to access the field areas studied. Credit: Gaetano Di Achille
  • Team members sometimes had to drive through running streams, as seen here on the drive from Mývatn to Askja. Credit: Debra Buczkowski
  • The field team cooked, ate, slept, socialized, and analyzed data at campsites comprising two large tents surrounded by smaller personal tents for sleeping. Credit: Debra Buczkowski
  • Terrain in several areas, including here at Askja, was extremely rough, which made for difficult hiking. Credit: Debra Buczkowski

Unlike with Mars or the Moon, ground truth of Venus orbital data is severely limited. The thick atmosphere obstructs remote sensing of rock composition from orbit. And past landers have not survived long enough on the surface to perform extensive testing, primarily because of the extremely hostile temperatures and atmospheric pressure (90 times that of Earth’s) at the surface. The VERITAS field campaign was therefore intended as a reality check, to test our geologic interpretations of radar observations.

Iceland’s extensive lava fields host a variety of volcanic and tectonic features similar to those observed on Venus.

The first goal of the campaign was to improve our ability to process, analyze, and interpret VERITAS-like data for the purpose of understanding Venus’s geology. The expedition in Iceland was an opportunity to create a library of radar signatures associated with specific surface features in volcanic landscapes, with direct measurements of both roughness and composition. The second goal was to test the methodologies and the approach that VERITAS will use to detect surface changes when it arrives at Venus.

Iceland’s extensive lava fields host a variety of volcanic and tectonic features similar to those observed on Venus, making them excellent choices as Venus analogues. The comparative lack of both vegetation and erosion at these sites makes them more comparable to those on Venus than basaltic lava fields elsewhere on Earth. In addition, the relative ages of different Icelandic lava flows are known and well documented, which allows us to determine whether radar data can be reliably used to tease out the ages of flows on Venus.

Three Sites, Three Environments

We focused the field campaign on three main basaltic lava flow fields: Askja, Holuhraun, and Fagradalsfjall. The diversity of geologic landforms within these sites enabled study of a range of features analogous to those that VERITAS will target on Venus. These features include plains volcanism, lobate flows, lava morphologies such as pāhoehoe and a’a, compositions ranging from basaltic to rhyolitic, pyroclastic airfall and wind-driven sedimentary deposits covering volcanic bedrock, tectonic rifts, and small-scale graben. The study areas also allowed us to investigate landforms created by interactions between sediment, tectonic structures, and lava flows and how these features appear in SAR data collected from orbit at different signal frequencies and incidence angles.

  • The lava fields studied during the field campaign included Askja and Holuhraun, in Iceland’s central highlands, and Fagradalsfjall, on the Reykjanes Peninsula in the country’s southwest (top left). White lines within the red boxes represent the flight lines flown by the German Aerospace Center (DLR) to collect synthetic aperture radar (SAR) data. Credit: Dan Nunes (map imagery and data: Google, IBCAO, Landsat, Copernicus, SIO, NOAA, U.S. Navy, NGA, GEBCO)
  • A tripod-mounted lidar instrument was used to take topographic measurements at all field sites, including here at Holuhraun. Credit: Sue Smrekar
  • Researchers collect lidar measurements at the Fagradalsfjall field site. Credit: Dan Nunes
  • Field campaign team members work with lidar instrumentation at the Askja site. Orange flags served as tie points for georeferencing the lidar images to airborne radar data. Credit: Debra Buczkowski

The Askja lava field, located in Iceland’s central highlands, is sourced from a central volcano and includes multiple areas with differing textures due to variations in the extent of sedimentation and erosion. Some Askja flows are covered with rhyolitic tephra and basaltic sand, offering additional textural and compositional diversity for study. Volcanism has occurred in the area for thousands of years, with the youngest flow (Vikrahraun) erupting in 1961.

Although geographically close to Askja, the Holuhraun lava field is sourced from a different magmatic reservoir and erupted from fissures. Sand sheets interact with the edges of the Holuhraun flows, especially along their northern boundary. The field also includes an extremely rough flow that was emplaced only about 10 years ago (2014–2015).

Located in southwestern Iceland on the Reykjanes Peninsula, the flows at Fagradalsfjall are even more recent, erupting from fissures starting in 2021 and continuing through 2025. (In fact, Fagradalsfjall was actively erupting at the beginning of the field campaign.) These recent flows, including pāhoehoe flows, are significantly smoother than those at Holuhraun and, because of their young age, have relatively little sediment coverage. Lava ponds and channels are also common here.

Air and Ground Campaigns The F-SAR sensor was installed on DLR’s Dornier 228 aircraft and flown out of Keflavik International Airport (left). The radar antenna mount is on the side of the fuselage just aft of the rear wheels. One of three trihedral radar reflectors deployed for use as reference targets during the campaign is seen at right. Credit: Marc Jaeger

The field campaign comprised both airborne and ground components. The German Aerospace Center (DLR), one of several agencies partnering with NASA on VERITAS, ran the airborne component, flying their F-SAR sensor aboard a twin-propeller plane to collect SAR data at three wavelengths (X-, S-, and L-band) at the same time the ground campaign team members visited each site.

The extensive multifrequency SAR dataset that DLR acquired covers the diverse geological features of the three lava fields and includes imagery that represents the differing spatial and vertical resolution capabilities of the Magellan and VERITAS missions, as well as those of the upcoming European Space Agency EnVision mission to Venus. Figure 1 shows an example of derived F-SAR topographic data for a lava flow at Holuhraun at simulated Magellan and VERITAS resolutions. Whereas the Magellan-like data only allow determination of the general slope of the landscape over a spatial scale of tens of kilometers, the VERITAS-like data enable spatial and vertical discrimination of distinct geologic units.

Fig. 1. A radar backscatter image above the Askja and Holuhraun lava fields (left) is seen here beside digital elevation models (DEM) produced with SAR topographic data at resolutions simulating those of Magellan radar altimeter data (center) and VERITAS radar altimeter data (right). White arrows point to the Holuhraun flow boundaries in all three images. Whereas at Magellan resolution, only a general regional slope can be discerned, at VERITAS resolution, it’s possible to pick out individual lava flows as well as Vaðalda Mountain. Credit: Scott Hensley

F-SAR operations also included deploying radar reflectors that were used as reference targets and regularly imaged to monitor sensor calibration and instrument stability throughout the campaign. In addition, a subset of the raw SAR data acquired was processed on-site within hours of each flight, providing imagery to inform the field teams’ site selection and prioritization within each lava field.

Team members braved river crossings and trekked across often-jagged rocks to take samples and collect information on the surface roughness and composition of the rocks being scanned from the air.

Concurrent with the radar data collection, VERITAS team members braved river crossings and trekked across often-jagged rocks to take samples and collect information on the surface roughness and composition of the rocks being scanned from the air. We simultaneously used lidar scanners to take topographic measurements at all field sites to compare with radar detections and a probe to determine electrical permittivity in sedimented areas. These measurements allowed us to determine how much of the radar backscatter signature at each site was due to the permittivity, rather than to roughness.

In addition, we used a field prototype of VERITAS’s VEM instrument, called the Vemulator, on-site to identify different rock types and compositions. Rock and sediment samples from all field sites were later tested in the lab to confirm field measurements of composition and permittivity, including those from the Vemulator.

Details Come into Focus

Following the field campaign, team members produced maps of the lava flows at all three sites based solely on the radar data, exactly as Venus researchers have made geologic maps of Venus using Magellan data. The new maps were made at three different resolutions: the resolution of the old Magellan data, the resolution of the VERITAS SAR, and the highest resolution available with the data collected during the campaign.

The improvement in SAR resolution from Magellan to VERITAS will permit observations of previously unidentified features on Venus (Figures 2 and 3). Views of the Holuhraun flow at Magellan resolution, for example, are too coarse to discern distinct lava flow units, or facies, whereas at the VERITAS resolution, separate facies, a small vent, and several lava ponds can be distinguished. Being able to identify similar features on Venus will allow us to detect changes on the surface since Magellan’s visit that would indicate recent volcanism, helping to better understand the planet’s volcanic history.

Fig. 2. SAR imagery of the Holuhraun flow is seen here at Magellan’s lower resolution and VERITAS’s higher resolution. In the latter case, lava ponds and a volcanic vent are observed (white arrows). Black boxes indicate the southwestern part of the flow that’s magnified in Figure 3. Credit: Debra Buczkowski Fig. 3. Even when magnified, no features within the Holuhraun flow can be discerned at Magellan resolution, whereas at VERITAS resolution, the volcanic vent (white arrow) and distinct flows coming from it are visible, as are other flow facies. Credit: Debra Buczkowski

Comparing our new aerial-radar-derived maps of the Iceland field sites with published maps based on ground observations enabled us to assess how well our flow boundaries matched what’s seen from the ground. In addition, we were able to determine how similar the observed radar properties were to actual flow composition and roughness. Once the flow boundaries were defined in the SAR datasets, we could also determine how overlying sediment influenced the radar appearances of different lava flows. This information could provide insight into how ashfalls or pyroclastic materials on Venus might obscure or alter the radar signature of underlying rocks.

When it arrives at its destination, VERITAS will create foundational datasets of high-resolution imaging, topography, and spectroscopy of Venus.

When it arrives at its destination, VERITAS will create foundational datasets of high-resolution imaging, topography, and spectroscopy of Venus. These datasets will be on par with those that have revolutionized our understanding of Mercury, Mars, and the Moon.

The 2023 field campaign served as both a test of the VERITAS instruments and a demonstration of what their improved capabilities will offer at Venus. Indeed, the campaign’s success demonstrated how VERITAS will make new discoveries and improve our knowledge of the planet’s past and present, and that it could lay the groundwork to optimize the science return of future Venus missions.

Author Information

Debra L. Buczkowski (debra.buczkowski@jhuapl.edu), Johns Hopkins Applied Physics Laboratory, Laurel, Md.; Jennifer L. Whitten, National Air and Space Museum, Smithsonian Institution, Washington, D.C.; Scott Hensley and Daniel C. Nunes, Jet Propulsion Laboratory, California Institute of Technology, Pasadena; and Marc Jaeger, Microwaves and Radar Institute, German Aerospace Center, Oberpfaffenhofen, Germany

Citation: Buczkowski, D. L., J. L. Whitten, S. Hensley, D. C. Nunes, and M. Jaeger (2026), Discovering Venus on Iceland, Eos, 107, https://doi.org/10.1029/2026EO260032. Published on 23 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Planet Labs image of the 13 January 2026 landslide at Burutsi village, in the Democratic Republic of Congo

Fri, 01/23/2026 - 11:23

A new satellite image confirms that over 15 houses were buried in a landslide that took the lives of almost 30 people.

Back on 15 January, I wrote about the 3 January 2026 landslide at Burutsi village, in the Democratic Republic of Congo. This landslide killed 28 people and injured 20 more.

This is a remote area, so getting detailed information about the location is very challenging. It is also very cloudy, limiting satellite imagery. However, on 21 January 2028, Planet Labs captured an image of the area using one of their Super Dove instruments. This is the image, draped onto the Google Earth DEM:-

Planet Labs image of the 14 January 2026 landslide at Burutsi in the DRC. Image copyright Planet Labs, captured on 21 January 2026, used with permission.

This is a Google Earth image from 2024 of the same area:-

Google Earth image of the site of the 14 January 2026 landslide at Burutsi in the DRC. Image captured on 8 January 2024.

And here is a slider to compare the two:-

This is a Google Earth image of the affected are in more detail:-

Google Earth image of the site of the 14 January 2026 landslide at Burutsi in the DRC. Image captured on 8 January 2024.

There is nothing obvious in the imagery to suggest that this slope was dangerous, noting of course the masking effect of the dense forest. As reported in the media, the landslide buried about 17 houses and closed the road.

The imagery clarifies the location of the landslide – it’s at [-1.30050, 28.66080].

Acknowledgement

Thanks as ever to the kind people at Planet Labs for providing access to their amazing imagery.

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Coastal Coralline Algae Naturally Survive Persistent, Extreme Low pH

Thu, 01/22/2026 - 19:11
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Journal of Geophysical Research: Biogeosciences

Ocean acidification is known to have major impacts on marine habitats under projected climate change. How vulnerable marine organisms in these habitats are to acidification largely depends on the variability of environmental conditions, such as pH, they experience naturally.

Burdett et al. [2025] provide precious time-series evidence that, unlike the open ocean, coastal ecosystems experience high natural environmental variability. For about two thirds of the year, the monitored coastal coralline algae reef was exposed to pH levels as low as those expected for the year 2100 under IPCC projections. The pH levels varied considerably throughout the day and between seasons, associated with biological activity, tidal cycling, and water temperature. Long‐term exposure to such low pH conditions and high variability may help coralline algal communities to adapt to future acidification, providing a level of optimism for the survival of this globally distributed biodiverse habitat.

Citation: Burdett, H. L., Mao, J., Foster, G. L., & Kamenos, N. A. (2025). Persistence of extreme low pH in a coralline algae habitat. Journal of Geophysical Research: Biogeosciences, 130, e2025JG009062. https://doi.org/10.1029/2025JG009062

—Xiaojuan Feng, Associate Editor, JGR: Biogeosciences

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Early news about the 22 January 2026 landslide at Mount Maunganui in New Zealand

Thu, 01/22/2026 - 08:10

Eight people have been killed or are missing in two landslides triggered by heavy rainfall in New Zealand

Substantial parts of New Zeealand have been suffering extreme rainfall – yet again – causing floods and landslides. The most serious event to date occurred at a camp site at Mount Maunganui on the Bay of Plenty in the North Island. Here, a landslide devastated a campsite close to the coast. Unfortunately, January is the main summer holiday period in New Zealand.

Stuff has a video of the landslide as it occurred. Meanwhile, The Guardian has a Youtube video with imagery of the aftermath:-

This still shows the basic components of the failure:-

The aftermath of the 22 January 2026 landslide at Mount Maunganui. Still from a video posted to Youtube.

The location is reported to be the Mount Maunganui Beachside Holiday Park. This makes the location [-37.63234, 176.17507]. This is Google Earth image of the site:-

Google Earth image of the site of the 22 January 2026 landslide at Mount Maunganui.

The image suggests a complex geology, with maybe a hint of previous landslides (this is very speculative). The geology of this area is primarily volcanic rocks, which may indicate a high landslide susceptibility. The images of the aftermath appear to suggest deeply weathered soils, and note the amount of water flowing through the debris.

News reports indicate that at least six people are missing, some of whom are children., The authorities are continuing to describe the operation at the site as a rescue.

Meanwhile, two other people were killed by an early morning landslide at Welcome Bay Road in Papamoa, also on the Bay of Plenty. This appears to have occurred at about [-37.7231, 176.20896]. One News has an image of the aftermath of the event that appears to show multiple shallow landslides on the same hillside.

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Trump Administration to Speed Up Permitting for Deep Sea Mining, Even Beyond U.S. Boundaries

Wed, 01/21/2026 - 18:07
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

NOAA has finalized a rule that will expedite the permit and license application process for deep seabed mining and allow companies to mine beyond U.S. jurisdictional boundaries.

The changes were published in a 113-page regulation on 21 January.

The changes revise the Deep Seabed Hard Mineral Resources Act (DSHMRA) of 1980, which required individuals or corporations who wanted to explore and mine mineral-rich nodules in the deep sea to apply for an exploration license and a commercial recovery (large-scale extraction) permit separately. Now, applicants may apply for both the exploration license and commercial recovery permit at the same time. 

“By issuing the permit simultaneously, they’re committing to exploitation without the information that you would need to evaluate its impacts.”

“Deep seabed mining is key to unlocking a domestic source of critical minerals for the United States,” Neil Jacobs, NOAA administrator, said in a statement. “This consolidation modernizes the law and supports the America First agenda by enabling U.S. companies to access these resources more quickly, strengthening our nation’s economic resilience and advancing the discovery and use of critical seafloor minerals.” 

Critics are concerned that the move will loosen environmental oversight. “By issuing the permit simultaneously, they’re committing to exploitation without the information that you would need to evaluate its impacts,” Emily Jeffers, senior attorney at the Center for Biological Diversity, told Agence France-Presse.

Beyond U.S. Boundaries

The updated rule also states that DSHMRA gives NOAA the ability to issue exploration licenses and permits for the seabed beyond national jurisdiction. The International Seabed Authority (ISA), an autonomous international governing body, regulates deep sea mining in international waters for countries that are part of the 1982 Law of the Sea Convention. The United States has never been a party to that treaty but has mostly followed its guidelines.

Now, NOAA’s insistence that the United States can regulate U.S. companies’ deep sea mining beyond U.S. waters is expected to cause controversy among members of the ISA, which has for years been negotiating rules to govern mining in international waters. In December, the Trump administration announced it had received an application for mining exploration in international waters from the Metals Company.

The final rule follows an executive order issued last year calling for the rapid development of deep sea mining capabilities both domestically and beyond U.S. jurisdictional boundaries.

 
Related

In a response to that order, the ISA called it “surprising because for over 30 years the US has been a reliable observer and significant contributor to the negotiations of the International Seabed Authority.” In the statement, the ISA also said any unilateral action to mine the deep sea “sets a dangerous precedent that could destabilize the entire system of global ocean governance.”

The deep sea has never been commercially mined. Compared to other ecosystems, little is known about the ecology of the ocean floor or how these ecosystems support marine life. Disturbing these ecosystems could have wide-ranging consequences.

“Once nodules are removed by mining, all biodiversity and functions directly dependent on the minerals will be lost for millions of years at the mined location, as nodules need millions of years to re-form,” Sabine Gollner, a deep-sea marine biologist at the Royal Netherlands Institute for Sea Research told Eos in 2024. 

—Grace van Deelen (@gvd.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Why Are River Deltas Disappearing? They’re Sinking Faster Than Many People Realize

Wed, 01/21/2026 - 13:54

This story was originally published in the Louisiana Illuminator.

A new study says river deltas around the world aren’t just disappearing because of rising seas, but also because the land itself is sinking down into the waters, either as fast or faster than the rising oceans.

Researchers found some of the most rapid sinking is happening along the Mississippi River Delta in Louisiana. The study aims to better guide coastal restoration in disappearing river deltas around the globe, helping leaders, scientists and people living in coastal communities with hard decisions on what can— and should—be saved.

“Coastal areas account for less than 1% of the entire land area we have,” said Leonard Ohenhen, a professor at the University of California, Irvine and the lead author of the study. “But a whole significant population, more than 600 million people, live in those areas.”

“You have a sort of a hodgepodge of different reasons why deltas are sinking.”

The study, published this week in the academic journal Nature, found the contributions of  subsidence, or slowly submerging land, to disappearing coasts is often overlooked.

The fight to preserve rapidly sinking land has been a decades-long battle in the Mississippi River Delta, as well as a source of contention between scientific and political figures in the state. But deltas across the world are sinking, too, and fast.

“You have a sort of a hodgepodge of different reasons why deltas are sinking,” Ohenhen said.

He said river deltas naturally sink to some degree, with sediment carried downstream by rivers piling up and pushing down on the spongy, soft land already there. Humans can accelerate this natural process by engineering rivers such as the Mississippi and by extracting groundwater or oil.

“Relative sea level rise in the area is also really important. That’s the sea level rise plus subsidence,” said Alisha Renfro, a coastal scientist with the National Wildlife Federation. “It really helps us understand where we can make investments in restoration long-term that we might actually be able to hold on to.”

This map from the report shows which areas of the Mississippi River Delta are sinking. Areas in red and yellow are areas of land sinking more rapidly, while spots in blue and purple are building land upwards. Credit: University of California, Irvine

Lack of sediment is the main driver of subsidence in the Mississippi River Delta, Ohenhen said, creating hotspots of rapidly sinking pieces of land amid slightly more stable areas. Most of the deltas studied in the paper, around 70%, have subsidence problems primarily the result of groundwater withdrawal. But some, like the Amazon and Mississippi deltas, had subsidence issues driven by the disappearance of river-carried sediment to replenish the delta’s land.

Putting hard numbers to and pinpointing causes of subsidence—like human activity—is invaluable to restoring coastal land.

“I would say that really validates what, not just my organization, but what a lot of people have recognized for a long time—that this was a significant contributing factor in subsidence,” said James Karst with the nonprofit advocacy group Coalition to Restore Coastal Louisiana, referring specifically to the lack of sediment sent to the Mississippi River Delta.

Decisions about what pieces of the coast can be saved are even more urgent with the cancellation of two large-scale restoration projects in Louisiana.

Known as sediment diversions, the Mid-Barataria and Mid-Breton plans would have diverted freshwater from the Mississippi River into surrounding wetlands. They were scrapped by the state because of the prospective impact on fisheries for oysters, crabs and other marine species. Fish and oyster harvesters celebrated the projects’ demise, while scientists and coastal restoration advocates warned that time is running out to save the coast.

Boaters fish in the canals and wetlands just outside of New Orleans, Louisiana. Coastal restoration projects spearheaded by the state hope to preserve areas of subsiding land that are at risk of disappearing. Credit: Elise Plunk/Louisiana Illuminator

“In light of the cancellation of Mid-Barataria, I think what we, everybody, should be thinking of is, ‘What is the next best thing?’” Karst said. “Clearly it is not going to move forward, but we can’t do nothing.”

“People should be aware that we are in a part of the world that is changing and that is changing rapidly.”

“People should be aware that we are in a part of the world that is changing and that is changing rapidly,” he added. “If we want to position ourselves as individuals and as communities, we should be anticipating these changes and anticipating how they will affect us.”

While the average rate of subsidence for the Mississippi River Delta is around 3.3 millimeters per year, Ohenhen said, some areas of Louisiana are sinking at a rate of 3 centimeters per year, one of the fastest rates of all the deltas studied. That is paired with sea level rising by at least 7 millimeters per year along the Gulf Coast, he said, also one of the highest rates in the world. This puts some areas of Louisiana’s land at higher risk of loss than anywhere else.

“In the Mississippi River Delta, for example, that is one of the only deltas in the world where you have active relocation of people from the delta due to land loss,” Ohenhen said. “The time that we need to respond to these changes is now before the situation gets significantly worse.”

—Elise Plunk (@plunk.bsky.social), Louisiana Illuminator

This story is a product of the Mississippi River Basin Ag & Water Desk, an independent reporting network based at the University of Missouri in partnership with Report for America, with major funding from the Walton Family Foundation.

The underlying causes of the 8 February 2025 Junlian rock avalanche in Sichuan Province, China

Wed, 01/21/2026 - 07:27

A new paper (Jia et al. 2026) has found that the 8 February 2025 Junlian rock avalanche was caused by progressive weakening of the rock mass through wetting and drying cycles.

On 8 February 2025, the major Junlian rock avalanche landslide occurred at Jinping Village in Sichuan Province, China. A paper (Jia et al. 2026) has now been published in the journal Landslides that provides more details about the possible causes of this event. This link should provide access to the paper.

An earlier paper (Zhao et al. 2025), which I noted in June, has already described this landslide. This is a photograph of the aftermath of this event:

The aftermath of the 8 February 2025 Junlian rock avalanche in Sichuan, China. Image by Xinhua.

Unfortunately, the paper does not give a lat / long for this landslide, but I have previously noted that it is at [27.99885, 104.60801].

As a reminder, Zhao et al. (2025) determined that the initial failure was 370,000 m3, increasing to 600,000 m3 through entrainment. The landslide had a runout distance of 1,180 metres and a vertical elevation change of 440 m. In total, 29 people were killed.

The slightly odd thing about this failure is that the rainfall event that appears to have triggered it was unexceptional (c. 85 mm over the previous 30 days). I hypothesised that a progressive failure mechanism could have been in play.

Jia et al. (2026) have made some really interesting observations. First, this site was subject to previous landslides, most notably in February 2013. The paper notes that:

“all 173 people from 29 households under threat [from this earlier event] were included in the geohazard risk avoidance relocation subsidy program. Some farmers self-demolished their houses, but as some occasionally returned during the farming season, the Mu’ai Town Government, with support from the county government, organized mandatory demolition of unremoved houses in the area in 2018. ”

Further failures occurred in 2021 and 2022, whereupon all the households immediately below the unstable slope were relocated. However, homes located at a greater distance from the cliff were left in place – these were the people affected by the 2025 event.

Jia et al. (2026) suggest that initial movement of the landslide in the years before 2025 weakened the rock mass and opened pathways for the movement of water into the shear zone. Critically, their work suggests that successive wetting and drying cycles led to degradation of the the sandstones and mudstones forming the slope, moving the mass towards failure.

This weakening was sufficient to render the slope vulnerable to the effects of the rainfall in February 20925, triggering the Junlian rock avalanche.

We might take away to key messages from this work. The first is the need to understand the likely runout characteristics of a slope in determining the safety of the population. This is devilishly difficult. That there was an ongoing programme to relocate the most vulnerable people is (on the face of it) good, but it depends on this calculation.

Second is the need to understand the complexities of the processes occurring in a slope. In the case of the Junlian rock avalanche, it was the progressive weakening of the rock mass through wetting and drying cycles that meant that the slope could fail under the influence of unexceptional rainfall. As we drive climate change, similar processes will be occurring in many more slopes in China and elsewhere. That is going to pose a major challenge in terms of keeping people safe.

References

Jia, W., Wen, T., Chen, N. et al. 2026. Dry–wet cycle may trigger the catastrophic landslide in Junlian on February 8, 2025Landslides. https://doi.org/10.1007/s10346-026-02692-2

Zhao, B., Zhang, Q., Wang, L. et al. 2025. Preliminary analysis of failure characteristics of the 2025 Junlian rock avalanche, ChinaLandslideshttps://doi.org/10.1007/s10346-025-02556-1.

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer