EOS

Syndicate content Eos
Science News by AGU
Updated: 30 min 30 sec ago

Understanding Aerosol-Cloud Interactions is Pivotal for Improving Climate Predictions

Thu, 02/26/2026 - 13:58
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

The dynamics of interactions between aerosols and clouds are far from being completely understood and, therefore, it is a source of uncertainty in climate modeling. In Im et al. [2026], a call is issued to integrate into climate models, through data assimilation, the innovative and massive information provided by satellite remote sensing, ground, and airborne observations. Machine learning is proposed as a valuable resource to improve our capability of integrating several sources of information and exploring new retrieval algorithms. Furthermore, machine learning provides the means to set up climate model emulators to speed up climate modeling. The authors call for a global effort to profit from renewed international cooperation to advance our understanding of aerosol-cloud interactions, with the target of reducing uncertainty of climate projections.

Contributions to global mean surface temperature (GSAT) change (1750-2019) from individual forcing components, including uncertainties as assessed by the IPCC AR6. Credit: Im et al. [2026], Figure 1 (left panel)

Citation: Im, U., Samset, B. H., Nenes, A., Thomas, J. L., Kokkola, H., Dubovik, O., et al. (2026). Aerosol-cloud interactions: Overcoming a barrier to projecting near-term climate evolution and risk. AGU Advances, 7, e2025AV001872. https://doi.org/10.1029/2025AV001872   

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Slow and Fast Madden-Julian Oscillation Modes

Wed, 02/25/2026 - 21:30
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Geophysical Research Letters

Subseasonal forecasts have skill due to the existence of the Madden-Julian Oscillation (MJO), which modulates convection in the tropics while moving eastward along the equator. In a new study, Marsico et al. [2026] use a data-driven model to identify two modes of the MJO — a fast-MJO mode, with a 45-day period, and a slow-MJO mode, with a 70-day period. These two modes interact constructively and destructively and when combined can reproduce the well-known characteristics of the MJO. The authors find that if these modes and their combination are identified in subseasonal forecasts, the skill of the MJO forecasts can be improved by approximately one week, which would significantly improve the forecast skill.

Citation: Marsico, D. H., Albers, J. R., Newman, M., Gehne, M., Dias, J., Kiladis, G. N., et al. (2026). Modal interference drives Madden-Julian Oscillation evolution and predictability. Geophysical Research Letters, 53, e2025GL118062. https://doi.org/10.1029/2025GL118062  

—Suzana Camargo, Editor, Geophysical Research Letters

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

With the Ocean Included, the Social Cost of Carbon Doubles

Wed, 02/25/2026 - 14:05

How much money is climate change costing humanity? The social cost of carbon, a term for the monetary damages caused by excess carbon emissions, provides one answer. 

That calculation has traditionally disregarded the impacts that climate change has on the ocean—until now.

“Once you see it, you cannot unsee it.”

“Once you see it, you cannot unsee it,” said Bernie Bastien-Olvera, a climate change scientist at the National Autonomous University of Mexico.

Bastien-Olvera is the lead author of a new paper published in Nature Climate Change that reevaluates the social cost of carbon, taking into account climate change’s effects on marine ecosystems. The study found the social cost of carbon nearly doubled when impacts on the ocean were considered.

“We usually think of the ocean economy as much, much smaller than the land-based economy. So the idea that climate change’s impacts on it could be as big as [they are] on land, where all of our infrastructure and people live, is surprising,” said James Rising, a climate economist at the University of Delaware who was not involved in the new study. 

Involving the Ocean

Calculations of the social cost of carbon typically consider the economic impacts of climate change on property, agricultural productivity, and human health. Ocean elements, if they’re included at all, are usually limited to the ability of the ocean to absorb carbon.

Both the National Academies of Sciences, Engineering, and Medicine and the U.S. EPA have published reports emphasizing the need for marine ecosystem representation in calculations of the social cost of carbon. 

“It’s so huge a gap,” Bastien-Olvera said. “It’s very well-documented that the oceans are a big missing piece in the social cost of carbon.”

Climate change is having a clear effect on ocean elements, regardless of whether those elements are included in calculating the social cost of carbon. As greenhouse gas emissions rise, marine chemistry is changing, and oceans are heating up, leading to interrelated phenomena, including ocean acidification, a loss of coral reefs, extreme weather events, and ecological imbalances. 

Bastien-Olvera and a team of scientists from seven countries integrated recent literature about the impacts of climate change on marine ecosystems and economies into the Regional Integrated Climate-Economy model (RICE50+) traditionally used to calculate the social cost of carbon. The researchers incorporated impacts on fisheries and marine agriculture, corals, mangroves, and seaports into the model, projecting costs under multiple climate change scenarios. 

“It’s a very significant component to the total social cost of carbon.”

When impacts on marine ecosystems and infrastructure were considered, the social cost of carbon jumped to $97 per ton of carbon dioxide,nearly double the cost when only terrestrial ecosystems were included, which was calculated at $51 per ton of carbon dioxide. “It’s a very significant component to the total social cost of carbon,” Rising said. 

Bastien-Olvera considers this number conservative because his team accounted for only a few of the ways that climate change is affecting oceans. “There’s a very, very long list of things that are not yet represented,” such as the existence value of deep-sea animals, the benefits of coastal protection from kelp forests, and the habitat offered by seagrasses, he said. 

Modeling Marine Ecosystem Services

Economists calculating the monetary value of nature use so-called substitution parameters to evaluate how a natural system—like a coral reef—could be replaced with capital inputs from humans. Some benefits of natural systems, such as corals’ ability to protect shorelines from flooding, require fairly straightforward substitutions. Other benefits, like simply knowing that coral reefs exist and are beautiful, are much more difficult to put a price on.

Rising said that the current study used simple substitution parameters to evaluate intangible elements such as enjoyment. He said more research on climate change’s impacts on the ocean is needed to help models better reflect the different economic values of different marine ecosystems.

Overall, the paper’s authors “do a really convincing job” and use “very reasonable economic steps” to estimate the social cost of carbon, Rising said. 

Rising said the new paper could have an immediate policy impact for governments and organizations that use estimates of the social cost of carbon and is a great first step for other scientists attempting to estimate the measurement. 

“What these authors have done is give us a framework for thinking about further improvements, and it’s going to be exciting,” he said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Citation: van Deelen, G. (2026), With the ocean included, the social cost of carbon doubles, Eos, 107, https://doi.org/10.1029/2026EO260067. Published on 25 February 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Drought Drove the Amazon’s 2023 Switch to a Carbon Source

Wed, 02/25/2026 - 14:04
Source: AGU Advances

The Amazon is the world’s largest tropical rainforest, typically storing more carbon than it releases into the atmosphere each year. But in 2023, global high-temperature records accompanied droughts and heat waves across South America, disrupting that stable pattern.

Botía et al. combined carbon dioxide measurements and global atmospheric data to calculate the Amazon rainforest’s 2023 carbon balance using several data sources, including vegetation and atmospheric models, remote sensing data of fire emissions, vegetation indices, and proxies for gross primary productivity (a measure of how much carbon an ecosystem takes up for photosynthesis). The researchers compared the Amazon Basin–scale patterns to local flux measurements of carbon dioxide from the Amazon Tall Tower Observatory, located in the central Amazon in northern Brazil.

They found that the forest released between 10 billion and 170 billion kilograms of carbon into the atmosphere in 2023 (including fire-related emissions), turning the ecosystem into a small net carbon emitter. The change was most pronounced in the second half of the year, likely driven by climate warming and high sea surface temperatures in both the Atlantic and Pacific oceans. The warming atmosphere and seas, along with an extended dry season, were likely compounded by the transition from La Niña to El Niño conditions.

However, despite an increase in drought-driven fires in the southern Amazon and an extended fire season, fire-related emissions from the rainforest were within the long-term (2003–2023) average in 2023. This level of fire-related emissions indicated that the rainforest’s change from a carbon sink to a carbon source was caused by the rainforest’s vegetation absorbing less carbon during drought conditions, rather than by fire-induced carbon release.

The rainforest’s record-breaking switch from a carbon absorber to a carbon emitter accounted for up to 30% of worldwide tropical carbon emissions in 2023, the researchers say. The findings suggest that the Amazon could become an overall carbon source faster than previously predicted. However, the authors note that the research so far is not conclusive, and the possibility of the ecosystem recovering exists as well. (AGU Advances, https://doi.org/10.1029/2025AV001658, 2026)

—Madeline Reinsel, Science Writer

Citation: Reinsel, M. (2026), Drought drove the Amazon’s 2023 switch to a carbon source, Eos, 107, https://doi.org/10.1029/2026EO260059. Published on 25 February 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The 8 December 2024 fatal landslide on the Güngören hillslope in Artvin, northeastern Türkiye

Wed, 02/25/2026 - 08:22

A landslide that killed four people in Turkey was associated with progressive failure of a slope with known stability issues. Final failure was triggered by heavy, but not exceptional, raifall.

On 8 December 2024, a fatal landslide occurred on the Güngören hillslope in Artvin, northeastern Türkiye. The failure, which occurred at 3:05 am local time, lowed across the D010 (E70) Black Sea coastal road, killing four people. I blogged about this landslide at the time, but now a detailed analysis (Görüm et al. 2026) has been published in the journal Landslides. The paper is both Open Access and published under a Creative Commons Licence, which is very helpful for those of us who write blogs.

The Güngören hillslope is located at [41.337634, 41.26327]. This image, from Görüm et al. (2026), shows the aftermath of the landslide:-

The aftermath of the a fatal landslide occurred on the Güngören hillslope in Artvin, northeastern Türkiye. Image from Görüm et al. (2026) .

Görüm et al. (2026) describe this landslide as a debris avalanche with a length of 522 m, a width of 250 m and an elevation difference of 287 m. It has a volume of about 100,000 m3. There have been previous landslides on this slope, one of which (in 2006) was fatal.

The landslide was associated with heavy rainfall (80 mm/day), but this was not exceptional, which means that the history of the slope is important in terms of the development of progressive failure. Görüm et al. (2026). They have used InSAR to show that the slope was deforming in the two years leading up to the failure, with rates in the range of 60 mm per year. Just 23 days before the Güngören hillslope failed, the 15 November 2024 Mw=4.7 Pazar (Rize) earthquake occurred about 45 km from the site. The calculated peak ground accelerations on the Güngören hillslope were low, but this may have played a role in the development of the final failure.

Görüm et al. (2026) also highlight two potentially important human factors in the occurrence of the landslide. First, the slope was quarried in the period leading up to 2006 for construction materials for the Black Sea Coastal Road. Notably, the fatal 3 April 2006 landslide was triggered by quarry blasting. One person died.

Second, the construction of the Black Sea Coastal Road may have destabilised the slope, perhaps through excavation at the toe.

Of course further instability on this slope seems likely, so Görüm et al. (2026) recommend ongoing monitoring of the site.

Reference

Görüm, T., Tanyaş, H., Yılmaz, A. et al. 2026. Fatal debris avalanche on an anthropogenically disturbed, earthquake-perturbed slope during antecedent rainfall. Landslides. https://doi.org/10.1007/s10346-026-02713-0.

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

How to Accelerate Advances in Ecological Forecasting

Tue, 02/24/2026 - 13:59

Just as meteorologists routinely predict temperature changes, storm trajectories, and other weather patterns, ecologists also forecast how ecosystems and environmental conditions can change in the near future. These ecological forecasts are rooted in scientific understanding of how natural systems behave and react, providing predictions of the future of ecosystems along with information about associated uncertainties.

Ecological forecasts offer tangible, practical insights. For example, they can estimate grass availability and quality for livestock and predict red tides along a coastline. They can support decisionmaking across society, guiding strategies for managing farms, forests, and fisheries, as well as for monitoring invasive or endangered species, assessing water quality, and implementing nature-based climate solutions. These forecasts can also influence everyday choices, such as when to take allergy medication during pollen season, whether to avoid the beach because of harmful algal blooms, and whether to reconsider a move to an area at risk of wildfires.

Ecological forecasts are increasingly vital today as we face rapid environmental changes and catastrophic biodiversity losses.

Demand for ecological forecasts is growing as more decisionmakers and natural resource managers recognize the importance of ecosystem services such as carbon storage, pollination, natural hazard mitigation, cultural benefits, and the provisioning of water, food, and other natural resources. Critically, these forecasts—produced by a community of researchers and practitioners across academia, government agencies, and industry—are increasingly vital today as we face rapid environmental changes and catastrophic biodiversity losses.

Iteratively developing forecasting models improves their predictive capabilities and scientific understanding of the systems they’re modeling. Weather forecasting models, for example, have seen tremendous improvements in accuracy and reliability over the past few decades, largely because meteorologists use them to generate and test hypotheses about atmospheric dynamics multiple times a day across millions of locations.

By comparison, ecological forecasting capabilities remain underdeveloped, partly because it is a much younger field that has received less sustained focus. Ecological forecasts also encompass a greater variety of processes and timescales. For example, some researchers model coupled physical, biogeochemical, and ecological processes across large regions to forecast forest productivity decades into the future, while others must incorporate highly localized weather conditions to predict stream dissolved oxygen levels just a day ahead.

U.S. Geological Survey scientist Jenny Briggs measures the trunk of a tree killed by mountain pine beetles. Such measurements inform ecological forecasting, which can help foresters to predict and respond to future insect outbreaks. Credit: U.S. Geological Survey, Public Domain

These complexities have contributed to the lack of a unified or standardized system for ecological forecasting. As a result, various organizations, such as federal and state agencies, industry groups, and academic institutions, have independently developed their own boutique forecasting systems.

Some diversity in approaches is essential for innovation, especially in an evolving and multidisciplinary field. But the absence of a unified system, shared infrastructure, and scalable practices often creates unnecessary duplication and inefficiencies that can hamper the scientific community’s ability to generate critical ecological predictions reliably. It may also limit our ability to deepen understanding of the environment. In brief, the current state of ecological forecasting often falls short of meeting societal needs.

Plenty of Data, but Barriers to Forecasting Remain

During a series of meetings held from 2020 to 2022 and organized by the Ecological Forecasting Initiative (EFI), representatives from U.S. federal agencies concluded that the primary bottlenecks to providing actionable ecological forecasts do not stem from technical or scientific shortcomings of current ecological models or from data availability. Instead, the challenges lie in generating routine forecasts efficiently and in effectively communicating them to end users.

A primary barrier to efficient ecological forecast generation is the limited interoperability among forecasting systems [Geller et al., 2022]. Different systems use different data and metadata formats, modeling approaches, and workflow structures. Such diversity is not unique to forecasting, but the requirements of operationalizing a model, such as real-time data access, fault-tolerant workflows, and translating output to decision-relevant metrics, amplify the difficulties posed by noninteroperable systems.

The lack of standardization among forecasting systems slows—and in many cases prevents—the development of robust, scalable forecasts.

The lack of standardization slows—and in many cases prevents—the development of robust, scalable forecasts. It also limits their reuse across platforms, reducing their overall effectiveness. Adopting shared tools and standards across the ecological forecasting community would signal that the field of ecological forecasting is maturing, helping to build trust and encourage adoption by decisionmakers.

A second major barrier to efficiency is redundancy among different ecological forecasting efforts. Many agencies and institutions tackle similar forecasting problems using different tools and workflows, often without coordination. This duplication of effort wastes valuable time, labor, and computational power, and the absence of shared infrastructure and protocols leads teams to re-create processes and datasets instead of building on existing efforts. For example, organizations and research groups often maintain their own in-house workflows for downloading gridded weather forecasts, converting these data to more user-friendly formats, and ingesting them into their forecasting models and tools.

Shifting away from boutique approaches to reusable, community-developed workflows could substantially improve interoperability and reduce redundancy in ecological forecasting. Using shared tools, developed and improved by many contributors, can also lower the time, effort, and cost needed to launch new forecasts. Maintaining workflows based on these tools is often more affordable, easier to manage, and less prone to errors than sustaining separate, individually built systems [Fer et al., 2021]. This collaborative approach also fosters innovation as improved tools and techniques are adopted by a community of users, rather than only for specialized individual projects that may not justify the investment to develop the tools.

Without effective collaboration, the ecological forecasting community may miss valuable opportunities to combine the diverse expertise and resources.

Inefficiencies and the lack of interoperability in ecological forecasting often arise because many researchers work in isolation, limited by technological and institutional siloing. These silos restrict the exchange of knowledge, data, and tools. Without effective collaboration, the ecological forecasting community may miss valuable opportunities to combine the diverse expertise and resources found in academia, government, and industry.

This disconnection leads to fragmented knowledge bases and isolated advancements, making it difficult to develop cohesive and integrated approaches to ecological forecasting. By working together to improve the technical foundations, or cyberinfrastructure, of ecological forecasting, we could substantially enhance our ability to anticipate changes in ecosystems and support improved decisionmaking.

Learning from Success Stories

Examples of how shared cyberinfrastructure can enhance predictions about ecosystems come from both within and outside the ecological forecasting community. For instance, decades of sustained funding and incremental improvements for weather forecasting infrastructure, led by agencies such as NOAA’s National Weather Service, have enabled scalable, robust systems that transform vast amounts of data into reliable and actionable forecasts. These forecasts support decisionmaking across government, industry, and the public, informing choices related to safety, planning, resource management, and more.

A notable example of shared cyberinfrastructure advancing ecological science is the National Ecological Observatory Network’s (NEON) Ecological Forecasting Challenge [Thomas et al., 2023; Thomas and Boettiger, 2025]. This initiative welcomed forecasting experts and students to use large-scale environmental data from NEON and forecasting models to predict ecological changes at 81 sites across the United States.

Since the challenge launched in 2021, more than 82 million forecasts have been processed by the shared cyberinfrastructure, enabling synthesis of forecast skill across dozens of models and ecosystems. For example, air temperature emerged as a crucial predictor in lake water temperature and dissolved oxygen forecasts [Olsson et al., 2025], and the ability to forecast spring leaf out accurately in deciduous forests varied with how fast green-up occurred (leaf out predictions are harder to make where green-up is faster) [Wheeler et al., 2024].

A migratory barn swallow (Hirundo rustica) rests on a branch in Seedskadee National Wildlife Refuge, in Wyoming. By combining traditional bird banding surveys with radar technology and machine learning, researchers can now forecast bird migrations more accurately (e.g., with BirdCast). These forecasts benefit bird conservation efforts and help enhance public safety during migration seasons. Credit: Tom Koerner/U.S. Fish and Wildlife Service, Public Domain

Numerous other examples demonstrate the value of cyberinfrastructure for ecological forecasting, as well as related services and decisionmaking [e.g., White et al., 2019; Zwart et al., 2023]. However, many of these initiatives have been one-off projects that lack sustainability or broad applicability. To reduce the community’s reliance on specialized cyberinfrastructure and methods and to ensure interoperability across systems, it is crucial that the ecological forecasting community develop and adopt standards and protocols for data management, model inputs and outputs, and workflows [Dietze et al., 2023; Geller et al., 2022]. Establishing these conventions will enhance data consistency and efficient data analysis, facilitate dissemination of forecasted data, and support creation of shared, reusable tools.

Overcoming Obstacles to Build Forecasting Infrastructure

During a 2024 EFI workshop focused on synthesizing best practices for cyberinfrastructure, participants agreed on key design principles that should be adopted, such as common metadata standards, the use of open-source technologies, and modular and scalable architecture. However, they also recognized that establishing infrastructure that adheres to these best practices faces obstacles and institutional challenges, including technical complexity, organizational silos and resource constraints, and a lack of centralized leadership.

The technical skills required to develop ecological forecasts can present a steep learning curve for ecologists.

The technical skills required to develop ecological forecasts, such as in software development, cloud architecture, and data management, can present a steep learning curve for ecologists. To bridge this skills gap, the ecological forecasting community could adopt mentoring programs in which ecologists collaborate with cyberinfrastructure and open-source technology experts to build skills needed for automated forecast systems. Integrating software development and cloud technologies into higher education curricula would introduce these concepts early in ecological training. And embedding dedicated software engineers within forecasting teams—rather than expecting domain scientists to develop technical expertise alongside their core responsibilities—would distribute the technical workload needed for creating forecast systems.

Institutional culture and siloed structures often incentivize short-term, competitive research focused on novel science, rather than development of stable, iterative, and reusable forecasting approaches. In addition, differing missions and policies among agencies and between agencies, industry, and academic institutions can unintentionally hinder collaboration.

Overcoming these barriers could involve building broad, transdisciplinary communities of practice that bring together ecologists, modelers, information technology professionals, and decisionmakers. Such communities can foster collaboration, align incentives, and promote the adoption of best practices for ecological forecasting. Grassroots efforts like the EFI and more formal structures such as the Interagency Council for Advancing Meteorological Services offer complementary models for this kind of engagement.

By connecting individuals with complementary expertise, these communities can facilitate knowledge exchange, establish shared standards, advocate for cyberinfrastructure investment, and codevelop robust forecasting tools that address real-world ecological challenges. In addition, the success of shared cyberinfrastructure ultimately relies on leaders within agencies, industry, and academia championing these efforts—leaders whom grassroots communities can help identify and support. Such leaders can emerge at any level of an organization, from graduate students to professors and from technicians to directors.

A strong community and clear leadership are especially important now, as the systems supporting ecological forecasting are rapidly transitioning to cloud computing, which offers both opportunities and challenges. Cloud platforms offer unprecedented scalability, enabling high-resolution models, real-time data assimilation, and automated forecast pipelines. Cyberinfrastructure design principles, such as modularity, align well with cloud-based architecture because modular designs allow components to scale independently based on demand, isolate failures to prevent system-wide crashes, and promote reusability across different cloud-based projects.

The progress seen in weather forecasting demonstrates what becomes possible when scientific communities invest in shared infrastructure, open standards, and sustained collaboration.

However, as organizations deepen their reliance on commercial cloud services, they may face higher costs and increased dependence on vendors. To mitigate these risks, institutions could collaborate on shared strategies that balance the benefits of cloud-native tools with the stability and autonomy of maintaining selected on-premises resources, particularly for predictable, long-running workloads that are more cost-efficient to host locally.

The progress seen in weather forecasting demonstrates what becomes possible when scientific communities invest in shared infrastructure, open standards, and sustained collaboration. For example, the average 3-day hurricane track error decreased from about 220 miles (354 kilometers) in 2000 to roughly 70 miles (113 kilometers) today, a testament to the power of improved models, data systems, and coordinated expertise [Ritchie, 2024].

Ecological forecasting could similarly see transformative gains, but success hinges on establishing a unified, community-driven framework of best practices to overcome barriers and develop a robust shared cyberinfrastructure. Ultimately, this collective effort will enhance the reliability and impact of ecological forecasts, empowering decisionmakers to better manage natural resources, anticipate environmental change, and safeguard public well-being.

Acknowledgments

We thank David Watkins for a helpful review of an earlier version of the manuscript. Any use of trade, firm, or product names is for descriptive purposes only and does not imply endorsement by the U.S. government.

References

Dietze, M. C., et al. (2023), A community convention for ecological forecasting: Output files and metadata version 1.0, Ecosphere, 14(11), e4686, https://doi.org/10.1002/ecs2.4686.

Fer, I., et al. (2021), Beyond ecosystem modeling: A roadmap to community cyberinfrastructure for ecological data‐model integration, Global Change Biol., 27(1), 13–26, https://doi.org/10.1111/gcb.15409.

Geller, G. N., et al. (2022), NASA Biological Diversity and Ecological Forecasting: Current state of knowledge and considerations for the next decade, p. 201, NASA, Washington, D.C., cce-signin.gsfc.nasa.gov/files/announcements/announcement_271.pdf.

Olsson, F., et al. (2025), What can we learn from 100,000 freshwater forecasts? A synthesis from the NEON Ecological Forecasting Challenge, Ecol. Appl., 35(1), e70004, https://doi.org/10.1002/eap.70004.

Ritchie, H. (2024), Weather forecasts have become much more accurate; we now need to make them available to everyone, Our World in Data, archive.ourworldindata.org/20251125-173858/weather-forecasts.html.

Thomas, R. Q., and C. Boettiger (2025), Cyberinfrastructure to support ecological forecasting challenges, ESS Open Arch., https://doi.org/10.22541/essoar.175917344.44115142/v1.

Thomas, R. Q., et al. (2023), The NEON Ecological Forecasting Challenge, Front. Ecol. Environ., 21(3), 112–113, https://doi.org/10.1002/fee.2616.

Wheeler, K. I., et al. (2024), Predicting spring phenology in deciduous broadleaf forests: NEON phenology forecasting community challenge, Agric. For. Meteorol., 345, 109810, https://doi.org/10.1016/j.agrformet.2023.109810.

White, E. P., et al. (2019), Developing an automated iterative near‐term forecasting system for an ecological study, Methods Ecol. Evol., 10(3), 332–344, https://doi.org/10.1111/2041-210X.13104.

Zwart, J. A., et al. (2023), Near‐term forecasts of stream temperature using deep learning and data assimilation in support of management decisions, J. Am. Water Resour. Assoc., 59(2), 317–337, https://doi.org/10.1111/1752-1688.13093.

Author Information

Jacob A. Zwart (jzwart@usgs.gov), U.S. Geological Survey, San Francisco, Calif.; Cameron Thompson, Northeastern Regional Association of Coastal Ocean Observing Systems, Portsmouth, N.H.; Hassan Moustahfid, U.S. Integrated Ocean Observing System, NOAA, Silver Spring, Md.; Jessica Burnett, NASA, Washington, D.C.; and Michael Dietze, Boston University, Boston, Mass.

Citation: Zwart, J. A., C. Thompson, H. Moustahfid, J. Burnett, and M. Dietze (2026), How to accelerate advances in ecological forecasting, Eos, 107, https://doi.org/10.1029/2026EO260066. Published on 24 February 2026. Text not subject to copyright.
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

These South Pole Seismometers Will Detect Vibrations 1.5 Miles Under the Ice

Mon, 02/23/2026 - 14:18

Right now, more than 1.5 miles (2.46 kilometers) below the surface at the South Pole, lie two seismometers—the deepest of their kind—built to withstand the extreme pressure, cold, and magnetic interference in one of Earth’s harshest environments.

Deploying the instruments, which will be part of the U.S. Geological Survey’s (USGS) Global Seismographic Network, was a “hail Mary” expedition because of the challenges faced, said Robert Anthony, a geophysicist in the Earthquake Hazards Program at the USGS who led the National Science Foundation (NSF)–funded project.

The new seismometers help “fill an enormous, continent-scale gap in our high-quality coverage of the Earth.”

“That they’re functioning a mile and a half deep in the ice is just incredible,” he added.

Now that the instruments have been successfully deployed, they’ll start collecting high-quality seismic information that scientists can use to measure earthquakes, detect tsunamis, and even monitor nuclear testing.

The new seismometers help “fill an enormous, continent-scale gap in our high-quality coverage of the Earth,” said Rick Aster, a seismologist at Colorado State University who was part of the technical review process for the seismometers. “Having a good distribution of stations around the world is a great thing for seismology and Earth science.”

Engineering Under Pressure

Creating seismometers that can withstand being buried in an ice sheet took years of planning, dozens of experts across many organizations, and cold, difficult work at the bottom of the world.

Each seismometer sits at the bottom of a borehole drilled as part of an NSF partnership with the USGS Albuquerque Seismological Laboratory, University of Wisconsin–Madison, and IceCube Neutrino Observatory, which had already been installing subsurface instruments to detect subatomic particles. The holes were drilled with hot water, meaning each is still filled with water that is slowly expanding as it freezes. This “violent, chaotic process,” said Anthony, is exerting extreme pressure on the seismometers, which must be capable of withstanding up to 8,500 pounds per square inch (58,605 kilopascals)—nearly 500 times the pressure of Earth’s atmosphere at sea level.

To protect them, each seismometer is held by a pressure vessel, first created for IceCube’s dark matter experiments, that can withstand about 10,000 pounds per square inch (68,948 kilopascals). The seismometers are also protected from magnetic storms, which can be particularly intense at the poles, with a metal covering that redirects the magnetic field around the instruments. 

USGS geophysicist Robert Anthony explains why the South Pole is the perfect place for these two new instruments. Credit: USGS, Public Domain

A scientific instrument company called Nanometrics helped the team determine how to mount the seismometers within the pressure vessels, while IceCube adapted their existing methods to create a system to allow the instruments to receive GPS signals far below the ice sheet’s surface.

“There’s such a high chance of failure, so many things that can go wrong, that it’s amazing that they both were installed and that they’re both functional.” 

The team finally had a fully operational product in July 2025, just 2 months before the shipping deadline to get the equipment to Antarctica. If their engineering solutions had taken just a month longer, the project may not have gone forward, Anthony said. In the 2 months before shipping, the instruments underwent extensive testing at the Albuquerque Seismological Laboratory, Michigan State University, and the University of Wisconsin. 

Anthony said he expects the seismometers, deployed during the Antarctic summer on 30 December and 9 January, to freeze fully into the ice within the next few months. Having them deployed is a “huge relief,” said David Wilson, director of the USGS Global Seismographic Network and a geophysicist involved in the project. “There’s such a high chance of failure, so many things that can go wrong, that it’s amazing that they both were installed and that they’re both functional.” 

Seismological Knowledge

The two seismometers will be able to record the movement of the planet after large earthquakes and pick up fainter signals with greater fidelity than any previously deployed instruments. The South Pole is the only place on Earth where seismometers can make such observations without distortion from Earth’s rotation. 

Also, the depth and location of the instruments mean they’re far from any surface noise, such as human activity, ocean waves, or wind. Even changes to atmospheric pressure, such as when storms roll in, can affect seismic data. The deeper seismometers are placed, the less those changes affect the instruments. Firn—dense snow in the process of compressing to glacial ice—also dampens surface noise.

Aster likens the installation of the instruments to astronomers trying to find the darkest sky to observe. “This is a vibrational sensor looking for the vibrationally quietest part of the world,” he said.

And because both seismometers will be frozen into the ice sheet, they will be extremely still and will remain so for a very long time. With such stable seismometers, “you can record minute ground motions, on the order of almost the size of an atom—very, very tiny ground motions,” Anthony said. 

The data from the seismometers could answer long-held questions about seismic activity in Antarctica, such as how its ice sheet is moving over bedrock. In places, the ice sheet could be sticking and slipping “in a way that we can observe at a new level of fidelity” using the new seismometers, Aster said. The instruments will also capture unique measurements of the seismic activity of icebergs off Antarctica’s coast and volcanoes in West Antarctica, he said.

The installation of these instruments showcases the value of having a U.S. science presence in Antarctica, Aster added. The South Pole station provides “an absolutely unique and world-class capability” for the U.S. scientific enterprise, he said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Citation: van Deelen, G. (2026), These South Pole seismometers will detect vibrations 1.5 miles under the ice, Eos, 107, https://doi.org/10.1029/2026EO260064. Published on 23 February 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The 20 February 2026 garbage landslide at Rodriguez, Rizal in the Philippines

Mon, 02/23/2026 - 07:42

Three people were killed in a major failure at a privately owned garbage dump on Friday. Earlier reports of 50 deaths are now believed to have been erroneous.

On 20 February 2026, the Philippines suffered another major garbage landslide, following the tragic events that occurred at Binaliw in Cebu on 8 January 2026, which killed 35 people. This most recent event occurred at Rodriguez in Rizal.

The location of 20 February 2026 landslide is reported to be Sitio 1B Harangan, Barangay San Isidro in Rodriguez. I believe that the landfill is at [14.77036°, 121.15283], although this is unconfirmed. This is a Google Earth image of the site from April 2025:-

Google Earth image of the likely site of the 20 February 2026 garbage landslide at Rodriguez in the Philippines.

PTV has a news article about this event, which includes mobile phone footage, apparently of the aftermath of the landslide. This is a still from that footage:-

The aftermath of the 20 February 2026 garbage landslide at Rodriguez in the Philippines. Still from a video posted to Facebook by PTV.

One person has been confirmed to have been killed in this landslide, and another two are missing. Early reports of up to 50 people being buried have now been dismissed.

The provincial Governor, Nina Ricci Ynares, has written to the Department of Environment and Natural Resources to request a probe into the event. The landfill was reportedly owned and operated by International Solid Waste Integrated Management Specialist, Inc. (ISWIMS), a private company.

There is a lack of high quality research on garbage landslides, despite their substantial impacts. However, Zhang et al. (2020) provided an interesting review of 62 examples from 22 different countries. They concluded that the following were the most common causes of garbage landslides:-

  • High landfill leachate level (40% of recorded cases);
  • Inadequate compaction (23%)
  • Insufficient bearing capacity of the foundation (19%)
  • Low shear strength of the interface between the liner and the garbage (11%)
  • Rapid release of landfill gas (6%).

It will be interesting to determine the cause of the garbage landslide at Rodriguez, but I would start with an examination of the compaction of the garbage and the management of water / leachate at the site.

Reference

Zhang, Z. et al. 2020. Global study on slope instability modes based on 62 municipal solid waste landfills. Waste Management & Research: The Journal for a Sustainable Circular Economy, 38 (12). https://doi.org/10.1177/0734242X209534.

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Power Plants Will Be Allowed to Release More Than Twice As Much Mercury Into the Air

Fri, 02/20/2026 - 14:57
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

At a 20 February event in Kentucky, the Trump administration announced a final action to loosen pollution restrictions for coal-burning power plants, including limits on emissions of mercury, a hazardous neurotoxin.

The move was originally put forward in June, alongside a proposal to repeal federal limits on power plant carbon emissions.

The new rollback eliminates parts of the Mercury and Air Toxics Standards (MATS) finalized under the Biden administration. The 2024 updates strengthened limits on mercury and other hazardous air pollutant emissions from coal-burning power plants. 

As a result of the repeal, coal-burning power plants will be allowed to emit more than twice as much mercury as they currently do. Specifically, they will no longer need to adhere to the limit of 1.2 pounds of mercury per trillion British thermal units of heat input (lb/TBtu) and instead must comply with the previous mercury release limit (set during the Obama administration in 2012) of 4.0 lb/TBtu.

“Weakening critical clean air safeguards will harm public health.”

The repeal also relaxes limits on emissions of arsenic, cadmium, chromium, lead, and nickel from coal-burning power plants.

The announced rollback shows that the “EPA is letting the dirtiest, least efficient coal plants in the country off the hook,” Joseph Goffman, who worked as an administrator in the EPA’s Office of Air and Radiation during the Biden administration, told The New York Times

In the final rule, the Trump EPA argued that the move will reduce “unwarranted compliance costs” for utilities operating coal-burning power plants. The agency estimated the change would save companies up to $670 million between 2028 and 2037, but did not explain how it arrived at that estimation. 

“The Trump E.P.A. is committed to fulfilling President Trump’s promise to unleash American energy, lowering costs for families, ensuring clean air for ALL Americans and fulfilling the agency’s core mission of protecting human health and the environment,” wrote Brigit Hirsch, an EPA spokesperson, in an email to The New York Times

 
Related

High levels of mercury exposure cause human health harms, including impairment to the nervous system, brain damage and developmental delays in children. Coal plants are responsible for nearly half of the United States’ mercury emissions, according to the EPA. The Biden administration’s EPA had predicted that its amendments to MATS would create health benefits worth $300 million over 10 years.

The repeal adds to a list of actions by the current EPA deregulating the coal industry.

The EPA’s action “will contribute to thousands of additional deaths, asthma attacks, and learning disabilities,” Matthew Davis, a former EPA scientist and policy expert at the League of Conservation Voters said in a statement. “Weakening critical clean air safeguards will harm public health.”

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Correction, 20 February 2026: This article was updated to reflect information in the EPA’s final repeal.

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Why More Rain Doesn’t Mean More Erosion in Mountains

Fri, 02/20/2026 - 14:55
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Journal of Geophysical Research: Earth Surface

Climate change reshapes landscapes by altering rainfall, the primary driver of erosion in coupled mountain–basin systems. Yet more rainfall does not necessarily translate into more erosion. Using a two-dimensional numerical model that integrates hillslope processes, river incision, and sedimentation, Luo et al. [2025] reveal a previously underappreciated phenomenon: erosion saturation. When the duration of climate variability exceeds the intrinsic response time of the landscape, the system reaches a state in which additional rainfall fails to amplify erosion. Instead, sedimentation increasingly regulates the system, dampening sediment flux despite continued climatic forcing.

By explicitly comparing the period of climate forcing (P) with the landscape response time (τ), the study introduces a simple and transferable framework for understanding how climatic signals are filtered before being archived in sedimentary records. This mechanism helps explain why some long-period climate oscillations, including those linked to Milankovitch cycles, may leave muted or phase-shifted signatures in downstream deposits. Importantly, erosion saturation is not limited to strictly periodic forcing and may also emerge under prolonged or stepwise climate changes.

These findings bridge a longstanding gap in source–sink research by emphasizing that mountains and basins function as a dynamically coupled system rather than independent sediment producers and receivers. The work also highlights the need to incorporate additional controls—such as spatially variable uplift and vegetation dynamics—into future models of landscape evolution under climate change.

Citation: Luo, T., Yuan, X., Guerit, L., & Shen, X. (2025). Erosion saturation of mountain-basin system in response to rainfall variation. Journal of Geophysical Research: Earth Surface, 130, e2025JF008649. https://doi.org/10.1029/2025JF008649

­­­­­­­­­­­­­­­­­­—Dongfeng Li, Associate Editor, JGR: Earth Surface

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New Method Could Improve U.S. Forecasting of West Nile Virus

Fri, 02/20/2026 - 13:57
Source: GeoHealth

West Nile virus is the most common mosquito-borne illness in the continental United States and can in rare cases lead to a much more serious disease with an approximately 10% fatality rate. West Nile virus neuroinvasive disease (WNND) has resulted in around 3,000 deaths since its introduction to the country in 1999, but to date no national forecast for the disease exists.

Harp et al. developed a climate-informed, regionally determined forecast method for WNND cases across the United States that outperforms current benchmarks. Key to their success was aggregating historically low county-level caseloads to the regional level, the authors say. Their work highlights key climatic factors and how their regional variation affects WNND rates.

Both mosquitoes and passerine birds (a group that includes more than half of all bird species) are vectors for West Nile virus, meaning caseloads are contingent on the environmental factors affecting these species. The authors picked the most relevant climatic factors as model inputs for each region. They found that drought and temperature are most strongly linked to WNND cases overall, and precipitation is linked in some regions. The central United States saw the most consistent correlation with drought and WNND cases, whereas the northern parts of the country saw the strongest link between WNND and warmer winter and spring temperatures.

The authors compared their climate-driven model with previous benchmark models, including a simple historical caseload model and an ensemble model from a 2022 competition. They found their model consistently outperformed others across regions. Nationally, a version of their model that included both primary and secondary climate factors (such as temperature and soil moisture) offered a prediction improvement of 21.8% over the historical model.

While the advancement represents a building block toward operational West Nile virus forecasts, the authors recommend that future work focus on enhancing county-level forecasting, which would provide authorities with more actionable information to prepare for fluctuations in WNND caseloads. Future WNND forecast models may also need to overcome the issue of climate data latency to offer real-time predictions, the authors say. One option could be to incorporate weather and climate forecasts into modeling, allowing disease forecasts to look further ahead. (GeoHealth, https://doi.org/10.1029/2025GH001657, 2026)

—Nathaniel Scharping (@nathanielscharp), Science Writer

Citation: Scharping, N. (2026), New method could improve U.S. forecasting of West Nile virus, Eos, 107, https://doi.org/10.1029/2026EO260065. Published on 20 February 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

This Potential Exoplanet Is Earth Sized but May Be Colder Than Mars

Thu, 02/19/2026 - 13:45

One way scientists search for Earth-like planets is the transit method, which involves observing a slight dimming in starlight when a planet passes in front of its star. Transits cause very small decreases in flux: A planet the size of Jupiter might block 1% of the light from a Sun-sized star, and an Earth-sized planet might block only 0.01%.

A study recently published in The Astrophysical Journal Letters suggests that an intriguing signal from the star HD 137010 comes from the transit of a planet about the size of Earth with a similar orbit. Astronomers detected the faint signal using data from NASA’s K2 mission.

HD 137010 is dimmer than the Sun, and the new planet candidate, HD 137010 b, likely lies near the outer edge of the star’s habitable zone. As a result of these factors, HD 137010 b receives far less energy from its star than the Earth receives from the Sun.

Detecting Single-Transit Events

HD 137010 b is the smallest potential planet to be detected from a single transit around a Sun-like star.

“Detecting single transit events is computationally difficult, so it’s sometimes actually easier for a human to pick out these events from the data—as was the case here,” Alexander Venner, an astrophysicist at the Max Planck Institute for Astronomy and lead author of the study, wrote in an email to Eos.

Data came from the K2 mission, which itself relied on the Kepler mission, NASA’s primary mission to find Earth-like planets orbiting Sun-like stars. After the Kepler spacecraft lost some functions, the K2 mission reused Kepler’s telescope to study brighter stars with high precision. Though each of K2’s observation campaigns lasted only about 80 days, too short to catch transiting planets with longer orbital periods, the mission still managed to discover planets from single-transit events.

“I knew there was something to it as soon as I saw it.”

The team noticed a 10-hour transit across the bright star HD 137010 in 2017. The telescope was precise enough to see the star clearly, even though its light dimmed only slightly, by 225 parts per million. Venner said some planetary scientists compare the effect to a moth passing in front of a lighthouse.

Still, Venner said the transit signal was significant enough that “I knew there was something to it as soon as I saw it.”

Even though Venner and the team were confident that the signal was significant, they still had to make sure the signal wasn’t a false alarm caused by background stars or quirks in the data.

To rule this out, the team carefully checked for any stars close to HD 137010. Radial velocity data, Hipparcos and Gaia astrometry, archival images, and high-resolution imaging showed no signs of stars falling within the K2 photometric aperture. Because only one transit was seen, astronomers can’t yet be certain it was caused by a planet, but the candidate was designated HD 137010 b.

Planetary Properties and Habitability

The new analysis suggests the radius of HD 137010 is about the same as Earth’s, and its orbital period is about 365 days. Using the planet’s orbit and the star’s brightness, the team estimated that HD 137010 b receives only about 0.3 times the amount of sunlight as Earth.

HD 137010 b is one of the coldest Earth-sized planets seen crossing a Sun-like star. Its surface may be as cold as −68°C (−90°F), even colder than Mars, which averages about −65°C (−85°F).

“Whether its surface is at all ‘Earth-like’ depends on the properties of its atmosphere, which we just can’t constrain from the current data,” Venner said. “A thick warming atmosphere might allow for a warm wet surface, but a thin atmosphere might result in a completely frozen surface colder than Mars.”

Future Prospects

This “represents a milestone in the search for worlds that might one day be considered truly Earth-like.”

“This is, indeed, an exciting result. It represents a milestone in the search for worlds that might one day be considered truly Earth-like,” Jon Jenkins, who served as the coinvestigator for data analysis on the original K2 mission but was not part of the research, wrote in an email to Eos.

“It will be extremely interesting if future observations give us information on the atmosphere or surface properties of HD 137010 b,” Venner said. “These scenarios could be distinguished if we’re able to observe the spectrum of HD 137010 b.”

—Pranjal Malewar (@PranjalMalewar), Science Writer

Citation: Malewar, P. (2026), This potential exoplanet is Earth sized but may be colder than Mars, Eos, 107, https://doi.org/10.1029/2026EO260062. Published on 19 February 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Landslides on Mauao in New Zealand following the 22 January 2026 rainfall event

Thu, 02/19/2026 - 08:14

To date, 42 landslides have been identified on Mauao (Mount Manganui) in New Zealand following the 22 January 2026 rainfall event.

The extreme rainfall event that affected parts of the North Island of New Zealand triggered two fatal landslides, of which the major failure at the Mount Maunganui Beachside Holiday Park on the flanks of Mauao was the most severe. In total, six people were killed in this failure, an unusually high total for a landslide in New Zealand.

As the clear up continues, work is underway to understand the scale of the problem on Mauao (Mount Managanui), the 232 m high lava dome that sits on the edge of the Bay of Plenty. Tauranga City council has a webpage providing updates on its ongoing work at Mauao, which includes an update published today. This highlights that 42 landslides have been identified on the walking tracks of Mauao, twelve of which are considered to be “severe” for which the impacts “generally involve high complexity, higher cost, longer timeframes, and often require staged or multi-disciplinary interventions.”

The Council has released this image showing some of the impacts:-

Landslides on Mauao following the 22 January 2026 rainfall event. Image from Tauranga City council.

This Planet Labs image, captured with their standard PlanetScope instrument on 15 January 2026, shows Mauao before the landslides:-

Satellite image of Mauao before the 22 January 2026 rainfall event. Image copyright Planet Labs, used with permission, captured on 15 January 2026.

And here is an image from five days after the 22 January 2026 event:-

Satellite image of Mauao after the 22 January 2026 rainfall event. Image copyright Planet Labs, used with permission, captured on 27 January 2026.

And here is a slider to allow the two images to be compared:-

Images by Planet Labs:- https://www.planet.com/

The fatal landslide occurred on the eastern side of Mauao just below the 3 o’clock position – this is clearly visible. But other landslides can be seen on the eastern side at the end of the beach and further to the north, and on the southwestern side too. In some cases, the impact of the landslides on the walking tracks is clear.

Resolving these landslides will be time consuming and expensive, yet another burden on a large country with a comparatively small population. Tom Robinson of the University of Canterbury has a very nice article about the impact of landslides on New Zealand, noting that they have claimed 1,800 lives over the last two centuries, twice the number killed by volcanoes and earthquakes combined. As extreme rainfall events increase in frequency and severity, the challenges for New Zealands are intensifying.

Acknowledgement

Many thanks to the wonderful people at Planet Labs for providing access to the satellite imagery.

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Inclusion, Diversity, Equity, and Accessibility: Excellent IDEA! 

Wed, 02/18/2026 - 16:07
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Inclusion, diversity, equity, and accessibility (IDEA) are recognized as central ethical commitments that strengthen science and expand its impact. However, their contribution to support continued innovation and the factual barriers and enablers are under-documented.

A new study from Naji and Reyes et al. [2026] addresses this gap. The authors conducted semi-structured interviews with underrepresented and underserved Earth observation professionals and identified challenges and support they received during their career. Through these conversations, they identify barriers and enablers and discuss solutions. The authors present interesting quotes from the interviews that excellently convey the feelings and discouragement caused by the barriers and the enthusiasm and scientific benefit stimulated by successful enablers. The article provides an illuminating perspective on the real value of IDEA for the benefit of science and humanity.

Citation: Naji, N., Reyes, S. R., Crowley, M. A., Schenkein, S. F., González, M., Siwe, R., et al. (2026). Global perspectives on barriers and enablers to inclusion, diversity, equity, and accessibility (IDEA) in the field of Earth observation. AGU Advances, 7, e2025AV001858. https://doi.org/10.1029/2025AV001858

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The Olympics Just Saw Its First “Forever Chemical” Disqualifications

Wed, 02/18/2026 - 13:57

This story was originally published by Grist. Sign up for Grist’s weekly newsletter here.

Heading into the Milan-Cortina 2026 Olympics, skiers and snowboarders were already adjusting to a ban on fluorinated waxes long prized for making their equipment faster. Last week, the Winter Games saw their first enforcement of that rule, which is aimed at protecting public health and the environment.

South Korean cross-country skiers Han Dasom and Lee Eui-jin were disqualified from the women’s sprint event on 10 February. That came one day after Japanese snowboarder Shiba Masaki was disqualified from the men’s parallel giant slalom. In all three cases, routine testing found banned compounds on their equipment.

The so-called “fluoro” waxes provide a “really ridiculous speed advantage.”

For decades, elite snow sports athletes have relied on waxes with fluorocarbons that are exceptional at repelling water and dirt. Former U.S. cross-country racer Nathan Schultz told Grist the so-called “fluoro” waxes provide a “really ridiculous speed advantage,” especially in warmer conditions like those experienced at these Games.

But these waxes also contained PFAS, short for per- and polyfluoroalkyl substances. This class of 15,000 so-called “forever chemicals” are notorious for never breaking down. Studies have linked exposure to PFAS to thyroid disease, developmental problems, and cancer, and research has found elevated levels in ski technicians who regularly handled the waxes. PFAS have also been detected in soil and water near ski venues, including wells drawing from aquifers in Park City, Utah, suggesting broader environmental contamination.

Amid growing concern over the environmental impacts and the risks to skiers, their technicians, and others, the International Ski and Snowboard Federation, or FIS, called for a ban in 2019. The prohibition took effect in 2023, and applies to all events governed by the federation, including nordic, alpine and freestyle skiing, ski jumping, and snowboarding.

Officials test multiple points on each competitor’s equipment, using a technique known as Fourier transform infrared spectroscopy to detect fluoros. If a given spot on a ski or snowboard turns green, it passes. A red result indicates the presence of the banned substance. Three or more red spots leads to disqualification.

Representatives for the Japan team did respond to comment. A spokesperson for the Korea Ski Association initially told the South Korean news agency Newsis that the organization was “perplexed” by the results. “They tested negative in all previous international competitions with no prior issues,” they said. “We will consult experts from wax and ski manufacturers to investigate whether the issue lies with the wax or skis.”

In an emailed statement, the Korean Olympic Committee told Grist that fluorine was detected in what it believed to be fluorine-free waxes. “The Ski Association has purchased [fluorine]-free wax products, so it will protest,” wrote the spokesperson. The team will also replace the wax and check the skis again after cleaning to “prevent recurrence.”

It is unclear if a protest was ever officially filed or what the outcome was. The Korean team declined to elaborate and FIS did not immediately respond to Grist’s questions. But unlike some infractions, like those related to doping, discipline for unintentional fluoro use generally applies only to the event in question. The Korean athletes competed again Thursday in the 10-km freestyle event, finishing 73rd and 80th.

This time the results stood.

Correction 24 February 2026: An earlier version of this story accidentally referred to fluoride instead of fluorine in one paragraph.

—Tik Root, Grist

This article originally appeared in Grist at https://grist.org/accountability/the-olympics-just-saw-its-first-forever-chemical-disqualifications/.

Grist is a nonprofit, independent media organization dedicated to telling stories of climate solutions and a just future. Learn more at Grist.org.

Liquefaction induced by the 29 March 2025 Mw=7.7 Mandalay earthquake

Wed, 02/18/2026 - 08:26

The Landslide Blog is written by Dave Petley, who is widely recognized as a world leader in the study and management of landslides.

Of all the ground impacts induced by large earthquakes, liquefaction often feels to be the most neglected. The costs can be savage, and the long term implications wide ranging.

In this context, a very interesting paper (Valkaniotis et al. 2026) has been published in the journal Engineering Geology, which documents the liquefaction induced by the 29 March 2025 Mw=7.7 Mandalay earthquake in Myanmar. Given the challenges of fieldwork in this highly contested area, the work has been conducted medium resolution remote sensing.

It is an excellent study that demonstrates that liquefaction was extremely wide-ranging. The authors have documented 18,000 locations in which liquefaction has occurred, with the distribution being controlled by both proximity to the rupture (and not to the epicentre) and by the geology. The presence of thick deposits of Holocene fluvial materials, which occur widely in this area, allowed extensive liquefaction to occur.

One aspect that I found particularly interesting, and highly informative, is the comparison of the utility of satellite images with different resolutions for mapping liquefaction features. In particular, they show that 10 metre resolution Sentinel 2 images are useful for mapping liquefaction. So, I thought I’d take a look at the utility of Planet Labs imagery in this context.

One example that Valkaniotis et al. (2026) provide lies at [22.311, 96.012]. The Planet Labs image below shows this area as of 16 March 2025, a few days before the Mandalay earthquake:-

Satellite image of an area of Myanmar prior to the 2025 Mandalay earthquake. Image copyright Planet Labs, used with permission, collected on 16 March 2025.

And this is the same area on 31 March 2025, three days after the eartuqkae:-

Satellite image of an area of Myanmar after the 2025 Mandalay earthquake. Image copyright Planet Labs, used with permission, collected on 31 March 2025.

And here is a slider to compare the two images:-

Images by Planet Labs.

In the second image, there are hundreds of areas of exposed fluvial deposits (the light coloured patches) that are not present in the first image. These are the areas of liquefaction mapped by Valkaniotis et al. (2026). I think there may also be some locations in which lateral spreads are visible too, but this is less clear.

This is a fascinating finding, which will be very helpful in assessing post-seismic impacts in the future.

The extant of the liquefaction after the 2025 Mandalay earthquake is very interesting. At the end of the day, studies like this provide insight into the response of the ground to large earthquakes, and in turn this is intended to allow us to build resilience to these events. Valkaniotis et al. (2026) conclude their article as follows:-

“The 2025 Mandalay event serves as a reminder that liquefaction remains one of the most devastating secondary hazards associated with strong earthquakes, especially in densely populated floodplains with complex dynamic fluvial histories. The insights gained from this inventory can not only enhance national seismic resilience efforts in Myanmar but also contribute to the better understanding of liquefaction behavior in large strike-slip earthquakes worldwide.”

Quite.

Reference and acknowledgement

Valkaniotis, S. et al. 2026. Regional-scale inventory and initial analysis of liquefaction triggered by the 2025 Mw 7.7 Mandalay earthquake, Myanmar. Engineering Geology,
363. https://doi.org/10.1016/j.enggeo.2026.108543.

Many thanks to the wonderful people at Planet Labs for providing access to the satellite imagery.

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Models Reveal Imprint of Tectonics and Climate on Alluvial Terraces

Tue, 02/17/2026 - 17:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

River terraces are archives of past environmental and climate change as they form when rivers erode into alluvial plains, leaving behind an elevated flat surface. A sequence of terraces can take tens to hundreds of thousands of years to develop, thus they potentially hold important information over the period of formation. This is the case for the extensive terraces in southern Patagonia.

Through mechanistic models of terrace formation, Ruby et al. [2026] both isolate and combine the key drivers of terrace formation and connect them with the observed terrace shapes. Some terrace shapes were shown to form only under a specific combination of model parameters. This opens a new quantitative way to reveal past tectonic, climatic, and environmental conditions and how these have changed using terraces.  

Citation: Ruby, A., McNab, F., Schildgen, T. F., Wickert, A. D., & Fernandes, V. M. (2026). How sediment supply, sea-level, and glacial isostatic oscillations drive alluvial river long-profile evolution and terrace formation. AGU Advances, 7, e2025AV002035. https://doi.org/10.1029/2025AV002035

—M. Bayani Cardenas, Editor, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Restored Peatlands Could Become Carbon Sinks Within Decades

Tue, 02/17/2026 - 14:04

Drained peatlands in Finland can become carbon sinks within just 15 years of restoration, suggests a study published in Restoration Ecology. The findings are a stark contrast to another recent publication that suggests the switch from source to sink can take hundreds of years.

Finland will submit a biodiversity restoration plan to the European Commission this September, and what to do about the country’s 5 million hectares of drained peatland will likely be a hot topic. Teemu Tahvanainen, the author of the new study and a plant ecologist at the University of Eastern Finland (Itä-Suomen Yliopisto), said the upcoming deadline motivated him to add to the conversation.

Moreover, if the country is to one day achieve carbon neutrality, it “cannot neglect those areas,” said peatland ecologist Anke Günther from Universität Rostock, in Germany, who was not involved in the new paper.

Like a Forest with No Air

To understand why pristine peatlands are powerful carbon sinks, imagine a forest without any air between the trees, said Günther. That’s how densely the mosses that make up peat are packed together.

To understand why pristine peatlands are powerful carbon sinks, imagine a forest without any air between the trees, said Günther. That’s how densely the mosses that make up peat are packed together. In some places, peatlands can cover millions of hectares and be meters deep. All told, they contain massive amounts of plant matter and therefore massive amounts of carbon—about a third of the total carbon found on Earth.

Peatlands are waterlogged, which largely prevents the peat from decomposing, but also limits how well trees and other plants can grow. Forestry and agricultural companies, governments, and private landowners often dig trenches to drain off some of the water, making the land available for other uses. But draining peat exposes it to oxygen, which then allows microbes to break it down, releasing carbon dioxide.

Rewetting stops these carbon emissions, but it can also cause others, explained soil scientist Jens Leifeld from the Swiss federal research institute Agroscope, who was not involved in the new study. For example, any trees growing in a drained peatland will die upon rewetting, and their deaths will release carbon dioxide if the trees aren’t harvested. Moreover, rewetting shifts the peatland’s microbial population from aerobic microbes to anaerobic, increasing methane emissions. Studies have produced conflicting answers when asking how restoring peatlands affects carbon emissions. “There was no agreed opinion,” Leifeld said.

Increasing the Resolution

Tahvanainen modeled peatland restoration with greater temporal resolution than in previous studies. Rather than assume that parameters such as methane emissions and decomposition of forest litter will remain the same after rewetting, he predicted how these parameters will vary in the years and decades following.

His take-home message: Restoration can cool the climate in as little as a couple of decades. “I’m saying that it can, which sounds a little bit ambiguous on purpose,” he added. There are many variables his approach can’t account for, he said, such as how climate change will progress and the state of a peatland prior to restoration.

“The results make sense to me in a way that other studies didn’t always.”

“The results make sense to me in a way that other studies didn’t always,” said Günther. It seemed implausible to her that the carbon sequestered through a bit of tree growth would compensate for the vast amount of carbon released from draining a peatland.

But rewetting also has consequences the model doesn’t consider, Leifeld pointed out. For example, rewetting changes the color of the landscape in the winter, taking it from the dark color of a forest to the white color of open snow. Snow reflects more sunlight than trees, which cools Earth.

Only field studies can truly answer the question of how rewetting peatlands will affect their greenhouse gas emissions, said forest ecologist Paavo Ojanen from Natural Resources Institute Finland. These studies are ongoing, but they require following peatlands for years. Until they’re complete, “we don’t have the real measurements,” he said.

For now, Tahvanainen said his work adds nuance to studies reporting that peatland restoration won’t bring climate mitigation in the next hundred years. That’s “just way too strongly put,” he said.

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2026), Restored peatlands could become carbon sinks within decades, Eos, 107, https://doi.org/10.1029/2026EO260060. Published on 17 February 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The massive, developing gully at Pondok Balik in Indonesia

Tue, 02/17/2026 - 08:15

A massive gully has been developing over the last two decades at Pondok Balik. It now covers an area of over 3 hectares.

In Indonesia, a massive and rapidly developing gully is causing considerable concern. Located at Pondok Balik in Central Aceh Regency, Aceh province, this feature has been developing since 2004. Reuters has an excellent gallery of images that is worth a view. There is a really good summary of the history of this gully on The Watchers website too.

There is some nice drone footage of this feature in this SindoNews report on Youtube:-

The location of this very large gully is [4.72374, 96.73117]. This is a Google Earth image of it, captured in June 2025:-

Google Earth image from June 2025 of the massive gully at Pondok Balik in Indonesia.

By comparison, here is an image from February 2015:-

Google Earth image from February 2015 2018 of the massive gully at Pondok Balik in Indonesia.

And here is a slider to compare the two, showing the raid development of the gully:-

Google Earth images

The gully is reportedly developing in loose volcanic materials, which are prone to rapid erosion when disturbed and saturated. In Indonesia, rainfall totals are high.

There are concerns about potential damage to the road seen in the image and to high voltage electricity pylons running through the area. It is proposed to seek to manage the hazard by reinforcing the soil and managing surface and subsurface water. This will not be straightforward or cheap.

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Rocky Shore Erosion Shaped by Multi-Scale Tectonics

Mon, 02/16/2026 - 14:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Coastal landscapes evolve under the combined influence of wave action, climatic variations, sea‑level change, and tectonic processes. Shoreline evolution is especially important along rocky coasts such as those of the western United States, where it shapes hazards to people and infrastructure and affects exposure to events like tsunamis. In this context, tectonically driven uplift plays a key role over both individual earthquake cycles and longer timescales associated with fault-system and topographic development.

Using a compilation of coastal change metrics and statistical analyses, Lopez and Masteller [2026] identify a tentative link between tectonics and shoreline change. On decadal timescales, uplift can slow coastline retreat, as might be expected. Over many earthquake cycles, however, higher long-term uplift associated with cumulative subduction-zone deformation appears to enhance shoreline retreat. These findings highlight some of the interactions between coastal and solid earth hazards. They also point toward future models that integrate similar constraints to improve our understanding of how earthquakes build topography and how sea level, coastal processes, and tectonics together modulate short‑ and long‑term coastal risk.

Citation: Lopez, C. G., & Masteller, C. C. (2026). Tectonics as a regulator of shoreline retreat and rocky coast evolution across timescales. AGU Advances, 7, e2025AV002065. https://doi.org/10.1029/2025AV002065

—Thorsten Becker, Editor, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer