EOS

Syndicate content Eos
Science News by AGU
Updated: 1 day 21 hours ago

Central China Water Towers Provide Stable Water Resources Under Change

Fri, 01/09/2026 - 15:24
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

The mountains ringing the Pacific Rim—stretching from the Andes to the Rockies, the Himalayas, and beyond—act as actual “water towers.” They host huge reserves of water that are stored in snowpack, glaciers, lakes, and soils, and then feed rivers and supply freshwater to billions of people downstream.

Yue et al. [2026] analyze how climate change affects freshwater supply from water towers by analyzing a new dendrochronological network of 100 tree-ring sampling sites. They first reconstruct Central China Water Tower (CCWT) runoff back to 1595. Then, by considering projections from climate models, the authors reveal increasing runoff across most Pacific Rim water towers, whereas water resources from the Northern Rocky Mountains are projected to decline substantially. These differences are attributed to distinct geographies and synoptic climatic conditions. The findings provide insights for adaptive management strategies in China.

Citation: Yue, W., Torbenson, M. C. A., Chen, F., Reinig, F., Esper, J., Martinez del Castillo, E., et al. (2026). Runoff reconstructions and future projections indicate highly variable water supply from Pacific Rim water towers. AGU Advances, 7, e2025AV002053.  https://doi.org/10.1029/2025AV002053

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

In 2025, the Ocean Stored a Record-Breaking Amount of Heat, Again

Fri, 01/09/2026 - 14:23
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.

The ocean soaked up more heat last year than in any year since modern measurements began around 1960, according to a new analysis published in Advances in Atmospheric Science

The world’s oceans absorb more than 90% of excess heat trapped in Earth’s atmosphere by greenhouse gas emissions. As heat in the atmosphere accumulates, heat stored in the ocean increases, too, making ocean heat a reliable indicator of long-term climate change. 

Ocean temperatures influence the frequency and intensity of marine heatwaves, change atmospheric circulation, and govern global precipitation patterns. 

Scientists measure the ocean’s heat in different ways. One common metric is global annual mean sea surface temperature, the average temperature in the top few meters of ocean waters. Global sea surface temperature in 2025 was the third warmest ever recorded, at about 0.5°C (0.9°F) above the 1981-2010 average.

“Last year was a bonkers, crazy warming year.”

Another metric is ocean heat content, which measures the total heat energy stored in the world’s oceans. It’s measured in zettajoules: One zettajoule is equivalent to 1,000,000,000,000,000,000,000 joules. To measure heat content in 2025, the study’s authors assessed ocean observational data from the upper 2,000 meters of the ocean, where most of the heat is absorbed, from NOAA’s National Centers for Environmental Information, the European Union’s Copernicus Climate Change Service, and the Chinese Academy of Sciences. 

They found that in total, the ocean absorbed an additional 23 zettajoules of heat energy in 2025, breaking the ocean heat content record for the ninth consecutive year and marking the longest sequence of consecutive ocean heat content records ever recorded.

“Last year was a bonkers, crazy warming year,” John Abraham, a mechanical engineer at the University of St. Thomas and a coauthor of the new study, told Wired.

Twenty-three zettajoules in one year is equivalent to the energy of 12 Hiroshima bombs exploding in the ocean every second. It’s also a large increase over the 16 zettajoules of heat the ocean absorbed in 2024. The hottest areas of the ocean observed in 2025 were the tropical and South Atlantic, Mediterranean Sea, North Indian Ocean, and Southern Ocean. 

 
Related

The results provide “direct evidence that the climate system is out of thermal equilibrium and accumulating heat,” the authors write.

A hotter ocean favors increased global precipitation and fuels more extreme tropical storms. In the past year, warmer global temperatures were likely partly responsible for the damaging effects of Hurricane Melissa in Jamaica and Cuba, heavy monsoon rains in Pakistan, severe flooding in the Central Mississippi Valley, and more.

“Ocean warming continues to exert profound impacts on the Earth system,” the authors wrote.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

9 January: This article was updated to correct the conversion of 23 zettajoules to Hiroshima bomb explosions.

These updates are made possible through information from the scientific community. Do you have a story about science or scientists? Send us a tip at eos@agu.org. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Managing Carbon Stocks Requires an Integrated View of the Carbon Cycle

Fri, 01/09/2026 - 14:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Managing carbon stocks in the land, ocean, and atmosphere under changing climate requires a globally integrated view of carbon cycle processes at local and regional scales. The growing Earth Observation (EO) record is the backbone of this multi-scale system, providing local information with discrete coverage from surface measurements and regional information at global scale from satellites.

Carbon flux information, anchored by inverse estimates from spaceborne Greenhouse Gas (GHG) concentrations, provides an important top-down view of carbon emissions and sinks, but currently lacks global continuity at assessment and management scales (less than 100 kilometers). Partial-column data can help separate signals in the boundary layer from the overlying atmosphere, providing an opportunity to enhance surface sensitivity and bring flux resolution down from that of column-integrated data (100–500 kilometers).

As described in Parazoo et al. [2025], the carbon cycle community envisions a carbon observation system leveraging GHG partial columns in the lower and upper troposphere to weave together information across scales from surface and satellite EO data, and integration of top-down / bottom-up analyses to link process understanding to global assessment. Such an actionable system that integrates existing and new EO data and inventories using advanced top-down and bottom-up analyses can help address the diverse and shifting needs of carbon management stakeholders.

Diverse carbon cycle science needs span multiple time (x-axis) and space (y-axis) scales across land (green shading), ocean (blue shading), and fossil (orange shading) sectors. Science needs addressed by the current and planned carbon flux and biomass Earth Observation (EO) program of record (PoR; purple and green, respectively) are depicted by the solid circle. Key EO science gaps exist at 1–100 kilometer spatial scale spanning sub-seasonal impacts of climate extremes and wildfires, interannual change and biomass, long term changes in growth, storage, and emissions, and carbon-climate feedbacks and tipping points (grey shading). Future GHG and biomass observing systems (e.g., dashed circles) will provide important benefits to carbon management efforts. Credit: Parazoo et al. [2025], Figure 1

Citation: Parazoo, N., Carroll, D., Abshire, J. B., Bar-On, Y. M., Birdsey, R. A., Bloom, A. A., et al. (2025). A U.S. scientific community vision for sustained earth observations of greenhouse gases to support local to global action. AGU Advances, 6, e2025AV001914.  https://doi.org/10.1029/2025AV001914

—Don Wuebbles, Editor, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New River Chemistry Insights May Boost Coastal Ocean Modeling

Fri, 01/09/2026 - 13:46
Source: Global Biogeochemical Cycles

Rivers deliver freshwater, nutrients, and carbon to Earth’s oceans, influencing the chemistry of coastal seawater worldwide. Notably, a river’s alkalinity and the levels of dissolved inorganic carbon it brings to the sea help to shape regional conditions for marine life, including shellfish and corals. These factors also affect the ability of coastal seawater to absorb carbon dioxide from Earth’s atmosphere—which can have major implications for climate change.

However, the factors influencing river chemistry are complex. Consequently, models for predicting worldwide carbon dynamics typically simplify or only partially account for key effects of river chemistry on coastal seawater. That could now change with new river chemistry insights from Da et al. By more realistically accounting for river inputs, the researchers demonstrate significant corrections to overestimation of the amount of carbon dioxide absorbed by the coastal ocean.

The researchers used real-world data on rivers around the world to analyze how factors such as forest cover, carbonate-containing rock, rainfall, permafrost, and glaciers in a watershed influence river chemistry. In particular, they examined how these factors affect a river’s levels of dissolved inorganic carbon as well as its total alkalinity—the ability of the water to resist changes in pH.

The researchers found that variations in total alkalinity between the different rivers were primarily caused by differences in watershed forest cover, carbonate rock coverage, and annual rainfall patterns. Between-river variations in the ratio of dissolved inorganic carbon to total alkalinity were significantly shaped by carbonate rock coverage and the amount of atmospheric carbon dioxide taken up by photosynthesizing plants in the watershed, they found.

The analysis enabled the researchers to develop new statistical models for using watershed features to realistically estimate dissolved inorganic carbon and total alkalinity levels at the mouths of rivers, where they flow into the ocean.

When incorporated into a global ocean model, the improved river chemistry estimates significantly reduced the overestimation of carbon dioxide taken up by coastal seawater. In other words, compared with prior ocean modeling results, the new results were more in line with real-world, data-based calculations of carbon dioxide absorption.

This study demonstrates the importance of accurately accounting for river chemistry when making model-based predictions of carbon cycling and climate change. More research is needed to further refine river chemistry estimates to enable even more accurate coastal ocean modeling. (Global Biogeochemical Cycles, https://doi.org/10.1029/2025GB008528, 2025)

—Sarah Stanley, Science Writer

Citation: Stanley, S. (2026), New river chemistry insights may boost coastal ocean modeling, Eos, 107, https://doi.org/10.1029/2026EO260022. Published on 9 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The Looming Data Loss That Threatens Public Safety and Prosperity

Fri, 01/09/2026 - 13:45

From farming and engineering to emergency management and insurance, many industries critical to daily life rely on Earth system and related socioeconomic datasets. NOAA has linked its data, information, and services to trillions of dollars in economic activity each year, and roughly three quarters of U.S. Fortune 100 companies use NASA Earth data, according to the space agency.

Such data are collected in droves every day by an array of satellites, aircraft, and surface and subsurface instruments. But for many applications, not just any data will do.

Leaving reference quality datasets (RQDs) to languish, or losing them altogether, would represent a dramatic shift in the country’s approach to managing environmental risk.

Trusted, long-standing datasets known as reference quality datasets (RQDs) form the foundation of hazard prediction and planning and are used in designing safety standards, planning agricultural operations, and performing insurance and financial risk assessments, among many other applications. They are also used to validate weather and climate models, calibrate data from other observations that are of less than reference quality, and ground-truth hazard projections. Without RQDs, risk assessments grow more uncertain, emergency planning and design standards can falter, and potential harm to people, property, and economies becomes harder to avoid.

Yet some well-established, federally supported RQDs in the United States are now slated to be, or already have been, decommissioned, or they are no longer being updated or maintained because of cuts to funding and expert staff. Leaving these datasets to languish, or losing them altogether, would represent a dramatic—and potentially very costly—shift in the country’s approach to managing environmental risk.

What Is an RQD?

No single definition exists for what makes a dataset an RQD, although they share common characteristics, including that they are widely used within their respective user communities as records of important environmental variables and indicators. RQDs are best collected using observing systems designed to produce highly accurate, stable, and long-term records, although only a few long-term observing systems can achieve these goals.

As technological advances and operating constraints are introduced, specialized efforts are needed to integrate new and past observations from multiple observing systems seamlessly. This integration requires minimizing biases in new observations and ensuring that these observations have the broad spatial and temporal coverage required of RQDs (Figure 1). The nature of these efforts varies by the user community, which sets standards so that the datasets meet the specific needs of end users.

Fig. 1. Various satellite sensors provide total precipitable water (TPW) data products characterizing the integrated amount of water vapor available throughout the atmospheric column. However, each of these products has biases and sampling errors because of differences in the algorithms, sensors, and spatial and temporal sampling resolutions on which they are based. NOAA’s Cooperative Institute for Research in the Atmosphere produces a unified, or blended, TPW—an example of which is shown here—that merges all available TPW products. Click image for larger version. Credit: NOAA

The weather and climate community—which includes U.S.- and internationally based organizations such as NOAA, NASA, the National Research Council, and the cosponsors of the Global Climate Observing System (GCOS)—has agreed upon principles to guide the development of RQDs [Bojinski et al., 2014; National Research Council, 1999]. For example, data must account for changes in observing times, frequency of observations, instruments, calibration, and undesirable local effects (e.g., obstructions affecting the instruments’ sensors). These RQDs are referred to as either thematic or fundamental climate data records depending on the postprocessing involved (e.g., sensor-detected satellite radiances (fundamental) versus a postprocessing data product such as integrated atmospheric water vapor (thematic)).

Another important attribute of RQDs is that their data are curated to include detailed provenance tracking, metadata, and information on validation, standardization, version control, archiving, and accessibility. The result of all this careful collection, community input, and curation is data that have been rigorously evaluated for scientific integrity, temporal and spatial consistency, and long-term availability.

An Anchor to Real-World Conditions

RQDs are crucial in many ways across sectors. They are vital, for example, in realistically calibrating and validating projections and predictions of environmental hazards by weather, climate, and Earth system models. They can also validate parameterizations used to represent physical processes in models and ground global reanalysis and gridded model products in true ambient conditions [Thorne and Vose, 2010].

RQDs have become even more important with the rapid emergence of artificial intelligence weather forecasting approaches.

Without these reference data to anchor them, the outputs of large-scale high-resolution gridded climate datasets (e.g., PRISM (Portable Remote Imaging Spectrometer), E-OBS, IMERG (Integrated Multi-satellite Retrievals for GPM), CHELSA-W5E5) can drift systematically. Over multidecadal horizons, this drift degrades our ability to separate genuine Earth system changes and variations from artifacts. RQDs have become even more important with the rapid emergence of artificial intelligence (AI) weather forecasting approaches, which must be trained on observations and model outputs and thus can inherit their spatial and temporal biases.

Indeed, RQDs are fundamental to correcting biases and minimizing the propagation of uncertainties in high-resolution models, both conventional and AI. Researchers consistently find that the choice and quality of reference datasets are critical in evaluating, bias-correcting, and interpreting climate and weather model outputs [Gampe et al., 2019; Gibson et al., 2019; Jahn et al., 2025; Gómez-Navarro et al., 2012; Tarek et al., 2021]. If the reference data used are of lower quality, greater uncertainty can be introduced into projections of precipitation and temperature, for example, especially with respect to extreme conditions and downstream impacts such as streamflows or disease risk. This potential underscores the importance of RQDs for climate and weather modeling.

Each community has its own requirements for RQDs. To develop and implement statistical risk models to assess local vulnerability to environmental hazards, the finance and insurance sectors prioritize high spatial and temporal resolution, data completeness, adequate metadata to dissect specific events, certification that data are from a trusted source, open-source accessibility, and effective user data formats. These sectors directly or indirectly (i.e., downstream datasets) rely on many federally supported datasets. Examples include NOAA’s Storm Events Database, Billion-Dollar Weather and Climate Disasters dataset, and Global Historical Climatology Network hourly dataset; NASA’s family of sea surface altimetry RQDs and its Soil Moisture Active Passive and Gravity Recovery and Climate Experiment terrestrial water storage datasets; and the interagency Monitoring Trends in Burn Severity dataset, which tracks burned wildfire areas.

Meanwhile, the engineering design community requires regularly updated reference data that can help distinguish truly extreme from implausible outlier conditions. This community uses scores of federally supported RQDs to establish safety and design standards, including NOAA’s Atlas 14 and Atlas 15 precipitation frequency datasets, U.S. Geological Survey’s (USGS) National Earthquake Hazards Reduction Program dataset, and NASA’s sea level data and tools (which are instrumental in applications related to ocean transport and ocean structures).

As RQDs are a cornerstone for assessing environmental hazards across virtually all sectors of society, the loss or degradation of RQDs is an Achilles heel for reliably predicting and projecting all manner of environmental hazard.

Linking Reference Observing and Data Systems

U.S. agencies have long recognized the importance of reference observing systems and the RQDs they supply. Since the early 2000s, for example, NOAA’s U.S. Climate Reference Network (USCRN) has operated a network of highly accurate stations (now numbering 137) across the country that measure a variety of meteorological variables and soil conditions (Figure 2) [Diamond et al., 2013]. The USCRN plans redundancy into its system, such as triplicate measurements of the same quantity to detect and correct sensor biases, allowing data users to trust the numbers they see.

Fig. 2. A typical U.S. Climate Reference Network station includes instruments to collect a variety of data on environmental variables such as air temperature, precipitation, wind, soil moisture and temperature, humidity, and solar radiation. Credit: NOAA

The World Meteorological Organization has helped to coordinate similar networks with reference quality standards internationally. One such network is the GCOS Reference Upper-Air Network, which tracks climate variables through the troposphere and stratosphere (and to which NOAA contributes). The resulting RQDs from this network are used to calibrate and bias-correct data from other (e.g., satellite) observing systems.

Only the federal government carries a statutory, sovereign, and enduring mandate to provide universally accessible environmental data as a public good.

In the absence of such reference quality observing systems, RQDs must be derived by expert teams using novel data analyses, special field-observing experiments, statistical methods, and physical models. Recognizing their importance, Thorne et al. [2018] developed frameworks for new reference observing networks. Expert teams have been assembled in the past to develop RQDs from observing systems that are of less than reference quality [Hausfather et al., 2016]. However, these teams require years of sustained work and funding, and only the federal government carries a statutory, sovereign, and enduring mandate to provide universally accessible environmental data as a public good; other sectors contribute valuable but nonmandated and nonsovereign efforts.

Datasets at Risk

Recent abrupt actions to reduce support for RQDs are out of step with the long-standing recognition of these datasets’ value and of the substantial efforts required to develop them.

Federal funding and staffing to maintain RQDs are being cut through reduced budgets, agency reorganizations, and reductions in force. The president’s proposed fiscal year 2026 budget would, for example, cut NOAA’s budget by more than 25% and abolish NOAA’s Office of Oceanic and Atmospheric Research, although the newest appropriations package diminishes cuts to science. The National Science Foundation–supported National Center for Atmospheric Research (NCAR), which archives field experiment datasets and community model outputs, is at risk of being dismantled.

Major cuts have also been proposed to NASA’s Earth Sciences Division, as well as to Earth sciences programs in the National Science Foundation, Department of Energy (DOE), Department of the Interior, and elsewhere. Changes enacted so far have already affected some long-running datasets that are no longer being processed and are increasingly at risk of disappearing entirely.

The degradation of RQDs that we’re now seeing comes at a time of growing risk from climate and weather hazards. In the past decade alone, the United States has faced over $1.4 trillion in damages from climate-related disasters—and over $2.9 trillion since 1980. Inflation adjusted per-person costs of U.S. disasters have jumped nearly tenfold since the 1980s and now cost each resident nearly $500 annually (Figure 3). The flooding disasters from Hurricane Helene in September 2024 and in central Texas in July 2025 offer recent reminders of both the risks from environmental hazards and the continued need to predict, project, and prepare for future events.

Fig. 3. The average inflation-adjusted cost per person in the United States from billion-dollar disasters—indicated here in pentad years—rose from about $50 in 1980 to roughly $450 as of 2020. Costs are derived using the National Centers for Environmental Information’s Billion-Dollar Weather and Climate Disasters reference quality dataset, which is no longer being updated.

Threatened datasets include many RQDs whose benefits are compounded because they are used in building other downstream RQDs. This includes examples such as USGS’s National Land Cover Database, which is instrumental to downstream RQDs like Federal Emergency Management Agency flood maps, U.S. Department of Agriculture (USDA) crop models, and EPA land use products. Another example is USDA’s National Agriculture Imagery Program, which delivers high-resolution aerial imagery during the growing season and supports floodplain mapping, wetland delineation, and transportation infrastructure planning.

Many other federally supported projects that produce derivative and downstream RQDs are at risk, primarily through reductions in calibration, reprocessing, observing-network density, expert stewardship, and in some cases abrupt termination of observations. Earth system examples include NOAA’s bathymetry and blended coastal relief products (e.g., National Bathymetric Source, BlueTopo, and Coastal Relief Models), USGS’s 3DEP Digital Elevation Model, and the jointly supported EarthScope Consortium geodetic products.

Several global satellite-derived RQDs face end-of-life and longer-term degradation issues, such as those related to NASA’s algorithm development and testing for the Global Precipitation Climatology Project, the National Snow and Ice Data Center’s sea ice concentration and extent data, and the family of MODIS (Moderate Resolution Imaging Spectroradiometer) RQDs. In addition, USGS’s streamflow records and NOAA’s World Ocean Atlas are at-risk foundational RQDs whose downstream products span sectors including engineering, hazards management, energy, insurance, defense, and ecosystem services.

More Than a Science Issue

The degradation of weather, climate, environmental, and Earth system RQDs propagates risk well beyond the agencies that produce them and isn’t a problem of just science and technology, because the products they power don’t serve just scientists.

Apart from fueling modeling of climate and weather risks and opportunities, they underpin earthquake and landslide vulnerability maps, energy grid management, safe infrastructure design, compound risk mitigation and adaptation strategies, and many other applications that governments, public utilities, and various industries use to assess hazards and serve public safety.

A sustained capability to produce high-resolution, decision-ready hazard predictions and projections relies on a chain of dependencies that begins with RQDs.

A sustained capability to produce high-resolution, decision-ready hazard predictions and projections relies on a chain of dependencies that begins with RQDs. If high-quality reference data vanish or aren’t updated, every subsequent link in that chain is adversely affected, and all these products become harder to calibrate and the information they provide is less certain.

RQDs are often used in ways that are not immediately transparent. A case in point is the critical step of updating weather model reanalyses (e.g., ERA5 (ECMWF Reanalysis v5) or MERRA-2 (Modern-Era Retrospective Analysis for Research and Applications, Version 2)), which are increasingly used in many weather and climate hazards products, by replacing the real-time operational data they assimilate with data from up-to-date RQDs wherever possible. These real-time operational data are rarely screened effectively for absolute calibration errors and subtle but important systemic biases, so this step helps to ensure the model simulations are free of time- and space-dependent biases. Using outputs from reanalysis models not validated or powered by RQDs can thus be problematic because biases can propagate into other hazard predictions, projections, and assessments, resulting in increased uncertainty and an inability to validate extremes.

A Vital Investment

With rapid advances in new observing system technologies and a diverse and ever–changing mix of observing methods, demand is growing for scientific expertise to blend old and new data seamlessly. The needed expertise involves specialized knowledge of how to process the data, integrate new observing system technologies, and more.

The costs for maintaining and updating RQDs are far less than recovering from a single billion-dollar disaster.

Creating RQDs isn’t easy, and sustained support is necessary. This support isn’t just a scientific priority—it’s also a vital national investment. Whereas the costs of restoring lost or hibernated datasets and rebuilding expert teams—if those tasks would even be possible—would be enormous, the costs for maintaining and updating RQDs are far less than recovering from a single billion-dollar disaster.

Heeding recurring recommendations to continue collecting precise and uninterrupted observations of the global climate system—as well as to continue research, development, and updates necessary to produce RQDs—in federal budgets for fiscal year 2026 and beyond thus seems the most sensible approach. If this doesn’t happen, then the United States will need to transition to relying on the interest, capacities, and capabilities of various other organizations both domestic and international to sustain the research, development, and operations required to produce RQDs and make them available.

Given the vast extent of observing system infrastructures, the expertise required to produce RQDs from numerous observing systems, and the long-term stability needed to sustain them, such a transition could be extremely challenging and largely inadequate for many users. Thus, by abandoning federally supported RQDs, we risk being penny-wise and climate foolish.

References

Bojinski, S., et al. (2014), The concept of essential climate variables in support of climate research, applications, and policy, Bull. Am. Meteorol. Soc., 95(9), 1,431–1,443, https://doi.org/10.1175/BAMS-D-13-00047.1.

Diamond, H. J., et al. (2013), U.S. Climate Reference Network after one decade of operations: Status and assessment, Bull. Am. Meteorol. Soc., 94(4), 485–498, https://doi.org/10.1175/BAMS-D-12-00170.1.

Gampe, D., J. Schmid, and R. Ludwig (2019), Impact of reference dataset selection on RCM evaluation, bias correction, and resulting climate change signals of precipitation, J. Hydrometeorol., 20(9), 1,813–1,828, https://doi.org/10.1175/JHM-D-18-0108.1.

Gibson, P. B., et al. (2019), Climate model evaluation in the presence of observational uncertainty: Precipitation indices over the contiguous United States, J. Hydrometeorol., 20(7), 1,339–1,357, https://doi.org/10.1175/JHM-D-18-0230.1.

Gómez-Navarro, J. J., et al. (2012), What is the role of the observational dataset in the evaluation and scoring of climate models?, Geophys. Res. Lett., 39(24), L24701, https://doi.org/10.1029/2012GL054206.

Hausfather, Z., et al. (2016), Evaluating the impact of U.S. Historical Climatology Network homogenization using the U.S. Climate Reference Network, Geophys. Res. Lett., 43(4), 1,695–1,701, https://doi.org/10.1002/2015GL067640.

Jahn, M., et al. (2025), Evaluating the role of observational uncertainty in climate impact assessments: Temperature-driven yellow fever risk in South America, PLOS Clim., 4(1), e0000601, https://doi.org/10.1371/journal.pclm.0000601.

National Research Council (1999), Adequacy of Climate Observing Systems, 66 pp., Natl. Acad. Press, Washington, D.C., https://doi.org/10.17226/6424.

Tarek, M., F. Brissette, and R. Arsenault (2021), Uncertainty of gridded precipitation and temperature reference datasets in climate change impact studies, Hydrol. Earth Syst. Sci., 25(6), 3,331–3,350, https://doi.org/10.5194/hess-25-3331-2021.

Thorne, P. W., and R. S. Vose (2010), Reanalyses suitable for characterizing long-term trends, Bull. Am. Meteorol. Soc., 91(3), 353–362, https://doi.org/10.1175/2009BAMS2858.1.

Thorne, P. W., et al. (2018), Towards a global land surface climate fiducial reference measurements network, Int. J. Climatol., 38(6), 2,760–2,774, https://doi.org/10.1002/joc.5458.

Author Information

Thomas R. Karl (Karl51tom@gmail.com), Climate and Weather LLC, Mills River, N.C.; Stephen C. Diggs, University of California Office of the President, Oakland; Franklin Nutter, Reinsurance Association of America, Washington, D.C.; Kevin Reed, New York Climate Exchange, New York; also at Stony Brook University, Stony Brook, N.Y.; and Terence Thompson, S&P Global, New York

Citation: Karl, T. R., S. C. Diggs, F. Nutter, K. Reed, and T. Thompson (2026), The looming data loss that threatens public safety and prosperity, Eos, 107, https://doi.org/10.1029/2026EO260021. Published on 9 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Plan to End NEPA’s “Regulatory Reign of Terror” Is Finalized

Thu, 01/08/2026 - 18:37
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

The Trump administration has finalized a plan to roll back regulations outlined by one of the nation’s bedrock environmental laws.

Signed into law in 1970, the National Environmental Policy Act (NEPA) requires federal agencies to assess how proposed major projects—such as the purchase of parklands, the establishment of military complexes, or the construction of buildings and highways—will impact the environment.

NEPA opponents, which include both Republicans and Democrats, claim the processes outlined in the legislation unnecessarily delay approvals for infrastructure and energy projects. Last February, the Council on Environmental Quality (CEQ) published an interim final rule removing NEPA regulations. The new action adopts the rule as final.

 
Related

“In this Administration, NEPA’s regulatory reign of terror has ended,” said CEQ Chairman Katherine Scarlett in a statement. “Thanks to President Trump’s leadership, CEQ acted early to slash needless layering of bureaucratic burden and restore common sense to the environmental review and permitting process.”

In response to the interim final rule, the CEQ received more than 108,000 public comments, according to a document outlining the rule published today on the Federal Register. One such comment came from a coalition of environmental groups, expressing strong opposition to the rule last March.

NEPA “promotes sound and environmentally-informed decisionmaking by federal agencies, and it provides the primary way for the public to learn about and provide input regarding the impacts of federal actions on their lives,” the letter read. “The only certainty provided by the Interim Final Rule is less government transparency, more project delay, more litigation, less resilient infrastructure, and poor environmental and health outcomes for communities.”

—Emily Gardner (@emfurd.bsky.social), Associate Editor

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Trump Pulls United States Out of International Climate Efforts “Contrary” to National Interests

Thu, 01/08/2026 - 16:11
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

In an executive order issued on 7 January, the White House ordered the country’s withdrawal from 66 international agreements determined to be “contrary to the interests of the United States,” including two global efforts to combat climate change: the United Nations Framework Convention on Climate Change (UNFCCC) and the Intergovernmental Panel on Climate Change (IPCC).

The UNFCCC is a 1992 treaty that sets the legal framework for international cooperation to limit climate change. The IPCC is the United Nations organization that assesses and communicates climate science to global governments. 

The order will make the United States the only country in the world that does not participate in the UNFCCC.

 “As the only country in the world not a part of the UNFCCC treaty, the Trump administration is throwing away decades of U.S. climate change leadership and global collaboration.” 

“This is a shortsighted, embarrassing, and foolish decision,” Gina McCarthy, former EPA administrator under President Barack Obama, told E&E News. “As the only country in the world not a part of the UNFCCC treaty, the Trump administration is throwing away decades of U.S. climate change leadership and global collaboration.” 

McCarthy added that the U.S. withdrawal would limit the country’s ability to influence important decisions that impact the global economy, especially as other countries invest heavily in clean energy.

KD Chavez, executive director of the Climate Justice Alliance, an advocacy organization, said in a statement that the withdrawal “protects polluters while abandoning all of us, our livelihoods, and Mother Earth.”

“This move undermines treaty obligations, tribal sovereignty, and the global cooperation needed to survive the climate crisis,” Chavez said.

Others say the UFCCC is ineffective, and that leaving it could open new opportunities to cooperate with other countries to combat or mitigate climate change: “The framework convention is a joke,” George David Banks, Trump’s international climate adviser during his first term, told E&E News.

The UNFCCC has been criticized in the past for the ineffectiveness of its annual “conferences of the parties,” or COPs, as well as the influence of fossil fuel lobbyists at these meetings. 

 
Related

Because the Senate originally, and unanimously, advised President George H.W. Bush to join the UNFCCC in 1992, legal experts question whether the order to withdraw is constitutional, or whether the United States could rejoin in the future. 

The withdrawal from the IPCC also cuts the United States out of global climate science assessments. “Walking away doesn’t make the science disappear, it only leaves people across the United States, policymakers, and businesses flying in the dark at the very moment when credible climate information is most urgently needed,” Delta Merner, associate accountability campaign director for the Climate and Energy Program at the Union of Concerned Scientists, said in a statement

On his first day in office last year, Trump pulled the United States out of the Paris Agreement, a legally binding treaty setting long-term emissions goals, for a second time—an action that one United Nations report estimated would eliminate 0.1°C (0.18°F) of global progress on climate change by 2100. Withdrawing from the IPCC and UNFCCC leaves the United States further isolated from international cooperative efforts to limit climate change.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Successful Liquid Lake Conditions in a Cold Martian Paleoclimate

Thu, 01/08/2026 - 15:20
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Understanding the paleoclimate of Mars is essential for gaining insights into Mars’ early history and atmospheric conditions. Such information is the key to learning why Mars shifted from a potentially warm, wet planet to the cold, dry desert we see now; whether climate change was gradual or catastrophic, thus informing how terrestrial planets evolve over billions of years.

Moreland et al. [2025] use an adapted lake energy balance model to investigate the connections between Martian geology and climate. By combining climate input from the Mars Weather Research & Forecasting general circulation model with geologic constraints from Curiosity rover observations, the study contributes to resolve the historic disconnect between the modeling results that suggest cold climate and the geologic evidence that liquid water was retained into Mars’ lakes. By concluding that relatively small lakes with a relatively limited water input and seasonal ice cover could retain seasonal liquid water for long times under Mars’ paleoclimate, the authors provide groundbreaking findings to inform climate models and enhance our understanding of conditions on early Mars.  

Citation: Moreland, E. L., Dee, S. G., Jiang, Y., Bischof, G., Mischna, M. A., Hartigan, N., et al. (2026). Seasonal ice cover could allow liquid lakes to persist in a cold Mars paleoclimate. AGU Advances, 7, e2025AV001891. https://doi.org/10.1029/2025AV001891

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The Northern Sargasso Sea Has Lost Much of Its Namesake Algae

Thu, 01/08/2026 - 14:37

Sargassum has a bad reputation for washing up on shorelines, rotting on the beach, and creating a stinky mess. But this marine algae also functions as a habitat for many marine species, and new research published in Nature Geoscience indicates that its biomass has significantly declined where it once flourished: Since 2015, the amount of Sargassum in the northern Sargasso Sea has decreased by more than 90%. That change is likely caused by a reduced supply of healthy algae from the Gulf of Mexico, where water temperatures are rising, the researchers suggest.

“This is the only sea on Earth that has no physical boundaries.”

The floating brown algae known as Sargassum is found throughout the Atlantic Ocean, the Caribbean Sea, and the Gulf of Mexico. (Other species exist in the Pacific.) A region of the subtropical North Atlantic Ocean is even named in its honor: the Sargasso Sea. Rafts of Sargassum measuring tens of meters wide and several kilometers long frequently form in the Sargasso Sea, and marine life ranging from crabs to shrimp to sea turtles takes refuge in the nooks and crannies afforded by its leaves and air-filled bladders.

The Sargasso Sea is a geographical anomaly when it comes to bodies of water—it’s bounded by ocean currents, not land. “This is the only sea on Earth that has no physical boundaries,” said Chuanmin Hu, an optical oceanographer at the University of South Florida in Tampa and the senior author of the new study.

Spotting Algae from Space

To better understand how Sargassum populations have shifted over time in the Sargasso Sea and beyond, Hu and his colleagues mined archival satellite data. The team focused on observations made from 2000 to 2023 with the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument, which collects data in the near- and midinfrared ranges of the electromagnetic spectrum. That spectral coverage is important because Sargassum, like all other vegetation, strongly reflects near-infrared light; ocean water, on the other hand, does not.

Sargassum has a different signal than the background ocean water,” said Hu.

The team, coled by Yingjun Zhang, Brian Barnes, and Deborah Goodwin, exploited that telltale sign to estimate the amount of algae present in various swaths of water. The researchers focused on six geographic regions that cumulatively spanned more than 40° in latitude and 90° in longitude. The team was able to detect Sargassum where the fractional areal coverage of the algae was as low as 1 part in 500. Typically, when Sargassum is present, there’s about 5 times that much of it in an average pixel, said Barnes, a satellite oceanographer at the University of South Florida in St. Petersburg.

The Northern Sargasso Sea, with Less Sargassum

The researchers found that Sargassum populations in the northern part of the Sargasso Sea have decreased dramatically since 2015—the satellite data revealed a roughly twelvefold drop in average biomass between 2000–2014 datasets and 2015–2023 datasets. (Measurements from the team’s shipboard surveys showed that Sargassum density declined by only about 50% over the same time period, but the team noted that those in situ data are sparse and potentially suffer from sampling bias.) If the satellite data are reflecting reality—and it’s likely that they are—that’s a substantial decrease in Sargassum, said Barnes. “There’s so much less now.”

At the same time, there’s been a proliferation of Sargassum in the so-called Great Atlantic Sargassum Belt. This 9,000-kilometer-wide swath of the ocean stretching from western Africa to the Gulf of Mexico saw an uptick in Sargassum beginning in 2011 that hasn’t abated. But it’s not as though the Great Atlantic Sargassum Belt is robbing the northern Sargasso Sea of its algae. The Great Atlantic Sargassum Belt is playing a role in the demise in the northern Sargasso Sea, but the largest changes are likely caused by shifting conditions in the Gulf of Mexico, the team surmised.

The agent that facilitates all of these connections? That’s ocean currents, said Zhang, an oceanographer at the Scripps Institution of Oceanography at the University of California, San Diego. The Sargasso Sea and the Gulf of Mexico may be thousands of kilometers apart, but they’re nonetheless linked by waters on the move.

Algae on a Journey

Satellite data have shown that the Gulf of Mexico is one of the key sources of Sargassum that ultimately ends up in the northern Sargasso Sea. The algae makes a journey that lasts several months: From the Gulf of Mexico, Sargassum hitches a ride on ocean currents—namely, the Loop Current and the Florida Current—before getting swept up in the Gulf Stream. It then makes its way along the East Coast of the United States before finally reaching the northern Sargasso Sea.

But sea surface temperatures have been rising in the Gulf of Mexico in recent years, often reaching more than 30°C in the summertime. Sargassum prefers temperatures ranging from 23°C to 28°C, and heat-stressed algae are less likely to survive the monthslong journey to the northern Sargasso Sea, said Hu. “During the long-distant transport, most of it will die.”

“You have a one-two punch.”

That makes sense, said William Hernandez, an oceanographer at the University of Puerto Rico–Mayaguez who was not involved in the research. Sargassum stressed by high temperature is less likely to take up nutrients and grow adequately, he said. “It’s the same thing that you see in terrestrial vegetation.”

In addition to heat stress, Sargassum in the Gulf of Mexico is also likely suffering from a lack of nutrients. That’s because the plentiful Sargassum in the Great Atlantic Sargassum Belt is gobbling up necessary compounds like phosphorus and sulfates, said Hernandez. So when currents off the coast of South America and in the Caribbean sweep water into the Gulf of Mexico, they’re transporting something that’s essentially already been picked over, he said. “By the time those waters reach that area, they’ve already been depleted of their nutrients.”

The combined effects of heat stress and limited nutrients really wallop Sargassum populations, said Hernandez. “You have a one-two punch.” There might well be ecological repercussions to having less Sargassum in the northern Sargasso Sea, the team suggests. Fish and other creatures rely on Sargassum for habitat, so less algae could translate into measurable impacts on other animals. Collecting in situ animal data in the Sargasso Sea will help answer that question, said Hu. “There should be impacts on other animals. Is that the case?”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2026), The northern Sargasso Sea has lost much of its namesake algae, Eos, 107, https://doi.org/10.1029/2026EO260014. Published on 8 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Temperatures Are Rising, but What About Humidity?

Thu, 01/08/2026 - 14:35
Source: AGU Advances

Heat waves are becoming commonplace, and so too is high humidity, which can strain the electrical grid, hurt the economy, and endanger human health. But the global prevalence of record-breaking humidity events, some of which approach the physiological limit of what humans can safely handle—and all of which go beyond local expectations and adaptations—has not been widely studied.

To remedy that oversight, Raymond et al. used data from the European Centre for Medium-Range Weather Forecasts Reanalysis 5 (ERA5) and several other sources to establish the most intense humid heat that has occurred in recent years across the globe. They then used several climate models to estimate where instances of even more severe humid heat are most likely to occur in the future.

Relative to the local climate, humid heat can be most extreme in the Middle East and North Africa, with tropical regions coming in a close second, the researchers found. In these locales, the wet-bulb temperature (a measure of humid heat) is capable of reaching 4–5 standard deviations above the average for the warm season. The Middle East and North Africa are also among the regions that experience the longest stretches of humid heat, sometimes lasting 20 or more days.

Estimates of overall humid heat likelihood are very sensitive to a few extremely hot, humid days, the researchers found. In many locations, removing a single outlier led statistical models to predict fivefold fewer hot, humid days in the future. The finding highlights the need for accurate observational data, the researchers write.

Humid heat is particularly dangerous when it comes in spurts, offering areas little relief for concentrated periods. In the tropics, three quarters of the days when the wet-bulb temperature was in the top 5% occurred in only a quarter of the years included in the study. This is likely largely because El Niño heightens both atmospheric temperature and moisture levels, so record-setting days in the tropics tend to cluster in years when this weather pattern is active.

The researchers note that 2023 was a banner year for humid heat, with 23 different regions setting records. That’s entirely because of climate change, the researchers’ work suggests: Otherwise, no records would have been broken. (AGU Advances, https://doi.org/10.1029/2025AV001963, 2025)

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2026), Temperatures are rising, but what about humidity?, Eos, 107, https://doi.org/10.1029/2026EO260020. Published on 8 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A “Lava World” Unexpectedly Hosts an Atmosphere

Wed, 01/07/2026 - 13:28

The universe never fails to surprise. Take TOI-561 b, an Earth-sized exoplanet that circles its star on an orbit more than 30 times smaller than Mercury’s.

Despite being blasted by radiation to the point that its rocky surface is likely molten, TOI-561 b still seems to retain a thick atmosphere. This discovery, reported in The Astrophysical Journal Letters, shows that even highly irradiated planets—whose atmospheres should have been eroded long ago—can remain enshrouded in gas for billions of years.

Lava World

When it comes to constellations, Sextans (the Sextant) is largely unremarkable; its brightest stars can’t even be seen with the naked eye from a large city. But there’s a star in Sextans that is home to a miniature solar system: TOI-561, roughly twice as old as the Sun, has four planets orbiting it. And the innermost of those planets, known as TOI-561 b, holds the special honor of being what’s called an ultrashort-period exoplanet. That’s a world no larger than twice the radius of Earth that whips around its host star in 1 day or less.

“We do not expect that an atmosphere can survive.”

Ultrashort-period exoplanets are rare—only several dozen are known to exist—and they’re extreme: They orbit so close to their host stars that they typically have dayside temperatures above the melting point of rock, leading researchers to dub them “lava worlds.” Ultrashort-period exoplanets are also planets on a journey—it’s thought that they formed farther away from their stars and migrated inward over time.

Many ultrashort-period exoplanets observed to date also don’t have atmospheres. That makes sense, said Rafael Luque, an astrophysicist at the Institute of Astrophysics of Andalusia in Granada, Spain, not involved in the new research. These extreme worlds are literally being irradiated by their host stars, he said. “We do not expect that an atmosphere can survive.”

A Puffed-Up World?

Earlier observations revealed both the size and mass of TOI-561 b. Taken together, those data suggest an anomalously low density for the planet, roughly 4.3 grams per cubic centimeter. (Earth’s average density, for comparison, is about 5.5 grams per cubic centimeter.)

There are several explanations for that finding, said Nicole Wallack, an astronomer at Carnegie Science in Washington, D.C., and a member of the research team. For instance, TOI-561 b might lack an iron core. But a more likely scenario is that it’s a puffed-up planet that appears larger and therefore less dense than it actually is, said Wallack.

And a thick atmosphere is the most logical culprit for a puffed-up exoplanet, she explained. “It could have an atmosphere that’s making the planet appear larger in radius but isn’t influencing its mass as much.”

To test that idea, Wallack and her colleagues, led by Johanna Teske, an astronomer at Carnegie Science, recently observed TOI-561 b and its host star using the James Webb Space Telescope. The researchers collected near-infrared observations of four orbits of the planet, each of which lasted only about 11 hours.

“Atmospheres are much better than solid rocks are at transporting heat.”

For this new study, the team focused on data collected around the time of so-called secondary eclipse. That’s when a planet passes behind its star, as seen from a telescope’s perspective. By comparing observations recorded when the star and planet are both visible to those recorded when just the star is visible, it’s possible to home in on just the signal from the planet, said Wallack. For TOI-561 b, the team divided that planet signal into seven near-infrared wavelength bins and looked at how the light was distributed as a function of wavelength.

That investigation allowed the team to estimate the approximate temperature of TOI-561 b: about 1,700–2,200 K. That’s significantly cooler than the roughly 3,000 K expected on the basis of the temperature of the star and TOI-561 b’s distance from it. “The planet appears to be colder than we would have expected,” said Wallack.

An atmosphere is the best explanation for that discrepancy, Teske and her colleagues proposed. The presence of an atmosphere would allow heat to be redistributed away from a planet’s warmer dayside and toward its cooler nightside. That process of heat distribution is much more efficient than relying on rocks to do the same thing, said Wallack. “Atmospheres are much better than solid rocks are at transporting heat.”

TOI-561 b might not be a complete outlier when it comes to having an atmosphere. After all, a handful of other ultrashort-period exoplanets, such as 55 Cancri e, are believed to be enshrouded in gas.

Hunting for Molecules

After analyzing the Webb observations, the researchers modeled signals that would be expected from an atmosphere containing varying proportions of molecules such as water, carbon dioxide, and carbon monoxide. They found that their data were no more consistent with one model than another. The relatively wide spectral binning that the team adopted—just seven data points over a range of roughly 2.7–5.1 micrometers—may have precluded detecting any molecule-specific features, the team concluded.

Even though the composition of TOI-561 b’s atmosphere remains inconclusive, there’s good evidence that it exists, said Michael Zhang, an astronomer at the University of Chicago not involved in the research. “I believe that there is an atmosphere.”

And that atmosphere is most likely composed of material outgassed from TOI-561 b’s molten surface. That inference can guide logical follow-on work modeling the planet’s atmosphere, said Zhang. “You can test compositions that you expect would be outgassed from the magma ocean.”

Analyzing TOI-561 b’s nightside signal—something that’s possible with the researchers’ current dataset—will also be important, said Zhang. It’s a tough measurement to make, but because atmospheres are good at redistributing heat, he explained, even the side of TOI-561 b facing away from its star should be detectable. “The nightside should be warm.”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2026), A “lava world” unexpectedly hosts an atmosphere, Eos, 107, https://doi.org/10.1029/2026EO260019. Published on 7 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The shifting pattern of landslide risk in cities – an interesting case study from Medellín

Wed, 01/07/2026 - 07:38

An fascinating case study from the 24 June 2025 Granizal landslide in Medellín, Colombia, which killed 27 people and destroyed 50 homes, shows demonstrates that it is not just the urban poor that are exposed to landslides.

That urban areas can be subject to high levels of landslide risk is well-established – commonly cited examples are Hong Kong (which has a huge programme to manage the risk), Sao Paolo and Medellín, amongst other places. The well-established pattern is that it is the urban poor that have the highest levels of risk, being forced to live on slopes on the margins of the conurbation, often with poor planning and low levels of maintenance of, for example, drainage systems.

A fascinating open access paper (Ozturk et al. 2025) has just been published in the journal Landslides that suggests that this pattern might be beginning to change under the impacts of climate change. The paper examines the 24 June 2025 Granizal landslide in Medellín, Colombia, which killed 27 people and destroyed 50 houses. I wrote about this landslide at the time, including this image of the upper part of the landslide:-

The main body of the 25 June 2025 landslide at Granizal in Colombia. Still from a video posted to Youtube by Cubrinet.

The location of the headscarp of the Granizal landslide is [6.29587, -75.52722].

The analysis of Ozturk et al. (2025) shows that this is a 75,000 cubic metre failure with a source area length of 143 m and a width of 50 m. The landslide was triggered by rainfall over a 36 hour period.

The authors’ analysis suggests that the landslide occurred on terrain that is steep even by the standards of Medellín, and at a comparably high elevation for the city. They have then looked at the distribution of income tax bands for the city according to both elevation and slope angle:-

Hillslope angle (a) and elevation (b) of the built up area in Medellín, categorized by utility tax, known as Strata, which determines the socio-economic status of different neighbourhoods. For example, the utility tax decreases as the categories get lower. Hillslope angle increases generally towards poorer categories. Figure from Oztuk et al. (2025), with the caption lightly edited.

The diagram shows that in Medellín, the poorest people live on the steepest slopes, and thus (at the first order) are more at risk of landslides. People with higher income levels tend to live on areas with a lower slope angle – the more affluent you are, the lower your landslide risk. However, this pattern reverses for those in the highest tax band (i.e. the richest). Those people live on steeper slopes (although not as steep as for the poorest people).

A similar pattern emerges for elevation, although the pattern is weaker. But compare Utility tax categories 5 and 6 for example – the richest people migrate to higher elevations.

This probably represents a desire by the most affluent to live in locations with the best views and in which they can have larger plots of land. A similar pattern is seen elsewhere – for example, property prices in The Peak in Hong Kong are very high.

It has been possible to live in these higher risk locations because of good identification of hazards for those that can afford it, the use of engineering approaches to mitigate the hazard and good maintenance of drains. These options are available to those with money, who live in “formal neighbourhoods” rather than unplanned communities. Of course, as Ozturk et al. (2025) remind us, the vulnerability of these communities is still much lower than that of the poor.

But Ozturk et al. (2025) make a really important point:-

“…we should not forget that climate change is gradually intensifying and may soon render the design criteria used for planning formal neighbourhoods obsolete. Hence, our concluding message is that future rainfall changes may also lead to catastrophic landslide impacts in formally planned urban neighbourhoods, challenging the assumption that only informal settlements are at high risk.”

The vulnerability of the poorest communities means that this is where the highest risk will continue to be located, and this is where the greatest levels of loss will occur. But our rapidly changing environment means that even more affluent communities are facing increasing levels of risk.

Reference

Ozturk, U., Braun, A., Gómez-Zapata, J.C. et al. 2025. Urban poor are the most endangered by socio-natural hazards, but not exclusively: the 2025 Granizal Landslide case. Landslides. https://doi.org/10.1007/s10346-025-02680-y

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

What Could Happen to the Ocean’s Carbon If AMOC Collapses

Tue, 01/06/2026 - 14:13
Source: Global Biogeochemical Cycles

The Atlantic Meridional Overturning Circulation (AMOC) is the system of currents responsible for shuttling warm water northward and colder, denser water to the south. This “conveyor belt” process helps redistribute heat, nutrients, and carbon around the planet.

During the last ice age, occurring from about 120,000 to 11,500 years ago, millennial-scale disruptions to AMOC correlated with shifts in temperature, atmospheric carbon dioxide (CO2), and carbon cycling in the ocean—as well as changes in the signatures of carbon isotopes in both the atmosphere and the ocean. At the end of the last ice age, a mass melting of glaciers caused an influx of cold meltwater to flood the northern Atlantic, which may have caused AMOC to weaken or collapse entirely.

Today, as the climate warms, AMOC may be weakening again. However, the links between AMOC, carbon levels, and isotopic variations are still not yet well understood. New modeling efforts in a pair of studies by Schmittner and Schmittner and Boling simulate an AMOC collapse to learn how ocean carbon storage, isotopic signatures, and carbon cycling could change during this process.

Both studies used the Oregon State University version of the University of Victoria climate model (OSU-UVic) to simulate carbon sources and transformations in the ocean and atmosphere under glacial and preindustrial states. Then, the researchers applied a new method to the simulation that breaks down the results more precisely. It separates dissolved inorganic carbon isotopes into preformed versus regenerated components. In addition, it distinguishes isotopic changes that come from physical sources, such as ocean circulation and temperature, from those stemming from biological sources, such as plankton photosynthesis.

Results from both model simulations suggest that an AMOC collapse would redistribute carbon throughout the oceans, as well as in the atmosphere and on land.

In the first study, for the first several hundred years of the model simulation, atmospheric carbon isotopes increased. Around year 500, they dropped sharply, with ocean processes driving the initial rise and land carbon controlling the decline. The decline is especially prominent in the North Atlantic in both glacial and preindustrial scenarios and is driven by remineralized organic matter and preformed carbon isotopes. In the Pacific, Indian, and Southern Oceans, there was a small increase in carbon isotopes.

In the second study, model output showed dissolved inorganic carbon increasing then decreasing, causing the inverse changes in atmospheric CO2. In the first thousand years of the model simulation, this increase in dissolved inorganic carbon can be partially explained by the accumulation of respired carbon in the Atlantic. The subsequent drop until year 4,000 is primarily driven by a decrease in preformed carbon in other ocean basins. (Global Biogeochemical Cycles, https://doi.org/10.1029/2025GB008527 and https://doi.org/10.1029/2025GB008526, 2025).

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2026), What could happen to the ocean’s carbon if AMOC collapses, Eos, 107, https://doi.org/10.1029/2026EO260016. Published on 6 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Science Escapes Largest Cuts in Latest Budget Bills

Mon, 01/05/2026 - 22:52
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

Today, top appropriators in the U.S. Senate and House of Representatives released a three-bill appropriations package for fiscal year 2026 (FY26) that largely rejects drastic cuts to federal science budgets that President Trump proposed last year. The “minibus” package, negotiated and agreed upon by both political parties, outlines a budget that preserves most, but not all, funding for key science programs related to space, weather, climate, energy, and the environment across multiple agencies.

“This is a fiscally responsible package that restrains spending while providing essential federal investments that will improve water infrastructure in our country, enhance our nation’s energy and national security, and spur scientific research necessary to maintain U.S. competitiveness,” Susan Collins (R–ME), chair of the Senate Appropriations Committee, said in a statement.

In May 2025, President Trump released a budget request to Congress that proposed slashing billions of dollars in federal science funding. However, during the many rounds of meetings throughout the year, appropriators in both chambers and on both sides of the aisle seemed disinclined to follow the proposed budget, including when it came to funding for climate research, clean energy initiatives, environmental protections, and other topics that run counter to administration priorities.

 
Related

This new three-bill package follows suit in rejecting many of the president’s more drastic cuts to science programs.

“This package rejects President Trump’s push to let our competitors do laps around us by slashing federal funding for scientific research by upwards of 50% and killing thousands of good jobs in the process,” Vice Chair Senator Patty Murray (D–WA) said in a statement. “It protects essential funding for our public lands, rejects steep proposed cuts to public safety grants that keep our communities safe, and boosts funding for key flood mitigation projects.”

Here’s how some Earth and space science agencies fare in this package:

  • Department of Energy (DOE) Non-Defense: $16.78 billion, including $8.4 billion for its Office of Science, $3.1 billion for energy efficiency and renewable energy programs, and $190 million for protecting the nation’s energy grids.
  • Environmental Protection Agency (EPA): $8.82 billion, preserving funding to state-level programs that protect access to clean water, drinking water, and air. The bill also retains funding for the Energy Star energy efficiency labelling program and increases funding to state and Tribal assistance grant programs.
  • NASA: $24.44 billion, including $7.25 for its science mission directorate, which would have seen a 47% decrease under the President’s budget request. The bills maintain funding for 55 missions that would have been cut, as well as for STEM engagement efforts and Earth science research that similarly would’ve been cut. It also increases spending for human exploration.
  • National Institute of Standards and Technology (NIST): $1.847 billion, including funds to advance research into carbon dioxide removal.
  • National Oceanic and Atmospheric Administration (NOAA): $6.171 billion, including $1.46 billion to the National Weather Service to improve forecasting abilities and boost staffing. The budget also earmarks funds to preserve weather and climate satellites, and maintain climate and coastal research.
  • National Park Service (NPS): $3.27 billion, with enough money to sustain FY24 staffing levels at national parks.
  • National Science Foundation (NSF): $8.75 billion, including $7.18 billion for research-related activities. That would support nearly 10,000 new awards and more than 250,000 scientists, technicians, teachers, and students.
  • U.S. Forest Service (USFS): $6.13 billion, with just under half of that put toward wildfire prevention and management. Funded programs not related to wildfire prevention include forest restoration, forest health management, hazardous fuels reduction, and repurposing unnecessary roads as trails.
  • U.S. Geological Survey (USGS): $1.42 billion, including money to maintain active satellites and topographical mapping programs.

This is the latest, but not the last, step in finalizing science funding for FY26. The bills now head out of committee to be voted upon by the full chambers of the Senate and House, reconciled between chambers, and then signed by the president.

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

After Sackett, a Wisconsin-Sized Wetland Area Is Vulnerable 

Mon, 01/05/2026 - 15:18

Three hundred years ago, the central United States was a land of wetlands—more than 150 million hectares of them. All that water made the region highly attractive to farmers, who, over time, converted most of it into agricultural land.

For the wetlands that remain, protections secured by the Clean Water Act are often the only thing preventing wetland conversion or development, especially when state protections are weak, said Kylie Wadkowski, a landscape ecohydrologist and doctoral candidate at Stanford University. 

With wetlands, “you actually can’t do whatever you want,” said Elliott White Jr., a coastal socioecosystem scientist at Stanford University. “That’s how this Sackett case came about.”

The Supreme Court’s 2023 Sackett v. EPA decision ruled in favor of two landowners backfilling a lot containing wetlands. The decision changed the definition of the term “waters of the United States”—which is used in the Clean Water Act—to exclude wetlands without continuous surface connections to larger, navigable bodies of water. In November 2025, the Trump administration’s EPA proposed to set new rules for water regulations that may be even looser than the updated Sackett definition.

According to research by Wadkowski and White presented on 15 December 2025 at AGU’s Annual Meeting in New Orleans, the changing definition will leave millions of hectares of wetlands unprotected and more vulnerable to development. 

Wadkowski and White are the first to analyze wetland protections in detail on a nationwide scale, said Adam Ward, a hydrologist at Oregon State University who was not involved in the research. “This represents a huge advance in understanding what is being protected and what is losing protections,” he said.

What Will Happen to Wetlands?

Wadkowski and White found that under Sackett, 16.4 million hectares of wetlands, an area about the size of Wisconsin, are either unprotected or have undetermined status. Under the EPA’s newest proposed rule, that number could increase; the proposed rule contains many subjective and ill-defined terms that could be interpreted by regulators to mean even more wetlands lose protections, Ward said.

The approach the researchers took—using the available wetland, stream, and land conversion data with spatial modeling—was “incredibly logical,” Ward said. “They’re using machine learning tools, using the information we have to try and gap-fill and create the most comprehensive analysis that they can, and that’s a huge step in the right direction.”

Rates of vulnerability were not consistent across the United States. A breakdown of protections based on land management categories showed that 43.5% of wetlands on lands managed by tribes was protected under Sackett, compared to the national average of 66%.

Wetlands in the Great Plains states North Dakota, South Dakota, Nebraska, Montana, and Minnesota were the least protected. This area of the country is often called the “prairie pothole” region because many of its wetlands are depressions in the landscape fed by groundwater and disconnected from larger surface water bodies. Under Sackett, these geographically isolated wetlands rely entirely on state-level protections, which are also often weaker in agricultural regions, Wadkowski said.

“The economic pressure and agricultural [conversion] happens a lot more in the Plains states,” she said. “And those are also the states that have less state level protections.”

With a rule that “emphasizes overland [surface] flow and connection to streams and rivers, it shouldn’t surprise us at all that it excludes wetlands that aren’t wet because of overland [surface] flow,” Ward said.

Wadkowski plans to continue to evaluate how various legal frameworks might affect wetland conversion rates in the future by comparing their estimates of protected wetlands under Sackett and the new EPA proposal with past data on wetland conversion rates under previous definitions of “waters of the United States.”

Informing Policy

To best protect wetlands, policymakers should ensure their policies line up with the available science, White said. 

Part of that strategy includes acknowledging that wetlands that are not connected to larger bodies of water year-round via surface water and therefore may not be protected under the Sackett decision may still be connected to broader water systems through groundwater, Wadkowski said. “Think about water bodies as not just on the surface.”

“Policymakers need to more thoughtfully engage with the scientific community for a more clear understanding of what a wetland is and what wetlands actually need.”

The Sackett decision and the new EPA proposal do not reflect the scientific consensus, White said. “Policymakers need to more thoughtfully engage with the scientific community for a more clear understanding of what a wetland is and what wetlands actually need.” Scientists, too, need to better engage with policymakers, he added.

For example, said Ward, part of the reason that wetland rule frameworks are so contentious is that none have yet been informed by enough clear, comprehensive science to make enforcement efficient or practical. “We have heaps of scientific understanding, but the scientific community writ large has not been invited to formally weigh-in on how to design a rule that reflects our understanding,” he wrote in an email.

In a presentation on 15 December at AGU’s Annual Meeting, Ward made the case for a new, large-scale U.S. headwater stream monitoring network, which would help reduce some of the uncertainty inherent in wetland regulations. “If you don’t understand the stream network, you can’t possibly understand the wetland protections,” he said.

Scientific engagement, however, has been made more difficult by the courts, according to Ward: Within the past 5 years, the Supreme Court has begun to invoke the major questions doctrine, which preserves major rulemaking on matters of environmental regulations for Congress, giving agencies like the EPA less incentive to seek input from scientists. 

“In parallel with our advances in understanding [wetland science] is a court system that is essentially cutting scientists out of the loop,” Ward said.

The public comment period for the EPA’s newest proposed rule closes on 5 January.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Citation: van Deelen, G. (2026), After Sackett, a Wisconsin-sized wetland area is vulnerable, Eos, 107, https://doi.org/10.1029/2026EO260018. Published on 5 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

How a Move to the Shallows 300,000 Years Ago Drove a Phytoplankton Bloom

Mon, 01/05/2026 - 14:08
Source: AGU Advances

Single-celled algae in the ocean known as coccolithophores play an important role in the marine carbon cycle when they take up bicarbonate from seawater to build their shells. Coccolithophore numbers have been increasing globally in recent years, meaning their influence is growing, even as scientists still don’t fully understand the factors driving their explosive growth. One explanation could be changes to the alkalinity of ocean water, specifically, greater amounts of bicarbonate available for the tiny creatures to use.

For more information on how coccolithophores grow and flourish, Zhang et al. looked to the last time the phytoplankton surged in number, between 300,000 and 500,000 years ago. Using fossilized coccolithophore morphology and examining carbon isotope ratios, the authors constructed models that allowed them to pick apart the ingredients for coccolithophore success.

Comparisons of inorganic to organic carbon ratios in the shells, as well as comparisons of photosynthesis and calcification rates revealed by carbon isotope ratios, showed a large increase in calcification linked to greater bicarbonate uptake. Though increasing alkalinity was likely a factor in the coccolithophores’ increased growth, it doesn’t explain all of it, the authors say. Instead, greater nutrient availability allowed coccolithophore populations to swell, both by giving them more food to use and by allowing them to move to shallower depths where there was more sunlight for photosynthesis.

The findings have implications for the present day, as we see marine phytoplankton numbers shifting alongside changes in ocean chemistry. Previous works focused on the change in seawater alkalinity and pH. But more information on how nutrient availability influences coccolithophore growth is needed, the authors conclude, especially in light of proposed geoengineering schemes that could shift the types of nutrients available. (AGU Advances, https://doi.org/10.1029/2024AV001609, 2025)

—Nathaniel Scharping (@nathanielscharp), Science Writer

Citation: Scharping, N. (2026), How a move to the shallows 300,000 years ago drove a phytoplankton bloom, Eos, 107, https://doi.org/10.1029/2026EO260010. Published on 5 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Las olas de calor marinas lentifican el flujo de carbono de los océanos

Mon, 01/05/2026 - 14:07

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

Las olas de calor marinas describen casos de aguas extraordinariamente cálidas que pueden permanecer en la superficie del océano durante meses. Al igual que las olas de calor que experimentamos en tierra, las olas de calor marinas pueden alterar la química ambiental y estancar los procesos biológicos. Mientras que las pérdidas catastróficas de megafauna son indicadores evidentes de un sistema de estrés, los investigadores han comenzado a recopilar datos suficientes para entender cómo los organismos microbianos en la base de las redes tróficas oceánicas están respondiendo a las olas de calor.

Un nuevo estudio publicado en Nature Communications presenta una década de mediciones que documentan dos olas de calor sucesivas en el noreste del Océano Pacífico. El equipo interdisciplinario de autores de este artículo utilizó una combinación de una boya robótica autónoma, un crucero oceanográfico y datos satelitales para entender cómo las comunidades microbianas de la región se reorganizaron en respuesta a estos eventos extremos.

Los investigadores descubrieron que la producción de materia orgánica incrementó en la superficie del océano durante las olas de calor, pero las partículas ricas en carbono no se hundieron, ni flotaron, más bien, se quedaron en su lugar.

La bomba biológica de carbono

Fitoplancton—diminutos microbios fotosintetizadores—activan la bomba biológica de carbono. Al usar la luz solar y el dióxido de carbono (CO2) para crecer, el fitoplancton extrae carbono de la atmósfera y lo incorpora al ciclo del carbono océanico. El zooplancton se alimenta en los extensos campos con estos organismos similares a plantas, transportando carbono a zonas más profundas de la columna de agua en forma de pellets fecales y pedazos de plancton a medio comer. Eventualmente, algunas de estas partículas se sumergen lo suficientemente como para alimentar los ecosistemas de las profundidades oceánicas.

“La capacidad del océano para capturar carbono depende de los microbios en la base de la red trófica.”

Esta bomba de carbono representa un amortiguador globalmente relevante frente a los impactos del cambio climático, ya que el océano absorbe aproximadamente la cuarta parte del CO2 emitido por la actividad humana. Algunas estimaciones sugieren que nuestras concentraciones atmosféricas actuales de CO2, podrían incrementar hasta un 50% si la bomba biológica de carbono dejará de transportar carbono hacia las profundidades del océano.

“La capacidad del océano para capturar carbono depende de los microbios en la base de la red trófica, entonces es muy importante que nosotros comencemos a entender cuáles son los impactos de las olas de calor marinas en las comunidades microbianas”, explicó Mariana Bif, autora principal del nuevo estudio. Bif es profesora asistente en la Universidad de Miami y anteriormente fue investigadora en el Instituto de Investigación del Acuario de la Bahía de Monterey, o MBARI por sus siglas en inglés.

Cuando la red trófica se enreda

En ambas olas de calor marinas rastreadas en el estudio, los investigadores encontraron que la bomba de carbono biológica mostraba señales de sobrecalentamiento. Las partículas ricas en carbono se quedaron estancadas aproximadamente a los 200 metros (660 pies) debajo de la superficie, pero durante las dos olas de calor, distintos mecanismos causaron esta acumulación.

La primera ola de calor incluida en el estudio empezó en el 2013, cuando vientos inusualmente débiles sobre el Pacífico no lograron devolver el aire cálido del verano hacia el territorio continental de los Estados Unidos. La ola de calor, apodada “the Blob” fue noticia cuando las aguas cálidas, estancadas y deficientes en oxígeno provocaron mortandades masivas de fauna en todos los rincones del Pacífico antes de disiparse en 2015.

En 2019, la nubosidad irregular sobre el océano y  prepararon el escenario para que otra ola de calor barriera con el noreste del Pacífico. Esta segunda ola de calor elevó nuevamente las temperaturas y pasó a conocerse como “the Blob 2.0”.

Bif y sus coautores encontraron que durante ambas olas de calor, la comunidad microbiana marina experimentó un cambio en sus “mandos intermedios”.

Dentro de los primeros años del Blob, las condiciones físicas y químicas favorecieron a especies más pequeñas de fitoplancton, lo que a su vez favoreció a un nuevo grupo de alimentadores zooplanctónicos. Esta discreta red trófica, eventualmente creó una capa oceánica llena de partículas orgánicas que eran demasiado ligeras para hundirse en las aguas más densas de las profundidades.

Durante el Blob 2.0, las concentraciones de las partículas de materia orgánica fueron aún más altas, pero el incremento no provino totalmente de la producción primaria. Esta vez las condiciones favorecieron a especies frugales. Los organismos oportunistamente capaces de alimentarse de detritos y de materia orgánica de menor calidad se volvieron más predominantes, mostrando que el sistema estaba ciclando y reciclando carbono para mantenerlo en la parte superior de la columna de agua. Dentro de esta comunidad, los parásitos prosperaron, y organismos (incluido un grupo de radiolarios) que nunca antes se habían observado en el noreste del Pacífico comenzaron a aparecer regularmente.

Midiendo en medio de la nada

La gama de tecnología utilizada en el estudio lo distingue de esfuerzos previos para catalogar los efectos de las olas de calor marinas

“Ahora nosotros estamos entrando en una era de ‘big data’ en la biogeoquímica oceánica, mientras que antes estábamos limitados a lo que podíamos recolectar desde los barcos.”

“Ahora nosotros estamos entrando en una era de ‘big data’ en la biogeoquímica oceánica, mientras que antes estábamos limitados a lo que podíamos recolectar desde los barcos,” dijo Stephanie Henson, científica principal en el Centro Nacional de Oceanografía en Southampton (NOC Southampton, por sus siglas en inglés), Reino Unido. Henson no participó en el estudio.

Henson explicó que las boyas autónomas y otros sistemas de monitoreo avanzado están permitiendo a los investigadores trabajar con un set de datos que se extiende más allá de la duración de un crucero oceanográfico.

“La gente ha estado estudiando las respuestas a las olas de calor marinas en sistemas como los arrecifes de coral, etcétera”, dijo Henson, explicando que los investigadores han observado que no todas las respuestas biológicas son iguales de una ola de calor marina a otra. Sin embargo, señaló que este estudio fue el primero que demuestra que los flujos de carbono en el océano, también presentan respuestas complejas a las olas de calor marinas.

Para revisar los signos vitales del Pacífico antes, durante y después de cada una de las olas de calor, los investigadores recurrieron a la Red Global de Biogeoquímica Oceánica (GO-BGC, por sus siglas en inglés). Los instrumentos GO-BGC son un subconjunto de la red Argo, una red global de miles de boyas robóticas autónomas. Cada boya se desplaza libremente con las corrientes oceánicas, monitoreando el pH, la salinidad, la temperatura y otros parámetros

Mariana Bif se prepara para desplegar una boya GO-BGC en la Bahía de Bengala. La boya derivará libremente en las corrientes oceánicas a aproximadamente 1.000 a 2.000 metros de profundidad, regresando a la superficie cada 10 días para enviar datos sobre la temperatura, salinidad y química oceánica, vía satélite, a los investigadores en tierra. (El océano Índico no fue parte del nuevo estudio, pero Bif utilizó boyas GO-BGC en el Pacífico para realizar la investigación.) Créditos: Sudheesh Keloth, Julio, 2025

A pesar de todo lo que pueden hacer, las boyas no son capaces de recolectar muestras microbianas. Para esto, en lugar de que Bif buscara la data, la data llegó a Bif.

Steven Hallam, microbiólogo de la Universidad de Columbia Británica y coautor en el nuevo estudio, se puso en contacto con Bif después de leer una entrevista con ella sobre su trabajo en olas de calor marinas. Él tenía la corazonada de que las muestras de ADN planctónico almacenadas en el refrigerador de su laboratorio podrían ser de ayuda para la investigación de Bif sobre el ciclo del carbono en el océano. Científicos del grupo de laboratorio de Hallam habían publicado previamente investigaciones sobre comunidades bacterianas en la misma región, usando muestras recolectadas durante los cruceros oceanográficos a lo largo del transecto Line P frente a la costa de Columbia Británica.

Después de un intercambio por correo electrónico, el grupo de laboratorio de Hallam analizó las muestras, expandiendo el análisis de las bacterias a la composición de la comunidad entera, lo que resultó en una contribución significativa al estudio de Bif.

Mientras la historia de cómo el ADN planctónico vino a Bif es un testimonio del poder entre la comunicación y colaboración en la ciencia, Henson notó que los transectos de la Line P, no necesariamente se superponen espacialmente con las regiones de mayor impacto de las olas de calor marinas, y combinar los conjuntos de datos de diferentes escalas (como los datos obtenidos en barcos y los datos de flotadores autónomos) debería hacerse cautelosamente.

Además, Henson añadió, “Es lo mejor que podemos hacer por el momento”

Incertidumbres persistentes

Respecto a las investigaciones futuras, Bif está involucrada en algunos nuevos proyectos explorando regiones marinas desoxigenadas, pero dijo: “Mi enfoque siempre es en los flotadores BGC-Argo”.

Bif indicó que será interesante observar los datos de BGC-Argo desde los flotadores que están en medio de la ola de calor marina afectando actualmente el Pacífico Norte. Esa ola de calor ya está mostrando señales de desaceleración, aunque los científicos dicen que probablemente permanecerá durante el invierno.

“No estoy seguro de si esto va a tener el alcance que tuvieron algunas de las olas de calor marinas anteriores en la región”, dijo Nick Bond, quien no estuvo involucrado en esta investigación pero estudió las olas de calor marinas como parte de su rol anterior como climatólogo del estado de Washington. Ahora él es investigador sénior en la Universidad de Washington.

“Lo que no medimos, no podemos entenderlo. Necesitamos más inversiones para monitorear el océano.”

Bond añadió que mientras haya “evidencia tentativa” de que el calentamiento climático puede estar incrementando la frecuencia de olas de calor marina en el Pacífico, aún hay mucho más por aprender antes de que los científicos puedan pronosticar con precisión cómo se comportarán en el futuro.

Mientras tanto, otra incógnita que se avecina para este campo de investigación se está desarrollando nuevamente en tierra firme.

“Hay un poco de incertidumbre en la comunidad porque al momento, para el programa global Argo, Estados Unidos contribuye aproximadamente con la mitad de los flotadores que se despliegan”, dijo Henson, aludiendo su preocupación a los recortes presupuestarios recientes en Estados Unidos. Sin embargo, ella explicó que otros países están intensificando sus contribuciones para mantener a flote el programa Argo.

“Lo que no medimos no podemos entenderlo. Necesitamos más inversiones para monitorear el océano” dijo Bif.

—Mack Baysinger (@mack-baysinger.bsky.social), Escritor de ciencia

This translation by translator (@Cecilia Ormaza) was made possible by a partnership with Planeteando y GeoLatinas. Esta traducción fue posible gracias a una asociación con Planeteando and GeoLatinas.

Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Guest post: Photos and Preliminary Observations from an Overview Flight of the 6 December 2025 Hubbard Glacier Earthquake, Yukon Territory, Canada

Mon, 01/05/2026 - 07:34

Yukon Geological Survey

Contributors: Derek Cronmiller, Theron Finley, Panya Lipovsky, Jan Dettmer

A guest post featuring images and a commentary of landslides in Yukon Territory in Canada triggered by the 6 December 2025 M=7.0 Hubbard Glacier Earthquake.

The 6 December 2025 Mw=7.0 Hubbard Glacier Earthquake in the St. Elias Mountains of Yukon caused widespread mass wasting activity in an area near Mt. Logan, Canada. On 12 December 2025, the Yukon Geological Survey completed an overview flight of the area to collect photographs and document seismically induced activity in the region. Based on the preliminary USGS finite fault model, the earthquake rupture appears to have been shallow, with approximately 2 m of slip occurring at ~6 km depth but no evidence of earthquake surface rupture was identified. However, we documented extensive surface effects including more than 200 landslides, many snow and ice avalanches, and widespread damage to glaciers throughout the area.

View southwest towards Mt King George. Snow avalanche and serac collapse scars are visible on the ridge in the foreground, large rock avalanche scars on the NE face of Mt. King George are visible in the background, triggered by the Landslides from the 6 Dec 2025 Hubbard Glacier Earthquake.

Landslide activity was concentrated on the Mt King George massif, where rock–ice avalanches and rockfall were the most common failure types. Based on preliminary earthquake relocations, the King George massif directly overlies a portion of the fault rupture and rises to a height of 3,741 m, approximately 1,900 m above the surrounding Hubbard Glacier.

Landslide scars and debris on the NW end of the King George massif, looking toward Mt Logan (5,959 m). Note the lack of snow cover here due to concentrated avalanche and landslide activity, as compared to distant peaks.

Landslide activity continued for several days after the main earthquake, likely due to a combination of aftershocks and progressive failure of slopes that were damaged by the earthquake or destabilized by earlier landsliding. At least one rock avalanche occurred between 11 and 12 December as constrained by Landsat imagery.  At the time of the overview flight, slide scars on the east and northeast sides of Mt King George remained active, with ongoing rockslides and rockfall producing large dust clouds. The large slide scar on the east face of Mt King George appeared to have liquid water flowing down its centre, suggesting either discontinuous permafrost within the massif or significant heating associated with slope failure.

A large landslide scar on the east face of Mt King George appeared to have liquid water running out from approximately halfway down the scar (arrow). This scar was producing active rockfall at the time of the overview flight and filling adjacent valleys with dust.

The largest observed landslide was a rock and ice avalanche produced by a partial collapse of the southwest ridge of Mt King George. The basal failure surface occurred along a southwest-dipping planar discontinuity oriented subparallel with the pre-existing slope of the south flank of the ridge . The crown of the slide originated at approximately 3,000 m above sea level and descended roughly 1,300 m along a tributary glacier before coming to rest on the Hubbard Glacier, approximately 7.4 km from the source area. This corresponds to an overall travel angle of approximately 10 degrees. Such high mobility is typical of rock avalanches on glaciers (c.f.  Evans and Clague 1988), where movement is enhanced by the low-friction surface of the glacier, entrainment of snow and ice, and by water inputs generated through frictional melting (Sosio et al., 2012).

Source area and runout of the largest (9 square km) landslide triggered by the Mw=7.0 6 December 2025 Hubbard Glacier Earthquakeearthquake on the southwest ridge of Mt King George. The planar surface of failure for the largest rock and ice avalanche on the southern flank of the SW ridge of Mt King George.

Snow avalanches were common on all aspects and were triggered in a more extensive region than landslides. Some of the largest snow avalanches occurred on the north and east aspects of Mt King George and McArthur Peak (a sub-peak of Mt Logan) and produced plumes that in some cases extended 2-3 km across the glacier at the base of the slopes where they initiated. Damage to glaciers was also extensive; the collapse of snow bridges and seracs on icefalls was widespread on the Hubbard Glacier and adjacent tributaries. A partial collapse of a glacier occurred between Mt King George and Mt Queen Mary, an area immediately above many of the most intense aftershocks. This portion of glacier is covered by debris from a rock avalanche off the NE face on Mt King George which may have contributed to its failure. The most intensely affected area is in a popular recreation zone of Kluane National Park and may have significant impact on the objective hazards skiers and mountaineers face in the region over the coming years.

Partial collapse of a glacier between Mt King George and Mt Queen Mary. The length of the failure is approximately 2.4 km. Widespread collapse of seracs and snow bridges on the south side of Mt Queen Mary. The field of view mid-photo is approximately 5 km across. Widespread collapse of seracs and snow bridges on the south side of Mt Queen Mary. The field of view mid-photo is approximately 5 km across. Rock avalanche below Mt King George triggered by the 6 December 2025 Hubbard Glacier Earthquake. The runout distance is 1.4 km.

All photos provided courtesy of the Government of Yukon.

References:

Evans, S. G., & Clague, J. J. 1988. Catastrophic rock avalanches in glacial environments. In Proceedings of the Fifth International Symposium on Landslides, 2, pp. 1153–1158. Lausanne, Switzerland.

Sosio R., Crosta, G.B., Chen, J.H. and Hungr, O. 2012. Modelling rock avalanche propagation onto glaciers. Quaternary Science Reviews 47, 23–40

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A landslide inventory that extends over a century in Alaska demonstrates that climate change is having a major impact

Fri, 01/02/2026 - 16:28

The Landslide Blog is written by Dave Petley, who is widely recognized as a world leader in the study and management of landslides.

Of course, allow me to start by wishing all my readers a Happy 2026. I suspect that we are in for quite a landslide journey again this year.

In late November, a very interesting open access paper (Darrow and Jacobs 2025) was published on the journal Landslides. This piece of work sought to understand the patterns of landslides in Alaska over a century through the creation of a database compiled from “a combination of 24 digital newspapers and online media sources, including historic digitised Alaskan newspapers”. Such a study is an epic amount of work, but yields fantastic data. This study is no exception.

What is of particular interest here is that Alaska suffers from a range of landslide hazards, and suffers significant losses from them, and it is an environment in which climate change is clearly occurring, with warming at a rate that is higher than the global average. Previous studies have shown that this is having a measurable impact on landslides in the mountains of Alaska.

In total, Darrow and Jacobs (2025) have identified 281 landslides since 1883 in Alaska, with the occurrence showing a strong seasonal pattern associated primarily with seasonal patterns of rainfall. The headline from the paper is summarised in this graphic from the paper:-

The recorded incidence of landslides in Alaska by decade, from Darrow and Jacobs (2025).

The data shows a dramatic increase in landslides in recent decades, and in particular in the last two decades or so. Of course, care is needed to ensure that this is not an artefact of the reporting of landslides, but Darrow and Jacobs (2025) explored this issue in detail, concluding that the signal is real. Fortunately, the number of fatalities caused by landslides in Alaska is small, and there is no significant trend in terms of fatal landslides.

So what lies behind this change? Darrow and Jacobs (2025) show that the increase in occurrence of landslides in Alaska is associated with a marked increase in in average annual air temperature that ranges between 1.2 C and 3.4 C, and an associated increase in precipitation that ranges from 3% to 27%, over the 50 years.

Of course, warming is not going to stop in Alaska in the next few decades, so the likely direction of travel in terms of landslides there is clear. There is recognition in Alaska that greater attention will be needed on landslides.

But more widely, this is further quantitative evidence that the climate is having a big impact on landslide hazard. It is remarkable how the evidence just keeps accumulating.

Reference

Darrow, M.M. and Jacobs, A. 2025. Read all about it! A review of more than a century of Alaskan landslides as recorded in periodicalsLandslides. https://doi.org/10.1007/s10346-025-02663-z.

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Marine Heat Waves Can Exacerbate Heat and Humidity over Land

Fri, 01/02/2026 - 14:52
Source: AGU Advances

In 2023, Earth experienced its warmest year since 1850, with heat waves stretching across oceans and land alike. East Asia, for example, experienced scorching temperatures and high humidity throughout the summer months. Humid-heat extremes like those seen that year can trigger heat-related illnesses and mortality at higher-than-average rates.

As on land, the ocean around East Asia also experienced unprecedented warming in 2023. Sea surface temperatures (SST) in the Kuroshio-Oyashio Extension region reached record highs, persisting through much of the year. Researchers know that marine heat waves can influence land heat waves, but the details of these connections remain unclear.

Okajima et al. modeled regional land-sea interactions to better understand the effects of the unprecedented 2023 marine heat wave on conditions on land in East Asia. The team focused on the peak hot and humid months of July, August, and September, using hourly data on atmospheric conditions, including temperature, humidity, wind velocity, and atmospheric pressure, as well as SST data from satellites and in situ sensors.

The modeling suggested that the 2023 marine heat wave greatly exacerbated the East Asian heat wave, particularly in Japan, by affecting atmospheric circulation and altering the usual radiative effects of clouds and water vapor. The team said the influence of the marine heat wave explains roughly 20% to 50% of the increase in the intensity and duration of hot and humid conditions observed on land in East Asia in summer 2023.

The scientists note that this research provides valuable insights that could help improve long-range weather predictions. Such predictions may help communities prepare for health risks, particularly in Asia, which the World Meteorological Organization reported earlier this year is warming twice as fast as the global average. (AGU Advances, https://doi.org/10.1029/2025AV001673, 2025)

—Sarah Derouin (@sarahderouin.com), Science Writer

Citation: Derouin, S. (2025), Marine heat waves can exacerbate heat and humidity over land, Eos, 107, https://doi.org/10.1029/2026EO260009. Published on 2 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer