Feed aggregator

Clues from the past reveal the West Antarctic Ice Sheet's vulnerability to warming

Phys.org: Earth science - Sat, 01/10/2026 - 17:20
The Thwaites and Pine Island glaciers, located in the Amundsen Sea sector of the West Antarctic Ice Sheet (WAIS), are among the fastest-melting glaciers on Earth. Together, they are losing ice more rapidly than any other part of Antarctica, raising serious concerns about the long-term stability of the ice sheet and its contribution to future sea-level rise.

Loss functions and constraints improve sea surface height prediction

Phys.org: Earth science - Sat, 01/10/2026 - 01:10
In order to understand currents, tides and other ocean dynamics, scientists need to accurately capture sea surface height, or a snapshot of the ocean's surface, including peaks and valleys due to changes in wind, currents and temperature, at any given moment. In order to more accurately forecast ocean circulation and other processes, climate variability, air-sea interactions and extreme weather events, researchers need to be able to accurately predict sea surface height into the future.

High-Resolution Spatiotemporal Monitoring of Secondary Microseisms via Multi-Array Analysis

Geophysical Journal International - Sat, 01/10/2026 - 00:00
SummaryThis study presents a workflow to monitor spatiotemporal variations of the secondary microseisms using multi-array analysis. We employ ambient-noise cross-correlation beamforming (CC beamforming) across three dense seismic networks with different instrument responses: ANTICS in Albania (nodal-geophone and broadband), Hi-net in Japan (short-period), and SCSN (broadband) in Southern California. Independent of their instrumentation, these networks enable us to track the spatial and temporal evolution of secondary microseism sources in the northern Hemisphere from autumn 2022 to spring 2023. The workflow involves continuous data preprocessing for different instrumented sensors, ambient-noise cross-correlation, beamforming, and beam-power back-projection into a global map. We also propose sliding-window raw-data beamforming (RA beamforming) for the continuous broadband data in this workflow to record the absolute amplitudes of secondary microseisms recorded by ANTICS. Joint CC beamforming analysis across the three different networks improves the resolution of ambient-noise source localization and displays high consistency with the equivalent vertical force at the ocean floor. The results indicate that secondary microseism sources in the northern Hemisphere are predominantly driven by winter storms in the northern Atlantic and northern Pacific. The relative and absolute amplitudes of the beam-power for the northern Atlantic are also extracted from CC beamforming based on geophone sensors and RA beamforming based on broadband instruments from ANTICS, respectively. Both approaches provide robust estimates of microseism strength in the northern Atlantic, with CC beamforming displaying a higher correlation with the modeled ocean floor equivalent forces. This study confirms the feasibility of using cost-effective nodal seismic arrays for detailed monitoring of secondary microseisms and highlights the potential for integrating multi-array seismic data with oceanographic models for an improved understanding of seismic noise generation and propagation.

Moho topography beneath the northern Manila subduction using differential evolution algorithm

Geophysical Journal International - Sat, 01/10/2026 - 00:00
SummaryMoho topography model of subduction zones is an important component of deep tectonics and an important basis for verifying geodynamic processes. As one of the main factors affecting the accuracy of Moho topography model inverted by gravity method, the selection of inversion parameters suffers from the effect of nonlinear terms, which need to be reduced by constraints. Therefore, we applied the differential evolutionary algorithm to compute the inversion parameters and obtained a refined Moho topographic model of North Manila subduction on this basis. Synthetic tests show that the differential evolution algorithm can effectively mitigate the impact of nonlinear terms. With or without noise, the differential evolution algorithm is effective in finding better inversion parameters compared to the linear regression method. Particularly in Moho density contrast, the average value obtained from multiple runs of differential evolution algorithm still achieved a 54.4% improvement in accuracy. In practical application, the comparison results show that the RMS of the difference between this paper’s model and all seismic control points is 2.37 km with an improvement of at least 35.1%, which proves that this paper’s method is reliable. Furthermore, we examined the impact of various parameters on the method to validate its robustness.

High-Resolution Lithospheric Vs Structure of the Ordos Block from Dense-Array Ambient Noise Tomography: Implications for Reactivation

Geophysical Journal International - Sat, 01/10/2026 - 00:00
SummaryThe far-field impact of Tibetan Plateau (TP) expansion on cratonic blocks remains enigmatic. We address this for the Ordos Block (OB) by constructing a high-resolution 3-D shear-wave velocity model using ambient noise tomography from an unprecedented dense seismic array (461 stations). Our model reveals: (1) NE-trending high-velocity anomalies at 10–25 km depth correlating with crustal magnetic signatures, providing seismic evidence for late Archean amalgamation of micro-blocks (Jining, Ordos, Xuchang, Xuhuai); (2) TP-induced reactivation manifesting as southwestern OB crustal thickening (50 km) with a high-velocity lower-crustal layer (≥4.0 km/s; 100 km wide, 10 km thick), attributed to TP lower-crustal underthrusting beyond the plateau margin (35.5°–37.5°N), facilitating >200 km strain transfer into the OB interior; (3) Incipient rifting dynamics in the Daihai Rift, where upper-crustal high-Vs (preserved rigidity) overlies mid-lower crust/uppermost mantle low-Vs anomalies (mantle-sourced thermal modification), indicating early-stage rifting driven by combined Pacific plate retreat and TP far-field stresses; (4) Craton-wide segmentation across a fundamental 37.5°N lithospheric boundary demarcating mantle upwelling/crust-mantle interaction (north) from passive TP push-dominated deformation (south). These findings redefine the OB as a strain-partitioned system, where lithospheric heritage controls differential response to plate-boundary forces.

Machine-learning based earthquake detection and location around the Tanlu fault zone in eastern China

Geophysical Journal International - Sat, 01/10/2026 - 00:00
SummaryThe Tanlu fault zone is an NNE-SSW oriented, large and deep strike-slip fault system running through eastern China. To investigate seismotectonics in and around the Tanlu fault zone, we adopt the LOC-FLOW approach to build a high-precision earthquake catalog. Our seismic data were recorded at 120 broadband TanluArray temporary stations with a sampling rate of 40 Hz and 76 broadband permanent stations with a sampling rate of 10 Hz from July 2019 to March 2023. We first conduct a series of experiments around the Luxi uplift and find that a higher sampling rate and a denser array of stations can significantly enhance the earthquake detection ability. Then we utilize both the temporary and permanent stations to conduct earthquake detection and location for the entire study area. As a result, 9648 earthquakes are detected and located in the REAL catalog, 6543 earthquakes in the HypoInverse catalog, and 5619 earthquakes in the HypoDD catalog, representing an increase of 20%, 22%, and 22%, respectively, as compared with the cases when only the TanluArray stations are used. Our location results show that earthquakes are mainly distributed in the Tanlu fault zone and active faults in relevant tectonic units. We collect 322 focal mechanism solutions (M > 2.0) of previous results from 2000 to 2020 to invert the stress field of the whole study region. The results show that the maximum principal stress axis of the whole study area is in the NEE-SWW direction, except that the Huoshan region is in the E-W direction. Along the Tanlu fault zone, the highest seismicity occurs in the Suqian-Weifang segment, and the Suqian seismic gap may be due to the aseismic slip along the fault planes of the Tanlu fault zone.

Reciprocity-aware PINN-based Seismic Traveltime Tomography and Uncertainty Quantification for Models with Irregular Topography

Geophysical Journal International - Sat, 01/10/2026 - 00:00
SummaryIn recent years, physics-informed neural networks (PINNs) have emerged as a powerful tool for seismic traveltime modeling and tomography. However, conventional PINNs do not consider applicable physical priors or quantify the uncertainty of the inverse problem, which is critical for reliable geological interpretation with topographical complexity. Thus, we propose a comprehensive PINN-based framework designed to tackle the critical challenge of inverting for the velocity in models with irregular topography, while also quantifying the inherent uncertainty. Leveraging automatic differentiation, our mesh-free approach directly accommodates complex surfaces without the need for specialized grids. To enhance inversion accuracy and physical consistency, we uniquely incorporate additional physical priors, namely well-log velocity profile and the principle of reciprocity. Furthermore, to address the non-uniqueness of the inverse problem, we integrate Monte Carlo (MC) Dropout to efficiently quantify model uncertainty without architectural modifications. Through 2D and 3D experiments on synthetic and real-world geological models, we demonstrate that our method accurately inverts for velocity structures with highly irregular topography. Results show that the inclusion of physical priors significantly improves model performance, while uncertainty quantification via MC dropout successfully highlights regions of higher uncertainty in the inverted velocity field, aligning with geological complexities in the velocities. This work establishes a robust and practical methodology for accurate and reliable seismic tomography in challenging geological settings.

Important new source of oxidation in the atmosphere found

Phys.org: Earth science - Fri, 01/09/2026 - 19:00
Hydroperoxides are strong oxidants that have a significant influence on chemical processes in the atmosphere. Now, an international research team involving the Leibniz Institute for Tropospheric Research (TROPOS) has shown that these substances also form from α‑keto acids such as pyruvic acid in clouds, rain and aerosol water when exposed to sunlight.

Central China Water Towers Provide Stable Water Resources Under Change

EOS - Fri, 01/09/2026 - 15:24
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

The mountains ringing the Pacific Rim—stretching from the Andes to the Rockies, the Himalayas, and beyond—act as actual “water towers.” They host huge reserves of water that are stored in snowpack, glaciers, lakes, and soils, and then feed rivers and supply freshwater to billions of people downstream.

Yue et al. [2026] analyze how climate change affects freshwater supply from water towers by analyzing a new dendrochronological network of 100 tree-ring sampling sites. They first reconstruct Central China Water Tower (CCWT) runoff back to 1595. Then, by considering projections from climate models, the authors reveal increasing runoff across most Pacific Rim water towers, whereas water resources from the Northern Rocky Mountains are projected to decline substantially. These differences are attributed to distinct geographies and synoptic climatic conditions. The findings provide insights for adaptive management strategies in China.

Citation: Yue, W., Torbenson, M. C. A., Chen, F., Reinig, F., Esper, J., Martinez del Castillo, E., et al. (2026). Runoff reconstructions and future projections indicate highly variable water supply from Pacific Rim water towers. AGU Advances, 7, e2025AV002053.  https://doi.org/10.1029/2025AV002053

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

In 2025, the Ocean Stored a Record-Breaking Amount of Heat, Again

EOS - Fri, 01/09/2026 - 14:23
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.

The ocean soaked up more heat last year than in any year since modern measurements began around 1960, according to a new analysis published in Advances in Atmospheric Science

The world’s oceans absorb more than 90% of excess heat trapped in Earth’s atmosphere by greenhouse gas emissions. As heat in the atmosphere accumulates, heat stored in the ocean increases, too, making ocean heat a reliable indicator of long-term climate change. 

Ocean temperatures influence the frequency and intensity of marine heatwaves, change atmospheric circulation, and govern global precipitation patterns. 

Scientists measure the ocean’s heat in different ways. One common metric is global annual mean sea surface temperature, the average temperature in the top few meters of ocean waters. Global sea surface temperature in 2025 was the third warmest ever recorded, at about 0.5°C (0.9°F) above the 1981-2010 average.

“Last year was a bonkers, crazy warming year.”

Another metric is ocean heat content, which measures the total heat energy stored in the world’s oceans. It’s measured in zettajoules: One zettajoule is equivalent to 1,000,000,000,000,000,000,000 joules. To measure heat content in 2025, the study’s authors assessed ocean observational data from the upper 2,000 meters of the ocean, where most of the heat is absorbed, from NOAA’s National Centers for Environmental Information, the European Union’s Copernicus Climate Change Service, and the Chinese Academy of Sciences. 

They found that in total, the ocean absorbed an additional 23 zettajoules of heat energy in 2025, breaking the ocean heat content record for the ninth consecutive year and marking the longest sequence of consecutive ocean heat content records ever recorded.

“Last year was a bonkers, crazy warming year,” John Abraham, a mechanical engineer at the University of St. Thomas and a coauthor of the new study, told Wired.

Twenty-three zettajoules in one year is equivalent to the energy of 12 Hiroshima bombs exploding in the ocean every second. It’s also a large increase over the 16 zettajoules of heat the ocean absorbed in 2024. The hottest areas of the ocean observed in 2025 were the tropical and South Atlantic, Mediterranean Sea, North Indian Ocean, and Southern Ocean. 

 
Related

The results provide “direct evidence that the climate system is out of thermal equilibrium and accumulating heat,” the authors write.

A hotter ocean favors increased global precipitation and fuels more extreme tropical storms. In the past year, warmer global temperatures were likely partly responsible for the damaging effects of Hurricane Melissa in Jamaica and Cuba, heavy monsoon rains in Pakistan, severe flooding in the Central Mississippi Valley, and more.

“Ocean warming continues to exert profound impacts on the Earth system,” the authors wrote.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

9 January: This article was updated to correct the conversion of 23 zettajoules to Hiroshima bomb explosions.

These updates are made possible through information from the scientific community. Do you have a story about science or scientists? Send us a tip at eos@agu.org. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

When bushfires make their own weather

Phys.org: Earth science - Fri, 01/09/2026 - 14:15
Bushfires are strongly driven by weather: hot, dry and windy conditions can combine to create the perfect environment for flames to spread across the landscape.

Ganges Delta under a winter shroud of fog

Phys.org: Earth science - Fri, 01/09/2026 - 14:11
Winter weather took hold across the Indo-Gangetic Plain in early January 2026, bringing dense fog and cold temperatures to much of the flat, fertile lands that span from Pakistan and northern India to Bangladesh.

What past global warming reveals about future rainfall

Phys.org: Earth science - Fri, 01/09/2026 - 14:02
To understand how global warming could influence future climate, scientists look to the Paleogene Period that began 66 million years ago, covering a time when Earth's atmospheric carbon dioxide levels were two to four times higher than they are today.

Managing Carbon Stocks Requires an Integrated View of the Carbon Cycle

EOS - Fri, 01/09/2026 - 14:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Managing carbon stocks in the land, ocean, and atmosphere under changing climate requires a globally integrated view of carbon cycle processes at local and regional scales. The growing Earth Observation (EO) record is the backbone of this multi-scale system, providing local information with discrete coverage from surface measurements and regional information at global scale from satellites.

Carbon flux information, anchored by inverse estimates from spaceborne Greenhouse Gas (GHG) concentrations, provides an important top-down view of carbon emissions and sinks, but currently lacks global continuity at assessment and management scales (less than 100 kilometers). Partial-column data can help separate signals in the boundary layer from the overlying atmosphere, providing an opportunity to enhance surface sensitivity and bring flux resolution down from that of column-integrated data (100–500 kilometers).

As described in Parazoo et al. [2025], the carbon cycle community envisions a carbon observation system leveraging GHG partial columns in the lower and upper troposphere to weave together information across scales from surface and satellite EO data, and integration of top-down / bottom-up analyses to link process understanding to global assessment. Such an actionable system that integrates existing and new EO data and inventories using advanced top-down and bottom-up analyses can help address the diverse and shifting needs of carbon management stakeholders.

Diverse carbon cycle science needs span multiple time (x-axis) and space (y-axis) scales across land (green shading), ocean (blue shading), and fossil (orange shading) sectors. Science needs addressed by the current and planned carbon flux and biomass Earth Observation (EO) program of record (PoR; purple and green, respectively) are depicted by the solid circle. Key EO science gaps exist at 1–100 kilometer spatial scale spanning sub-seasonal impacts of climate extremes and wildfires, interannual change and biomass, long term changes in growth, storage, and emissions, and carbon-climate feedbacks and tipping points (grey shading). Future GHG and biomass observing systems (e.g., dashed circles) will provide important benefits to carbon management efforts. Credit: Parazoo et al. [2025], Figure 1

Citation: Parazoo, N., Carroll, D., Abshire, J. B., Bar-On, Y. M., Birdsey, R. A., Bloom, A. A., et al. (2025). A U.S. scientific community vision for sustained earth observations of greenhouse gases to support local to global action. AGU Advances, 6, e2025AV001914.  https://doi.org/10.1029/2025AV001914

—Don Wuebbles, Editor, AGU Advances

Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New River Chemistry Insights May Boost Coastal Ocean Modeling

EOS - Fri, 01/09/2026 - 13:46
Source: Global Biogeochemical Cycles

Rivers deliver freshwater, nutrients, and carbon to Earth’s oceans, influencing the chemistry of coastal seawater worldwide. Notably, a river’s alkalinity and the levels of dissolved inorganic carbon it brings to the sea help to shape regional conditions for marine life, including shellfish and corals. These factors also affect the ability of coastal seawater to absorb carbon dioxide from Earth’s atmosphere—which can have major implications for climate change.

However, the factors influencing river chemistry are complex. Consequently, models for predicting worldwide carbon dynamics typically simplify or only partially account for key effects of river chemistry on coastal seawater. That could now change with new river chemistry insights from Da et al. By more realistically accounting for river inputs, the researchers demonstrate significant corrections to overestimation of the amount of carbon dioxide absorbed by the coastal ocean.

The researchers used real-world data on rivers around the world to analyze how factors such as forest cover, carbonate-containing rock, rainfall, permafrost, and glaciers in a watershed influence river chemistry. In particular, they examined how these factors affect a river’s levels of dissolved inorganic carbon as well as its total alkalinity—the ability of the water to resist changes in pH.

The researchers found that variations in total alkalinity between the different rivers were primarily caused by differences in watershed forest cover, carbonate rock coverage, and annual rainfall patterns. Between-river variations in the ratio of dissolved inorganic carbon to total alkalinity were significantly shaped by carbonate rock coverage and the amount of atmospheric carbon dioxide taken up by photosynthesizing plants in the watershed, they found.

The analysis enabled the researchers to develop new statistical models for using watershed features to realistically estimate dissolved inorganic carbon and total alkalinity levels at the mouths of rivers, where they flow into the ocean.

When incorporated into a global ocean model, the improved river chemistry estimates significantly reduced the overestimation of carbon dioxide taken up by coastal seawater. In other words, compared with prior ocean modeling results, the new results were more in line with real-world, data-based calculations of carbon dioxide absorption.

This study demonstrates the importance of accurately accounting for river chemistry when making model-based predictions of carbon cycling and climate change. More research is needed to further refine river chemistry estimates to enable even more accurate coastal ocean modeling. (Global Biogeochemical Cycles, https://doi.org/10.1029/2025GB008528, 2025)

—Sarah Stanley, Science Writer

Citation: Stanley, S. (2026), New river chemistry insights may boost coastal ocean modeling, Eos, 107, https://doi.org/10.1029/2026EO260022. Published on 9 January 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The Looming Data Loss That Threatens Public Safety and Prosperity

EOS - Fri, 01/09/2026 - 13:45

From farming and engineering to emergency management and insurance, many industries critical to daily life rely on Earth system and related socioeconomic datasets. NOAA has linked its data, information, and services to trillions of dollars in economic activity each year, and roughly three quarters of U.S. Fortune 100 companies use NASA Earth data, according to the space agency.

Such data are collected in droves every day by an array of satellites, aircraft, and surface and subsurface instruments. But for many applications, not just any data will do.

Leaving reference quality datasets (RQDs) to languish, or losing them altogether, would represent a dramatic shift in the country’s approach to managing environmental risk.

Trusted, long-standing datasets known as reference quality datasets (RQDs) form the foundation of hazard prediction and planning and are used in designing safety standards, planning agricultural operations, and performing insurance and financial risk assessments, among many other applications. They are also used to validate weather and climate models, calibrate data from other observations that are of less than reference quality, and ground-truth hazard projections. Without RQDs, risk assessments grow more uncertain, emergency planning and design standards can falter, and potential harm to people, property, and economies becomes harder to avoid.

Yet some well-established, federally supported RQDs in the United States are now slated to be, or already have been, decommissioned, or they are no longer being updated or maintained because of cuts to funding and expert staff. Leaving these datasets to languish, or losing them altogether, would represent a dramatic—and potentially very costly—shift in the country’s approach to managing environmental risk.

What Is an RQD?

No single definition exists for what makes a dataset an RQD, although they share common characteristics, including that they are widely used within their respective user communities as records of important environmental variables and indicators. RQDs are best collected using observing systems designed to produce highly accurate, stable, and long-term records, although only a few long-term observing systems can achieve these goals.

As technological advances and operating constraints are introduced, specialized efforts are needed to integrate new and past observations from multiple observing systems seamlessly. This integration requires minimizing biases in new observations and ensuring that these observations have the broad spatial and temporal coverage required of RQDs (Figure 1). The nature of these efforts varies by the user community, which sets standards so that the datasets meet the specific needs of end users.

Fig. 1. Various satellite sensors provide total precipitable water (TPW) data products characterizing the integrated amount of water vapor available throughout the atmospheric column. However, each of these products has biases and sampling errors because of differences in the algorithms, sensors, and spatial and temporal sampling resolutions on which they are based. NOAA’s Cooperative Institute for Research in the Atmosphere produces a unified, or blended, TPW—an example of which is shown here—that merges all available TPW products. Click image for larger version. Credit: NOAA

The weather and climate community—which includes U.S.- and internationally based organizations such as NOAA, NASA, the National Research Council, and the cosponsors of the Global Climate Observing System (GCOS)—has agreed upon principles to guide the development of RQDs [Bojinski et al., 2014; National Research Council, 1999]. For example, data must account for changes in observing times, frequency of observations, instruments, calibration, and undesirable local effects (e.g., obstructions affecting the instruments’ sensors). These RQDs are referred to as either thematic or fundamental climate data records depending on the postprocessing involved (e.g., sensor-detected satellite radiances (fundamental) versus a postprocessing data product such as integrated atmospheric water vapor (thematic)).

Another important attribute of RQDs is that their data are curated to include detailed provenance tracking, metadata, and information on validation, standardization, version control, archiving, and accessibility. The result of all this careful collection, community input, and curation is data that have been rigorously evaluated for scientific integrity, temporal and spatial consistency, and long-term availability.

An Anchor to Real-World Conditions

RQDs are crucial in many ways across sectors. They are vital, for example, in realistically calibrating and validating projections and predictions of environmental hazards by weather, climate, and Earth system models. They can also validate parameterizations used to represent physical processes in models and ground global reanalysis and gridded model products in true ambient conditions [Thorne and Vose, 2010].

RQDs have become even more important with the rapid emergence of artificial intelligence weather forecasting approaches.

Without these reference data to anchor them, the outputs of large-scale high-resolution gridded climate datasets (e.g., PRISM (Portable Remote Imaging Spectrometer), E-OBS, IMERG (Integrated Multi-satellite Retrievals for GPM), CHELSA-W5E5) can drift systematically. Over multidecadal horizons, this drift degrades our ability to separate genuine Earth system changes and variations from artifacts. RQDs have become even more important with the rapid emergence of artificial intelligence (AI) weather forecasting approaches, which must be trained on observations and model outputs and thus can inherit their spatial and temporal biases.

Indeed, RQDs are fundamental to correcting biases and minimizing the propagation of uncertainties in high-resolution models, both conventional and AI. Researchers consistently find that the choice and quality of reference datasets are critical in evaluating, bias-correcting, and interpreting climate and weather model outputs [Gampe et al., 2019; Gibson et al., 2019; Jahn et al., 2025; Gómez-Navarro et al., 2012; Tarek et al., 2021]. If the reference data used are of lower quality, greater uncertainty can be introduced into projections of precipitation and temperature, for example, especially with respect to extreme conditions and downstream impacts such as streamflows or disease risk. This potential underscores the importance of RQDs for climate and weather modeling.

Each community has its own requirements for RQDs. To develop and implement statistical risk models to assess local vulnerability to environmental hazards, the finance and insurance sectors prioritize high spatial and temporal resolution, data completeness, adequate metadata to dissect specific events, certification that data are from a trusted source, open-source accessibility, and effective user data formats. These sectors directly or indirectly (i.e., downstream datasets) rely on many federally supported datasets. Examples include NOAA’s Storm Events Database, Billion-Dollar Weather and Climate Disasters dataset, and Global Historical Climatology Network hourly dataset; NASA’s family of sea surface altimetry RQDs and its Soil Moisture Active Passive and Gravity Recovery and Climate Experiment terrestrial water storage datasets; and the interagency Monitoring Trends in Burn Severity dataset, which tracks burned wildfire areas.

Meanwhile, the engineering design community requires regularly updated reference data that can help distinguish truly extreme from implausible outlier conditions. This community uses scores of federally supported RQDs to establish safety and design standards, including NOAA’s Atlas 14 and Atlas 15 precipitation frequency datasets, U.S. Geological Survey’s (USGS) National Earthquake Hazards Reduction Program dataset, and NASA’s sea level data and tools (which are instrumental in applications related to ocean transport and ocean structures).

As RQDs are a cornerstone for assessing environmental hazards across virtually all sectors of society, the loss or degradation of RQDs is an Achilles heel for reliably predicting and projecting all manner of environmental hazard.

Linking Reference Observing and Data Systems

U.S. agencies have long recognized the importance of reference observing systems and the RQDs they supply. Since the early 2000s, for example, NOAA’s U.S. Climate Reference Network (USCRN) has operated a network of highly accurate stations (now numbering 137) across the country that measure a variety of meteorological variables and soil conditions (Figure 2) [Diamond et al., 2013]. The USCRN plans redundancy into its system, such as triplicate measurements of the same quantity to detect and correct sensor biases, allowing data users to trust the numbers they see.

Fig. 2. A typical U.S. Climate Reference Network station includes instruments to collect a variety of data on environmental variables such as air temperature, precipitation, wind, soil moisture and temperature, humidity, and solar radiation. Credit: NOAA

The World Meteorological Organization has helped to coordinate similar networks with reference quality standards internationally. One such network is the GCOS Reference Upper-Air Network, which tracks climate variables through the troposphere and stratosphere (and to which NOAA contributes). The resulting RQDs from this network are used to calibrate and bias-correct data from other (e.g., satellite) observing systems.

Only the federal government carries a statutory, sovereign, and enduring mandate to provide universally accessible environmental data as a public good.

In the absence of such reference quality observing systems, RQDs must be derived by expert teams using novel data analyses, special field-observing experiments, statistical methods, and physical models. Recognizing their importance, Thorne et al. [2018] developed frameworks for new reference observing networks. Expert teams have been assembled in the past to develop RQDs from observing systems that are of less than reference quality [Hausfather et al., 2016]. However, these teams require years of sustained work and funding, and only the federal government carries a statutory, sovereign, and enduring mandate to provide universally accessible environmental data as a public good; other sectors contribute valuable but nonmandated and nonsovereign efforts.

Datasets at Risk

Recent abrupt actions to reduce support for RQDs are out of step with the long-standing recognition of these datasets’ value and of the substantial efforts required to develop them.

Federal funding and staffing to maintain RQDs are being cut through reduced budgets, agency reorganizations, and reductions in force. The president’s proposed fiscal year 2026 budget would, for example, cut NOAA’s budget by more than 25% and abolish NOAA’s Office of Oceanic and Atmospheric Research, although the newest appropriations package diminishes cuts to science. The National Science Foundation–supported National Center for Atmospheric Research (NCAR), which archives field experiment datasets and community model outputs, is at risk of being dismantled.

Major cuts have also been proposed to NASA’s Earth Sciences Division, as well as to Earth sciences programs in the National Science Foundation, Department of Energy (DOE), Department of the Interior, and elsewhere. Changes enacted so far have already affected some long-running datasets that are no longer being processed and are increasingly at risk of disappearing entirely.

The degradation of RQDs that we’re now seeing comes at a time of growing risk from climate and weather hazards. In the past decade alone, the United States has faced over $1.4 trillion in damages from climate-related disasters—and over $2.9 trillion since 1980. Inflation adjusted per-person costs of U.S. disasters have jumped nearly tenfold since the 1980s and now cost each resident nearly $500 annually (Figure 3). The flooding disasters from Hurricane Helene in September 2024 and in central Texas in July 2025 offer recent reminders of both the risks from environmental hazards and the continued need to predict, project, and prepare for future events.

Fig. 3. The average inflation-adjusted cost per person in the United States from billion-dollar disasters—indicated here in pentad years—rose from about $50 in 1980 to roughly $450 as of 2020. Costs are derived using the National Centers for Environmental Information’s Billion-Dollar Weather and Climate Disasters reference quality dataset, which is no longer being updated.

Threatened datasets include many RQDs whose benefits are compounded because they are used in building other downstream RQDs. This includes examples such as USGS’s National Land Cover Database, which is instrumental to downstream RQDs like Federal Emergency Management Agency flood maps, U.S. Department of Agriculture (USDA) crop models, and EPA land use products. Another example is USDA’s National Agriculture Imagery Program, which delivers high-resolution aerial imagery during the growing season and supports floodplain mapping, wetland delineation, and transportation infrastructure planning.

Many other federally supported projects that produce derivative and downstream RQDs are at risk, primarily through reductions in calibration, reprocessing, observing-network density, expert stewardship, and in some cases abrupt termination of observations. Earth system examples include NOAA’s bathymetry and blended coastal relief products (e.g., National Bathymetric Source, BlueTopo, and Coastal Relief Models), USGS’s 3DEP Digital Elevation Model, and the jointly supported EarthScope Consortium geodetic products.

Several global satellite-derived RQDs face end-of-life and longer-term degradation issues, such as those related to NASA’s algorithm development and testing for the Global Precipitation Climatology Project, the National Snow and Ice Data Center’s sea ice concentration and extent data, and the family of MODIS (Moderate Resolution Imaging Spectroradiometer) RQDs. In addition, USGS’s streamflow records and NOAA’s World Ocean Atlas are at-risk foundational RQDs whose downstream products span sectors including engineering, hazards management, energy, insurance, defense, and ecosystem services.

More Than a Science Issue

The degradation of weather, climate, environmental, and Earth system RQDs propagates risk well beyond the agencies that produce them and isn’t a problem of just science and technology, because the products they power don’t serve just scientists.

Apart from fueling modeling of climate and weather risks and opportunities, they underpin earthquake and landslide vulnerability maps, energy grid management, safe infrastructure design, compound risk mitigation and adaptation strategies, and many other applications that governments, public utilities, and various industries use to assess hazards and serve public safety.

A sustained capability to produce high-resolution, decision-ready hazard predictions and projections relies on a chain of dependencies that begins with RQDs.

A sustained capability to produce high-resolution, decision-ready hazard predictions and projections relies on a chain of dependencies that begins with RQDs. If high-quality reference data vanish or aren’t updated, every subsequent link in that chain is adversely affected, and all these products become harder to calibrate and the information they provide is less certain.

RQDs are often used in ways that are not immediately transparent. A case in point is the critical step of updating weather model reanalyses (e.g., ERA5 (ECMWF Reanalysis v5) or MERRA-2 (Modern-Era Retrospective Analysis for Research and Applications, Version 2)), which are increasingly used in many weather and climate hazards products, by replacing the real-time operational data they assimilate with data from up-to-date RQDs wherever possible. These real-time operational data are rarely screened effectively for absolute calibration errors and subtle but important systemic biases, so this step helps to ensure the model simulations are free of time- and space-dependent biases. Using outputs from reanalysis models not validated or powered by RQDs can thus be problematic because biases can propagate into other hazard predictions, projections, and assessments, resulting in increased uncertainty and an inability to validate extremes.

A Vital Investment

With rapid advances in new observing system technologies and a diverse and ever–changing mix of observing methods, demand is growing for scientific expertise to blend old and new data seamlessly. The needed expertise involves specialized knowledge of how to process the data, integrate new observing system technologies, and more.

The costs for maintaining and updating RQDs are far less than recovering from a single billion-dollar disaster.

Creating RQDs isn’t easy, and sustained support is necessary. This support isn’t just a scientific priority—it’s also a vital national investment. Whereas the costs of restoring lost or hibernated datasets and rebuilding expert teams—if those tasks would even be possible—would be enormous, the costs for maintaining and updating RQDs are far less than recovering from a single billion-dollar disaster.

Heeding recurring recommendations to continue collecting precise and uninterrupted observations of the global climate system—as well as to continue research, development, and updates necessary to produce RQDs—in federal budgets for fiscal year 2026 and beyond thus seems the most sensible approach. If this doesn’t happen, then the United States will need to transition to relying on the interest, capacities, and capabilities of various other organizations both domestic and international to sustain the research, development, and operations required to produce RQDs and make them available.

Given the vast extent of observing system infrastructures, the expertise required to produce RQDs from numerous observing systems, and the long-term stability needed to sustain them, such a transition could be extremely challenging and largely inadequate for many users. Thus, by abandoning federally supported RQDs, we risk being penny-wise and climate foolish.

References

Bojinski, S., et al. (2014), The concept of essential climate variables in support of climate research, applications, and policy, Bull. Am. Meteorol. Soc., 95(9), 1,431–1,443, https://doi.org/10.1175/BAMS-D-13-00047.1.

Diamond, H. J., et al. (2013), U.S. Climate Reference Network after one decade of operations: Status and assessment, Bull. Am. Meteorol. Soc., 94(4), 485–498, https://doi.org/10.1175/BAMS-D-12-00170.1.

Gampe, D., J. Schmid, and R. Ludwig (2019), Impact of reference dataset selection on RCM evaluation, bias correction, and resulting climate change signals of precipitation, J. Hydrometeorol., 20(9), 1,813–1,828, https://doi.org/10.1175/JHM-D-18-0108.1.

Gibson, P. B., et al. (2019), Climate model evaluation in the presence of observational uncertainty: Precipitation indices over the contiguous United States, J. Hydrometeorol., 20(7), 1,339–1,357, https://doi.org/10.1175/JHM-D-18-0230.1.

Gómez-Navarro, J. J., et al. (2012), What is the role of the observational dataset in the evaluation and scoring of climate models?, Geophys. Res. Lett., 39(24), L24701, https://doi.org/10.1029/2012GL054206.

Hausfather, Z., et al. (2016), Evaluating the impact of U.S. Historical Climatology Network homogenization using the U.S. Climate Reference Network, Geophys. Res. Lett., 43(4), 1,695–1,701, https://doi.org/10.1002/2015GL067640.

Jahn, M., et al. (2025), Evaluating the role of observational uncertainty in climate impact assessments: Temperature-driven yellow fever risk in South America, PLOS Clim., 4(1), e0000601, https://doi.org/10.1371/journal.pclm.0000601.

National Research Council (1999), Adequacy of Climate Observing Systems, 66 pp., Natl. Acad. Press, Washington, D.C., https://doi.org/10.17226/6424.

Tarek, M., F. Brissette, and R. Arsenault (2021), Uncertainty of gridded precipitation and temperature reference datasets in climate change impact studies, Hydrol. Earth Syst. Sci., 25(6), 3,331–3,350, https://doi.org/10.5194/hess-25-3331-2021.

Thorne, P. W., and R. S. Vose (2010), Reanalyses suitable for characterizing long-term trends, Bull. Am. Meteorol. Soc., 91(3), 353–362, https://doi.org/10.1175/2009BAMS2858.1.

Thorne, P. W., et al. (2018), Towards a global land surface climate fiducial reference measurements network, Int. J. Climatol., 38(6), 2,760–2,774, https://doi.org/10.1002/joc.5458.

Author Information

Thomas R. Karl (Karl51tom@gmail.com), Climate and Weather LLC, Mills River, N.C.; Stephen C. Diggs, University of California Office of the President, Oakland; Franklin Nutter, Reinsurance Association of America, Washington, D.C.; Kevin Reed, New York Climate Exchange, New York; also at Stony Brook University, Stony Brook, N.Y.; and Terence Thompson, S&P Global, New York

Citation: Karl, T. R., S. C. Diggs, F. Nutter, K. Reed, and T. Thompson (2026), The looming data loss that threatens public safety and prosperity, Eos, 107, https://doi.org/10.1029/2026EO260021. Published on 9 January 2026. Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Experts say oceans soaked up record heat levels in 2025

Phys.org: Earth science - Fri, 01/09/2026 - 08:00
The world's oceans absorbed a record amount of heat in 2025, an international team of scientists said Friday, further priming conditions for sea level rise, violent storms, and coral death.

Systematic Bias in Shear-Wave Splitting Measurement

Geophysical Journal International - Fri, 01/09/2026 - 00:00
SummaryShear-wave splitting measurement returns two parameters related to the fabric of the upper mantle: the orientation of the fast polarisation (fast direction), and a measure of the intensity and thickness of the fabric known as the split time. Spatial statistics of compiled splitting measurements indicate that the fast direction is spatially coherent, while the split time is not. We show, through modelling large numbers of noisy measurements, that single-earthquake splitting measurements exhibit a prominent upward bias in split time, the degree of which depends on specifics of the measurement process. Averaging single-event splitting parameters over many measurements does not mitigate this bias; however, stacking of error surfaces from individual measurements does, given sufficient back-azimuthal coverage, while also greatly reducing scatter. Published splitting results use a mix of these two averaging techniques, and this inconsistent bias between studies is likely responsible for the lack of spatial coherence in compiled split-time measurements. We demonstrate this in real data by examining a data set from Alberta, Canada and surrounding areas, for which a recent study published parameter-averaged results. By examining a comparable data set using error-surface stacking, we are able to greatly increase the coherence of the split times while obtaining highly similar fast directions. Our coherent split times are mapped to reveal a zone of strong splitting beneath the active Cordillera, and three zones of moderate to low split time within the cratonic lithosphere.

Spectral induced polarization measurements at different mountain permafrost landforms with varying ice contents

Geophysical Journal International - Fri, 01/09/2026 - 00:00
SummaryUnderstanding the spatial variability of ice content in frozen ground is key to design adequate measures to manage different ecosystems in frame of climate change. To-date investigations in frozen ground require the analysis of borehole data or the collection of multiple geophysical data. Here we propose the use of spectral induced polarization (SIP) as a technique that provides in quasi real-time information about changes in ice content in the subsurface. We demonstrate that exploring the frequency dependence in electrical conductivity and polarization (capacitive) properties in the frequency range between 0.1 and 75 Hz provides direct information about their relative variations in ice content. Our study is based on measurements conducted at nine representative permafrost sites in the European Alps with varying landforms and ice contents, including a pure ice and an unfrozen reference. We use the phase frequency effect (ϕFE) parameter as a parameter describing both the amplitude of the polarization and its frequency dependence to compare the response associated to the different sites. Our results show the lowest ϕFE in sites with low ice content, while increases in this parameter are associated with higher ice content. We evaluate the correlation between SIP parameters and validation ground ice contents for all sites and observe a clear correlation between ϕFE and volumetric ice content. The ϕFE results exhibit distinct landform-specific patterns, with the highest values found in rock glaciers, intermediate values in frozen talus slopes, lower values in bedrock permafrost, and the lowest in unfrozen talus slopes, reducing interpretation ambiguities in electrical resistivity results for assessing ice content.

Temperatures are rising, but what about humidity?

Phys.org: Earth science - Thu, 01/08/2026 - 23:10
Heat waves are becoming commonplace, and so too is high humidity, which can strain the electrical grid, hurt the economy, and endanger human health. But the global prevalence of record-breaking humidity events, some of which approach the physiological limit of what humans can safely handle—and all of which go beyond local expectations and adaptations—has not been widely studied.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer