EOS

Syndicate content Eos
Science News by AGU
Updated: 8 hours 41 min ago

When the Earth Moves: 25 Years of Probabilistic Fault Displacement Hazards

Fri, 10/17/2025 - 16:08
Editors’ Vox is a blog from AGU’s Publications Department.

Earthquake surface ruptures can cause severe damage to infrastructure and, while preventative measures can be taken to allow the structures to adapt in the case of an earthquake, one of the best methods is to avoid unnecessary risks in the first place.

A new article in Reviews of Geophysics explores the history of Probabilistic Fault Displacement Hazard Assessments (PFDHA) and recent efforts to improve them with modern methods. Here, we asked the authors to give an overview of PFDHAs, how scientists’ methods have evolved over time, and future research directions.

What is fault displacement and what kinds of risks are associated with it?

Fault displacement occurs when an earthquake breaks the ground surface along a fault. This displacement along the fault can shift the ground horizontally and/or vertically, by several meters for the largest earthquakes. Such ruptures pose serious risks to infrastructures located across faults—such as pipelines, transportation systems, dams and power generation facilities—because they may be torn apart or severely damaged. While some facilities can be engineered to tolerate limited movements, many critical systems are highly vulnerable, making it essential to evaluate this hazard.

This figure shows the Trans-Alaska Pipeline crossing the Denali Fault, which ruptured during the 2002 earthquake. Photos and diagrams illustrate how the pipeline was designed to bend and slide, allowing it to survive several meters of fault movement without breaking. Credit: Valentini et al. [2025], Figure 5

In simple terms, what are Probabilistic Fault Displacement Hazard Assessments (PFDHA)?

A Probabilistic Fault Displacement Hazard Assessment (PFDHA) is a quantitative analysis based on a method that estimates the likelihood that an earthquake will rupture the surface at a specific site and anticipate the magnitude of the displacement. Instead of giving a single answer, PFDHA provides probabilities of different displacement levels for different reference periods of interest. This allows engineers and planners to evaluate risks in a structured way and make informed decisions about building designs or land use near faults.

This diagram explains how scientists estimate the expected amount of displacement due to an earthquake and at a specific site. It shows the main steps and data used in a Probabilistic Fault Displacement Hazard Assessment (PFDHA). Credit: Valentini et al. [2025], Figure 8

How have Fault Displacement Hazard Assessments evolved over time?

The first systematic PFDHA was developed in the early 2000s for the Yucca Mountain nuclear waste repository in the USA. Since then, the methodology has expanded from normal faults to include strike-slip and reverse faults worldwide. Over the past 25 years, new global databases of surface ruptures supporting statistical analysis, advances in statistical modeling, and international benchmark exercises have significantly improved the reliability and comparability of PFDHA approaches. In the future, the field should integrate remote sensing data, artificial intelligence, and physics-based modeling to better capture the complexity of earthquake ruptures.

What are the societal benefits of developing PFDHAs?

By quantifying the hazard of surface fault rupture, PFDHAs provide critical input for the safe design of infrastructures. This helps to avoid catastrophic failures such as pipeline leaks, dam collapses and resulting flooding, or road and railway disruption. Beyond engineering, PFDHAs also support land-use planning by identifying areas where construction should be avoided. Ultimately, these assessments reduce economic losses, improve resilience, and protect human lives in earthquake-prone regions.

What are some real-life examples of PFDHAs being developed and implemented?

One of the earliest and most influential applications was at Yucca Mountain, Nevada, where PFDHA helped assess the safety of a proposed nuclear waste repository. More recently, PFDHA approaches have been adopted internationally, including in Japan and Italy, for assessing risks to dams, tunnels, and other critical infrastructure.

What are some of the most exciting recent developments in this field?

These photos show how earthquakes can damage critical infrastructure such as bridges, dams, railways, and pipelines. The images highlight both principal and distributed fault ruptures, underscoring why engineers and planners must consider both when assessing earthquake hazards. Credit: Valentini et al. [2025], Figure 4

Recent years have seen major advances thanks to new global databases such as the worldwide and unified database of surface ruptures (SURE) and the Fault Displacement Hazard Initiative (FDHI), which collect tens of thousands of observations of past surface ruptures. Remote sensing techniques now allow scientists to map fault ruptures with unprecedented detail. Importantly, these techniques have also awakened the geological and seismological community to the relevance of moderate earthquakes. Since the 2000s and 2010s, it has become clear that earthquakes smaller than magnitude 6.5 can also produce significant surface ruptures, a threat that was often overlooked before these technological advances. Additionally, international collaborations, such as the International Atomic Energy Agency benchmark project, are helping to unify approaches and ensure that PFDHAs are robust and reproducible across different regions.

What are the major unsolved or unresolved questions and where are additional research, data, or modeling efforts needed?

Several challenges remain. A key issue is the limited number of well-documented earthquakes outside North America and Japan, leaving other regions underrepresented in global databases. Another challenge is how to model complex, multi-fault ruptures, which are increasingly observed in large earthquakes. Understanding the controls on off-fault deformation, as revealed by modern geodetic techniques during large to moderate events, is another critical open question. This knowledge could improve our ability to predict rupture patterns and displacement amounts.

Similarly, the role of near-surface geology in controlling the location, size, and distribution of surface ruptures for a given earthquake magnitude remains poorly constrained and deserves further study. Standardizing terminology and methods is also essential for consistent hazard assessments. Looking forward, more high-quality data, integration of physics-based models, and improved computational frameworks will be crucial to advance the field.


—A. Valentini (alessandro.valentini@univie.ac.at, 0000-0001-5149-2090), University of Vienna, Austria; Francesco Visini (0000-0001-9582-6443), Istituto Nazionale di Geofisica e Vulcanologia, Italy; Paolo Boncio (0000-0002-4129-5779),  Università degli Studi “G. d’Annunzio,” Italy; Oona Scotti (0000-0002-6640-9090), Autorité de Sureté Nucléaire et de Radioprotection, France; and Stéphane Baize (0000-0002-7656-1790), Autorité de Sureté Nucléaire et de Radioprotection, France

Editor’s Note: It is the policy of AGU Publications to invite the authors of articles published in Reviews of Geophysics to write a summary for Eos Editors’ Vox.

Citation: Valentini, A., F. Visini, P. Boncio, O. Scotti, and S. Baize (2025), When the earth moves: 25 years of probabilistic fault displacement hazards, Eos, 106, https://doi.org/10.1029/2025EO255033. Published on 17 October 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Scientists Must Join Forces to Solve Forecasting’s Predictability Desert

Fri, 10/17/2025 - 11:55

Should I wear a jacket to work today, or will I be too warm? Will that hurricane miss my town, or should I prepare to evacuate? We rely on accurate short-term weather forecasts both to make mundane daily decisions and to warn us of extreme events on the horizon. At the same time, Earth system scientists focus on understanding what drives variations in temperature, precipitation, and extreme conditions over periods spanning months, decades, and longer.

Between those two ends of the forecasting spectrum are subseasonal-to-seasonal (S2S) predictions on timescales of 2 weeks to 2 months. S2S forecasts bridge the gap between short-term weather forecasts and long-range outlooks and hold enormous potential for supporting effective advance decisionmaking across sectors ranging from water and agriculture to energy, disaster preparedness, and more. Yet these timescales represent an underdeveloped scientific frontier where our predictive capabilities are weakest. Indeed, the S2S range is often referred to as the predictability desert.

Forecasts at 3- to 4-week lead times, for example, remain inconsistent. Sometimes, so-called windows of opportunity arise when models provide strikingly accurate, or skillful, guidance at this timescale. But these windows of skillful S2S forecasting are themselves unpredictable. Why do they occur when they do? Do they have recognizable precursors? And how does predictability depend on the quantity (e.g., temperature versus precipitation) being predicted?

Three interlocking puzzle pieces represent the integration of weather prediction (left) and long-term outlooks (right) with the “missing middle” of S2S predictability (center). The center piece highlights key applications—agriculture, water availability, and disaster preparedness—and the tools needed to advance S2S skill, including modeling, data assimilation (DA), artificial intelligence (AI), and multiscale process understanding. Credit: Simmi Readle/NSF NCAR

These questions are more than academic curiosities. Answering them would transform our ability to gauge the value of S2S forecasts in real time and to anticipate and respond to high-impact events such as heat waves, flooding rains, drought onset, and wildfires.

Tackling this challenge requires traditionally siloed communities—scientists focused on predicting near-term weather and those focused on projecting long-term changes in the Earth system—to coordinate efforts. Together, these communities can advance scientific understanding and predictive capabilities across scales.

Discovering Windows of Opportunity

The challenges of subseasonal-to-seasonal (S2S) prediction reflect the complex and interconnected dynamics of the Earth system.

The challenges of S2S prediction reflect the complex and interconnected dynamics of the Earth system. At these lead times, forecast skill relies not only on the accuracy of initial input atmospheric conditions—always a vital element for weather forecasts—but also on model treatments of slowly evolving components of the Earth system. These components—including the ocean state, land surface conditions, snow cover, atmospheric composition, and large-scale patterns of variability such as the Madden-Julian Oscillation (MJO), El Niño–Southern Oscillation, stratospheric quasi-biennial oscillation, and sudden stratospheric warmings—interact in ways that enhance or degrade forecast performance. Volcanic eruptions can further influence these interactions, altering circulation patterns and modulating surface climate on S2S timescales.

Researchers have made substantial progress in understanding these individual Earth system components. But we still cannot reliably anticipate when models will yield skillful forecasts because their accuracy at S2S timescales is episodic and state dependent, meaning it comes and goes and depends on various interacting conditions at any given time. A model might perform well for a given region in one season—yielding a window of opportunity—but struggle in another region or season.

So how might we get better at anticipating such windows? For starters, rather than viewing the predictive capability of models as fixed, we can treat it as a dynamic property that changes depending on evolving system conditions. This paradigm shift could help scientists focus on developing tools and metrics that help them anticipate when forecasts will be most reliable. It could also suggest a need to rethink strategies for collecting environmental observations.

Just as predictability is episodic, so too might be the value of strategically enhanced observations. For example, targeted observations of sea surface temperatures, soil moisture, or atmospheric circulation during periods when these conditions strongly influence forecast skill could be far more valuable than the same measurements made at other times. Such adaptive, or state-aware, observing strategies, say, intensifying atmospheric sampling ahead of a developing MJO event, would mean concentrating resources where and when they will matter most. By feeding these strategically enhanced observations into forecast models, scientists could improve both the forecasts themselves and the ability to evaluate their reliability.

Aligning Goals Across Disciplines

S2S timescales fall at the intersection of weather forecasts and seasonal to decadal outlooks, and the communities working on those different types of predictions have different focuses and research strengths.

To drive needed technical advances supporting improved S2S predictability, we also need a cultural shift to remove barriers between scientific disciplines. S2S timescales fall at the intersection of weather forecasts and seasonal to decadal outlooks, and the communities working on those different types of predictions have different focuses and research strengths. Weather prediction emphasizes initial condition accuracy, data assimilation, and high-resolution modeling of fast atmospheric processes. Studying Earth system behavior and variability over longer timescales focuses on modeling slowly evolving boundary conditions (e.g., the ocean) and coupled component interactions (e.g., between the land and the atmosphere).

Historically, these communities have operated along parallel tracks, each with its own institutions, funding structures, and research priorities. The challenge of identifying windows of opportunity at S2S timescales offers a unifying scientific problem.

Earth system features that offer potentially promising signals of S2S predictability, such as the MJO, are already shared terrain, studied through the lenses of both weather and longer-term change. Extreme events are another area of convergence: Weather models focus on forecasting specific short-lived, high-impact events, whereas Earth system models explore the conditions and teleconnections that influence the likelihood and persistence of extremes. Together, these complementary perspectives can illuminate not only what might happen but why and when skillful forecasts are possible.

The path to unlocking S2S predictability involves more than simply blending models, though. It requires aligning the communities’ scientific goals, model performance evaluation strategies, and approaches for dealing with uncertainty. These approaches include the design of model ensembles, data assimilation strategies that quantify uncertainty in initial conditions, probabilistic evaluation methods, and ways of communicating forecast confidence to users.

The path forward also entails building modeling systems that capitalize on the weather community’s expertise in initialization and the Earth system modeling community’s insights into boundary forcing and component coupling. Accurate initialization must capture all Earth system components—from soil moisture, ocean heat content, and snow cover, for example, to the state of the atmosphere, including the stratosphere. However, observations and data assimilation for several key variables, especially in the ocean, stratosphere, and other data-sparse regions, remain limited, constraining our ability to represent their influences in prediction systems.

A near-term opportunity for aligning goals and developing models lies in improving prediction of MJO-related extreme rainfall events, which arise from tropical ocean–atmosphere interactions and influence regional circulation and precipitation. This improvement will require that atmospheric convection be better represented in models, a long-standing challenge in both communities.

Emerging kilometer-scale models and machine learning offer shared innovation and collaboration spaces. Kilometer-scale models can explicitly resolve convection, validate and refine model parameterizations, and elucidate interactions between large-scale circulation and small-scale processes. Machine learning provides new avenues to emulate convection-permitting simulations, represent unresolved processes, and reduce systematic model errors.

Success with this challenge could yield immediate value for science and decisionmaking by, for example, enabling earlier warnings for flood-prone areas and supporting more informed planting and irrigation decisions in agriculture.

From Forecast Skill to Societal Resilience

The societal need for more skillful S2S prediction is urgent and growing. Communities worldwide are increasingly vulnerable to extreme conditions whose impacts unfold on weekly to monthly timescales. In scenarios such as a prolonged dry spell that turns into drought, a sudden warming trend that amplifies wildfire risk, or a stalled precipitation pattern that leads to flooding, insights from S2S forecasting could provide foresight and opportunities to prepare in affected areas.

Officials overseeing water management, energy planning, public health, agriculture, and emergency response are all seeking more reliable guidance for S2S time frames. In many cases, forecasts providing a few additional weeks of lead time could enable more efficient resource allocation, preparedness actions, and adaptation strategies. Imagine if forecasts could reliably indicate prolonged heat waves 3–4 weeks in advance. Energy providers could prepare for surges in cooling demand, public health officials could implement heat safety campaigns, and farmers could adjust planting or irrigation schedules to reduce losses.

The resilience of infrastructure, ecosystems, and economies hinges on knowing not only what might happen but also when we can trust our forecasts. By focusing on understanding when and where we have windows of opportunity with S2S modeling, we open the door to developing new, intermediate-term forecasting systems that are both skillful and useful—forecast systems that communicate confidence dynamically and inform real-world decisions with nuance.

Realizing this vision will require alignment of research priorities and investments. S2S forecasting and modeling efforts have often fallen between the traditional mandates of agencies concerned with either weather or longer-term outlooks. As a result, the research and operational efforts of these communities have not always been coordinated or sustained at the scale required to drive progress.

Coordination and Collaboration

With growing public attention on maintaining economic competitiveness internationally and building disaster resilience, S2S prediction represents an untapped opportunity space. And as machine learning and artificial intelligence offer new ways to explore predictability with models and to extract meaningful patterns from model outputs, now is the time to advance the needed coordination.

The many use cases for S2S prediction underscore that it isn’t just a scientific challenge, it’s a strategy for achieving resilience and prosperity.

The many use cases for S2S prediction underscore that it isn’t just a scientific challenge, it’s a strategy for achieving resilience and prosperity. We call on a variety of communities and enterprises to collaborate and rally around the challenge of illuminating windows of opportunity in S2S modeling.

Scientists from traditionally distinct disciplines should codesign research strategies to jointly investigate when, where, and why S2S skill emerges. For example, they could examine weather regimes (e.g., the Pacific or Alaska ridges) and their links to modes of variability (e.g., the North Atlantic Oscillation) and leverage data assimilation to better understand how these phenomena evolve across timescales.

The scientific community could also identify and evaluate critical observational gaps that limit progress in modeling and data assimilation. And they could develop strategies to implement adaptive observing approaches that, for example, target soil moisture, surface energy fluxes, and boundary layer profiles to better capture land-atmosphere interactions at S2S timescales. Such approaches would help to fill gaps and advance understanding of key Earth system processes.

Modeling centers could build flexible prediction systems that allow for advanced data assimilation and incorporate robust coupling of Earth system components—drawing from the weather and Earth system modeling communities, respectively—to explore how initial conditions and boundary forcing jointly influence S2S skill. Using modular components—self-contained pieces of code that represent individual Earth system processes, such as atmospheric aerosols and dynamic vegetation—within these systems could help isolate sources of predictability and improve process-level understanding.

To sustain progress initiated by scientists and modeling centers, agencies and funders must recognize S2S prediction as a distinct priority and commit to investing in the needed modeling, observations, and institutional coordination.

Furthermore, it’s essential that scientists, decisionmakers, and end users codevelop forecast tools and information. Close integration among these groups would focus scientific innovation on user-defined needs of what is useful and actionable, allowing scientists to build tools that meet those needs.

S2S forecasting may never deliver consistent skill across all timescales and regions, but knowing when and where it is skillful could make it profoundly powerful for anticipating high-impact hazards. Can we reliably predict windows of opportunity to help solve the predictability desert? Let’s do the work together to find out.

Author Information

Jadwiga H. Richter (jrichter@ucar.edu) and Everette Joseph, National Science Foundation National Center for Atmospheric Research, Boulder, Colo.

Citation: Richter, J. H., and E. Joseph (2025), Scientists must join forces to solve forecasting’s predictability desert, Eos, 106, https://doi.org/10.1029/2025EO250389. Published on 17 October 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Flash, a Boom, a New Microbe Habitat

Fri, 10/17/2025 - 11:54

A sizable asteroid impact generally obliterates anything alive nearby. But the aftermath of such a cataclysm can actually function like an incubator for life. Researchers studying a Finnish impact structure found minerals whose chemistry implies that microbes were present roughly 4 million years after the impact. These findings, which were published in Nature Communications last month, shed light on how rapidly microscopic life colonizes a site after an asteroid impact.

A Special Lake

Finland is known for its myriad lakes used by boaters, fishers, swimmers, and other outdoor afficionados. Lake Lappajärvi is a particularly special Finnish lake with a storied past: Its basin was created roughly 78 million years ago when an asteroid slammed into the planet. In 2024, the United Nations Educational, Scientific and Cultural Organization (UNESCO) established a geopark in South Ostrobothnia, Finland, dedicated to preserving and sharing the history of the 23-kilometer-diameter lake and the surrounding region.

“It’s one of the places where you think that life could have started.”

Jacob Gustafsson, a geoscientist at Linnaeus University in Kalmar, Sweden, and his colleagues recently analyzed a collection of rocks unearthed from deep beneath Lake Lappajärvi. The team’s goal was to better understand how rapidly microbial life colonized the site after the sterilizing impact, which heated the surrounding rock to around 2,000°C (3,632°F).

There’s an analogue between this type of work and studies of the origin of life, said Henrik Drake, a geochemist at Linnaeus University and a member of the team. That’s because a fresh impact site contains a slew of temperature and chemical gradients and no shortage of shattered rocks with nooks and crannies for tiny life-forms. A similar environment beyond Earth would be a logical place for life to arise, Drake said. “It’s one of the places where you think that life could have started.”

Microbe-Sculpted Minerals

In 2022, Gustafsson and his collaborators traveled to Finland to visit the National Drill Core Archive of the Geological Survey of Finland.

There, in the rural municipality of Loppi, the team pored over sections of cores drilled from beneath Lake Lappajärvi in the 1980s and 1990s. The researchers selected 33 intervals of core that were fractured or shot through with holes. The goal was to find calcite or pyrite crystals that had formed in those interstices as they were washed with mineral-rich fluids.

“It’s amazing what we can find out in tiny crystals.”

The team used tweezers to pick out individual calcite and pyrite crystals from the cores. Gustafsson and his collaborators then estimated the ages of those crystals using uranium-lead dating and a technique known as secondary ion mass spectrometry to calculate the ratios of various carbon, oxygen, and sulfur isotopes within them. Because microbes preferentially take up certain isotopes, measuring the isotopic ratios preserved in minerals can reveal the presence of long-ago microbial activity and even identify types of microbes. “We see the products of the microbial process,” Drake said.

“It’s amazing what we can find out in tiny crystals,” Gustafsson added.

The researchers also used isotopic ratios of carbon, oxygen, and sulfur to estimate local groundwater temperatures in the distant past. By combining their age and temperature estimates, the team could trace how the Lake Lappajärvi impact site cooled over time.

A Slow Cool

Groundwater temperatures at Lake Lappajärvi had cooled to around 50°C (122°F) roughly 4 million years after the impact, the team found. That’s a far slower cooling rate than has been inferred for other similarly sized impact craters, such as Ries Crater in Germany, in which hydrothermal activity ceased after about 250,000 years, and Haughton Crater in Canada, where such activity lasted only about 50,000 years.

“Four million years is a very long time,” said Teemu Öhman, an impact geologist at the Impact Crater Lake–Lappajärvi UNESCO Global Geopark in South Ostrobothnia, Finland, not involved in the research. “If you compare Lappajärvi with Ries or Haughton, which are the same size, they cooled way, way, way faster.”

That difference is likely due to the type of rocks that predominate at the Lappajärvi impact site, Gustafsson and his collaborators proposed. For starters, there’s only a relatively thin layer of sedimentary rock at the surface. “Sedimentary rocks often don’t fully melt during impact because of their inherent water and carbon dioxide content,” Drake explained. And Lappajärvi has a thick layer of bedrock (including granites and gneisses), which would have melted in the impact, sending temperatures surging to around 2,000°C, earlier research estimated.

About 4 million years after the impact is also when microbial activity in the crater began, according to Gustafsson and his collaborators. Those ancient microbes were likely converting sulfate into sulfide, the team proposed. And roughly 10 million years later, when temperatures had fallen to around 30°C (86°F), methane-producing microbes appeared, the researchers surmised on the basis of their isotopic analysis of calcite.

In the future, Gustafsson and his colleagues plan to study other Finnish impact craters and look for similar microbial features in smaller and older impact structures. In the meantime, the team is carefully packaging up their material from the Lappajärvi site. It’s time to return the core samples to the Geological Survey of Finland, Drake said. “Now we need to ship them back.”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), A flash, a boom, a new microbe habitat, Eos, 106, https://doi.org/10.1029/2025EO250388. Published on 17 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Tectonics and Climate Are Shaping an Alaskan Ecosystem

Thu, 10/16/2025 - 13:24
Source: AGU Advances

Increased warming in high-latitude wetlands seems poised to increase the activity of methanogens, or methane-producing microbes. These ecosystems are complex places, however, making outcomes hard to predict.

In new biogeochemical research taking into account tectonic, climatic, and ecological factors affecting the Copper River Delta in Alaska, Buser-Young et al. found that seismic uplift and glacial meltwater have each contributed to changes in microbial metabolism, with the surprising effect of potentially decreasing methane production.

The Copper River Delta in south central Alaska has a history of large seismic events. That includes, most recently, a 1964 earthquake that lifted portions of the delta to up to 3.4 meters above sea level, turning much of it from a marine environment to a freshwater one. In more recent decades, increasing amounts of iron-rich glacial runoff have also begun flowing through the delta, the result of climate change.

Combining geochemical studies of sediment cores from six wetland locations in the delta with metagenomic analyses of the microbes in the cores, the authors documented a distinct shift in microbial metabolism. Though genes for methanogenesis are still prevalent, and organic matter is available, they found that in an increasingly freshwater, iron-rich environment, the dominant means of energy production among the microbes shifted to involve iron cycling. Their findings are a demonstration of the ways large-scale geological and climatic shifts can affect small-scale processes such as the dynamics of microbial communities.

Looking ahead, the researchers say analyzing deeper sediment core samples could provide more information about how microbial dynamics have changed over time. In addition, they say, further culture-based experiments could improve understanding of the relationships between iron and organic matter within the carbon cycle. (AGU Advances, https://doi.org/10.1029/2025AV001821, 2025)

—Nathaniel Scharping (@nathanielscharp), Science Writer

Citation: Scharping, N. (2025), Tectonics and climate are shaping an Alaskan ecosystem, Eos, 106, https://doi.org/10.1029/2025EO250387. Published on 16 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Scientists Tune In to the Ocean’s Sound Waves

Thu, 10/16/2025 - 13:23

“It’s a good thing we can’t hear it with our ears. Otherwise, we’d just have this constant din from the oceans.”

The steady thrumming of crashing waves is the ocean’s soundtrack. But behind that calming rhythm is a host of hidden chaotic sound waves, most of which are too low in frequency for humans to hear. This acoustic energy travels as infrasound through the air and as seismic waves through the ground. “It’s a good thing we can’t hear it with our ears,” said Stephen Arrowsmith, a geoscientist at Southern Methodist University in Texas. “Otherwise, we’d just have this constant din from the oceans.”

Recently, scientists developed a new method to monitor surf’s acoustic and seismic signatures to identify individual breaking waves within the noise. The findings could allow for new methods for monitoring sea conditions from land and even provide insights into conditions in the upper atmosphere.

A Signal in the Noise

Scientists first discovered surf-generated infrasound more than 20 years ago. One study, led by Arrowsmith, even detected infrasound more than 124 miles (200 kilometers) inland. While the number of such studies has slowed over the past decade, researchers at the University of California, Santa Barbara (UC Santa Barbara), who typically study volcano seismology realized they were well positioned to contribute to surf infrasound research. “We have the proximity to the coastline here on campus, so it seemed an interesting thing to explore,” said Robin Matoza, an Earth scientist and senior author on the paper.

While past studies had detected surf infrasound only as a continuous wall of noise, the researchers suspected that with new advances in computation as well as in acoustic and seismic detection, they could identify the acoustic signatures of individual waves.

The team, led by geologist Jeremy Francoeur, who conducted the work for his master’s thesis at UC Santa Barbara, installed a single infrasound sensor that collected near-continuous data for 10 months, from September 2022 to July 2023. Then, in October 2023, they conducted an intensive field experiment over 6 days, deploying a network of 12 infrasound sensors and one seismometer across a roughly 500-foot area near the Santa Barbara coast.

“One of the biggest surprises was that the same infrasound signals are being generated by surf nearly every day.”

The researchers also took GoPro videos to correlate specific ocean waves with the infrasound and seismic profiles they generated. They then selected the signatures of five waves as templates to match against the 10 months of single-sensor acoustic data, picking out individual crashing waves among all the infrasound recorded. “One of the biggest surprises was that the same infrasound signals are being generated by surf nearly every day,” said Francoeur in an email. The approach revealed up to tens of thousands of individual surf events per day.

“I liked how they were able to identify discrete surf events using this local array,” said Arrowsmith, who wasn’t involved in the new study. “Previous studies on this, including mine, were not able to do that.”

The researchers found they could detect discrete infrasound signals only when breaking waves were over approximately 6.5 feet (2 meters) high, suggesting that a minimum amount of energy is required to generate detectable infrasound. When waves were detectable, however, the size of the water’s waves correlated with acoustic signal strength. This finding was particularly noticeable in the winter months when larger storm swells reach the California coast.

By timing when infrasound signals hit each sensor in the network, the scientists triangulated the positions of the waves, pinpointing a hot spot of acoustic activity to a specific rocky reef area just offshore. This suggests that certain bathymetric features might be more effective than others at generating detectable infrasound. The findings were published in Geophysical Journal International.

From the Surf to the Sky

Monitoring and locating the infrasound signature of surf could offer a new method for monitoring sea conditions using land-based sensors, which is critical for maritime safety and coastal management and research. Sea conditions are most often studied using ocean-based buoys or video monitoring, which is obscured at night and in foggy conditions.

The new method could also have applications far beyond the coast. If the signals from individual waves can be detected at greater distances from shore, they could offer information about conditions in the upper atmosphere. This is possible because infrasound enters the upper atmosphere, and features like temperature and wind speed modulate the waves before they refract in the stratosphere and return to Earth.

By comparing the signatures of individual surf events detected at sensors positioned at different distances, scientists say it could be possible to correlate specific acoustic signals with atmospheric conditions, providing a new tool for studying weather patterns and atmospheric dynamics.

“If you have repetitive signals, you can monitor small changes in those signals,” Matoza said. “You could use that to infer changes in the atmosphere.”

—Andrew Chapman (@andrewchapman.bsky.social), Science Writer

Citation: Chapman, A. (2025), Scientists tune in to the ocean’s sound waves, Eos, 106, https://doi.org/10.1029/2025EO250384. Published on 16 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Panama’s Coastal Waters Missed Their Annual Cooldown This Year

Wed, 10/15/2025 - 12:18

From January to April, strong winds blowing south from the Atlantic side of Panama through gaps in the Cordillera mountain range typically travel over the country and push warm water away from Panama’s Pacific coast. This displacement allows cold, nutrient-rich water to flow up from the depths, a process called upwelling. The Panama Pacific upwelling keeps corals cool and nourishes the complex marine food webs that support Panama’s fishing industry and economy.

In 2025, for the first time on record, this upwelling didn’t occur, according to research published in the Proceedings of the National Academy of Sciences of the United States of America.

During the upwelling period early in the year, ocean temperatures near the coast typically fall to a low of about 19°C, said Andrew Sellers, a marine ecologist at the Smithsonian Tropical Research Institute in Panama. This year, the coastal waters reached just 23.3°C at their coolest.

Waning Winds

Sellers said the Panama Pacific upwelling has likely been happening since the isthmus formed millions of years ago. The phenomenon has been recorded at low resolution for 80 years, and scientists have 40 years’ worth of more detailed records.

The team has identified “a shocking extreme event.”

Scripps Institution of Oceanography climate scientist Shang-Ping Xie, who has studied the weather patterns that usually cause the Panama Pacific upwelling but was not involved with this research, said the team had identified “a shocking extreme event.”

Annual upwelling moderates water temperature along the coast and triggers plankton blooms that nourish marine food webs and Panama’s economy. About 95% of the fish the country catches comes from the Pacific side, and most of that marine life is supported by upwelling, said Sellers.

Sellers said that though tropical upwelling plays a critical role in supporting marine food webs and fisheries, it’s understudied. Indeed, it was a happy accident that the research team was able to obtain measurements in 2025. Sellers says the Smithsonian Tropical Research Institute maintains a network of temperature sensors near the coast but does not regularly monitor the temperature of deeper waters. Early this year, the Max Planck Institute research vessel S/Y Eugen Seibold was in the region as part of its mission to study the relationship between the atmosphere and the ocean, and it provided high-resolution temperature measurements, including in deeper waters, during the upwelling failure.

The Panama Pacific upwelling typically causes a rise in chlorophyll concentrations (blue = low concentrations and red = high concentrations) and a phytoplankton bloom, nourishing the area’s rich marine life, as seen here in February 2024. Credit: Aaron O’Dea

These measurements allowed the research team to see that deeper waters offshore were cold as usual but that those waters didn’t make their way to the coast. The cause seems to be a dramatic change in wind patterns in early 2025: Winds hailing from the north were both shorter in duration and 74% less frequent during the study period than in typical years.

Rippling Consequences

“Given how important upwelling is to that region, it’s hard to imagine there wouldn’t be a loss of primary productivity,” the growth of phytoplankton that sustains the ocean’s food chains, said Michael Fox, a coral reef ecologist at the King Abdullah University of Science and Technology. “Upwelling sets the stage for the base of the food web.”

Some models have predicted that climate change will cause upwelling in temperate zones such as California to strengthen, but the dynamics in the tropics are more of a mystery. The Panama Pacific upwelling is strongly influenced by the El Niño–Southern Oscillation (ENSO). Sellers says changes in ENSO might be affecting local dynamics in Panama.

“Studies like this one should motivate people to pay more attention to ocean-atmosphere dynamics in the tropics.”

“Studies like this one should motivate people to pay more attention to ocean-atmosphere dynamics in the tropics,” Fox said.

Sellers said this year’s unprecedented upwelling failure is likely to have adverse effects on the country’s vibrant Pacific marine life, but Panama does not collect extensive data on its fisheries. The team is now examining the exception—a dataset related to small fish such as sardines and anchovies—to see whether the lack of upwelling affected those fish.

Xie said the Smithsonian team hasn’t yet provided enough data to evaluate what caused this year’s unusual wind patterns and whether climate change made the upwelling failure more likely. Early this year, La Niña would likely have raised the pressure on the Pacific side of the country, which would have weakened the winds. But Xie said that La Niña is a frequent phenomenon and it alone can’t explain the unprecedented weather seen in Panama this year. He said something likely happened that changed pressure levels on the country’s northern Atlantic side as well. But more research is needed to say for sure.

Sellers’s team is preparing to gather more detailed measurements of marine life effects in early 2026, in case upwelling fails again. They are planning to assess the population of barnacles and other sessile invertebrates, which rely on plankton whose populations burgeon during upwelling.

Though the Eugen Seibold’s mission is set to end in 2026, Sellers said he’s determined to perform extensive water temperature measurements early next year, with or without a research vessel. “Sensors are cheap, and we can get more of them,” he said.

“In coming years, we’ll know if this is going to be a recurring issue,” Sellers said. “If it is, it’s going to be a hard hit to the economy.”

—Katherine Bourzac (@bourzac.bsky.social), Science Writer

Citation: Bourzac, K. (2025), Panama’s coastal waters missed their annual cooldown this year, Eos, 106, https://doi.org/10.1029/2025EO250382. Published on 15 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Chicago Soil Maps Childhood Lead Exposure Risk

Wed, 10/15/2025 - 12:11
Source: GeoHealth

Lead is a neurotoxin that can damage multiple body systems and lead to learning and developmental problems. The element has been phased out of use in paint, gasoline, and other industrial applications for decades, but it can persist for years in the soil. Children, who can be particularly vulnerable to lead poisoning, can accidentally ingest and inhale lead particles when they play in contaminated areas.

Even though one in four U.S. homes likely has soil lead levels over the recommended safety limits, no major U.S. city includes systematic soil monitoring as part of its lead prevention services, and blood testing often happens only after exposure.

Chicago is one city with many homes built before 1978—the year the U.S. government banned the use of lead-based paint—and its industrial history means that many residents could be living with elevated blood lead levels (EBLL) because of the prevalence of lead in the surrounding soil. Testing soil for lead is one way to predict which communities are most at risk for childhood lead exposure.

Thorstenson et al. analyzed 1,750 soil samples from Chicago’s 77 community areas. The researchers then used these data with the EPA’s Integrated Exposure Uptake Biokinetic model (IEUBK) to estimate how much lead children are likely to have in their blood. Comparing these data to actual EBLL findings from the Chicago Department of Public Health and accounting for factors such as household income, the age of housing, and the housing’s proximity to industrial land, the researchers built a comprehensive map that identifies the Chicago communities most at risk for soil lead exposure.

More than half of the citywide soil samples showed lead levels above the EPA’s recommended threshold of 200 parts per million—with some hot spots rising above 300 parts per million. When matched with the modeling from IEUBK, an estimated 27% of children across the city are at risk of EBLL. In the hot spot areas, that risk rises to 57%.

These findings suggest that though median household income is the strongest predictor of EBLL prevalence, soil lead levels are also a significant predictor. Systematic soil testing could become a crucial way to reduce children’s risk of lead exposure in contaminated areas, the authors say. (GeoHealth, https://doi.org/10.1029/2025GH001572, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2025), Chicago soil maps childhood lead exposure risk, Eos, 106, https://doi.org/10.1029/2025EO250377. Published on 15 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

JPL Workforce Decimated

Tue, 10/14/2025 - 16:26
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

Today, NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, Calif., laid off 550 people, a roughly 11% reduction of its workforce.

“This week’s action, while not easy, is essential to securing JPL’s future by creating a leaner infrastructure, focusing on our core technical capabilities, maintaining fiscal discipline and positioning us to compete in the evolving space ecosystem,” JPL director Dave Gallagher wrote in a brief statement released on 13 October. Layoffs were spread across the technical, business, and support areas.

Gallagher said that this workforce reduction is part of a reorganization that began in July and is not related to the current government shutdown that began on 1 October. A 10 October court filing by the White House Office of Management and Budget did not include NASA among the agencies targeted for layoffs by the Trump administration during the ongoing shutdown, reported Space News.

 
Related

JPL is a research and development laboratory federally funded by NASA. While the current government shutdown continues, NASA has been directed to operate and plan as if the appropriations bill passed by the House of Representative is in effect, which would fund NASA (and most JPL projects) at nearly the same level as the current fiscal year.

Federal whistleblowers, however, have come forward with evidence that NASA leadership has been operating as if the President’s Budget Request (PBR)—not the appropriations bill—is in effect, directing mission wind-down operations and staff reductions under the assumption of a 20% overall budget cut. Some of that lost spending would affect JPL’s ability to plan, build, and operate Earth science missions and space exploration spacecraft.

Despite vocal support from the Trump administration and NASA leadership about putting humans on the Moon again and eventually on Mars, the PBR would also cancel the Mars Sample Return program, which would pick up and return to Earth sample capsules collected and deposited by the Perseverance rover. Analysis of those samples would provide critical support to any future human exploration mission to Mars.

Kevin Hicks, a systems engineer who formerly operated rovers at JPL, said that Perseverance’s budget is being reduced by two-thirds, “just enough to technically keep it going and not get the full PR backlash of canceling a working rover,” he wrote.

Credit: Kevin Hicks (@astro-cowboy.bsky.social) via Bluesky

This is the fourth round of layoffs at JPL since the beginning of 2024, including an 8% reduction in staff that affected mostly engineering-related positions. The mood among current and former JPL employees is grim. Several people commented on a JPL Reddit forum that they expect more layoffs in the future.

“Today was very somber on lab. It felt like everyone [was] grieving,” one Redditor wrote on 13 October. Several other posters echoed that sentiment. “We tried to keep a positive, but realistic attitude and we even took a final group photo in front of the JPL concrete logo. However, there’s no whitewashing the ‘doomsday-eve’ feeling that’s looming over all our heads.”

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

As Seas Rise, Corals Can’t Keep Up

Tue, 10/14/2025 - 12:14

Coral reefs face myriad challenges, from ocean acidification to warming seas to destructive fishing activities. Sometimes, reefs can rebound from these ecological harms—but only if the coral species assembled on a reef can maintain the required growth rates.

A revised estimate of coral growth rates, published in Nature, suggests that tropical western Atlantic reefs are losing their capacity to build upward. Without upward reef growth, rising seas threaten to drown these reefs and cancel out the benefits they offer to coastal communities, such as minimizing flood damage. Researchers found that reef growth rates at essentially all the 400 sites analyzed won’t be enough to keep up with sea level rise by 2100.

“It’s very critical that we get a handle on what these rates are to be able to adequately gauge the scale of the problem.”

“It’s very critical that we get a handle on what these rates are to be able to adequately gauge the scale of the problem,” said Cody Clements, a coral reef ecologist at the Georgia Institute of Technology who was not involved in the new study. “We have a lot of work ahead of us.”

“Unfortunately, the estimates are worse than before,” said Rich Aronson, a coral reef ecologist at the Florida Institute of Technology who was not involved in the new paper but works closely with its authors. 

Eroding Reefs

Coral reefs grow when corals secrete calcium carbonate, a hard material that forms their exoskeletons.

Scientists can use knowledge of the species that make up a coral reef to estimate its vertical stacking porosity—how much vertical space a reef can build with a given amount of calcium carbonate. 

The skeletons of branching corals, for example, tend to accumulate in an arrangement with more empty space, leading to more upward growth than other corals, such as flat corals, might achieve with the same amount of calcium carbonate.

However, the relationship between coral assemblage and vertical growth ability has so far been poorly defined, said Chris Perry, a coastal geoscientist at the University of Exeter and lead author of the new study. 

The studied reefs “are going to have zero capacity, really, to be able to track future sea level rise.”

Perry and his research group wanted a better estimate. They gathered 66 images of fossilized coral reefs from the tropical western Atlantic and analyzed how those reefs grew over time on the basis of the species of corals within. Then, they applied their revised estimates of growth to previously collected data on the ecology and carbonate production of 400 sites at three reef systems in the tropical western Atlantic: the Mexican Mesoamerican Reef, the Florida Keys, and Bonaire. 

The adjusted estimate of growth revealed a bleaker picture of reef health than the scientists anticipated: Researchers found that on average, reefs at all sites were growing at a sluggish pace—less than 1 millimeter per year—with an average growth rate decline of 12.4% when compared to previous estimates. On average, global sea levels are rising by about 4.5 millimeters per year.

The new calculations are particularly stark for reefs dominated by branching coral species, Didier De Bakker, a coral reef ecologist at the University of Exeter and a coauthor of the new study, wrote in an email. 

If corals can’t grow, they shrink, falling victim to erosion by other marine creatures such as fish and sea urchins. Eventually, corals unable to keep up with sea level rise are drowned, unable to access sufficient light to continue growing at all.

The studied reefs “are going to have zero capacity, really, to be able to track future sea level rise,” Perry said. 

Corals at Limones Reef in the Mexican Caribbean suffered a bleaching event in 2023. Credit: Lorenzo Álvarez-Filip

In general, the new estimates of the link between assemblage type and vertical growth “revise our estimate downward” of how well corals will be able to keep up with sea level rise, Aronson said. The results also align with a 2023 study by Aronson and others that found reef growth in Panama’s Gulf of Chiriquí, part of the Pacific Ocean, is likely already unable to keep up with sea level rise. 

Perry and De Bakker hope the data in the new study will feed into future studies modeling coastal wave exposure. “These new estimates provide a more realistic basis for projecting the vulnerability of adjacent habitats and reef-fronted urban areas,” De Bakker wrote. 

Aronson said one next step for the research would be to apply the research team’s new estimates of vertical growth to reefs elsewhere, such as those in tropical Indo-Pacific waters. There, more species of branching coral still survive, giving Indo-Pacific reefs a slightly better chance of keeping up with sea level rise, said Clements, who studies Indo-Pacific reefs.

Climate Change and Corals

As a last step to their study, the researchers used what they’d learned about reef growth at 400-plus reef sites along with various future climate warming scenarios, called Shared Socioeconomic Pathways, or SSPs, to project how reef growth rates may change as the climate warms and sea levels continue to rise.

Results predicted that more than 70% of tropical western Atlantic reefs will transition into net erosional states by 2040 under an optimistic scenario (SSP1-2.6). But if warming exceeds SSP2-4.5 (a middle-of-the-road scenario in line with current development patterns), nearly all reefs will be eroding by 2100.

“Even if you go by some of the conservative estimates that they’re using, we still have a major problem in terms of coral reef accretion rates,” Clements said. 

Reef Benefits Wash Away

Slower vertical growth means corals will have a tougher time maintaining their crest, or high point. These crests serve as wave breakers that dissipate wave energy and reduce flood damages to coastal communities. One estimate suggests that coral reefs near the U.S. coastline prevent more than $1.8 billion in damage each year.

This coral reef crest in the Mexican Caribbean dissipates wave energy and reduces beach erosion and possible flood damage. Credit: Lorenzo Álvarez-Filip

As coral growth fails to track with sea level rise, these crests fall below the water’s surface. In turn, rising seas and waves from storms face less resistance, and reefs’ protective abilities get washed away.

“It’s quite difficult to see how we turn this around without really, really aggressive action on greenhouse gas emissions.”

Reef restoration is an active area of research, with engineers and ecologists working together to create various solutions, from LEGO-like scaffolding for corals to robots that sprinkle warming reefs with cool water. Previous research by Aronson and others indicated that successful restoration could help reefs keep pace with future sea level rise.

However, restoration will be effective only if it is done in tandem with efforts to rein in climate warming, which could slow sea level rise and reduce the frequency of marine heat waves, Perry said. “It’s quite difficult to see how we turn this around without really, really aggressive action on greenhouse gas emissions.”

“We have to do something about these global-scale stressors, like climate change, or it’s not going to matter,” Clements said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

This news article is included in our ENGAGE resource for educators seeking science news for their classroom lessons. Browse all ENGAGE articles, and share with your fellow educators how you integrated the article into an activity in the comments section below.

Citation: van Deelen, G. (2025), As seas rise, corals can’t keep up, Eos, 106, https://doi.org/10.1029/2025EO250380. Published on 14 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Space Radiation Can Produce Some Organic Molecules Detected on Icy Moons

Tue, 10/14/2025 - 12:10

New laboratory research suggests that some organic molecules previously detected in plumes erupting from Saturn’s moon Enceladus may be products of natural radiation, rather than originating from the moon’s subsurface ocean. This discovery complicates the assessment of the astrobiological relevance of these compounds.

Enceladus hides a global ocean buried beneath its frozen crust. Material from this liquid reservoir is ejected into space from cracks in the ice near the south pole, forming plumes of dust-sized ice particles that extend for hundreds of kilometers. While most of this material falls back onto the surface, some remains in orbit, becoming part of Saturn’s E ring, the planet’s outermost and widest ring.

Between 2005 and 2015, NASA’s Cassini spacecraft flew repeatedly through these plumes and detected a variety of organic molecules. The detection was viewed as evidence of a chemically rich and potentially habitable environment under the ice, where molecules essential to life could be available. However, the new study offers an explanation in which radiation, not biology, is behind the presence of at least some of these organic molecules.

To test the role of space radiation, a team of researchers led by planetary scientist Grace Richards, a postdoc at the National Institute for Astrophysics in Rome, simulated conditions near Enceladus’s surface by creating a mixture of water, carbon dioxide, methane, and ammonia, the main expected components of surface ice on Enceladus. They cooled the concoction to −200°C inside a vacuum chamber and then bombarded it with water ions, which are an important component of the radiation environment that surrounds the moon.

The radiation induced a series of chemical reactions that produced a cocktail of molecules, including carbon monoxide, cyanate, ammonium, and various alcohols, as well as molecular precursors to amino acids such as formamide, acetylene, and acetaldehyde. The presence of these simple molecules indicates that radiation could induce similar reactions on Enceladus.

Richards presented these findings at the Europlanet Science Congress–Division for Planetary Sciences Joint Meeting (EPSC-DPS 2025) in Helsinki, Finland. She and her coauthors also published a detailed report in Planetary and Space Science.

Enceladus and Beyond

The new research raises the question of whether the organic molecules detected in Enceladus’s plumes truly come from the moon’s buried ocean, whether they are formed in space, or whether they form close to the surface after the plumes leave the Enceladean interior.

While the finding doesn’t exclude the possibility of a habitable ocean on Enceladus, Richards urges caution in assuming a direct link between the presence of these molecules in the plumes, their origin, and their possible role as precursors to biochemistry.

“I don’t necessarily think that my experiments discredit anything to do with Enceladus’s habitability.”

“I don’t necessarily think that my experiments discredit anything to do with Enceladus’s habitability,” Richards said.

However, she added, “when you’re trying to infer this ocean composition from what you’re seeing in space, it’s important to understand all the processes that go into modifying this material.” Apart from radiation, these processes include phase changes, interactions with the moon’s ice walls, and interactions with the space environment.

“We need a lot of experiments of that type,” said planetary scientist Alexis Bouquet, a French National Centre for Scientific Research (CNRS) researcher at L’Université d’Aix-Marseille who wasn’t involved in the study. “They demonstrated that you can produce a certain variety of species in conditions that are relevant to the south pole of Enceladus.”

Bouquet highlighted the importance of simulating these environments in a lab for planning future missions to Enceladus and for interpreting the much-anticipated data from current missions to Jupiter’s icy moons. These missions are NASA’s Europa Clipper, which will explore Europa, and the European Space Agency’s (ESA) JUICE (Jupiter Icy Moons Explorer), which will visit all three of the giant planet’s moons with subsurface oceans: Ganymede, Calisto, and also Europa.

The intense radiation around Jupiter makes these experiments especially relevant. “Radiation chemistry for Europa or the Jovian moons in general [is] a big deal, a bigger deal than in Enceladus,” Bouquet says.

Another Story Completely

As Richards’s work questions the origin of organic compounds around Enceladus, researchers keep adding more molecules to the puzzle.

After a new analysis of data gathered during one of Cassini’s close approaches to Enceladus in 2008, researchers led by planetary scientist Nozair Khawaja at the Freie Universität Berlin and the University of Stuttgart reported the discovery of new types of organic molecules, seemingly emanating from the icy vents. They include ester and ether groups and chains and cyclic species containing double bonds of oxygen and nitrogen.

On Earth, these molecules are essential links in a series of chemical reactions that ultimately produce complex compounds needed for life. And while these molecules could have an inorganic origin, “they increase the habitability potential of Enceladus,” Khawaja said. The findings appeared in Nature Astronomy.

Khawaja’s team’s analysis suggests that complex organic molecules are present in fresh ice grains just expelled from the vents. During its last flyby, Cassini got as close as 28 kilometers to the moon’s surface.

After modeling the plumes and the icy grains’ residence times in space, they think that the ice grains sampled by Cassini did not spend a lot of time in space, likely just “a few minutes,” Khawaja said. “It is fresh.”

This short duration in space questions whether space radiation had enough time to produce the organic molecules Khawaja detected. Just a few minutes would not be long enough for such complex chemistry to take place, even in a high-radiation environment.

“Big grains coming from the surface full of organics? That is much harder to explain through radiation chemistry,” Bouquet said.

While the types of experiments performed by Richards “are valuable and take the science to the next level,” Khawaja said, “our results tell the other story completely.”

Back to Enceladus

Both studies reinforce the complexity of Enceladus’s chemistry, upholding it as a prime target in the search for extraterrestrial life, or at least life’s building blocks. Enceladus has all three prerequisites for life: liquid water, an energy source, and a rich cocktail of chemical elements and molecules. Even if the subsurface ocean is out of reach—it lies at least a few kilometers beneath the ice close to the poles—the plumes offer the only known opportunity to sample an extraterrestrial liquid ocean.

Studies for a potential ESA mission dedicated to Enceladus are already underway, with plans that include high-speed flybys through the plumes and, potentially, a lander on the south pole. The insights from both recent studies will help researchers design the instrumentation and guide the interpretation of future results.

“There is no better place to look for [life] than Enceladus,” Khawaja said.

—Javier Barbuzano (@javibar.bsky.social), Science Writer

Citation: Barbuzano, J. (2025), Space radiation can produce some organic molecules detected on icy moons, Eos, 106, https://doi.org/10.1029/2025EO250383. Published on 14 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The 12 July 2024 landslide cluster in Pengshui County, Chongqing, China

Tue, 10/14/2025 - 07:45

About 140 mm triggered 143 landslides in an area of about 10 km2, killing two people.

Loyal readers will have noticed that I’m fascinated by dense clusters of landslides triggered by intense rainfall (or earthquakes). Over the years, I have written about these on multiple occasions, but increasing numbers are being described in the literature.

Another very interesting example has just been published in the journal Landslides (Xie et al. 2025). This example occurred on 12 July 2024 close to Puzi in Pengshui County, Chongqing, China. The centre of the cluster as at [29.56790, 108.28781] – this is the marker on the images that follow.

The Planet image below shows the area on 24 May 2024, before the rainfall:-

The site of the 12 July 2024 landslides in Pengshui County, Chongqing, China. Image copyright Planet, used with permission. Image dated 24 May 2024.

And this is the same site after the event on 12 July 2024:-

The aftermath of the 12 July 2024 landslides in Pengshui County, Chongqing, China. Image copyright Planet, used with permission. Image dated 1 August 2024.

And here is an image compare:-

Images copyright Planet, used with permission.

Xie et al. (2025) show that this cluster of landslides was triggered by a rainstorm that deposited about 140 mm of rainfall in a few hours. In total, 143 landslides were triggered in an area of about 10 km2. The failures were mostly disrupted avalanches, some of which formed channelised debris flows. However, Xie et al. (2025) also show that there are a number of interesting aspects of this cluster of landslides.

Note the geographical isolation of these landslides. The slopes to the east and west suffered far fewer failures. Perhaps surprisingly, this cluster of landslides did not occur in the area of highest rainfall – a short distance to the west, more than 200 mm was recorded, but few landslides occurred.

The analysis of Xie et al. (2025) shows that this cluster occurred because of a weak geological unit (sandstone) that was highly fractured, a geological structure that promoted instability and steep slope gradients (which may be associated with erosion by the river). Thus, it is the combination of the meteorological, geological and geomorphological factors that led to the cluster of landslides.

Fortunately, the area had been mostly evacuated ahead of the rainfall, so there were just two fatalities. There was extensive damage to properties though.

This event illustrates well the ways in which extreme rainfall events are combining with local factors to create clusters of landslides that have the potential to generate high levels of damage.

Many thanks to Xie et al. (2025) for such an interesting example.

References

Xie, X., Liu, S., Macciotta, R. et al. 2025. Spatial heterogeneity in landslide response to a short-duration intense rainfall event on 12 July 2024 in Pengshui County, Chongqing, ChinaLandslides. https://doi.org/10.1007/s10346-025-02624-6.

Planet Team 2025. Planet Application Program Interface: In Space for Life on Earth. San Francisco, CA. https://www.planet.com/.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The 22 May 1960 earthquake-induced landslides and tsunami at Lake Rupanco in Chile

Mon, 10/13/2025 - 06:41

Reconstruction of landslides on the banks of Lake Rupanco in Chile, triggered by the 22 May 1960 Mw-9.5 earthquake, suggests that a slope failure with a volume of 161 million cubic metres triggered a tsunami with a maximum amplitude of 33.3 metres. About 120 people were killed.

A very interesting paper (Quiroga et al. 2025) has just been published in the journal Landslides that examines combined landslide – tsunami threats at Lake Rupanco [40.82, -72.50] in Chile. The context is a series of landslides, and a resultant tsunami, that was triggered by the 22 May 1960 Mw=9.5 Great Chilean earthquake. The paper reconstructs that landslides and models the tsunami that they generated.

This event is particularly interesting as the loss of life was significant. Quiroga et al. (2025) document about 120 fatalities:-

“The most severely impacted area was Las Gaviotas, a settlement situated on the southeast shore…, where tsunami run-up heights reportedly exceeded 10 m, according to eyewitness accounts… One of the most significant losses was the destruction of the popular Termas de Rupanco hotel located near geothermal springs …, which was swept away by the landslides, resulting in 11 confirmed fatalities … At that time, a road was also under construction along the southern shoreline to connect Osorno with Las Gaviotas; both the road and several worker camps were destroyed…”

The Chilean Enterreno site has a photograph of the Termas de Rupanco hotel prior to the tsunami:-

Hotel Termas de Rupanco, which was destroyed by the landslide-induced tsunami 1960. Image from Enterreno. Posted by Francisco Vidal Guzmán under a by-nc licence.

Quiroga et al. (2025) have tracked the source of the tsunami to a series of landslides that occurred on the north side of Lake Rupanco. The scars of these failures are still very visible on Google Earth:-

Google Earth image of the site of the landslides on the banks of Lake Rupanco triggered by the 22 May 1960 earthquake in Chile.

Quiroga et al. (2025) have identified eight landslide scars in this area, of which the most significant is the bowl-shaped scar in the centre of the image above. This is the most likely source of the tsunami. It is a rotational failure with lower runout zone, with a volume of 161 million m3. Of this volume, 12.1 million m3 became submerged to generate the wave.

Reconstruction of the wave suggests that it has a maximum amplitude of 33.3 metres close to the landslide itself. At Las Gaviotas, where the hotel was located, the wave had a maximum amplitude of 8.6 metres, arriving 261 seconds after initiation.

This elegant and useful paper illustrates well the threat posed by large landslides into lakes. For those located in the hotel, the events would have been terrifying, starting with a major earthquake for which the shaking would have been intense and long-lasting, followed by the noise and dust generated by the collapsing slopes, and finally the impact of this enormous tsunami. Keeping people safe in such circumstances is a very major challenge.

Reference

Quiroga, J.P., Aránguiz, R., Hernández-Madrigal, V.M. et al. 2025. Reconstruction and numerical modeling of historical and paleo-tsunamigenic landslides in Lake Rupanco, Chile. Landslides. https://doi.org/10.1007/s10346-025-02629-1.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Zircon Crystals Could Reveal Earth’s Path Among the Stars

Fri, 10/10/2025 - 12:53

Tiny crystals in Earth’s crust may have recorded meteorite and comet impacts as our planet traveled through the spiral arms of the Milky Way over more than 4 billion years, according to new research.

The study is one of the first to suggest that galactic-scale processes can affect Earth’s geology, and researchers think similar evidence might be found on other bodies in the solar system, including the Moon and Mars.

“This is something that could connect the Earth, the Moon, and Mars into the wider galactic surroundings.”

“This is so interesting and exciting—we are potentially seeing something that is not just unique to Earth,” explained geologist Chris Kirkland of Australia’s Curtin University, the first author of the new study published in Physical Review Research. “This is something that could connect the Earth, the Moon, and Mars into the wider galactic surroundings.”

Kirkland and his coauthor, University of Lincoln astrophysicist Phil Sutton, studied changes in oxygen isotopes in a database of tens of thousands of dated crystals of zircon—a silicate mineral with the chemical formula ZrSiO4 that is common in Earth’s crust. They compared their findings to maps of the Milky Way galaxy that show its neutral hydrogen, or H1.

H1, with one proton and one electron, is the most abundant element in the universe, and its density is particularly high in the arms of the Milky Way galaxy.

Because they are almost exactly the same size, uranium atoms sometimes replace the zirconium atoms in zircon. Uranium radioactively decays into lead over time, so geologists can study the levels of uranium and lead isotopes in zircon crystals to determine when the crystals formed, sometimes in the first phases of the evolution of Earth’s crust about 4.4 billion years ago.

“Zircon crystals are a geologist’s best friend…we can get a lot of information from a single zircon grain.”

“Zircon crystals are a geologist’s best friend,” Kirkland said. “They have an inbuilt clock, and they carry a chemical signature that tells us how they formed—so we can get a lot of information from a single zircon grain.”

Queen’s University geochemist Christopher Spencer, who was not involved in the study, said that the work was fascinating and provocative. “I think the study is a reminder that Earth does not evolve in isolation and that interdisciplinary thinking, however speculative at first, can open up new ways of framing questions about our planet’s history.”

Oxygen Isotope Ratios

The key to the latest research was in the ratios of isotopes—forms of the same chemical element that have different numbers of neutrons—in the oxygen atoms of zircon’s silicate group.

The relative levels of oxygen isotopes in samples of zircon crystals can tell geologists whether the crystals formed high in the crust, perhaps while interacting with water and sediments, or deeper within Earth’s mantle.

Kirkland said the latest study examined the distribution of the ratios of oxygen isotopes found in a dataset of zircon crystals sampled from around the world. The scientists evaluated the data’s “kurtosis,” or the measure of how flat or peaked a distribution is. A dataset with high kurtosis has a narrow distribution, with most values occurring in the middle and causing a sharp peak in the distribution curve. In contrast, a dataset with low kurtosis has a wide distribution with more high and low values, causing a wider distribution curve with a less pronounced peak.

The researchers determined that periods of high oxygen isotope kurtosis corresponded to times when our solar system was crossing the dense spiral arms of the Milky Way galaxy. Such crossings occurred roughly every 187 million years on average during our solar system’s 748-million-year orbit around the galactic center at a speed of about 240 kilometers per second.

In addition to H1, the spiral arms are filled with many more stars than the interstellar space between them. The gravity of those stars seems to have disturbed the Oort Cloud—the haze of billions of icy rock fragments that surrounds our solar system. That, in turn, caused more meteors and comets to strike Earth as it passed through the galactic arms, leading to the subsequent melting of the crust in many places, Kirkland said. “By looking at the variability of the [zircon] signal over time, we were able to get an indication of how different the magma production on the planet was at that time.”

Professor Chris Kirkland uses an ion microprobe to date zircon mineral grains. Credit: C. L. Kirkland

He warned that correlation does not mean causation but said that in this case there seemed to be no other plausible cause for the periodic kurtosis of the oxygen isotope ratios in zircons. “It is very important that we are able to see the frequency of [meteor and comet] impacts” on Earth, Kirkland said. “Rather than an internal process, we seem to be looking at an external process.”

Some other experts suggest the new study is notable for outlining the concept that galactic processes could have left geological traces, but it is not yet conclusive proof.

Earth scientist Craig Storey of the University of Portsmouth in the United Kingdom, who was not involved in the new study, said crustal melting did not necessarily prove an increase in meteorite or comet impacts. Instead, natural processes here on Earth, such as volcanic or tectonic movements, could have caused melting of the crust at several stages of our planet’s geological history.

He is also concerned that some of the proposed correlations in the study may not be correct. “It is an interesting idea, and there are potentially ways to test it, but I don’t think this is the way to test it,” Storey said.

—Tom Metcalfe (@HHAspasia) Science Writer

Citation: Metcalfe, T. (2025), Zircon crystals could reveal Earth’s path among the stars, Eos, 106, https://doi.org/10.1029/2025EO250379. Published on 10 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New 3D Model Reveals Geophysical Structures Beneath Britain

Fri, 10/10/2025 - 12:53
Source: Journal of Geophysical Research: Solid Earth

Magnetotelluric (MT) data, which contain measurements of electric and magnetic field variations at Earth’s surface, provide insights into the electrical resistivity of Earth’s crust and upper mantle. Changes in resistivity, or the ability to conduct an electrical current, can indicate the presence of geologic features such as igneous intrusions or sedimentary basins, meaning MT surveys can complement other kinds of geophysical surveys to help reveal Earth’s subsurface. In addition, such surveys can play an important role in improving understanding of the risks space weather poses to human infrastructure.

Montiel-Álvarez et al. present the first 3D electrical resistivity model of Britain, based on long-period MT data (using measurements gathered every second for 4–6 weeks at a time) from across the island. Their model, called BERM-2024, points to previously recognized as well as likely new tectonic and geological structures. The authors also model the effects of a recent solar storm on Earth’s geoelectric field, validating the usefulness of MT-based approaches for space weather impact forecasting.

The BERM-2024 electrical resistivity model is based on MT data from 69 sites in Britain, including both new and legacy datasets. Creating the final model involved processing the raw time series data and accounting for the “coastal effect” caused by the conductivity of ocean water when inverting the data—or calculating causes based on observations.

Sensitivity tests of the new model indicate it resolves features to depths of 200 kilometers (125 miles), including many known from other geophysical surveys and geological observations. It also reveals new anomalies, including highly conductive areas under Scotland’s Southern Uplands Terrane and a resistive anomaly under the island of Anglesey. More intriguing, a large, previously unknown conductive anomaly appears in their model between 85 and 140 kilometers (52–87 miles) beneath the West Midlands region.

The authors tested the utility of their resistivity model for estimating the electric field at Earth’s surface, which is key in forecasting the effects of geomagnetically induced currents caused by space weather. To do so, they obtained a time series of the horizontal electric field across Britain during a solar storm that occurred on 10–11 October 2024, which led to bright displays of aurora borealis across the Northern Hemisphere. They found good agreement between their modeled time series and those measured at observatories, indicating that electrical resistivity models are a tool that can provide accurate information for space weather impact planning. (Journal of Geophysical Research: Solid Earth, https://doi.org/10.1029/2025JB031813, 2025)

—Nathaniel Scharping (@nathanielscharp), Science Writer

Citation: Scharping, N. (2025), New 3D model reveals geophysical structures beneath Britain, Eos, 106, https://doi.org/10.1029/2025EO250381. Published on 10 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Pinpointing Sewage Seeps in Hawaii

Thu, 10/09/2025 - 13:09

In Hawaii, most of the population relies on private septic tanks or cesspools to dispose of sewage and other wastewater. There are more than 88,000 cesspools in the state, with about 55,000 on the Big Island alone. These systems, as opposed to more strictly regulated municipal wastewater treatment units, have a higher risk of sewage leaking into the porous substrate.

A recent study published in Frontiers in Marine Science identifies sewage-contaminated submarine groundwater discharge (SGD) sites, pinpointing specific locations that stakeholders may want to prioritize for mitigation efforts.

Modeling and Mapping

Previous studies estimated that groundwater flows deliver 3 to 4 times more discharge to oceans than rivers do, making them significant pathways for transporting pollutants.

In response to pollution concerns from the local community, a team from Arizona State University, with the support of the Hawaiʻi Marine Education and Research Center, used airborne mapping to identify locations where SGD reached the ocean along the western coastline of the Big Island.

Sewage-contaminated water (colored blue in this photograph) enters the ocean from submarine groundwater discharge sites on the Kona coast of the Big Island. Credit: ASU Global Airborne Observatory

To precisely identify these freshwater-seawater interfaces, researchers built on previous studies that used thermal sensors to capture the temperature difference between the two bodies of water. Figuring out which of these discrete interface points were problematic “was very challenging,” said Kelly Hondula, a researcher at the Center for Global Discovery and Conservation Science and first author of the study.

The team identified more than 1,000 discharge points and collected samples from 47 locations. “We chose points where we could localize freshwater emerging from the land or points of high community interest,” explained Hondula.

In addition to aerial surveys, researchers analyzed the discharge points by monitoring their salinity gradients and measuring levels of Enterococcus, a group of bacteria that frequently serve as key fecal indicators in public health testing. They integrated these data into a statistical model that used upstream land cover and known sewage sites to predict the likelihood of sewage and bacterial contamination for each SGD site along the western Hawaiʻi coastline.

The techniques allowed scientists to identify regions of the built environment that are associated with contamination. Besides areas with septic systems and cesspools, they found a high correlation between sewage discharge and development within the first 500 meters of the coast.

“Sewage going into the ground comes out in the ocean, with often a worrying level of waste contamination.”

The geology of a discharge point also contributes to its risk of contamination. Discharge points around the island’s South Kona region, for instance, feature “some of the youngest and most porous volcanic substrate in the archipelago, with little soil development and a high degree of hydrologic connectivity between point sources of pollution and coastal waters,” the authors wrote. Although South Kona has relatively sparse development, increased land use will likely have a disproportionate effect on groundwater quality, they concluded.

“We were surprised to find such clear results: Sewage going into the ground comes out in the ocean, with often a worrying level of waste contamination,” said Hondula.

Mapping Mitigation

As communities continue to invest in coastal development, understanding the effect of sewage discharge and how to avoid it is becoming an increasingly pressing concern worldwide.

As such, the new study “contributes to the growing body of evidence correlating sewage-tainted groundwater discharge with coastal water quality, showing a strong linkage between wastewater and development in the nearshore area. That’s something that land managers and conservation scientists should really take into account,” said Henrietta Dulai, a geochemist at the University of Hawaiʻi at Mānoa who was not involved in the study.

The state of Hawaii has recognized the particular risk posed by largely unregulated cesspools leaking sewage-contaminated groundwater to the ocean. In fact, there is a state mandate to eliminate cesspools by 2050, but the associated cost is slowing the process.

Many scientists say the costs of phasing out cesspools is far outweighed by the health benefits. “We need to consider the financial sides of replacing cesspools versus the benefit of preserving the water quality for the environment and the people,” said Tristan McKenzie, a researcher at the University of Gothenburg, Sweden, who was not involved in the study. “Studies like this highlight why we need to act now.”

—Anna Napolitano (@anna83nap; @anna83nap.bsky.social), Science Writer

Citation: Napolitano, A. (2025), Pinpointing sewage seeps in Hawaii, Eos, 106, https://doi.org/10.1029/2025EO250376. Published on 9 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Step Toward AI Modeling of the Whole Earth System

Thu, 10/09/2025 - 13:08
Source: Journal of Geophysical Research: Machine Learning and Computation

Modelers have demonstrated that artificial intelligence (AI) models can produce climate simulations with more efficiency than physics-based models. However, many AI models are trained on past climate data, making it difficult for them to predict how climate might respond to future changes, such as further increases in the concentration of greenhouse gases.

Clark et al. have taken another step toward using AI to model complex Earth systems by coupling an AI model of the atmosphere (called the Ai2 Climate Emulator, or ACE) with a physical model of the ocean (called a slab ocean model, or SOM) to produce a model they call ACE2-SOM. They trained ACE2-SOM on output of a 100-kilometer-resolution physics-based model from a range of climates.

In response to increased atmospheric carbon dioxide, consistent with its target model, ACE2-SOM predicted well-known responses, such as surface temperature increasing more strongly over land than over ocean, and wet areas becoming wetter and dry areas becoming drier. When the researchers compared their results with those of a 400-kilometer-resolution version of the physics-based model they were emulating, they found that ACE2-SOM produced more accurate and cost-effective predictions: ACE2-SOM used 25 times less power while providing a resolution that was 4 times finer.

But ACE2-SOM struggled when the researchers asked it to predict what would happen if atmospheric carbon dioxide levels rose rapidly (suddenly quadrupling, e.g.). While the ocean surface temperature took the appropriate time to adjust, the atmosphere almost immediately shifted to the equilibrium climate under the new carbon dioxide concentration, even though physical laws would dictate a slower response.

To become fully competitive with physics-based models, AI climate models will need to become better able to model unusual situations, the authors write. The slab ocean model used in this study is also highly simplified. So to maintain their efficiency advantage while improving realism, AI models will also need to incorporate additional parts of the Earth system, such as ocean circulation and sea ice coverage, the researchers add. (Journal of Geophysical Research: Machine Learning and Computation, https://doi.org/10.1029/2024JH000575, 2025)

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), A step toward AI modeling of the whole Earth system, Eos, 106, https://doi.org/10.1029/2025EO250362. Published on 9 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

La salinidad del Océano Austral podría estar desencadenando la pérdida de hielo marino

Thu, 10/09/2025 - 13:08

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

El Océano Austral existe en un estado de equilibrio precario. El mar está estratificado, con agua fría en la superficie y agua relativamente cálida debajo. Es una situación inherentemente inestable — en igualdad de condiciones, el agua cálida debería subir a la superficie. Pero es más salada y, por lo tanto, más densa, por lo que permanece en el fondo. La capa superior fría, en cambio, se mantiene más dulce con las nevadas y el hielo marino, que se forma cerca de la costa y luego se desplaza hacia el norte entrando al océano abierto antes de derretirse.

Durante los últimos diez años, la capa de hielo marino ha ido disminuyendo a medida que las temperaturas oceánicas se han calentado. El rápido deshielo ha aportado aún más agua dulce a la superficie, lo que debería reforzar la capacidad aislante de la capa de agua fría y permitir que el hielo marino vuelva a expandirse.

Sin embargo, ese ciclo de retroalimentación parece haberse interrumpido. Nuevos datos satelitales han revelado que el océano alrededor de la Antártida, contra todo pronóstico, se está volviendo más salado.

El estudio fue publicado en Proceedings of the National Academy of Sciences of the United States of America (PNAS).

Medir donde es difícil medir

El hielo marino, los mares agitados y la oscuridad permanente hacen que resulte prácticamente imposible monitorear la salinidad del Océano Austral desde un barco durante el invierno. Solo en años recientes ha sido posible medir la salinidad del océano Austral desde el espacio. Los satélites pueden observar la temperatura de brillo de la superficie oceánica, una medida de la radiación emitida en la superficie del océano. Cuanto más dulce es el agua, mayor es la temperatura de brillo.

La técnica funciona bien en aguas más cálidas, pero en aguas frías la temperatura de brillo no varía tanto como cambia la salinidad. Dado que estos cambios ya son, en general, bastante sutiles, los satélites no habían podido detectarlos con precisión en las regiones polares. En estas zonas, el hielo marino también tiende a nublar la señal.

Los avances recientes en tecnología satelital, sin embargo, han mejorado notablemente la sensibilidad de las lecturas de brillo, y los nuevos algoritmos permiten a los investigadores eliminar el ruido generado por el hielo marino.

El oceanógrafo Alessandro Silvano, de la Universidad de Southampton, y sus colegas analizaron los últimos 12 años de registros de salinidad del satélite de la Agencia Espacial Europea para la medición de la humedad del suelo y la salinidad oceánica (SMOS, por sus siglas en inglés). Para Alex Haumann, científico climático de la Universidad Ludwig-Maximilians de Múnich, Alemania, e integrante del equipo, contar con estos datos de amplio alcance — que cubren todo el Océano Austral con una resolución de 25 kilómetros cuadrados — representa un cambio revolucionario. “Debido a la gran cobertura y la serie temporal que puedes obtener, es super valioso. Es realmente una nueva herramienta para monitorear este sistema”, afirmó.

Con el calentamiento, esperamos que fluya más agua dulce hacia el océano. Por lo tanto, es bastante impactante que aparezca esta agua más salada en la superficie”

Sin embargo, cuando el equipo observó que la salinidad había aumentado durante ese periodo, no pudieron evitar cuestionar la tecnología. Para verificar lo que estaban observando, recurrieron a las boyas Argo, boyas automatizadas que toman muestras de agua a una profundidad de hasta 2000 metros. Una red de boyas flota en los mares del mundo, incluido el océano Austral.

Para sorpresa y consternación de Silvano, las boyas corroboraron los datos satelitales. “Muestran la misma señal”, dijo. “Pensamos, de acuerdo, esto es real. No es un error.”

Al comparar los datos sobre la salinidad con las tendencias del hielo marino, el equipo observó un patrón inquietante. “Existe una correlación muy alta entre la salinidad superficial y la capa de hielo marino”, explicó Haumann. “Cuando la salinidad es alta, el hielo marino es escaso. Cuando la salinidad es baja, hay más hielo marino.”

“Con el calentamiento, esperamos que fluya más agua dulce hacia el océano. Por lo tanto, es bastante sorprendente que aparezca esta agua más salada en la superficie”, afirmó Inga Smith, física especializada en hielo marino de la Universidad de Otago en Nueva Zelanda, que no participó en la investigación.

Un régimen cambiante

La explicación más plausible para el aumento de la salinidad, según Silvano, es que las delicadas capas de agua antártica se han alterado y el agua más cálida y salada que se encuentra debajo está ahora saliendo a la superficie, lo que hace que esta sea demasiado cálida para que se forme hielo marino.

Aunque subrayó que es demasiado pronto para determinar la causa de la surgencia, Silvano planteó que podría estar provocado por el fortalecimiento de los vientos del oeste alrededor de la Antártida, como consecuencia del cambio climático. Afirmó que teme que el mecanismo natural de control de daños de la Antártida, en el que el deshielo libera agua dulce, que a su vez atrapa el agua cálida de las profundidades y finalmente permite que se forme más hielo marino, se haya roto de forma irreversible.

El debilitamiento de la estratificación oceánica amenaza, en cambio, con crear una nueva y peligrosa retroalimentación en la que las potentes corrientes de convección traen aún más agua cálida y salada de las profundidades, lo que conduce a una pérdida descontrolada de hielo.

“Creemos que esto podría ser un cambio de régimen, un cambio en el sistema oceánico y glacial, en el que hay menos hielo de forma permanente”, señaló Silvano.

“Tenemos que encontrar formas de monitorear el sistema, porque está cambiando muy rápidamente”

Wolfgang Rack, glaciólogo de la Universidad de Canterbury en Nueva Zelanda, quien no participó en la investigación, dijo que el registro satelital aún no es lo suficientemente largo como para demostrar si el aumento en la salinidad es una anomalía o un nuevo estado normal, no obstante, añadió: “Es bastante improbable que se trate de una simple anomalía, porque la señal es muy significativa.”

Zhaomin Wang, oceanógrafo de la Universidad de Hohai en Nankín, China, que no participó en la investigación, afirmó que el estudio era un “resultado muy sólido,” pero advirtió que aún es demasiado pronto para atribuir de forma concluyente el retroceso del hielo marino a la surgencia. “Es bastante difícil desentrañar la causa y el efecto entre el cambio del hielo marino antártico y el cambio de la salinidad de la superficie”, dijo, “porque es un sistema acoplado, lo que dificulta determinar qué proceso inicia los cambios”.

Para Haumann, los hallazgos muestran lo crucial que es la nueva tecnología para rastrear los cambios en el océano Austral. “Tenemos que encontrar formas de monitorear el sistema, porque está cambiando muy rápidamente”, dijo. “Esta es una de las regiones más distantes de la Tierra, pero una de las más críticas para la sociedad. La mayor parte del exceso de calor que tenemos en el sistema climático va a parar a esta región, y esto nos ha ayudado a mantener el planeta a una tasa de calentamiento relativamente moderada”.

“Ahora no sabemos realmente qué va a pasar con eso», dijo.”

Bill Morris, Escritor científico

This translation by Saúl A. Villafañe-Barajas (@villafanne) was made possible by a partnership with Planeteando and Geolatinas. Esta traducción fue posible gracias a una asociación con Planeteando y Geolatinas.

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The report of the Board of Inquiry into the 14 January 2025 McCrae Landslide

Thu, 10/09/2025 - 06:22

The tribunal has concluded that a major leak in a water main, which released 40 million liters of water, triggered the failure

On 14 January 2025, the McCrae landslide occurred on the Mornington Peninsula in Australia. The site is located at [-38.34631, 144.93500]. I posted about this event at the time, noting that local residents had observed large volumes of water bubbling out of the ground in the period leading up to the failure. The landslide caused property damage and it resulted in serious injuries to one person.

In the aftermath of the landslide, the Victorian Government established a formal Independent Board of Inquiry into the events – a rare response so a landslide of this type. That tribunal has now published its conclusions in a report that is available online. It contains 30 recommendations some of which are specific to this site, whilst others cover landslide management and response more generally. These have widespread application, and it is worth a read.

The Report includes this image of the aftermath of the McCrae landslide:-

The 14 January 2025 McCrae landslide. Image from the Board of Inquiry report.

The report is admirably definitive about the causes of the landslide. It notes that there were previous periods of movement on the slope, but that the events of 14 January 2025 started with movement that was observed on 5 January 2925. It states that:

“Water was the trigger of the 5 January 2025 landslide and the McCrae Landslide. The source of that water was the burst water main at Bayview Road.”

The Board of Inquiry has calculated that the burst water main released about 40 million litres of water. The leak started at least 150 days before the landslide occurred, and there were numerous reports made to the water authority that there were problems at the site. However, the leak was not detected and repaired.

As I noted above, some of the recommendations pertain to landslide management more generally. One (Recommendation 7) highlights the needs for proper protocols to respond to landslide incidents (this is a widespread problem). Others (Recommendations 18 and 21) highlight the need for better training and education with regard to landslides, whilst there is also a focus on a better understanding and identification of landslide risk (Recommendations 20 and 23), and clarity about responsibility for landslide management (Recommendations 29 and 30).

News reports in Australia indicate that the Victorian Government has accepted all the findings of the McCrae landslide inquiry. Plans are now in place to ensure that the issues at the site are addressed and that the householders who have suffered such heavy losses are treated appropriately.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Sharpiegate Scientist Takes the Helm at NOAA

Wed, 10/08/2025 - 18:23
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

Meteorologist and atmospheric scientist Neil Jacobs was confirmed as the new leader of NOAA on Tuesday evening.

Jacobs has a PhD in atmospheric science and worked in weather monitoring before joining NOAA in 2018.

But Jacobs is perhaps most well-known for his role in “Sharpiegate.” In 2019, during his first term, President Trump claimed that Alabama was in the path of Hurricane Dorian. After the claim met pushback, the president held a press conference and showed members of the media a map of the hurricane’s path that had been altered with a Sharpie, and NOAA issued a statement backing Trump’s claim.

President Trump displayed a map that altered the projected path of Hurricane Dorian with Sharpie. (The inked-in addition extends the white “Potential track area” and includes the Florida Panhandle, southern Georgia, and southeastern Alabama.) Credit: The White House

At the time, Jacobs was the acting NOAA administrator, and had approved the unsigned statement. A National Academy of Public Administration report later found that his involvement with the statement violated NOAA’s scientific integrity policy.

At Jacobs’ confirmation hearing in July, he said that, if a similar situation arose in the future, he would handle it differently. He also said he supported proposed cuts to NOAA’s budget, and that his top priorities included staffing the National Weather Service office, reducing the seafood trade deficit, and “return[ing] the United States to the world’s leader in global weather forecast modeling capability.”

 

Jacobs made no mention of climate change in his opening statement. When asked whether he agreed that human activities are the dominant cause of observed warming over the last century, he noted “that natural signals are mixed in there” but that “human influence is certainly there” too.

The Senate voted 51-46 to confirm Jacobs, in a session during which they also confirmed a cluster of attorneys and ambassadors (including former NFL star Herschel Walker as ambassador to the Bahamas).

Carlos Martinez, a senior climate scientist at the Union of Concerned Scientists, expressed concern in a statement published before Jacobs’ confirmation hearing.

“Despite his relevant expertise and career experience, Dr. Jacobs has already demonstrated he’s willing to undermine science and his employees for political purposes as he did during the infamous ‘Sharpiegate’ scandal,” Martinez wrote.

Bluesky users reacted to the news. Credit: Michael Battalio @battalio.com via Bluesky‬

Others were more cautiously optimistic, noting his experience as a scientist. “It could be worse,” noted one Redditor. “He’s an actual atmospheric scientist and a known quantity.”

“I’m hopeful that he’s learned how to fight within the political system — because he is going to have to fight,” former NOAA administrator Rick Spinrad told Bloomberg in August.

—Emily Gardner (@emfurd.bsky.social), Associate Editor

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

How Might Leftover Corn Stalks Halt Fugitive Carbon?

Wed, 10/08/2025 - 13:12

Across North America, abandoned oil and gas wells are leaking carbon dioxide and other greenhouse gases into the atmosphere. As of 2022, there were more than 123,000 documented orphaned wells in the United States, but researchers suspect the real number may be anywhere from 310,000 to 800,000.

Abandoned wells can be plugged by filling the drill holes with water or oil, but that process requires a substantial amount of liquid, as well as liquid assets. It would take 26 billion gallons—an amount that would fill almost 40,000 Olympic-size swimming pools—to plug 120,000 wells, with each well costing up to $1 million. (That’s $120 billion in total.)

“On the one hand, you have these underutilized waste products. On the other hand, you have abandoned oil wells that need to be plugged. It’s an abundant resource meeting an urgent demand.”

In a new study published in Energy Conversion and Management, researchers weighed the possibility of plugging wells and sequestering carbon with bio-oil made from vegetative waste. Their goal was to see whether the production of bio-oil could be a source of revenue for farmers while the oil itself could prevent greenhouse gases from escaping from abandoned wells.

“On the one hand, you have these underutilized waste products,” explained Mark Mba-Wright in a statement. Mba-Wright is a coauthor of the new paper, engineering professor at Iowa State University, and systems engineer at its Bioeconomy Institute. “On the other hand, you have abandoned oil wells that need to be plugged. It’s an abundant resource meeting an urgent demand.”

Biomass Bounty

The production of bio-oil starts with pyrolysis, the process in which vegetative waste decomposes under intense heat (≥1,000℉, or ~538°C°) in an oxygen-free environment. Pyrolysis produces three products: a liquid (bio-oil), a solid (biochar), and a gas. The gas is used to fuel future pyrolysis efforts, biochar can be sold as a soil amendment, and storing bio-oil underground has long been touted as an effective way to sequester carbon.

The fields and forests of the United States are ripe with plants and thus vegetative waste that could be used to produce bio-oil. For example, “for every kilogram of corn that the farmer produces, an additional kilogram of corn stover or biomass is produced,” said Mba-Wright.

Corn stover—the stalks, husks, and cobs left over after harvest—is a leading source of biomass for Midwestern farmers. In the western United States, woody forest debris is more widely available. To address this diversity of resources, Mba-Wright and his colleagues investigated the bio-oil potential of corn stover, switchgrass, pine, tulip poplar, hybrid poplar, and oriented strand board (an engineered product made with wood flakes and adhesives).

In partnership with Charm Industrial, a private carbon capture company, Mba-Wright and his colleagues sought to understand whether corn stover and other feedstocks would be suitable for bio-oil production, whether the process would be economically helpful to farmers, and whether the processing-to-plugging pathway would be effective at sequestering carbon.

Small-Scale Pyrolysis Feasibility

Charm has been using pyrolysis at a commercial scale for years, said Mba-Wright, but building large plants requires significant capital investment and risk.

Instead of a large, stationary plant, the team modeled the environmental and economic feasibility of an array of mobile pyrolysis units that could be located on farms. “You can imagine a farmer might be using his tractor or his combine on his field, and on the back of the unit have one of Charm’s pyrolysis units. And instead of letting the waste go to the field, it would be processed on site,” Mba-Wright explained.

In the modeled mobile pyrolysis scenario, the researchers found that the process could generate 5.3 tons of bio-oil and 2.5 tons of biochar for every 10 tons of corn stover. This estimate is slightly lower than the yield of bio-oil produced by other pyrolysis methods but is still reasonable.

The process of taking each feedstock from harvest to well plugging was carbon negative, the scientists found. Switchgrass had the highest carbon footprint at −0.62 kilogram of carbon dioxide (CO2) to kilogram of oil, and oriented strand board had the lowest carbon footprint at −1.55 kilograms of CO2 to kilogram of oil. Corn was in the middle, weighing in at −1.18 kilograms of CO2 to kilogram of oil.

An Array of Economics

Modeling indicated that the new pyrolysis process would be economically feasible as well, costing between $83.60 and $152 per ton of CO2. (The monetary difference accounts for the costs of including biochar sequestration.) These costs fall within the range of carbon credit commodity price ranges.

“The most important message is that there’s an economic case for carbon removal,” Mba-Wright said.

The scientists admit that to many individual farmers, however, this economic case might not seem like a bargain: The base capital cost of each pyrolysis unit would be $1.28 million.

“My impression was they were looking at this from the firm perspective, not exactly the farmer perspective,” said Sarah Sellars, an assistant professor of agricultural economics at South Dakota State University. “A base capital cost of 1.28 million? No farmer would invest in that. If they were going to spend $1.28 million, they’d probably buy more land.”

Mba-Wright said that although the costs are, indeed, significant, there are different options to consider. “Farmers could lease the equipment,” he suggested, adding that businesses could offer a lease-to-own option. “There are also intermediate solutions,” he added, “where you may have a unit that’s shared among farms.”

He acknowledged other challenges as well. Farmers “have a tight schedule during harvesting and planting. They may not want to have to operate another piece of equipment, so that’s something that suppliers of the unit will have to develop: a system that is easy for the farmer to use.”

Life Is Messy

On paper, sequestering carbon while halting fugitive emissions from orphan wells looks like a slam dunk.

But carbon and climate are complicated. “We can look at things from theory and economics and carbon mitigation, but then when it comes to these other variables, like the policy and the infrastructure to implement them, I think we should be cautious,” said Sellars. “Unfortunately, a lot of scientists don’t like to hear that, though. I mean, that’s why economics is called a dismal science.”

Lauren Gifford, director of the Soil Carbon Solutions Center at Colorado State University, agreed, adding that “a lot of what we’re reading in articles and things are promises or goals, but the industry just hasn’t taken off enough for us to see how these things play out at scale. A lot of what we see now is either hope or plans, and we know that real life is messy.”

—Sarah Derouin (@sarahderouin.com), Science Writer

Citation: Derouin, S. (2025), How might leftover corn stalks halt fugitive carbon?, Eos, 106, https://doi.org/10.1029/2025EO250378. Published on 8 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer