EOS

Syndicate content Eos
Science News by AGU
Updated: 1 day 8 hours ago

Alaska Awaits Response from FEMA in the Aftermath of Major Floods

Mon, 10/20/2025 - 16:45
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.

Major floods in Alaska have caused the death of at least one person and displaced thousands more over the course of the last two weeks. Many of the displaced may not be able to return home for 18 months or longer, according to Alaska Gov. Mike Dunleavy.

Tropical Storm Halong formed in the Northern Philippine Sea on 5 October, and had become a category 4 typhoon by 7 October. Though it was considered an ex-typhoon by the time it reached western Alaska, the storm brought wind speeds of up to 113 miles per hour (181 kilometers per hour), along with severe flooding across the Yukon Delta, Kuskokwim Delta, and Norton Sound.

 
Related

Among the hardest hit population centers were the villages of Kipnuk and Kwigillingok, home to a combined 1,000 people, mostly Alaska Native or American Indian. At this time of year, the remote villages can only be reached by water or by air.

In Kipnuk, water levels rose 5.9 feet (1.8 meters) above the normal highest tide line. In Kwigillingok, water levels measured 6.3 feet (1.9 meters) above the normal highest tide line—more than double the previous record set in 1990. According to a letter from the governor’s office to President Trump, 90% of structures in Kipnuk and 35% of structures in Kwigillingok have been destroyed.

The Alaska Air and Army National Guard, the U.S. Coast Guard, and Alaska State Troopers evacuated hundreds of residents to the regional hub of Bethel, then to the capital of Anchorage in what the Alaska National Guard called the state’s largest airlift operation in history.

“It’s been an all-hands-on deck endeavor, and everybody is trying to support their fellow Alaskans in their time of need,” said Col. Christy Brewer, the Alaska National Guard director of joint operations, in a 19 October statement.

Silence From FEMA

But calls for assistance from the Federal Emergency Management Agency seem to have so far gone unanswered, leaving some people asking, “Where is FEMA?”

An urgent question. According to the FEMA Daily Briefing a presidential disaster declaration was requested on October 16th. To the best of my knowledge it hasn’t been granted. Any event of this size should be an easy and immediate yes.

Dr. Samantha Montano (@samlmontano.bsky.social) 2025-10-18T23:13:44.421Z

As reported by the New York Times, the EPA revoked a $20 million grant in May that was intended to protect Kipnuk from extreme flooding. The grant cancellation was likely part of a larger effort by the administration to shift the burden of disaster response to states.

On 16 October, Dunleavy submitted a request to President Trump to declare a major disaster for the state.

The letter notes that Alaska has seen 57 state-declared disasters since November 2018, 14 of which have been approved for federal disaster assistance. There have been 14 state-declared disasters in Alaska in the last 12 months alone, including fires, freezes, landslides, and floods.

“It is anticipated that more than 1,500 Alaskans will be evacuated to our major cities, many of whom will not be able to return to their communities and homes for upwards of 18 months,” Gov. Dunleavy wrote. “This incident is of such magnitude and severity that an effective response exceeds state and local capabilities, necessitating supplementary federal assistance to save lives, protect property, public health, and safety, and mitigate the threat of further disaster.”

On 17 October, Alaska’s senators and state representative (all Republicans) also submitted a letter to President Trump, urging him to approve the governor’s request for a major disaster declaration.

Also on 17 October, Vice President JD Vance said on X that he and the president were “closely tracking the storm devastation,” and that the federal government was working closely with Alaska officials. On 18 October, Lisa Murkowski (R-AK) said she believed FEMA representatives were “totally on the ground.”

However, as of 20 October, the incident is not listed in FEMA’s disaster declaration database.

—Emily Gardner (@emfurd.bsky.social) Associate Editor

These updates are made possible through information from the scientific community. Do you have a story about science or scientists? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The Southern Ocean May Be Building Up a Massive Burp

Mon, 10/20/2025 - 13:16
Source: AGU Advances

The ocean has helped mitigate global warming by absorbing around a quarter of anthropogenic carbon dioxide (CO2) emissions, along with more than 90% of the excess heat those emissions generate.

Many efforts, including assessments by the Intergovernmental Panel on Climate Change, have looked at how the oceans may continue to mitigate increasing emissions and global warming. However, few have looked at the opposite: How will the oceans respond if emissions and associated atmospheric heat levels begin to decrease in response to net negative emissions?

Frenger et al. examined what might happen in the Southern Ocean if after more than a century of human-induced warming, global mean temperatures were to be reduced via CO2 removal from the atmosphere. The Southern Ocean is a dynamic system, with large-scale upwelling and a robust ability to take up excess carbon and heat. To better understand how the Southern Ocean will behave in net negative carbon conditions, the researchers modeled how the ocean and the atmosphere would interact.

They used the University of Victoria climate model, UVic v. 2.9, to simulate multicentury timescales and carbon cycle feedbacks. UVic uses a combination of an atmospheric energy–moisture balance model, an ocean circulation and sea ice model, a land biosphere model, and an ocean biochemistry model. The researchers used UVic to model an idealized climate change scenario commonly used in climate modeling: Emissions increase until atmospheric CO2 levels double after 70 years, followed by a steep emissions cut and subsequent sustained net negative emissions.

The results showed that after several centuries of net negative emissions levels and gradual global cooling, the Southern Ocean abruptly released a burst of accumulated heat—an oceanic “burp”—that led to a decadal- to centennial-scale period of warming. This warming was comparable to average historical anthropogenic warming rates. The team said that because of seawater’s unique chemistry, this burp released relatively little CO2 along with the heat.

Frenger and colleagues note that their work uses a model with intermediate-level complexity and an idealized climate change scenario, but that their findings were consistent when tested with other modeling setups. They say the Southern Ocean’s importance to the global climate system, including its role in heat release to the atmosphere in a cooling climate, should be studied further and contemporary changes closely monitored. (AGU Advances, https://doi.org/10.1029/2025AV001700, 2025)

—Sarah Derouin (@sarahderouin.com), Science Writer

Citation: Derouin, S. (2025), The Southern Ocean may be building up a massive burp, Eos, 106, https://doi.org/10.1029/2025EO250385. Published on 20 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Publishing Participatory Science: The Community Science Exchange

Mon, 10/20/2025 - 12:00
Editors’ Vox is a blog from AGU’s Publications Department.

The Community Science Exchange was founded in 2021 to elevate the work of scientists, scholars and community members collectively engaged in participatory science and to broaden the reach of their discoveries, results and science-based solutions. Now more than ever, we would like to recognize the importance of the work of the Community Science Exchange in fostering an inclusive scientific community and strengthening public trust in science. Here, we highlight the publication outlets offered by the Community Science Exchange and encourage the AGU community to contribute.     

The Community Science Exchange aims to encourage, foster, and promote co-production between science and community.

Within equitable participatory science, or a collective scientific endeavor giving significant voice and weight to both science and publics, the Community Science Exchange defines “community” variously as place-based, a group defined by a shared culture or heritage, and/or a group defined by a shared experience. From environmental concerns to public health, anthropology to engineering, the Community Science Exchange aims to encourage, foster, and promote co-production between science and community. To aid in the integration of local knowledge and lived experience, the Community Science Exchange specifically includes community voice in its publications: as authors, in sections devoted to community description and community impact, and in quotes from community members involved in and/or affected by the work. Scientists and academic scholars with an interest in elevating their community partners within their publications instead of hiding them in an acknowledgment should consider publication within the Exchange.

The American Geophysical Union hosts the Community Science Exchange with further support and guidance from five partnership organizations: the American Anthropological Association (AAA), the American Public Health Association (APHA), the Association for Advancing Participatory Sciences (AAPS), the Unión Geofísica Mexicana (UGM), and Wiley. To broaden the publication venues for community members and organizations, practitioners, boundary spanners, and others who may not receive career benefits from scientific journal publication, the Community Science Exchange has created two new avenues for those who want to publish and share their work: the journal Community Science and the online publication venue managed by AGU, the Hub.

Since its first issue in June 2022, Community Science has published articles discussing a variety of topics of interest to communities and scientists, including water quality, plastic pollution, language as a barrier to equitable access to scientific literature, and integration of Indigenous knowledge in shellfish monitoring. Community Science has also participated in several special collections, including on air quality, equitable co-production, and sustainable agriculture. Growing steadily in submissions, Community Science received the PROSE Award for Journals from the Princeton University Press in 2024. The journal is open access, allowing anyone to read the work published for free.

As a peer-reviewed journal, manuscripts go through an evaluation and revision process to ensure that research published in the journal rigorously advances both science and community outcomes. Like the other journals within the AGU journal portfolio, those who review for Community Science are welcome to invite a co-reviewer. This endeavor can help early career researchers to become thorough and constructive reviewers, and can invite experienced community organizers, boundary spanners and those with relevant lived expertise to engage in thoughtful reviews complementary to scientific review. Publications in both Community Science and the Hub are periodically featured in Editor’s Highlights, in which editors explain what they found exciting about a work, or in Research Spotlights, which are written by Eos’ professional science writers and feature recent newsworthy work. These features offer a more approachable point of entry to explore the science.

Unlike any other journal in the AGU portfolio, the Community Science Exchange also supports an alternate publication venue – the Hub – which is hosted on the Community Science Exchange website. Broadening the definition and understanding of scientific research, work, and resources, the Hub seeks to deepen the connection between science and community.

The Hub is home to a wide variety of content, including stand-alone submissions that are intentionally written outside of the strictures of a scientific journal format.

The Hub is home to a wide variety of content, including stand-alone submissions that are intentionally written outside of the strictures of a scientific journal format, to “complementary materials” allowing journal paper authors to enrich their articles with linked materials furthering community voice. Although the Hub isn’t a scholarly journal in the traditional sense, all submissions are editor-vetted before potential revision and publication. Any new, original content published on the Hub is now eligible to receive a permanent digital object identifier (DOI) allowing it to be cited in the references of scholarly publications and other content.

Authors can submit materials to the Hub that fall into one of four categories:

Project Descriptions are narratives of work done, or even more formalized case studies. They should include a description of the community involved, an explanation of the community knowledge utilized, and a summary of the work done. Example: Climate Safe Neighborhoods [Project Description] (doi.org/10.1029/2024CSE000101)

Protocols and Methods are for describing how the community science work was done. These could be practiced approaches, descriptions of relevant policies to be considered, or outlines of project development.

Tools and Resources are items that can help others along on their own community science work, such as datasets or visualization tools. Even descriptions of useful apps that would be helpful would be welcome.

Educational Materials are items geared toward educating or training about community science practices. These could include instruction manuals, guidebooks, or even workshop or webinar curricula.

Because the Hub is a living initiative, evolving with the needs and desires of the community, submissions that don’t cleanly fit into any one of these categories will still be considered.

If you are interested in joining in the Community Science Exchange’s efforts to expand how we view, publish, and share science, please email us at communitysci@agu.org. Whether you have a resource to submit to the Hub, an article to submit to the journal, want to be a reviewer, or even want to apply to be an editor – we’d love to hear from you.

Finally, we want to thank all of those who have served as editors of this initiative so far, both past and present (starred are original editorial board members):

  • Julia Parrish*, current Editor-in-Chief
  • Kathryn Semmens*, current Deputy Editor of the Hub
  • Claire Beveridge*, current editor
  • Gillian Bowser, current editor
  • Muki Haklay*, current editor
  • Rajul Pandya, current editor
  • Jean Schensul*, founding Deputy Editor, current editor
  • Kevin Noone*, founding Editor-in-Chief, past editor
  • Paula Buchanan*, founding Deputy Editor, past editor
  • Shobhana Gupta*, past editor
  • Heidi Roop*, past editor
  • Roopam Shukla*, past editor

—Allison Schuette (aschuette@agu.org, 0009-0007-1055-0937), Program Coordinator, AGU Publications; Julia Parrish (0000-0002-2410-3982), Editor-in-Chief, Community Science Exchange; Kathryn Semmens (0000-0002-8822-3043), Deputy Editor, The Hub; Kristina Vrouwenvelder (0000-0002-5862-2502), Assistant Director, AGU Publications; and Sarah Dedej (0000-0003-3952-4250), Assistant Director, AGU Publications

Citation: Schuette, A., J. Parrish, K. Semmens, K. Vrouwenvelder, and S. Dedej (2025), Publishing participatory science: the Community Science Exchange, Eos, 106, https://doi.org/10.1029/2025EO255032. Published on 20 October 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Universities Reject Trump Funding Deal

Fri, 10/17/2025 - 16:09
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

The “Compact for Academic Excellence in Higher Education,” developed by the Trump administration and sent to nine universities on 1 October, proposes that the institutions agree to a series of criteria in exchange for preferential treatment in funding decisions.

The compact’s provisions ask universities to: 

  • Ban the consideration of any demographic factors, including sex, ethnicity, race, sexual orientation, and religion in any admissions decisions, financial aid decisions, or hiring decisions.
  • Commit to “institutional neutrality,” create an “intellectually open campus environment,” and abolish “institutional units that purposefully punish, belittle, and even spark violence against conservative ideas.”
  • Require all employees to abstain from actions or speech related to social and political events unless such events have a direct impact on their university or they are acting in their individual capacity rather than as university representatives. 
  • Interpret the words “woman,” and “man” according to “reproductive function and biological processes.”
  • Stop charging tuition for any admitted student pursuing “hard science” programs. (This only applies for universities with endowments over $2 million per undergraduate student.)
  • Disclose foreign funding and gifts.
Compact-for-Academic-Excellence-in-Higher-Education-10.1Download

The proposed deal was sent to the University of Pennsylvania, the University of Virginia, the University of Arizona, the University of Texas at Austin, the University of Southern California, Vanderbilt University, Dartmouth University, Brown University, and the Massachusetts Institute of Technology. 

 
Related

“Any university that refuses this once-in-a-lifetime opportunity to transform higher education isn’t serving its students or their parents—they’re bowing to radical, left-wing bureaucrats,” Liz Huston, a White House spokesperson, told Bloomberg

Simon Marginson, a professor of higher education at Oxford University, told Time that if successful, the compact would “establish a level of federal control of the national mind that has never been seen before.” 

On 12 October, President Trump opened up the offer to all institutions of higher education in a post on social media website Truth Social.

As of 20 October, the following schools have responded to Trump’s offer:

  • Massachusetts Institute of Technology: MIT was the first to reject Trump’s offer. In a 10 October letter to the administration, MIT President Sally Kornbluth wrote that MIT’s practices “meet or exceed many standards outlined in the document,” but that the compact “also includes principles with which we disagree, including those that would restrict freedom of expression and our independence as an institution.”
  • Brown University: In a 15 October letter to the administration, Brown University President Christina H. Paxson declined the deal. She wrote that Brown “would work with the government to find solutions if there were concerns about the way the University fulfills its academic mission,” but that, like Kornbluth, she was “concerned that the Compact by its nature and by various provisions would restrict academic freedom and undermine the autonomy of Brown’s governance.”
  • University of Southern California: In a 16 October statement, USC Interim President Beong-Soo Kim informed the university community that he had declined the deal, and wrote that the university takes legal obligations seriously and is diligently working to streamline administrative functions, control tuition rates, maintain academic rigor, and ensure that students develop critical thinking skills. “Even though the Compact would be voluntary, tying research benefits to it would, over time, undermine the same values of free inquiry and academic excellence that the Compact seeks to promote,” he wrote.
  • University of Pennsylvania: In a 16 October statement, UPenn President J. Larry Jameson informed the university community that he had declined to sign the compact. “At Penn, we are committed to merit-based achievement and accountability. The long-standing partnership between American higher education and the federal government has greatly benefited society and our nation. Shared goals and investment in talent and ideas will turn possibility into progress,” he wrote.
  • University of Virginia: In a 17 October letter to the administration, UVA Interim President Paul Mahoney declined to sign the compact. “We seek no special treatment in exchange for our pursuit of those foundational goals,” the letter said. “The integrity of science and other academic work requires merit-based assessment of research and scholarship. A contractual arrangement predicating assessment on anything other than merit will undermine the integrity of vital, sometimes lifesaving, research and further erode confidence in American higher education.”
  • Dartmouth University: In a 18 October letter to the administration, Dartmouth President Sian Leah Beilock declined the deal. “I do not believe that the involvement of the government through a compact—whether it is a Republican- or Democratic-led White House—is the right way to focus America’s leading colleges and universities on their teaching and research mission,” Beilock wrote.
  • University of Arizona: In a 20 October announcement, President Suresh Garimella said he had declined to agree to the proposal and had instead submitted a Statement of Principles to the U.S. Department of Education informed by “hundreds of U of A stakeholders and partner organizations.” “This response is our contribution toward a national conversation about the future relationship between universities and the federal government. It is critical for the University of Arizona to take an active role in this discussion and to work toward maintaining a strong relationship with the federal government while staying true to our principles,” Garimella wrote.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

20 October: This article was updated to include the University of Virginia and Dartmouth University.

21 October: This article was updated to include the University of Arizona.

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

When the Earth Moves: 25 Years of Probabilistic Fault Displacement Hazards

Fri, 10/17/2025 - 16:08
Editors’ Vox is a blog from AGU’s Publications Department.

Earthquake surface ruptures can cause severe damage to infrastructure and, while preventative measures can be taken to allow the structures to adapt in the case of an earthquake, one of the best methods is to avoid unnecessary risks in the first place.

A new article in Reviews of Geophysics explores the history of Probabilistic Fault Displacement Hazard Assessments (PFDHA) and recent efforts to improve them with modern methods. Here, we asked the authors to give an overview of PFDHAs, how scientists’ methods have evolved over time, and future research directions.

What is fault displacement and what kinds of risks are associated with it?

Fault displacement occurs when an earthquake breaks the ground surface along a fault. This displacement along the fault can shift the ground horizontally and/or vertically, by several meters for the largest earthquakes. Such ruptures pose serious risks to infrastructures located across faults—such as pipelines, transportation systems, dams and power generation facilities—because they may be torn apart or severely damaged. While some facilities can be engineered to tolerate limited movements, many critical systems are highly vulnerable, making it essential to evaluate this hazard.

This figure shows the Trans-Alaska Pipeline crossing the Denali Fault, which ruptured during the 2002 earthquake. Photos and diagrams illustrate how the pipeline was designed to bend and slide, allowing it to survive several meters of fault movement without breaking. Credit: Valentini et al. [2025], Figure 5

In simple terms, what are Probabilistic Fault Displacement Hazard Assessments (PFDHA)?

A Probabilistic Fault Displacement Hazard Assessment (PFDHA) is a quantitative analysis based on a method that estimates the likelihood that an earthquake will rupture the surface at a specific site and anticipate the magnitude of the displacement. Instead of giving a single answer, PFDHA provides probabilities of different displacement levels for different reference periods of interest. This allows engineers and planners to evaluate risks in a structured way and make informed decisions about building designs or land use near faults.

This diagram explains how scientists estimate the expected amount of displacement due to an earthquake and at a specific site. It shows the main steps and data used in a Probabilistic Fault Displacement Hazard Assessment (PFDHA). Credit: Valentini et al. [2025], Figure 8

How have Fault Displacement Hazard Assessments evolved over time?

The first systematic PFDHA was developed in the early 2000s for the Yucca Mountain nuclear waste repository in the USA. Since then, the methodology has expanded from normal faults to include strike-slip and reverse faults worldwide. Over the past 25 years, new global databases of surface ruptures supporting statistical analysis, advances in statistical modeling, and international benchmark exercises have significantly improved the reliability and comparability of PFDHA approaches. In the future, the field should integrate remote sensing data, artificial intelligence, and physics-based modeling to better capture the complexity of earthquake ruptures.

What are the societal benefits of developing PFDHAs?

By quantifying the hazard of surface fault rupture, PFDHAs provide critical input for the safe design of infrastructures. This helps to avoid catastrophic failures such as pipeline leaks, dam collapses and resulting flooding, or road and railway disruption. Beyond engineering, PFDHAs also support land-use planning by identifying areas where construction should be avoided. Ultimately, these assessments reduce economic losses, improve resilience, and protect human lives in earthquake-prone regions.

What are some real-life examples of PFDHAs being developed and implemented?

One of the earliest and most influential applications was at Yucca Mountain, Nevada, where PFDHA helped assess the safety of a proposed nuclear waste repository. More recently, PFDHA approaches have been adopted internationally, including in Japan and Italy, for assessing risks to dams, tunnels, and other critical infrastructure.

What are some of the most exciting recent developments in this field?

These photos show how earthquakes can damage critical infrastructure such as bridges, dams, railways, and pipelines. The images highlight both principal and distributed fault ruptures, underscoring why engineers and planners must consider both when assessing earthquake hazards. Credit: Valentini et al. [2025], Figure 4

Recent years have seen major advances thanks to new global databases such as the worldwide and unified database of surface ruptures (SURE) and the Fault Displacement Hazard Initiative (FDHI), which collect tens of thousands of observations of past surface ruptures. Remote sensing techniques now allow scientists to map fault ruptures with unprecedented detail. Importantly, these techniques have also awakened the geological and seismological community to the relevance of moderate earthquakes. Since the 2000s and 2010s, it has become clear that earthquakes smaller than magnitude 6.5 can also produce significant surface ruptures, a threat that was often overlooked before these technological advances. Additionally, international collaborations, such as the International Atomic Energy Agency benchmark project, are helping to unify approaches and ensure that PFDHAs are robust and reproducible across different regions.

What are the major unsolved or unresolved questions and where are additional research, data, or modeling efforts needed?

Several challenges remain. A key issue is the limited number of well-documented earthquakes outside North America and Japan, leaving other regions underrepresented in global databases. Another challenge is how to model complex, multi-fault ruptures, which are increasingly observed in large earthquakes. Understanding the controls on off-fault deformation, as revealed by modern geodetic techniques during large to moderate events, is another critical open question. This knowledge could improve our ability to predict rupture patterns and displacement amounts.

Similarly, the role of near-surface geology in controlling the location, size, and distribution of surface ruptures for a given earthquake magnitude remains poorly constrained and deserves further study. Standardizing terminology and methods is also essential for consistent hazard assessments. Looking forward, more high-quality data, integration of physics-based models, and improved computational frameworks will be crucial to advance the field.


—A. Valentini (alessandro.valentini@univie.ac.at, 0000-0001-5149-2090), University of Vienna, Austria; Francesco Visini (0000-0001-9582-6443), Istituto Nazionale di Geofisica e Vulcanologia, Italy; Paolo Boncio (0000-0002-4129-5779),  Università degli Studi “G. d’Annunzio,” Italy; Oona Scotti (0000-0002-6640-9090), Autorité de Sureté Nucléaire et de Radioprotection, France; and Stéphane Baize (0000-0002-7656-1790), Autorité de Sureté Nucléaire et de Radioprotection, France

Editor’s Note: It is the policy of AGU Publications to invite the authors of articles published in Reviews of Geophysics to write a summary for Eos Editors’ Vox.

Citation: Valentini, A., F. Visini, P. Boncio, O. Scotti, and S. Baize (2025), When the earth moves: 25 years of probabilistic fault displacement hazards, Eos, 106, https://doi.org/10.1029/2025EO255033. Published on 17 October 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Scientists Must Join Forces to Solve Forecasting’s Predictability Desert

Fri, 10/17/2025 - 11:55

Should I wear a jacket to work today, or will I be too warm? Will that hurricane miss my town, or should I prepare to evacuate? We rely on accurate short-term weather forecasts both to make mundane daily decisions and to warn us of extreme events on the horizon. At the same time, Earth system scientists focus on understanding what drives variations in temperature, precipitation, and extreme conditions over periods spanning months, decades, and longer.

Between those two ends of the forecasting spectrum are subseasonal-to-seasonal (S2S) predictions on timescales of 2 weeks to 2 months. S2S forecasts bridge the gap between short-term weather forecasts and long-range outlooks and hold enormous potential for supporting effective advance decisionmaking across sectors ranging from water and agriculture to energy, disaster preparedness, and more. Yet these timescales represent an underdeveloped scientific frontier where our predictive capabilities are weakest. Indeed, the S2S range is often referred to as the predictability desert.

Forecasts at 3- to 4-week lead times, for example, remain inconsistent. Sometimes, so-called windows of opportunity arise when models provide strikingly accurate, or skillful, guidance at this timescale. But these windows of skillful S2S forecasting are themselves unpredictable. Why do they occur when they do? Do they have recognizable precursors? And how does predictability depend on the quantity (e.g., temperature versus precipitation) being predicted?

Three interlocking puzzle pieces represent the integration of weather prediction (left) and long-term outlooks (right) with the “missing middle” of S2S predictability (center). The center piece highlights key applications—agriculture, water availability, and disaster preparedness—and the tools needed to advance S2S skill, including modeling, data assimilation (DA), artificial intelligence (AI), and multiscale process understanding. Credit: Simmi Readle/NSF NCAR

These questions are more than academic curiosities. Answering them would transform our ability to gauge the value of S2S forecasts in real time and to anticipate and respond to high-impact events such as heat waves, flooding rains, drought onset, and wildfires.

Tackling this challenge requires traditionally siloed communities—scientists focused on predicting near-term weather and those focused on projecting long-term changes in the Earth system—to coordinate efforts. Together, these communities can advance scientific understanding and predictive capabilities across scales.

Discovering Windows of Opportunity

The challenges of subseasonal-to-seasonal (S2S) prediction reflect the complex and interconnected dynamics of the Earth system.

The challenges of S2S prediction reflect the complex and interconnected dynamics of the Earth system. At these lead times, forecast skill relies not only on the accuracy of initial input atmospheric conditions—always a vital element for weather forecasts—but also on model treatments of slowly evolving components of the Earth system. These components—including the ocean state, land surface conditions, snow cover, atmospheric composition, and large-scale patterns of variability such as the Madden-Julian Oscillation (MJO), El Niño–Southern Oscillation, stratospheric quasi-biennial oscillation, and sudden stratospheric warmings—interact in ways that enhance or degrade forecast performance. Volcanic eruptions can further influence these interactions, altering circulation patterns and modulating surface climate on S2S timescales.

Researchers have made substantial progress in understanding these individual Earth system components. But we still cannot reliably anticipate when models will yield skillful forecasts because their accuracy at S2S timescales is episodic and state dependent, meaning it comes and goes and depends on various interacting conditions at any given time. A model might perform well for a given region in one season—yielding a window of opportunity—but struggle in another region or season.

So how might we get better at anticipating such windows? For starters, rather than viewing the predictive capability of models as fixed, we can treat it as a dynamic property that changes depending on evolving system conditions. This paradigm shift could help scientists focus on developing tools and metrics that help them anticipate when forecasts will be most reliable. It could also suggest a need to rethink strategies for collecting environmental observations.

Just as predictability is episodic, so too might be the value of strategically enhanced observations. For example, targeted observations of sea surface temperatures, soil moisture, or atmospheric circulation during periods when these conditions strongly influence forecast skill could be far more valuable than the same measurements made at other times. Such adaptive, or state-aware, observing strategies, say, intensifying atmospheric sampling ahead of a developing MJO event, would mean concentrating resources where and when they will matter most. By feeding these strategically enhanced observations into forecast models, scientists could improve both the forecasts themselves and the ability to evaluate their reliability.

Aligning Goals Across Disciplines

S2S timescales fall at the intersection of weather forecasts and seasonal to decadal outlooks, and the communities working on those different types of predictions have different focuses and research strengths.

To drive needed technical advances supporting improved S2S predictability, we also need a cultural shift to remove barriers between scientific disciplines. S2S timescales fall at the intersection of weather forecasts and seasonal to decadal outlooks, and the communities working on those different types of predictions have different focuses and research strengths. Weather prediction emphasizes initial condition accuracy, data assimilation, and high-resolution modeling of fast atmospheric processes. Studying Earth system behavior and variability over longer timescales focuses on modeling slowly evolving boundary conditions (e.g., the ocean) and coupled component interactions (e.g., between the land and the atmosphere).

Historically, these communities have operated along parallel tracks, each with its own institutions, funding structures, and research priorities. The challenge of identifying windows of opportunity at S2S timescales offers a unifying scientific problem.

Earth system features that offer potentially promising signals of S2S predictability, such as the MJO, are already shared terrain, studied through the lenses of both weather and longer-term change. Extreme events are another area of convergence: Weather models focus on forecasting specific short-lived, high-impact events, whereas Earth system models explore the conditions and teleconnections that influence the likelihood and persistence of extremes. Together, these complementary perspectives can illuminate not only what might happen but why and when skillful forecasts are possible.

The path to unlocking S2S predictability involves more than simply blending models, though. It requires aligning the communities’ scientific goals, model performance evaluation strategies, and approaches for dealing with uncertainty. These approaches include the design of model ensembles, data assimilation strategies that quantify uncertainty in initial conditions, probabilistic evaluation methods, and ways of communicating forecast confidence to users.

The path forward also entails building modeling systems that capitalize on the weather community’s expertise in initialization and the Earth system modeling community’s insights into boundary forcing and component coupling. Accurate initialization must capture all Earth system components—from soil moisture, ocean heat content, and snow cover, for example, to the state of the atmosphere, including the stratosphere. However, observations and data assimilation for several key variables, especially in the ocean, stratosphere, and other data-sparse regions, remain limited, constraining our ability to represent their influences in prediction systems.

A near-term opportunity for aligning goals and developing models lies in improving prediction of MJO-related extreme rainfall events, which arise from tropical ocean–atmosphere interactions and influence regional circulation and precipitation. This improvement will require that atmospheric convection be better represented in models, a long-standing challenge in both communities.

Emerging kilometer-scale models and machine learning offer shared innovation and collaboration spaces. Kilometer-scale models can explicitly resolve convection, validate and refine model parameterizations, and elucidate interactions between large-scale circulation and small-scale processes. Machine learning provides new avenues to emulate convection-permitting simulations, represent unresolved processes, and reduce systematic model errors.

Success with this challenge could yield immediate value for science and decisionmaking by, for example, enabling earlier warnings for flood-prone areas and supporting more informed planting and irrigation decisions in agriculture.

From Forecast Skill to Societal Resilience

The societal need for more skillful S2S prediction is urgent and growing. Communities worldwide are increasingly vulnerable to extreme conditions whose impacts unfold on weekly to monthly timescales. In scenarios such as a prolonged dry spell that turns into drought, a sudden warming trend that amplifies wildfire risk, or a stalled precipitation pattern that leads to flooding, insights from S2S forecasting could provide foresight and opportunities to prepare in affected areas.

Officials overseeing water management, energy planning, public health, agriculture, and emergency response are all seeking more reliable guidance for S2S time frames. In many cases, forecasts providing a few additional weeks of lead time could enable more efficient resource allocation, preparedness actions, and adaptation strategies. Imagine if forecasts could reliably indicate prolonged heat waves 3–4 weeks in advance. Energy providers could prepare for surges in cooling demand, public health officials could implement heat safety campaigns, and farmers could adjust planting or irrigation schedules to reduce losses.

The resilience of infrastructure, ecosystems, and economies hinges on knowing not only what might happen but also when we can trust our forecasts. By focusing on understanding when and where we have windows of opportunity with S2S modeling, we open the door to developing new, intermediate-term forecasting systems that are both skillful and useful—forecast systems that communicate confidence dynamically and inform real-world decisions with nuance.

Realizing this vision will require alignment of research priorities and investments. S2S forecasting and modeling efforts have often fallen between the traditional mandates of agencies concerned with either weather or longer-term outlooks. As a result, the research and operational efforts of these communities have not always been coordinated or sustained at the scale required to drive progress.

Coordination and Collaboration

With growing public attention on maintaining economic competitiveness internationally and building disaster resilience, S2S prediction represents an untapped opportunity space. And as machine learning and artificial intelligence offer new ways to explore predictability with models and to extract meaningful patterns from model outputs, now is the time to advance the needed coordination.

The many use cases for S2S prediction underscore that it isn’t just a scientific challenge, it’s a strategy for achieving resilience and prosperity.

The many use cases for S2S prediction underscore that it isn’t just a scientific challenge, it’s a strategy for achieving resilience and prosperity. We call on a variety of communities and enterprises to collaborate and rally around the challenge of illuminating windows of opportunity in S2S modeling.

Scientists from traditionally distinct disciplines should codesign research strategies to jointly investigate when, where, and why S2S skill emerges. For example, they could examine weather regimes (e.g., the Pacific or Alaska ridges) and their links to modes of variability (e.g., the North Atlantic Oscillation) and leverage data assimilation to better understand how these phenomena evolve across timescales.

The scientific community could also identify and evaluate critical observational gaps that limit progress in modeling and data assimilation. And they could develop strategies to implement adaptive observing approaches that, for example, target soil moisture, surface energy fluxes, and boundary layer profiles to better capture land-atmosphere interactions at S2S timescales. Such approaches would help to fill gaps and advance understanding of key Earth system processes.

Modeling centers could build flexible prediction systems that allow for advanced data assimilation and incorporate robust coupling of Earth system components—drawing from the weather and Earth system modeling communities, respectively—to explore how initial conditions and boundary forcing jointly influence S2S skill. Using modular components—self-contained pieces of code that represent individual Earth system processes, such as atmospheric aerosols and dynamic vegetation—within these systems could help isolate sources of predictability and improve process-level understanding.

To sustain progress initiated by scientists and modeling centers, agencies and funders must recognize S2S prediction as a distinct priority and commit to investing in the needed modeling, observations, and institutional coordination.

Furthermore, it’s essential that scientists, decisionmakers, and end users codevelop forecast tools and information. Close integration among these groups would focus scientific innovation on user-defined needs of what is useful and actionable, allowing scientists to build tools that meet those needs.

S2S forecasting may never deliver consistent skill across all timescales and regions, but knowing when and where it is skillful could make it profoundly powerful for anticipating high-impact hazards. Can we reliably predict windows of opportunity to help solve the predictability desert? Let’s do the work together to find out.

Author Information

Jadwiga H. Richter (jrichter@ucar.edu) and Everette Joseph, National Science Foundation National Center for Atmospheric Research, Boulder, Colo.

Citation: Richter, J. H., and E. Joseph (2025), Scientists must join forces to solve forecasting’s predictability desert, Eos, 106, https://doi.org/10.1029/2025EO250389. Published on 17 October 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Flash, a Boom, a New Microbe Habitat

Fri, 10/17/2025 - 11:54

A sizable asteroid impact generally obliterates anything alive nearby. But the aftermath of such a cataclysm can actually function like an incubator for life. Researchers studying a Finnish impact structure found minerals whose chemistry implies that microbes were present roughly 4 million years after the impact. These findings, which were published in Nature Communications last month, shed light on how rapidly microscopic life colonizes a site after an asteroid impact.

A Special Lake

Finland is known for its myriad lakes used by boaters, fishers, swimmers, and other outdoor afficionados. Lake Lappajärvi is a particularly special Finnish lake with a storied past: Its basin was created roughly 78 million years ago when an asteroid slammed into the planet. In 2024, the United Nations Educational, Scientific and Cultural Organization (UNESCO) established a geopark in South Ostrobothnia, Finland, dedicated to preserving and sharing the history of the 23-kilometer-diameter lake and the surrounding region.

“It’s one of the places where you think that life could have started.”

Jacob Gustafsson, a geoscientist at Linnaeus University in Kalmar, Sweden, and his colleagues recently analyzed a collection of rocks unearthed from deep beneath Lake Lappajärvi. The team’s goal was to better understand how rapidly microbial life colonized the site after the sterilizing impact, which heated the surrounding rock to around 2,000°C (3,632°F).

There’s an analogue between this type of work and studies of the origin of life, said Henrik Drake, a geochemist at Linnaeus University and a member of the team. That’s because a fresh impact site contains a slew of temperature and chemical gradients and no shortage of shattered rocks with nooks and crannies for tiny life-forms. A similar environment beyond Earth would be a logical place for life to arise, Drake said. “It’s one of the places where you think that life could have started.”

Microbe-Sculpted Minerals

In 2022, Gustafsson and his collaborators traveled to Finland to visit the National Drill Core Archive of the Geological Survey of Finland.

There, in the rural municipality of Loppi, the team pored over sections of cores drilled from beneath Lake Lappajärvi in the 1980s and 1990s. The researchers selected 33 intervals of core that were fractured or shot through with holes. The goal was to find calcite or pyrite crystals that had formed in those interstices as they were washed with mineral-rich fluids.

“It’s amazing what we can find out in tiny crystals.”

The team used tweezers to pick out individual calcite and pyrite crystals from the cores. Gustafsson and his collaborators then estimated the ages of those crystals using uranium-lead dating and a technique known as secondary ion mass spectrometry to calculate the ratios of various carbon, oxygen, and sulfur isotopes within them. Because microbes preferentially take up certain isotopes, measuring the isotopic ratios preserved in minerals can reveal the presence of long-ago microbial activity and even identify types of microbes. “We see the products of the microbial process,” Drake said.

“It’s amazing what we can find out in tiny crystals,” Gustafsson added.

The researchers also used isotopic ratios of carbon, oxygen, and sulfur to estimate local groundwater temperatures in the distant past. By combining their age and temperature estimates, the team could trace how the Lake Lappajärvi impact site cooled over time.

A Slow Cool

Groundwater temperatures at Lake Lappajärvi had cooled to around 50°C (122°F) roughly 4 million years after the impact, the team found. That’s a far slower cooling rate than has been inferred for other similarly sized impact craters, such as Ries Crater in Germany, in which hydrothermal activity ceased after about 250,000 years, and Haughton Crater in Canada, where such activity lasted only about 50,000 years.

“Four million years is a very long time,” said Teemu Öhman, an impact geologist at the Impact Crater Lake–Lappajärvi UNESCO Global Geopark in South Ostrobothnia, Finland, not involved in the research. “If you compare Lappajärvi with Ries or Haughton, which are the same size, they cooled way, way, way faster.”

That difference is likely due to the type of rocks that predominate at the Lappajärvi impact site, Gustafsson and his collaborators proposed. For starters, there’s only a relatively thin layer of sedimentary rock at the surface. “Sedimentary rocks often don’t fully melt during impact because of their inherent water and carbon dioxide content,” Drake explained. And Lappajärvi has a thick layer of bedrock (including granites and gneisses), which would have melted in the impact, sending temperatures surging to around 2,000°C, earlier research estimated.

About 4 million years after the impact is also when microbial activity in the crater began, according to Gustafsson and his collaborators. Those ancient microbes were likely converting sulfate into sulfide, the team proposed. And roughly 10 million years later, when temperatures had fallen to around 30°C (86°F), methane-producing microbes appeared, the researchers surmised on the basis of their isotopic analysis of calcite.

In the future, Gustafsson and his colleagues plan to study other Finnish impact craters and look for similar microbial features in smaller and older impact structures. In the meantime, the team is carefully packaging up their material from the Lappajärvi site. It’s time to return the core samples to the Geological Survey of Finland, Drake said. “Now we need to ship them back.”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), A flash, a boom, a new microbe habitat, Eos, 106, https://doi.org/10.1029/2025EO250388. Published on 17 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Tectonics and Climate Are Shaping an Alaskan Ecosystem

Thu, 10/16/2025 - 13:24
Source: AGU Advances

Increased warming in high-latitude wetlands seems poised to increase the activity of methanogens, or methane-producing microbes. These ecosystems are complex places, however, making outcomes hard to predict.

In new biogeochemical research taking into account tectonic, climatic, and ecological factors affecting the Copper River Delta in Alaska, Buser-Young et al. found that seismic uplift and glacial meltwater have each contributed to changes in microbial metabolism, with the surprising effect of potentially decreasing methane production.

The Copper River Delta in south central Alaska has a history of large seismic events. That includes, most recently, a 1964 earthquake that lifted portions of the delta to up to 3.4 meters above sea level, turning much of it from a marine environment to a freshwater one. In more recent decades, increasing amounts of iron-rich glacial runoff have also begun flowing through the delta, the result of climate change.

Combining geochemical studies of sediment cores from six wetland locations in the delta with metagenomic analyses of the microbes in the cores, the authors documented a distinct shift in microbial metabolism. Though genes for methanogenesis are still prevalent, and organic matter is available, they found that in an increasingly freshwater, iron-rich environment, the dominant means of energy production among the microbes shifted to involve iron cycling. Their findings are a demonstration of the ways large-scale geological and climatic shifts can affect small-scale processes such as the dynamics of microbial communities.

Looking ahead, the researchers say analyzing deeper sediment core samples could provide more information about how microbial dynamics have changed over time. In addition, they say, further culture-based experiments could improve understanding of the relationships between iron and organic matter within the carbon cycle. (AGU Advances, https://doi.org/10.1029/2025AV001821, 2025)

—Nathaniel Scharping (@nathanielscharp), Science Writer

Citation: Scharping, N. (2025), Tectonics and climate are shaping an Alaskan ecosystem, Eos, 106, https://doi.org/10.1029/2025EO250387. Published on 16 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Scientists Tune In to the Ocean’s Sound Waves

Thu, 10/16/2025 - 13:23

“It’s a good thing we can’t hear it with our ears. Otherwise, we’d just have this constant din from the oceans.”

The steady thrumming of crashing waves is the ocean’s soundtrack. But behind that calming rhythm is a host of hidden chaotic sound waves, most of which are too low in frequency for humans to hear. This acoustic energy travels as infrasound through the air and as seismic waves through the ground. “It’s a good thing we can’t hear it with our ears,” said Stephen Arrowsmith, a geoscientist at Southern Methodist University in Texas. “Otherwise, we’d just have this constant din from the oceans.”

Recently, scientists developed a new method to monitor surf’s acoustic and seismic signatures to identify individual breaking waves within the noise. The findings could allow for new methods for monitoring sea conditions from land and even provide insights into conditions in the upper atmosphere.

A Signal in the Noise

Scientists first discovered surf-generated infrasound more than 20 years ago. One study, led by Arrowsmith, even detected infrasound more than 124 miles (200 kilometers) inland. While the number of such studies has slowed over the past decade, researchers at the University of California, Santa Barbara (UC Santa Barbara), who typically study volcano seismology realized they were well positioned to contribute to surf infrasound research. “We have the proximity to the coastline here on campus, so it seemed an interesting thing to explore,” said Robin Matoza, an Earth scientist and senior author on the paper.

While past studies had detected surf infrasound only as a continuous wall of noise, the researchers suspected that with new advances in computation as well as in acoustic and seismic detection, they could identify the acoustic signatures of individual waves.

The team, led by geologist Jeremy Francoeur, who conducted the work for his master’s thesis at UC Santa Barbara, installed a single infrasound sensor that collected near-continuous data for 10 months, from September 2022 to July 2023. Then, in October 2023, they conducted an intensive field experiment over 6 days, deploying a network of 12 infrasound sensors and one seismometer across a roughly 500-foot area near the Santa Barbara coast.

“One of the biggest surprises was that the same infrasound signals are being generated by surf nearly every day.”

The researchers also took GoPro videos to correlate specific ocean waves with the infrasound and seismic profiles they generated. They then selected the signatures of five waves as templates to match against the 10 months of single-sensor acoustic data, picking out individual crashing waves among all the infrasound recorded. “One of the biggest surprises was that the same infrasound signals are being generated by surf nearly every day,” said Francoeur in an email. The approach revealed up to tens of thousands of individual surf events per day.

“I liked how they were able to identify discrete surf events using this local array,” said Arrowsmith, who wasn’t involved in the new study. “Previous studies on this, including mine, were not able to do that.”

The researchers found they could detect discrete infrasound signals only when breaking waves were over approximately 6.5 feet (2 meters) high, suggesting that a minimum amount of energy is required to generate detectable infrasound. When waves were detectable, however, the size of the water’s waves correlated with acoustic signal strength. This finding was particularly noticeable in the winter months when larger storm swells reach the California coast.

By timing when infrasound signals hit each sensor in the network, the scientists triangulated the positions of the waves, pinpointing a hot spot of acoustic activity to a specific rocky reef area just offshore. This suggests that certain bathymetric features might be more effective than others at generating detectable infrasound. The findings were published in Geophysical Journal International.

From the Surf to the Sky

Monitoring and locating the infrasound signature of surf could offer a new method for monitoring sea conditions using land-based sensors, which is critical for maritime safety and coastal management and research. Sea conditions are most often studied using ocean-based buoys or video monitoring, which is obscured at night and in foggy conditions.

The new method could also have applications far beyond the coast. If the signals from individual waves can be detected at greater distances from shore, they could offer information about conditions in the upper atmosphere. This is possible because infrasound enters the upper atmosphere, and features like temperature and wind speed modulate the waves before they refract in the stratosphere and return to Earth.

By comparing the signatures of individual surf events detected at sensors positioned at different distances, scientists say it could be possible to correlate specific acoustic signals with atmospheric conditions, providing a new tool for studying weather patterns and atmospheric dynamics.

“If you have repetitive signals, you can monitor small changes in those signals,” Matoza said. “You could use that to infer changes in the atmosphere.”

—Andrew Chapman (@andrewchapman.bsky.social), Science Writer

Citation: Chapman, A. (2025), Scientists tune in to the ocean’s sound waves, Eos, 106, https://doi.org/10.1029/2025EO250384. Published on 16 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Panama’s Coastal Waters Missed Their Annual Cooldown This Year

Wed, 10/15/2025 - 12:18

From January to April, strong winds blowing south from the Atlantic side of Panama through gaps in the Cordillera mountain range typically travel over the country and push warm water away from Panama’s Pacific coast. This displacement allows cold, nutrient-rich water to flow up from the depths, a process called upwelling. The Panama Pacific upwelling keeps corals cool and nourishes the complex marine food webs that support Panama’s fishing industry and economy.

In 2025, for the first time on record, this upwelling didn’t occur, according to research published in the Proceedings of the National Academy of Sciences of the United States of America.

During the upwelling period early in the year, ocean temperatures near the coast typically fall to a low of about 19°C, said Andrew Sellers, a marine ecologist at the Smithsonian Tropical Research Institute in Panama. This year, the coastal waters reached just 23.3°C at their coolest.

Waning Winds

Sellers said the Panama Pacific upwelling has likely been happening since the isthmus formed millions of years ago. The phenomenon has been recorded at low resolution for 80 years, and scientists have 40 years’ worth of more detailed records.

The team has identified “a shocking extreme event.”

Scripps Institution of Oceanography climate scientist Shang-Ping Xie, who has studied the weather patterns that usually cause the Panama Pacific upwelling but was not involved with this research, said the team had identified “a shocking extreme event.”

Annual upwelling moderates water temperature along the coast and triggers plankton blooms that nourish marine food webs and Panama’s economy. About 95% of the fish the country catches comes from the Pacific side, and most of that marine life is supported by upwelling, said Sellers.

Sellers said that though tropical upwelling plays a critical role in supporting marine food webs and fisheries, it’s understudied. Indeed, it was a happy accident that the research team was able to obtain measurements in 2025. Sellers says the Smithsonian Tropical Research Institute maintains a network of temperature sensors near the coast but does not regularly monitor the temperature of deeper waters. Early this year, the Max Planck Institute research vessel S/Y Eugen Seibold was in the region as part of its mission to study the relationship between the atmosphere and the ocean, and it provided high-resolution temperature measurements, including in deeper waters, during the upwelling failure.

The Panama Pacific upwelling typically causes a rise in chlorophyll concentrations (blue = low concentrations and red = high concentrations) and a phytoplankton bloom, nourishing the area’s rich marine life, as seen here in February 2024. Credit: Aaron O’Dea

These measurements allowed the research team to see that deeper waters offshore were cold as usual but that those waters didn’t make their way to the coast. The cause seems to be a dramatic change in wind patterns in early 2025: Winds hailing from the north were both shorter in duration and 74% less frequent during the study period than in typical years.

Rippling Consequences

“Given how important upwelling is to that region, it’s hard to imagine there wouldn’t be a loss of primary productivity,” the growth of phytoplankton that sustains the ocean’s food chains, said Michael Fox, a coral reef ecologist at the King Abdullah University of Science and Technology. “Upwelling sets the stage for the base of the food web.”

Some models have predicted that climate change will cause upwelling in temperate zones such as California to strengthen, but the dynamics in the tropics are more of a mystery. The Panama Pacific upwelling is strongly influenced by the El Niño–Southern Oscillation (ENSO). Sellers says changes in ENSO might be affecting local dynamics in Panama.

“Studies like this one should motivate people to pay more attention to ocean-atmosphere dynamics in the tropics.”

“Studies like this one should motivate people to pay more attention to ocean-atmosphere dynamics in the tropics,” Fox said.

Sellers said this year’s unprecedented upwelling failure is likely to have adverse effects on the country’s vibrant Pacific marine life, but Panama does not collect extensive data on its fisheries. The team is now examining the exception—a dataset related to small fish such as sardines and anchovies—to see whether the lack of upwelling affected those fish.

Xie said the Smithsonian team hasn’t yet provided enough data to evaluate what caused this year’s unusual wind patterns and whether climate change made the upwelling failure more likely. Early this year, La Niña would likely have raised the pressure on the Pacific side of the country, which would have weakened the winds. But Xie said that La Niña is a frequent phenomenon and it alone can’t explain the unprecedented weather seen in Panama this year. He said something likely happened that changed pressure levels on the country’s northern Atlantic side as well. But more research is needed to say for sure.

Sellers’s team is preparing to gather more detailed measurements of marine life effects in early 2026, in case upwelling fails again. They are planning to assess the population of barnacles and other sessile invertebrates, which rely on plankton whose populations burgeon during upwelling.

Though the Eugen Seibold’s mission is set to end in 2026, Sellers said he’s determined to perform extensive water temperature measurements early next year, with or without a research vessel. “Sensors are cheap, and we can get more of them,” he said.

“In coming years, we’ll know if this is going to be a recurring issue,” Sellers said. “If it is, it’s going to be a hard hit to the economy.”

—Katherine Bourzac (@bourzac.bsky.social), Science Writer

Citation: Bourzac, K. (2025), Panama’s coastal waters missed their annual cooldown this year, Eos, 106, https://doi.org/10.1029/2025EO250382. Published on 15 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Chicago Soil Maps Childhood Lead Exposure Risk

Wed, 10/15/2025 - 12:11
Source: GeoHealth

Lead is a neurotoxin that can damage multiple body systems and lead to learning and developmental problems. The element has been phased out of use in paint, gasoline, and other industrial applications for decades, but it can persist for years in the soil. Children, who can be particularly vulnerable to lead poisoning, can accidentally ingest and inhale lead particles when they play in contaminated areas.

Even though one in four U.S. homes likely has soil lead levels over the recommended safety limits, no major U.S. city includes systematic soil monitoring as part of its lead prevention services, and blood testing often happens only after exposure.

Chicago is one city with many homes built before 1978—the year the U.S. government banned the use of lead-based paint—and its industrial history means that many residents could be living with elevated blood lead levels (EBLL) because of the prevalence of lead in the surrounding soil. Testing soil for lead is one way to predict which communities are most at risk for childhood lead exposure.

Thorstenson et al. analyzed 1,750 soil samples from Chicago’s 77 community areas. The researchers then used these data with the EPA’s Integrated Exposure Uptake Biokinetic model (IEUBK) to estimate how much lead children are likely to have in their blood. Comparing these data to actual EBLL findings from the Chicago Department of Public Health and accounting for factors such as household income, the age of housing, and the housing’s proximity to industrial land, the researchers built a comprehensive map that identifies the Chicago communities most at risk for soil lead exposure.

More than half of the citywide soil samples showed lead levels above the EPA’s recommended threshold of 200 parts per million—with some hot spots rising above 300 parts per million. When matched with the modeling from IEUBK, an estimated 27% of children across the city are at risk of EBLL. In the hot spot areas, that risk rises to 57%.

These findings suggest that though median household income is the strongest predictor of EBLL prevalence, soil lead levels are also a significant predictor. Systematic soil testing could become a crucial way to reduce children’s risk of lead exposure in contaminated areas, the authors say. (GeoHealth, https://doi.org/10.1029/2025GH001572, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2025), Chicago soil maps childhood lead exposure risk, Eos, 106, https://doi.org/10.1029/2025EO250377. Published on 15 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

JPL Workforce Decimated

Tue, 10/14/2025 - 16:26
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

Today, NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, Calif., laid off 550 people, a roughly 11% reduction of its workforce.

“This week’s action, while not easy, is essential to securing JPL’s future by creating a leaner infrastructure, focusing on our core technical capabilities, maintaining fiscal discipline and positioning us to compete in the evolving space ecosystem,” JPL director Dave Gallagher wrote in a brief statement released on 13 October. Layoffs were spread across the technical, business, and support areas.

Gallagher said that this workforce reduction is part of a reorganization that began in July and is not related to the current government shutdown that began on 1 October. A 10 October court filing by the White House Office of Management and Budget did not include NASA among the agencies targeted for layoffs by the Trump administration during the ongoing shutdown, reported Space News.

 
Related

JPL is a research and development laboratory federally funded by NASA. While the current government shutdown continues, NASA has been directed to operate and plan as if the appropriations bill passed by the House of Representative is in effect, which would fund NASA (and most JPL projects) at nearly the same level as the current fiscal year.

Federal whistleblowers, however, have come forward with evidence that NASA leadership has been operating as if the President’s Budget Request (PBR)—not the appropriations bill—is in effect, directing mission wind-down operations and staff reductions under the assumption of a 20% overall budget cut. Some of that lost spending would affect JPL’s ability to plan, build, and operate Earth science missions and space exploration spacecraft.

Despite vocal support from the Trump administration and NASA leadership about putting humans on the Moon again and eventually on Mars, the PBR would also cancel the Mars Sample Return program, which would pick up and return to Earth sample capsules collected and deposited by the Perseverance rover. Analysis of those samples would provide critical support to any future human exploration mission to Mars.

Kevin Hicks, a systems engineer who formerly operated rovers at JPL, said that Perseverance’s budget is being reduced by two-thirds, “just enough to technically keep it going and not get the full PR backlash of canceling a working rover,” he wrote.

Credit: Kevin Hicks (@astro-cowboy.bsky.social) via Bluesky

This is the fourth round of layoffs at JPL since the beginning of 2024, including an 8% reduction in staff that affected mostly engineering-related positions. The mood among current and former JPL employees is grim. Several people commented on a JPL Reddit forum that they expect more layoffs in the future.

“Today was very somber on lab. It felt like everyone [was] grieving,” one Redditor wrote on 13 October. Several other posters echoed that sentiment. “We tried to keep a positive, but realistic attitude and we even took a final group photo in front of the JPL concrete logo. However, there’s no whitewashing the ‘doomsday-eve’ feeling that’s looming over all our heads.”

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

As Seas Rise, Corals Can’t Keep Up

Tue, 10/14/2025 - 12:14

Coral reefs face myriad challenges, from ocean acidification to warming seas to destructive fishing activities. Sometimes, reefs can rebound from these ecological harms—but only if the coral species assembled on a reef can maintain the required growth rates.

A revised estimate of coral growth rates, published in Nature, suggests that tropical western Atlantic reefs are losing their capacity to build upward. Without upward reef growth, rising seas threaten to drown these reefs and cancel out the benefits they offer to coastal communities, such as minimizing flood damage. Researchers found that reef growth rates at essentially all the 400 sites analyzed won’t be enough to keep up with sea level rise by 2100.

“It’s very critical that we get a handle on what these rates are to be able to adequately gauge the scale of the problem.”

“It’s very critical that we get a handle on what these rates are to be able to adequately gauge the scale of the problem,” said Cody Clements, a coral reef ecologist at the Georgia Institute of Technology who was not involved in the new study. “We have a lot of work ahead of us.”

“Unfortunately, the estimates are worse than before,” said Rich Aronson, a coral reef ecologist at the Florida Institute of Technology who was not involved in the new paper but works closely with its authors. 

Eroding Reefs

Coral reefs grow when corals secrete calcium carbonate, a hard material that forms their exoskeletons.

Scientists can use knowledge of the species that make up a coral reef to estimate its vertical stacking porosity—how much vertical space a reef can build with a given amount of calcium carbonate. 

The skeletons of branching corals, for example, tend to accumulate in an arrangement with more empty space, leading to more upward growth than other corals, such as flat corals, might achieve with the same amount of calcium carbonate.

However, the relationship between coral assemblage and vertical growth ability has so far been poorly defined, said Chris Perry, a coastal geoscientist at the University of Exeter and lead author of the new study. 

The studied reefs “are going to have zero capacity, really, to be able to track future sea level rise.”

Perry and his research group wanted a better estimate. They gathered 66 images of fossilized coral reefs from the tropical western Atlantic and analyzed how those reefs grew over time on the basis of the species of corals within. Then, they applied their revised estimates of growth to previously collected data on the ecology and carbonate production of 400 sites at three reef systems in the tropical western Atlantic: the Mexican Mesoamerican Reef, the Florida Keys, and Bonaire. 

The adjusted estimate of growth revealed a bleaker picture of reef health than the scientists anticipated: Researchers found that on average, reefs at all sites were growing at a sluggish pace—less than 1 millimeter per year—with an average growth rate decline of 12.4% when compared to previous estimates. On average, global sea levels are rising by about 4.5 millimeters per year.

The new calculations are particularly stark for reefs dominated by branching coral species, Didier De Bakker, a coral reef ecologist at the University of Exeter and a coauthor of the new study, wrote in an email. 

If corals can’t grow, they shrink, falling victim to erosion by other marine creatures such as fish and sea urchins. Eventually, corals unable to keep up with sea level rise are drowned, unable to access sufficient light to continue growing at all.

The studied reefs “are going to have zero capacity, really, to be able to track future sea level rise,” Perry said. 

Corals at Limones Reef in the Mexican Caribbean suffered a bleaching event in 2023. Credit: Lorenzo Álvarez-Filip

In general, the new estimates of the link between assemblage type and vertical growth “revise our estimate downward” of how well corals will be able to keep up with sea level rise, Aronson said. The results also align with a 2023 study by Aronson and others that found reef growth in Panama’s Gulf of Chiriquí, part of the Pacific Ocean, is likely already unable to keep up with sea level rise. 

Perry and De Bakker hope the data in the new study will feed into future studies modeling coastal wave exposure. “These new estimates provide a more realistic basis for projecting the vulnerability of adjacent habitats and reef-fronted urban areas,” De Bakker wrote. 

Aronson said one next step for the research would be to apply the research team’s new estimates of vertical growth to reefs elsewhere, such as those in tropical Indo-Pacific waters. There, more species of branching coral still survive, giving Indo-Pacific reefs a slightly better chance of keeping up with sea level rise, said Clements, who studies Indo-Pacific reefs.

Climate Change and Corals

As a last step to their study, the researchers used what they’d learned about reef growth at 400-plus reef sites along with various future climate warming scenarios, called Shared Socioeconomic Pathways, or SSPs, to project how reef growth rates may change as the climate warms and sea levels continue to rise.

Results predicted that more than 70% of tropical western Atlantic reefs will transition into net erosional states by 2040 under an optimistic scenario (SSP1-2.6). But if warming exceeds SSP2-4.5 (a middle-of-the-road scenario in line with current development patterns), nearly all reefs will be eroding by 2100.

“Even if you go by some of the conservative estimates that they’re using, we still have a major problem in terms of coral reef accretion rates,” Clements said. 

Reef Benefits Wash Away

Slower vertical growth means corals will have a tougher time maintaining their crest, or high point. These crests serve as wave breakers that dissipate wave energy and reduce flood damages to coastal communities. One estimate suggests that coral reefs near the U.S. coastline prevent more than $1.8 billion in damage each year.

This coral reef crest in the Mexican Caribbean dissipates wave energy and reduces beach erosion and possible flood damage. Credit: Lorenzo Álvarez-Filip

As coral growth fails to track with sea level rise, these crests fall below the water’s surface. In turn, rising seas and waves from storms face less resistance, and reefs’ protective abilities get washed away.

“It’s quite difficult to see how we turn this around without really, really aggressive action on greenhouse gas emissions.”

Reef restoration is an active area of research, with engineers and ecologists working together to create various solutions, from LEGO-like scaffolding for corals to robots that sprinkle warming reefs with cool water. Previous research by Aronson and others indicated that successful restoration could help reefs keep pace with future sea level rise.

However, restoration will be effective only if it is done in tandem with efforts to rein in climate warming, which could slow sea level rise and reduce the frequency of marine heat waves, Perry said. “It’s quite difficult to see how we turn this around without really, really aggressive action on greenhouse gas emissions.”

“We have to do something about these global-scale stressors, like climate change, or it’s not going to matter,” Clements said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

This news article is included in our ENGAGE resource for educators seeking science news for their classroom lessons. Browse all ENGAGE articles, and share with your fellow educators how you integrated the article into an activity in the comments section below.

Citation: van Deelen, G. (2025), As seas rise, corals can’t keep up, Eos, 106, https://doi.org/10.1029/2025EO250380. Published on 14 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Space Radiation Can Produce Some Organic Molecules Detected on Icy Moons

Tue, 10/14/2025 - 12:10

New laboratory research suggests that some organic molecules previously detected in plumes erupting from Saturn’s moon Enceladus may be products of natural radiation, rather than originating from the moon’s subsurface ocean. This discovery complicates the assessment of the astrobiological relevance of these compounds.

Enceladus hides a global ocean buried beneath its frozen crust. Material from this liquid reservoir is ejected into space from cracks in the ice near the south pole, forming plumes of dust-sized ice particles that extend for hundreds of kilometers. While most of this material falls back onto the surface, some remains in orbit, becoming part of Saturn’s E ring, the planet’s outermost and widest ring.

Between 2005 and 2015, NASA’s Cassini spacecraft flew repeatedly through these plumes and detected a variety of organic molecules. The detection was viewed as evidence of a chemically rich and potentially habitable environment under the ice, where molecules essential to life could be available. However, the new study offers an explanation in which radiation, not biology, is behind the presence of at least some of these organic molecules.

To test the role of space radiation, a team of researchers led by planetary scientist Grace Richards, a postdoc at the National Institute for Astrophysics in Rome, simulated conditions near Enceladus’s surface by creating a mixture of water, carbon dioxide, methane, and ammonia, the main expected components of surface ice on Enceladus. They cooled the concoction to −200°C inside a vacuum chamber and then bombarded it with water ions, which are an important component of the radiation environment that surrounds the moon.

The radiation induced a series of chemical reactions that produced a cocktail of molecules, including carbon monoxide, cyanate, ammonium, and various alcohols, as well as molecular precursors to amino acids such as formamide, acetylene, and acetaldehyde. The presence of these simple molecules indicates that radiation could induce similar reactions on Enceladus.

Richards presented these findings at the Europlanet Science Congress–Division for Planetary Sciences Joint Meeting (EPSC-DPS 2025) in Helsinki, Finland. She and her coauthors also published a detailed report in Planetary and Space Science.

Enceladus and Beyond

The new research raises the question of whether the organic molecules detected in Enceladus’s plumes truly come from the moon’s buried ocean, whether they are formed in space, or whether they form close to the surface after the plumes leave the Enceladean interior.

While the finding doesn’t exclude the possibility of a habitable ocean on Enceladus, Richards urges caution in assuming a direct link between the presence of these molecules in the plumes, their origin, and their possible role as precursors to biochemistry.

“I don’t necessarily think that my experiments discredit anything to do with Enceladus’s habitability.”

“I don’t necessarily think that my experiments discredit anything to do with Enceladus’s habitability,” Richards said.

However, she added, “when you’re trying to infer this ocean composition from what you’re seeing in space, it’s important to understand all the processes that go into modifying this material.” Apart from radiation, these processes include phase changes, interactions with the moon’s ice walls, and interactions with the space environment.

“We need a lot of experiments of that type,” said planetary scientist Alexis Bouquet, a French National Centre for Scientific Research (CNRS) researcher at L’Université d’Aix-Marseille who wasn’t involved in the study. “They demonstrated that you can produce a certain variety of species in conditions that are relevant to the south pole of Enceladus.”

Bouquet highlighted the importance of simulating these environments in a lab for planning future missions to Enceladus and for interpreting the much-anticipated data from current missions to Jupiter’s icy moons. These missions are NASA’s Europa Clipper, which will explore Europa, and the European Space Agency’s (ESA) JUICE (Jupiter Icy Moons Explorer), which will visit all three of the giant planet’s moons with subsurface oceans: Ganymede, Calisto, and also Europa.

The intense radiation around Jupiter makes these experiments especially relevant. “Radiation chemistry for Europa or the Jovian moons in general [is] a big deal, a bigger deal than in Enceladus,” Bouquet says.

Another Story Completely

As Richards’s work questions the origin of organic compounds around Enceladus, researchers keep adding more molecules to the puzzle.

After a new analysis of data gathered during one of Cassini’s close approaches to Enceladus in 2008, researchers led by planetary scientist Nozair Khawaja at the Freie Universität Berlin and the University of Stuttgart reported the discovery of new types of organic molecules, seemingly emanating from the icy vents. They include ester and ether groups and chains and cyclic species containing double bonds of oxygen and nitrogen.

On Earth, these molecules are essential links in a series of chemical reactions that ultimately produce complex compounds needed for life. And while these molecules could have an inorganic origin, “they increase the habitability potential of Enceladus,” Khawaja said. The findings appeared in Nature Astronomy.

Khawaja’s team’s analysis suggests that complex organic molecules are present in fresh ice grains just expelled from the vents. During its last flyby, Cassini got as close as 28 kilometers to the moon’s surface.

After modeling the plumes and the icy grains’ residence times in space, they think that the ice grains sampled by Cassini did not spend a lot of time in space, likely just “a few minutes,” Khawaja said. “It is fresh.”

This short duration in space questions whether space radiation had enough time to produce the organic molecules Khawaja detected. Just a few minutes would not be long enough for such complex chemistry to take place, even in a high-radiation environment.

“Big grains coming from the surface full of organics? That is much harder to explain through radiation chemistry,” Bouquet said.

While the types of experiments performed by Richards “are valuable and take the science to the next level,” Khawaja said, “our results tell the other story completely.”

Back to Enceladus

Both studies reinforce the complexity of Enceladus’s chemistry, upholding it as a prime target in the search for extraterrestrial life, or at least life’s building blocks. Enceladus has all three prerequisites for life: liquid water, an energy source, and a rich cocktail of chemical elements and molecules. Even if the subsurface ocean is out of reach—it lies at least a few kilometers beneath the ice close to the poles—the plumes offer the only known opportunity to sample an extraterrestrial liquid ocean.

Studies for a potential ESA mission dedicated to Enceladus are already underway, with plans that include high-speed flybys through the plumes and, potentially, a lander on the south pole. The insights from both recent studies will help researchers design the instrumentation and guide the interpretation of future results.

“There is no better place to look for [life] than Enceladus,” Khawaja said.

—Javier Barbuzano (@javibar.bsky.social), Science Writer

Citation: Barbuzano, J. (2025), Space radiation can produce some organic molecules detected on icy moons, Eos, 106, https://doi.org/10.1029/2025EO250383. Published on 14 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The 12 July 2024 landslide cluster in Pengshui County, Chongqing, China

Tue, 10/14/2025 - 07:45

About 140 mm triggered 143 landslides in an area of about 10 km2, killing two people.

Loyal readers will have noticed that I’m fascinated by dense clusters of landslides triggered by intense rainfall (or earthquakes). Over the years, I have written about these on multiple occasions, but increasing numbers are being described in the literature.

Another very interesting example has just been published in the journal Landslides (Xie et al. 2025). This example occurred on 12 July 2024 close to Puzi in Pengshui County, Chongqing, China. The centre of the cluster as at [29.56790, 108.28781] – this is the marker on the images that follow.

The Planet image below shows the area on 24 May 2024, before the rainfall:-

The site of the 12 July 2024 landslides in Pengshui County, Chongqing, China. Image copyright Planet, used with permission. Image dated 24 May 2024.

And this is the same site after the event on 12 July 2024:-

The aftermath of the 12 July 2024 landslides in Pengshui County, Chongqing, China. Image copyright Planet, used with permission. Image dated 1 August 2024.

And here is an image compare:-

Images copyright Planet, used with permission.

Xie et al. (2025) show that this cluster of landslides was triggered by a rainstorm that deposited about 140 mm of rainfall in a few hours. In total, 143 landslides were triggered in an area of about 10 km2. The failures were mostly disrupted avalanches, some of which formed channelised debris flows. However, Xie et al. (2025) also show that there are a number of interesting aspects of this cluster of landslides.

Note the geographical isolation of these landslides. The slopes to the east and west suffered far fewer failures. Perhaps surprisingly, this cluster of landslides did not occur in the area of highest rainfall – a short distance to the west, more than 200 mm was recorded, but few landslides occurred.

The analysis of Xie et al. (2025) shows that this cluster occurred because of a weak geological unit (sandstone) that was highly fractured, a geological structure that promoted instability and steep slope gradients (which may be associated with erosion by the river). Thus, it is the combination of the meteorological, geological and geomorphological factors that led to the cluster of landslides.

Fortunately, the area had been mostly evacuated ahead of the rainfall, so there were just two fatalities. There was extensive damage to properties though.

This event illustrates well the ways in which extreme rainfall events are combining with local factors to create clusters of landslides that have the potential to generate high levels of damage.

Many thanks to Xie et al. (2025) for such an interesting example.

References

Xie, X., Liu, S., Macciotta, R. et al. 2025. Spatial heterogeneity in landslide response to a short-duration intense rainfall event on 12 July 2024 in Pengshui County, Chongqing, ChinaLandslides. https://doi.org/10.1007/s10346-025-02624-6.

Planet Team 2025. Planet Application Program Interface: In Space for Life on Earth. San Francisco, CA. https://www.planet.com/.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The 22 May 1960 earthquake-induced landslides and tsunami at Lake Rupanco in Chile

Mon, 10/13/2025 - 06:41

Reconstruction of landslides on the banks of Lake Rupanco in Chile, triggered by the 22 May 1960 Mw-9.5 earthquake, suggests that a slope failure with a volume of 161 million cubic metres triggered a tsunami with a maximum amplitude of 33.3 metres. About 120 people were killed.

A very interesting paper (Quiroga et al. 2025) has just been published in the journal Landslides that examines combined landslide – tsunami threats at Lake Rupanco [40.82, -72.50] in Chile. The context is a series of landslides, and a resultant tsunami, that was triggered by the 22 May 1960 Mw=9.5 Great Chilean earthquake. The paper reconstructs that landslides and models the tsunami that they generated.

This event is particularly interesting as the loss of life was significant. Quiroga et al. (2025) document about 120 fatalities:-

“The most severely impacted area was Las Gaviotas, a settlement situated on the southeast shore…, where tsunami run-up heights reportedly exceeded 10 m, according to eyewitness accounts… One of the most significant losses was the destruction of the popular Termas de Rupanco hotel located near geothermal springs …, which was swept away by the landslides, resulting in 11 confirmed fatalities … At that time, a road was also under construction along the southern shoreline to connect Osorno with Las Gaviotas; both the road and several worker camps were destroyed…”

The Chilean Enterreno site has a photograph of the Termas de Rupanco hotel prior to the tsunami:-

Hotel Termas de Rupanco, which was destroyed by the landslide-induced tsunami 1960. Image from Enterreno. Posted by Francisco Vidal Guzmán under a by-nc licence.

Quiroga et al. (2025) have tracked the source of the tsunami to a series of landslides that occurred on the north side of Lake Rupanco. The scars of these failures are still very visible on Google Earth:-

Google Earth image of the site of the landslides on the banks of Lake Rupanco triggered by the 22 May 1960 earthquake in Chile.

Quiroga et al. (2025) have identified eight landslide scars in this area, of which the most significant is the bowl-shaped scar in the centre of the image above. This is the most likely source of the tsunami. It is a rotational failure with lower runout zone, with a volume of 161 million m3. Of this volume, 12.1 million m3 became submerged to generate the wave.

Reconstruction of the wave suggests that it has a maximum amplitude of 33.3 metres close to the landslide itself. At Las Gaviotas, where the hotel was located, the wave had a maximum amplitude of 8.6 metres, arriving 261 seconds after initiation.

This elegant and useful paper illustrates well the threat posed by large landslides into lakes. For those located in the hotel, the events would have been terrifying, starting with a major earthquake for which the shaking would have been intense and long-lasting, followed by the noise and dust generated by the collapsing slopes, and finally the impact of this enormous tsunami. Keeping people safe in such circumstances is a very major challenge.

Reference

Quiroga, J.P., Aránguiz, R., Hernández-Madrigal, V.M. et al. 2025. Reconstruction and numerical modeling of historical and paleo-tsunamigenic landslides in Lake Rupanco, Chile. Landslides. https://doi.org/10.1007/s10346-025-02629-1.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Zircon Crystals Could Reveal Earth’s Path Among the Stars

Fri, 10/10/2025 - 12:53

Tiny crystals in Earth’s crust may have recorded meteorite and comet impacts as our planet traveled through the spiral arms of the Milky Way over more than 4 billion years, according to new research.

The study is one of the first to suggest that galactic-scale processes can affect Earth’s geology, and researchers think similar evidence might be found on other bodies in the solar system, including the Moon and Mars.

“This is something that could connect the Earth, the Moon, and Mars into the wider galactic surroundings.”

“This is so interesting and exciting—we are potentially seeing something that is not just unique to Earth,” explained geologist Chris Kirkland of Australia’s Curtin University, the first author of the new study published in Physical Review Research. “This is something that could connect the Earth, the Moon, and Mars into the wider galactic surroundings.”

Kirkland and his coauthor, University of Lincoln astrophysicist Phil Sutton, studied changes in oxygen isotopes in a database of tens of thousands of dated crystals of zircon—a silicate mineral with the chemical formula ZrSiO4 that is common in Earth’s crust. They compared their findings to maps of the Milky Way galaxy that show its neutral hydrogen, or H1.

H1, with one proton and one electron, is the most abundant element in the universe, and its density is particularly high in the arms of the Milky Way galaxy.

Because they are almost exactly the same size, uranium atoms sometimes replace the zirconium atoms in zircon. Uranium radioactively decays into lead over time, so geologists can study the levels of uranium and lead isotopes in zircon crystals to determine when the crystals formed, sometimes in the first phases of the evolution of Earth’s crust about 4.4 billion years ago.

“Zircon crystals are a geologist’s best friend…we can get a lot of information from a single zircon grain.”

“Zircon crystals are a geologist’s best friend,” Kirkland said. “They have an inbuilt clock, and they carry a chemical signature that tells us how they formed—so we can get a lot of information from a single zircon grain.”

Queen’s University geochemist Christopher Spencer, who was not involved in the study, said that the work was fascinating and provocative. “I think the study is a reminder that Earth does not evolve in isolation and that interdisciplinary thinking, however speculative at first, can open up new ways of framing questions about our planet’s history.”

Oxygen Isotope Ratios

The key to the latest research was in the ratios of isotopes—forms of the same chemical element that have different numbers of neutrons—in the oxygen atoms of zircon’s silicate group.

The relative levels of oxygen isotopes in samples of zircon crystals can tell geologists whether the crystals formed high in the crust, perhaps while interacting with water and sediments, or deeper within Earth’s mantle.

Kirkland said the latest study examined the distribution of the ratios of oxygen isotopes found in a dataset of zircon crystals sampled from around the world. The scientists evaluated the data’s “kurtosis,” or the measure of how flat or peaked a distribution is. A dataset with high kurtosis has a narrow distribution, with most values occurring in the middle and causing a sharp peak in the distribution curve. In contrast, a dataset with low kurtosis has a wide distribution with more high and low values, causing a wider distribution curve with a less pronounced peak.

The researchers determined that periods of high oxygen isotope kurtosis corresponded to times when our solar system was crossing the dense spiral arms of the Milky Way galaxy. Such crossings occurred roughly every 187 million years on average during our solar system’s 748-million-year orbit around the galactic center at a speed of about 240 kilometers per second.

In addition to H1, the spiral arms are filled with many more stars than the interstellar space between them. The gravity of those stars seems to have disturbed the Oort Cloud—the haze of billions of icy rock fragments that surrounds our solar system. That, in turn, caused more meteors and comets to strike Earth as it passed through the galactic arms, leading to the subsequent melting of the crust in many places, Kirkland said. “By looking at the variability of the [zircon] signal over time, we were able to get an indication of how different the magma production on the planet was at that time.”

Professor Chris Kirkland uses an ion microprobe to date zircon mineral grains. Credit: C. L. Kirkland

He warned that correlation does not mean causation but said that in this case there seemed to be no other plausible cause for the periodic kurtosis of the oxygen isotope ratios in zircons. “It is very important that we are able to see the frequency of [meteor and comet] impacts” on Earth, Kirkland said. “Rather than an internal process, we seem to be looking at an external process.”

Some other experts suggest the new study is notable for outlining the concept that galactic processes could have left geological traces, but it is not yet conclusive proof.

Earth scientist Craig Storey of the University of Portsmouth in the United Kingdom, who was not involved in the new study, said crustal melting did not necessarily prove an increase in meteorite or comet impacts. Instead, natural processes here on Earth, such as volcanic or tectonic movements, could have caused melting of the crust at several stages of our planet’s geological history.

He is also concerned that some of the proposed correlations in the study may not be correct. “It is an interesting idea, and there are potentially ways to test it, but I don’t think this is the way to test it,” Storey said.

—Tom Metcalfe (@HHAspasia) Science Writer

Citation: Metcalfe, T. (2025), Zircon crystals could reveal Earth’s path among the stars, Eos, 106, https://doi.org/10.1029/2025EO250379. Published on 10 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New 3D Model Reveals Geophysical Structures Beneath Britain

Fri, 10/10/2025 - 12:53
Source: Journal of Geophysical Research: Solid Earth

Magnetotelluric (MT) data, which contain measurements of electric and magnetic field variations at Earth’s surface, provide insights into the electrical resistivity of Earth’s crust and upper mantle. Changes in resistivity, or the ability to conduct an electrical current, can indicate the presence of geologic features such as igneous intrusions or sedimentary basins, meaning MT surveys can complement other kinds of geophysical surveys to help reveal Earth’s subsurface. In addition, such surveys can play an important role in improving understanding of the risks space weather poses to human infrastructure.

Montiel-Álvarez et al. present the first 3D electrical resistivity model of Britain, based on long-period MT data (using measurements gathered every second for 4–6 weeks at a time) from across the island. Their model, called BERM-2024, points to previously recognized as well as likely new tectonic and geological structures. The authors also model the effects of a recent solar storm on Earth’s geoelectric field, validating the usefulness of MT-based approaches for space weather impact forecasting.

The BERM-2024 electrical resistivity model is based on MT data from 69 sites in Britain, including both new and legacy datasets. Creating the final model involved processing the raw time series data and accounting for the “coastal effect” caused by the conductivity of ocean water when inverting the data—or calculating causes based on observations.

Sensitivity tests of the new model indicate it resolves features to depths of 200 kilometers (125 miles), including many known from other geophysical surveys and geological observations. It also reveals new anomalies, including highly conductive areas under Scotland’s Southern Uplands Terrane and a resistive anomaly under the island of Anglesey. More intriguing, a large, previously unknown conductive anomaly appears in their model between 85 and 140 kilometers (52–87 miles) beneath the West Midlands region.

The authors tested the utility of their resistivity model for estimating the electric field at Earth’s surface, which is key in forecasting the effects of geomagnetically induced currents caused by space weather. To do so, they obtained a time series of the horizontal electric field across Britain during a solar storm that occurred on 10–11 October 2024, which led to bright displays of aurora borealis across the Northern Hemisphere. They found good agreement between their modeled time series and those measured at observatories, indicating that electrical resistivity models are a tool that can provide accurate information for space weather impact planning. (Journal of Geophysical Research: Solid Earth, https://doi.org/10.1029/2025JB031813, 2025)

—Nathaniel Scharping (@nathanielscharp), Science Writer

Citation: Scharping, N. (2025), New 3D model reveals geophysical structures beneath Britain, Eos, 106, https://doi.org/10.1029/2025EO250381. Published on 10 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Pinpointing Sewage Seeps in Hawaii

Thu, 10/09/2025 - 13:09

In Hawaii, most of the population relies on private septic tanks or cesspools to dispose of sewage and other wastewater. There are more than 88,000 cesspools in the state, with about 55,000 on the Big Island alone. These systems, as opposed to more strictly regulated municipal wastewater treatment units, have a higher risk of sewage leaking into the porous substrate.

A recent study published in Frontiers in Marine Science identifies sewage-contaminated submarine groundwater discharge (SGD) sites, pinpointing specific locations that stakeholders may want to prioritize for mitigation efforts.

Modeling and Mapping

Previous studies estimated that groundwater flows deliver 3 to 4 times more discharge to oceans than rivers do, making them significant pathways for transporting pollutants.

In response to pollution concerns from the local community, a team from Arizona State University, with the support of the Hawaiʻi Marine Education and Research Center, used airborne mapping to identify locations where SGD reached the ocean along the western coastline of the Big Island.

Sewage-contaminated water (colored blue in this photograph) enters the ocean from submarine groundwater discharge sites on the Kona coast of the Big Island. Credit: ASU Global Airborne Observatory

To precisely identify these freshwater-seawater interfaces, researchers built on previous studies that used thermal sensors to capture the temperature difference between the two bodies of water. Figuring out which of these discrete interface points were problematic “was very challenging,” said Kelly Hondula, a researcher at the Center for Global Discovery and Conservation Science and first author of the study.

The team identified more than 1,000 discharge points and collected samples from 47 locations. “We chose points where we could localize freshwater emerging from the land or points of high community interest,” explained Hondula.

In addition to aerial surveys, researchers analyzed the discharge points by monitoring their salinity gradients and measuring levels of Enterococcus, a group of bacteria that frequently serve as key fecal indicators in public health testing. They integrated these data into a statistical model that used upstream land cover and known sewage sites to predict the likelihood of sewage and bacterial contamination for each SGD site along the western Hawaiʻi coastline.

The techniques allowed scientists to identify regions of the built environment that are associated with contamination. Besides areas with septic systems and cesspools, they found a high correlation between sewage discharge and development within the first 500 meters of the coast.

“Sewage going into the ground comes out in the ocean, with often a worrying level of waste contamination.”

The geology of a discharge point also contributes to its risk of contamination. Discharge points around the island’s South Kona region, for instance, feature “some of the youngest and most porous volcanic substrate in the archipelago, with little soil development and a high degree of hydrologic connectivity between point sources of pollution and coastal waters,” the authors wrote. Although South Kona has relatively sparse development, increased land use will likely have a disproportionate effect on groundwater quality, they concluded.

“We were surprised to find such clear results: Sewage going into the ground comes out in the ocean, with often a worrying level of waste contamination,” said Hondula.

Mapping Mitigation

As communities continue to invest in coastal development, understanding the effect of sewage discharge and how to avoid it is becoming an increasingly pressing concern worldwide.

As such, the new study “contributes to the growing body of evidence correlating sewage-tainted groundwater discharge with coastal water quality, showing a strong linkage between wastewater and development in the nearshore area. That’s something that land managers and conservation scientists should really take into account,” said Henrietta Dulai, a geochemist at the University of Hawaiʻi at Mānoa who was not involved in the study.

The state of Hawaii has recognized the particular risk posed by largely unregulated cesspools leaking sewage-contaminated groundwater to the ocean. In fact, there is a state mandate to eliminate cesspools by 2050, but the associated cost is slowing the process.

Many scientists say the costs of phasing out cesspools is far outweighed by the health benefits. “We need to consider the financial sides of replacing cesspools versus the benefit of preserving the water quality for the environment and the people,” said Tristan McKenzie, a researcher at the University of Gothenburg, Sweden, who was not involved in the study. “Studies like this highlight why we need to act now.”

—Anna Napolitano (@anna83nap; @anna83nap.bsky.social), Science Writer

Citation: Napolitano, A. (2025), Pinpointing sewage seeps in Hawaii, Eos, 106, https://doi.org/10.1029/2025EO250376. Published on 9 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Step Toward AI Modeling of the Whole Earth System

Thu, 10/09/2025 - 13:08
Source: Journal of Geophysical Research: Machine Learning and Computation

Modelers have demonstrated that artificial intelligence (AI) models can produce climate simulations with more efficiency than physics-based models. However, many AI models are trained on past climate data, making it difficult for them to predict how climate might respond to future changes, such as further increases in the concentration of greenhouse gases.

Clark et al. have taken another step toward using AI to model complex Earth systems by coupling an AI model of the atmosphere (called the Ai2 Climate Emulator, or ACE) with a physical model of the ocean (called a slab ocean model, or SOM) to produce a model they call ACE2-SOM. They trained ACE2-SOM on output of a 100-kilometer-resolution physics-based model from a range of climates.

In response to increased atmospheric carbon dioxide, consistent with its target model, ACE2-SOM predicted well-known responses, such as surface temperature increasing more strongly over land than over ocean, and wet areas becoming wetter and dry areas becoming drier. When the researchers compared their results with those of a 400-kilometer-resolution version of the physics-based model they were emulating, they found that ACE2-SOM produced more accurate and cost-effective predictions: ACE2-SOM used 25 times less power while providing a resolution that was 4 times finer.

But ACE2-SOM struggled when the researchers asked it to predict what would happen if atmospheric carbon dioxide levels rose rapidly (suddenly quadrupling, e.g.). While the ocean surface temperature took the appropriate time to adjust, the atmosphere almost immediately shifted to the equilibrium climate under the new carbon dioxide concentration, even though physical laws would dictate a slower response.

To become fully competitive with physics-based models, AI climate models will need to become better able to model unusual situations, the authors write. The slab ocean model used in this study is also highly simplified. So to maintain their efficiency advantage while improving realism, AI models will also need to incorporate additional parts of the Earth system, such as ocean circulation and sea ice coverage, the researchers add. (Journal of Geophysical Research: Machine Learning and Computation, https://doi.org/10.1029/2024JH000575, 2025)

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), A step toward AI modeling of the whole Earth system, Eos, 106, https://doi.org/10.1029/2025EO250362. Published on 9 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer