EOS

Syndicate content Eos
Science News by AGU
Updated: 1 day 8 hours ago

How Plant-Fungi Friendships Are Changing

Wed, 10/22/2025 - 13:30
Source: Journal of Geophysical Research: Biogeosciences

Just as the human body contains a multitude of symbiotic microbial companions, most plant species also live alongside microbial friends. Among these companions are mycorrhizal fungi, which help plants gather water and nutrients—particularly nitrogen—from the soil. In exchange, plants provide mycorrhizal fungi with an average of 3% to 13% of the carbon they pull from the atmosphere through photosynthesis and sometimes as much as 50%.

This carbon donation to support mycorrhizal fungi can incur a significant carbon cost for plants. But few groups have investigated how environmental factors such as soil temperature and nitrogen levels influence the amount of carbon flowing from plants to mycorrhizal fungi and how this flow is likely to shift with climate change. To fill this gap, Shao et al. derived a model that they call Myco-CORPSE (Mycorrhizal Carbon, Organisms, Rhizosphere, and Protection in the Soil Environment) that illustrates how the environment influences interactions between plants and mycorrhizal fungi.

When the researchers fed data from more than 1,800 forest sites in the eastern United States into Myco-CORPSE, they obtained some familiar results and also made some new discoveries. The model echoed previous work in suggesting that increasing the abundance of soil nitrogen, for example, through fertilizer runoff, decreases the dependence of plants on mycorrhizal fungi and therefore reduces the amount of carbon plants allocate to their microbial counterparts. But in contrast to previous studies, these researchers found that rising soil temperatures had the same effect of reducing the amount of nitrogen and carbon exchanged by fungi and plants. That’s because warmth accelerates the breakdown of organic material, which releases nitrogen. Increasing atmospheric carbon dioxide levels, on the other hand, will likely increase the reliance of plants on mycorrhizal fungi by increasing the growth rate of plants and therefore increasing their need for nutrients.

The Myco-CORPSE model also replicated observed patterns, showing that the two major kinds of mycorrhizal fungal species (arbuscular and ectomycorrhizal) behave differently: Arbuscular trees tend to donate less carbon to their associated fungi relative to how much ectomycorrhizal trees donate to theirs. The model also found that forests with a mix of both kinds of species typically accrue less carbon from plants than forests with less mycorrhizal diversity.

As forest managers navigate the many stresses that forests face today, promoting a diversity of mycorrhizal species within forests could optimize plant growth while minimizing the carbon diverted to mycorrhizal fungi, the researchers wrote. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2025JG009198, 2025)

This article is part of the special collection Biogeosciences Leaders of Tomorrow: JGR: Biogeosciences Special Collection on Emerging Scientists.

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), How plant-fungi friendships are changing, Eos, 106, https://doi.org/10.1029/2025EO250397. Published on 22 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

An Asteroid Impact May Have Led to Flooding near the Grand Canyon

Wed, 10/22/2025 - 13:30

When it comes to famous holes in the ground, northern Arizona has two: Grand Canyon and Barringer Meteorite Crater.

New research now suggests that these famous depressions might, in fact, be linked—the impact that created the crater roughly 56,000 years ago might also have unleashed landslides in a canyon that’s part of Grand Canyon National Park today. Those landslides in turn likely dammed the Colorado River and temporarily created an 80-kilometer-long lake, the team proposed. The results were published in Geology.

Driftwood Then and Now

“These are two iconic features of Arizona.”

Karl Karlstrom, a geologist recently retired from the University of New Mexico, grew up in Flagstaff, Ariz. Grand Canyon and Barringer Meteorite Crater both were therefore in his proverbial backyard. “These are two iconic features of Arizona,” said Karlstrom.

Karlstrom’s father—also a geologist—used to regularly explore the caves that dot the walls of Grand Canyon and surrounding canyons. In 1970, he collected two pieces of driftwood from a cavern known as Stanton’s Cave. The mouth of Stanton’s Cave is more than 40 meters above the Colorado River, so finding driftwood in its recesses was unexpected. Routine flooding couldn’t have lofted woody detritus that high, said Karlstrom. “It would have required a flood 10 times bigger than any known flood over the last 2,000 years.”

The best radiocarbon dating available in the 1970s suggested that the driftwood was at least 35,000 years old. A colleague of the elder Karlstrom suggested that the driftwood had floated into Stanton’s Cave when an ancient landslide temporarily dammed the Colorado, raising water levels. The researchers even identified the likely site of the landslide—a wall of limestone in Nankoweap Canyon.

But what had set off that landslide in the first place? That’s the question that Karl Karlstrom and his colleagues sought to answer. In 2023, the researchers collected two additional samples of driftwood from another cave 5 kilometers downriver from Stanton’s Cave.

A “Striking” Coincidence

Modern radiocarbon dating of both the archival and newly collected driftwood samples yielded ages of roughly 56,000 years, with uncertainties of a few thousand years, for all samples. The team also dated sand collected from the second cave; it too had ages that, within the errors, were consistent with the sand having been emplaced 56,000 years ago.

The potential significance of that timing didn’t set in until one of Karlstrom’s international collaborators took a road trip to nearby Barringer Meteorite Crater, also known as Meteor Crater. There, he learned that the crater is believed to have formed around 56,000 years ago.

That coincidence was striking, said Karlstrom, and it got the team thinking that perhaps these two famous landmarks of northern Arizona—Meteor Crater and Grand Canyon National Park—might be linked. The impact that created Meteor Crater has been estimated to have produced ground shaking equivalent to that of an M5.2–5.4 earthquake. At the 160-kilometer distance of Nankoweap Canyon, the purported site of the landsliding, that ground movement would have been attenuated to roughly M3.3–3.5.

It’s impossible to know for sure whether such movement could have dislodged the limestone boulders of Nankoweap Canyon, Karlstrom and his colleagues concede. That’s where future modeling work will come in, said Karlstrom. It’s important to remember that an asteroid impact likely produces a distinctly different shaking signature than an earthquake caused by slip on a fault, said Karlstrom. “Fault slip earthquakes release energy from several kilometers depths whereas impacts may produce larger surface waves.”

But there’s good evidence that a cliff in Nankoweap Canyon did, indeed, let go, said Chris Baisan, a dendrochronologist at the Laboratory of Tree-Ring Research at the University of Arizona and a member of the research team. “There was an area where it looked like the canyon wall had collapsed across the river.”

An Ancient Lake

Using the heights above the Colorado where the driftwood and sand samples were collected, the team estimated that an ancient lake extended from Nankoweap Canyon nearly 80 kilometers upstream. At its deepest point, it would have measured roughly 90 meters. Such a feature likely persisted for several decades until the lake filled with sediment, allowing the river to overtop the dam and quickly erode it, the team concluded.

“They’re certainly close, if not contemporaneous.”

The synchronicity in ages between the Meteor Crater impact and the evidence of a paleolake in Nankoweap Canyon is impressive, said John Spray, a planetary scientist at the University of New Brunswick in Canada not involved in the research. “They’re certainly close, if not contemporaneous.” And while it’s difficult to prove causation, the team’s assertion that an impact set landslides in motion in the area around Grand Canyon is convincing, he added. “I think the likelihood of it being responsible is very high.”

Karlstrom and his collaborators are continuing to collect more samples from caves in Grand Canyon National Park. So far, they’ve found additional evidence of material that dates to roughly 56,000 years ago, as well as even older samples. It seems that there might have been multiple generations of lakes in the Grand Canyon area, said Karlstrom. “The story is getting more complicated.”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), An asteroid impact may have led to flooding near the Grand Canyon, Eos, 106, https://doi.org/10.1029/2025EO250391. Published on 22 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Another landslide dam flood at the site of the Matai’an rock avalanche in Taiwan

Wed, 10/22/2025 - 06:59

Failure of the landslide debris from the Matai’an rock avalanche allowed another barrier lake to form. This breached on 21 October 2025, generating another damaging debris flow.

Newspapers in Taiwan are reporting that a new landslide barrier lake formed and then failed at the site of the giant Matai’an rock avalanche. The breach event apparently occurred at baout 9 pm local time on 21 October 2025. The risk had been identified in advance and the downstream population had been evacuated successfully this time, so there are no reports of fatalities.

The Taipei Times has an image of the barrier lake that was released by the Hualien branch of the Forestry and Nature Conservation Agency:-

The Matai’an landslide barrier lakes prior to the failure of the lower one on 21 October 2025. Photo courtesy of the Hualien branch of the Forestry and Nature Conservation Agency via the Taipei Times.

There is also a video on Youtube from Focus Taiwan (CNA English News) that includes helicopter footage of the site, also provided by the Forestry and Nature Conservation Agency:-

This includes the following still:-

The lower Matai’an landslide barrier lake prior to the failure on 21 October 2025. Still from a video posted to Youtube by CNA English News – original footage courtesy of the Hualien branch of the Forestry and Nature Conservation Agency.

It appears to me that the barrier lake has formed because of a large landslide in the debris from the original rock avalanche note the dark coloured landslide scar on the left side of the image.

Loyal readers will remember that I highlighted that this could be an issue in my post on 3 October:-

“So, a very interesting question will now pertain to the stability of these slopes. How will they perform in conditions of intense rainfall and/or earthquake shaking? Is there the potential for a substantial slope failure on either side, allowing a new (enlarged) lake to form.”

“This will need active monitoring (InSAR may well be ideal). The potential problems associated with the Matai’an landslide are most certainly not over yet.”

There is a high probability that this will be a recurring issue in periods of heavy rainfall.

Meanwhile, keep a close eye on Tropical Storm Melissa, which is tracking slowly northwards in the Caribbean. This could bring exceptionally high levels of rainfall to Haiti and Jamaica as it is moving very slowly. This one looks like a disaster in waiting at the moment.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

To Find Critical Minerals, Look to Plate Tectonics

Tue, 10/21/2025 - 13:31

For much of the 20th century, “petroleum politics” shaped international policy. In the 21st century, a new set of resources has taken center stage: critical minerals. Sourcing and extracting these minerals have become a priority for countries and communities around the world because they are used in everything from solar panels to cell phones to superconductors.

A new study suggests where prospectors can search for critical minerals: rifting sites left behind by the supercontinent Rodinia, which broke up in the Proterozoic, more than 800 million years ago.

“To better find [critical] resources, really, we need a better understanding of geology.”

“Unless it is grown, absolutely everything on the planet that we use as a manufactured good requires something that comes out of a mine,” said Chris Kirkland, a geologist at Curtin University in Australia and a coauthor of the new study, published last month in Geological Magazine. “To better find those resources, really, we need a better understanding of geology.”

Kirkland and his colleagues began by analyzing rocks unearthed by drilling companies in Western Australia. The slabs contain carbonatite, a “weird,” rare, and poorly understood kind of igneous rock formed in the mantle from magmas rich in carbonate minerals. As the magmas rise through Earth’s interior, they react with surrounding rocks, altering the chemical signatures that geologists typically use to trace a sample’s origins.

Carbonatites often contain rare earth elements, such as niobium. Although niobium can be found in different rocks, carbonatites are the only ones offering it in amounts economically suitable for extraction. The Western Australia sites are home to more than 200 million metric tons of the metal.

The team “threw the whole kitchen sink of analytical techniques” at the carbonatites, explained Kirkland. The first step was to take a drill core sample and image its structure to see the broad geological ingredients inside. Then the researchers used lasers to sample individual grains and piece out their crystals.

The carbonatites contained zircon, apatite, and mica, all crystals with isotopes that decay at known rates and can tell researchers about the sample’s age and source. The researchers also analyzed the helium present in zircon, because helium is a volatile element that easily escapes rocks near the surface and can help reveal when the rocks reached the crust.

Written in Stone

The story written in the slabs is one tied to the long history of plate tectonics. The breakup of Rodinia began around 800 million years ago and continued for millions of years as hot, metal-enriched oozes of magma rose up from the mantle. Pressure from this rising rock helped split apart the supercontinent, and the metals encased in carbonatites breached the surface at once-stable mounds of continental crust called cratons.

Today, said Kirkland, tracking these “old fossil scars” where cratons split could reveal stores of minerals.

More than 200 million metric tons of niobium were recently identified in Australia’s Aileron Province, a likely result of the breakup of Rodinia. Credit: Dröllner et al., 2025, https://doi.org/10.1017/S0016756825100204

“Reconstructing a geologic history for one particular area on Earth is something that I think has potential to help us in better understanding these pretty poorly understood carbonatite systems globally,” said Montana State University geologist Zachary Murguía Burton, who was not involved with the paper.

Burton estimates that some 20% of the carbonatites on Earth contain economically attractive concentrations of critical minerals, although he noted that the rocks in the study experienced a unique confluence of local and regional geologic processes that might influence the minerals they contain.

In particular, the carbonatites analyzed in the new study identified the source of recently discovered niobium deposits beneath central Australia. Niobium is a critical mineral used in lithium-ion batteries and to strengthen and lighten steel. Because 90% of today’s supply of niobium comes from a single operation in Brazil, finding additional deposits is a priority.

In addition to niobium, Kirkland said a geologic “recipe” similar to the one his team identified might work for finding gold.

The work is an important reminder of “how tiny minerals and clever dating techniques can not only solve deep-time geological puzzles, but also help guide the hunt for the critical metals we need,” Kirkland said.

—Hannah Richter (@hannah-richter.bsky.social), Science Writer

Citation: Richter, H. (2025), To find critical minerals, look to plate tectonics, Eos, 106, https://doi.org/10.1029/2025EO250393. Published on 21 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Seismic Anisotropy Reveals Deep-Mantle Dynamics

Tue, 10/21/2025 - 13:31
Source: Geochemistry, Geophysics, Geosystems

In some parts of Earth’s interior, seismic waves travel at different speeds depending on the direction in which they are moving through the layers of rock in Earth’s interior. This property is known as seismic anisotropy, and it can offer important information about how the silicate rock of the mantle—particularly at the mantle’s lowermost depths—deforms. In contrast, areas through which seismic waves travel at the same speed regardless of direction are considered isotropic.

In the bottom 300 kilometers of the mantle, also known as the D layer, anisotropy is potentially caused by mantle plumes or mantle flow interacting with the edges of large low-shear-velocity provinces: continent-sized, dense, hot BLOBs (big lower-mantle basal structures) at the base of the mantle above the core. Many questions persist about the viscosity, movement, stability, and shape of the BLOBS, as well as about how they can be influenced by mantle plumes and subduction.

Roy et al. used ASPECT, a 3D mantle convection modeling software, and ECOMAN, a mantle fabric simulation code, to examine the deep mantle. They tested five different mantle model configurations, adjusting the viscosity and density of the BLOBs. The goal was to see which configuration would most closely re-create the observed seismic anisotropy.

The researchers treated the BLOBs as regions with their own unique chemistry, which form from a 100-kilometer-thick layer at the bottom of the mantle. Their models simulated how mantle plumes formed over the past 250 million years, during which time events such as the breakup of Pangaea, the opening of the Atlantic, and the evolution of various subduction zones occurred.

The study suggests that the best explanation for observed seismic anisotropy is when the BLOBs are 2% denser and 100 times more viscous than the surrounding mantle. This aligns with observations of anisotropy patterns in seismic data. Plumes form mainly at the edges of BLOBs, where strong deformation causes strong anisotropy. (Geochemistry, Geophysics, Geosystems, https://doi.org/10.1029/2025GC012510, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2025), Seismic anisotropy reveals deep-mantle dynamics, Eos, 106, https://doi.org/10.1029/2025EO250392. Published on 21 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Alaska Awaits Response from FEMA in the Aftermath of Major Floods

Mon, 10/20/2025 - 16:45
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.

Major floods in Alaska have caused the death of at least one person and displaced thousands more over the course of the last two weeks. Many of the displaced may not be able to return home for 18 months or longer, according to Alaska Gov. Mike Dunleavy.

Tropical Storm Halong formed in the Northern Philippine Sea on 5 October, and had become a category 4 typhoon by 7 October. Though it was considered an ex-typhoon by the time it reached western Alaska, the storm brought wind speeds of up to 113 miles per hour (181 kilometers per hour), along with severe flooding across the Yukon Delta, Kuskokwim Delta, and Norton Sound.

 
Related

Among the hardest hit population centers were the villages of Kipnuk and Kwigillingok, home to a combined 1,000 people, mostly Alaska Native or American Indian. At this time of year, the remote villages can only be reached by water or by air.

In Kipnuk, water levels rose 5.9 feet (1.8 meters) above the normal highest tide line. In Kwigillingok, water levels measured 6.3 feet (1.9 meters) above the normal highest tide line—more than double the previous record set in 1990. According to a letter from the governor’s office to President Trump, 90% of structures in Kipnuk and 35% of structures in Kwigillingok have been destroyed.

The Alaska Air and Army National Guard, the U.S. Coast Guard, and Alaska State Troopers evacuated hundreds of residents to the regional hub of Bethel, then to the capital of Anchorage in what the Alaska National Guard called the state’s largest airlift operation in history.

“It’s been an all-hands-on deck endeavor, and everybody is trying to support their fellow Alaskans in their time of need,” said Col. Christy Brewer, the Alaska National Guard director of joint operations, in a 19 October statement.

Silence From FEMA

But calls for assistance from the Federal Emergency Management Agency seem to have so far gone unanswered, leaving some people asking, “Where is FEMA?”

An urgent question. According to the FEMA Daily Briefing a presidential disaster declaration was requested on October 16th. To the best of my knowledge it hasn’t been granted. Any event of this size should be an easy and immediate yes.

Dr. Samantha Montano (@samlmontano.bsky.social) 2025-10-18T23:13:44.421Z

As reported by the New York Times, the EPA revoked a $20 million grant in May that was intended to protect Kipnuk from extreme flooding. The grant cancellation was likely part of a larger effort by the administration to shift the burden of disaster response to states.

On 16 October, Dunleavy submitted a request to President Trump to declare a major disaster for the state.

The letter notes that Alaska has seen 57 state-declared disasters since November 2018, 14 of which have been approved for federal disaster assistance. There have been 14 state-declared disasters in Alaska in the last 12 months alone, including fires, freezes, landslides, and floods.

“It is anticipated that more than 1,500 Alaskans will be evacuated to our major cities, many of whom will not be able to return to their communities and homes for upwards of 18 months,” Gov. Dunleavy wrote. “This incident is of such magnitude and severity that an effective response exceeds state and local capabilities, necessitating supplementary federal assistance to save lives, protect property, public health, and safety, and mitigate the threat of further disaster.”

On 17 October, Alaska’s senators and state representative (all Republicans) also submitted a letter to President Trump, urging him to approve the governor’s request for a major disaster declaration.

Also on 17 October, Vice President JD Vance said on X that he and the president were “closely tracking the storm devastation,” and that the federal government was working closely with Alaska officials. On 18 October, Lisa Murkowski (R-AK) said she believed FEMA representatives were “totally on the ground.”

However, as of 20 October, the incident is not listed in FEMA’s disaster declaration database.

—Emily Gardner (@emfurd.bsky.social) Associate Editor

These updates are made possible through information from the scientific community. Do you have a story about science or scientists? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The Southern Ocean May Be Building Up a Massive Burp

Mon, 10/20/2025 - 13:16
Source: AGU Advances

The ocean has helped mitigate global warming by absorbing around a quarter of anthropogenic carbon dioxide (CO2) emissions, along with more than 90% of the excess heat those emissions generate.

Many efforts, including assessments by the Intergovernmental Panel on Climate Change, have looked at how the oceans may continue to mitigate increasing emissions and global warming. However, few have looked at the opposite: How will the oceans respond if emissions and associated atmospheric heat levels begin to decrease in response to net negative emissions?

Frenger et al. examined what might happen in the Southern Ocean if after more than a century of human-induced warming, global mean temperatures were to be reduced via CO2 removal from the atmosphere. The Southern Ocean is a dynamic system, with large-scale upwelling and a robust ability to take up excess carbon and heat. To better understand how the Southern Ocean will behave in net negative carbon conditions, the researchers modeled how the ocean and the atmosphere would interact.

They used the University of Victoria climate model, UVic v. 2.9, to simulate multicentury timescales and carbon cycle feedbacks. UVic uses a combination of an atmospheric energy–moisture balance model, an ocean circulation and sea ice model, a land biosphere model, and an ocean biochemistry model. The researchers used UVic to model an idealized climate change scenario commonly used in climate modeling: Emissions increase until atmospheric CO2 levels double after 70 years, followed by a steep emissions cut and subsequent sustained net negative emissions.

The results showed that after several centuries of net negative emissions levels and gradual global cooling, the Southern Ocean abruptly released a burst of accumulated heat—an oceanic “burp”—that led to a decadal- to centennial-scale period of warming. This warming was comparable to average historical anthropogenic warming rates. The team said that because of seawater’s unique chemistry, this burp released relatively little CO2 along with the heat.

Frenger and colleagues note that their work uses a model with intermediate-level complexity and an idealized climate change scenario, but that their findings were consistent when tested with other modeling setups. They say the Southern Ocean’s importance to the global climate system, including its role in heat release to the atmosphere in a cooling climate, should be studied further and contemporary changes closely monitored. (AGU Advances, https://doi.org/10.1029/2025AV001700, 2025)

—Sarah Derouin (@sarahderouin.com), Science Writer

Citation: Derouin, S. (2025), The Southern Ocean may be building up a massive burp, Eos, 106, https://doi.org/10.1029/2025EO250385. Published on 20 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Publishing Participatory Science: The Community Science Exchange

Mon, 10/20/2025 - 12:00
Editors’ Vox is a blog from AGU’s Publications Department.

The Community Science Exchange was founded in 2021 to elevate the work of scientists, scholars and community members collectively engaged in participatory science and to broaden the reach of their discoveries, results and science-based solutions. Now more than ever, we would like to recognize the importance of the work of the Community Science Exchange in fostering an inclusive scientific community and strengthening public trust in science. Here, we highlight the publication outlets offered by the Community Science Exchange and encourage the AGU community to contribute.     

The Community Science Exchange aims to encourage, foster, and promote co-production between science and community.

Within equitable participatory science, or a collective scientific endeavor giving significant voice and weight to both science and publics, the Community Science Exchange defines “community” variously as place-based, a group defined by a shared culture or heritage, and/or a group defined by a shared experience. From environmental concerns to public health, anthropology to engineering, the Community Science Exchange aims to encourage, foster, and promote co-production between science and community. To aid in the integration of local knowledge and lived experience, the Community Science Exchange specifically includes community voice in its publications: as authors, in sections devoted to community description and community impact, and in quotes from community members involved in and/or affected by the work. Scientists and academic scholars with an interest in elevating their community partners within their publications instead of hiding them in an acknowledgment should consider publication within the Exchange.

The American Geophysical Union hosts the Community Science Exchange with further support and guidance from five partnership organizations: the American Anthropological Association (AAA), the American Public Health Association (APHA), the Association for Advancing Participatory Sciences (AAPS), the Unión Geofísica Mexicana (UGM), and Wiley. To broaden the publication venues for community members and organizations, practitioners, boundary spanners, and others who may not receive career benefits from scientific journal publication, the Community Science Exchange has created two new avenues for those who want to publish and share their work: the journal Community Science and the online publication venue managed by AGU, the Hub.

Since its first issue in June 2022, Community Science has published articles discussing a variety of topics of interest to communities and scientists, including water quality, plastic pollution, language as a barrier to equitable access to scientific literature, and integration of Indigenous knowledge in shellfish monitoring. Community Science has also participated in several special collections, including on air quality, equitable co-production, and sustainable agriculture. Growing steadily in submissions, Community Science received the PROSE Award for Journals from the Princeton University Press in 2024. The journal is open access, allowing anyone to read the work published for free.

As a peer-reviewed journal, manuscripts go through an evaluation and revision process to ensure that research published in the journal rigorously advances both science and community outcomes. Like the other journals within the AGU journal portfolio, those who review for Community Science are welcome to invite a co-reviewer. This endeavor can help early career researchers to become thorough and constructive reviewers, and can invite experienced community organizers, boundary spanners and those with relevant lived expertise to engage in thoughtful reviews complementary to scientific review. Publications in both Community Science and the Hub are periodically featured in Editor’s Highlights, in which editors explain what they found exciting about a work, or in Research Spotlights, which are written by Eos’ professional science writers and feature recent newsworthy work. These features offer a more approachable point of entry to explore the science.

Unlike any other journal in the AGU portfolio, the Community Science Exchange also supports an alternate publication venue – the Hub – which is hosted on the Community Science Exchange website. Broadening the definition and understanding of scientific research, work, and resources, the Hub seeks to deepen the connection between science and community.

The Hub is home to a wide variety of content, including stand-alone submissions that are intentionally written outside of the strictures of a scientific journal format.

The Hub is home to a wide variety of content, including stand-alone submissions that are intentionally written outside of the strictures of a scientific journal format, to “complementary materials” allowing journal paper authors to enrich their articles with linked materials furthering community voice. Although the Hub isn’t a scholarly journal in the traditional sense, all submissions are editor-vetted before potential revision and publication. Any new, original content published on the Hub is now eligible to receive a permanent digital object identifier (DOI) allowing it to be cited in the references of scholarly publications and other content.

Authors can submit materials to the Hub that fall into one of four categories:

Project Descriptions are narratives of work done, or even more formalized case studies. They should include a description of the community involved, an explanation of the community knowledge utilized, and a summary of the work done. Example: Climate Safe Neighborhoods [Project Description] (doi.org/10.1029/2024CSE000101)

Protocols and Methods are for describing how the community science work was done. These could be practiced approaches, descriptions of relevant policies to be considered, or outlines of project development.

Tools and Resources are items that can help others along on their own community science work, such as datasets or visualization tools. Even descriptions of useful apps that would be helpful would be welcome.

Educational Materials are items geared toward educating or training about community science practices. These could include instruction manuals, guidebooks, or even workshop or webinar curricula.

Because the Hub is a living initiative, evolving with the needs and desires of the community, submissions that don’t cleanly fit into any one of these categories will still be considered.

If you are interested in joining in the Community Science Exchange’s efforts to expand how we view, publish, and share science, please email us at communitysci@agu.org. Whether you have a resource to submit to the Hub, an article to submit to the journal, want to be a reviewer, or even want to apply to be an editor – we’d love to hear from you.

Finally, we want to thank all of those who have served as editors of this initiative so far, both past and present (starred are original editorial board members):

  • Julia Parrish*, current Editor-in-Chief
  • Kathryn Semmens*, current Deputy Editor of the Hub
  • Claire Beveridge*, current editor
  • Gillian Bowser, current editor
  • Muki Haklay*, current editor
  • Rajul Pandya, current editor
  • Jean Schensul*, founding Deputy Editor, current editor
  • Kevin Noone*, founding Editor-in-Chief, past editor
  • Paula Buchanan*, founding Deputy Editor, past editor
  • Shobhana Gupta*, past editor
  • Heidi Roop*, past editor
  • Roopam Shukla*, past editor

—Allison Schuette (aschuette@agu.org, 0009-0007-1055-0937), Program Coordinator, AGU Publications; Julia Parrish (0000-0002-2410-3982), Editor-in-Chief, Community Science Exchange; Kathryn Semmens (0000-0002-8822-3043), Deputy Editor, The Hub; Kristina Vrouwenvelder (0000-0002-5862-2502), Assistant Director, AGU Publications; and Sarah Dedej (0000-0003-3952-4250), Assistant Director, AGU Publications

Citation: Schuette, A., J. Parrish, K. Semmens, K. Vrouwenvelder, and S. Dedej (2025), Publishing participatory science: the Community Science Exchange, Eos, 106, https://doi.org/10.1029/2025EO255032. Published on 20 October 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Universities Reject Trump Funding Deal

Fri, 10/17/2025 - 16:09
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

The “Compact for Academic Excellence in Higher Education,” developed by the Trump administration and sent to nine universities on 1 October, proposes that the institutions agree to a series of criteria in exchange for preferential treatment in funding decisions.

The compact’s provisions ask universities to: 

  • Ban the consideration of any demographic factors, including sex, ethnicity, race, sexual orientation, and religion in any admissions decisions, financial aid decisions, or hiring decisions.
  • Commit to “institutional neutrality,” create an “intellectually open campus environment,” and abolish “institutional units that purposefully punish, belittle, and even spark violence against conservative ideas.”
  • Require all employees to abstain from actions or speech related to social and political events unless such events have a direct impact on their university or they are acting in their individual capacity rather than as university representatives. 
  • Interpret the words “woman,” and “man” according to “reproductive function and biological processes.”
  • Stop charging tuition for any admitted student pursuing “hard science” programs. (This only applies for universities with endowments over $2 million per undergraduate student.)
  • Disclose foreign funding and gifts.
Compact-for-Academic-Excellence-in-Higher-Education-10.1Download

The proposed deal was sent to the University of Pennsylvania, the University of Virginia, the University of Arizona, the University of Texas at Austin, the University of Southern California, Vanderbilt University, Dartmouth University, Brown University, and the Massachusetts Institute of Technology. 

 
Related

“Any university that refuses this once-in-a-lifetime opportunity to transform higher education isn’t serving its students or their parents—they’re bowing to radical, left-wing bureaucrats,” Liz Huston, a White House spokesperson, told Bloomberg

Simon Marginson, a professor of higher education at Oxford University, told Time that if successful, the compact would “establish a level of federal control of the national mind that has never been seen before.” 

On 12 October, President Trump opened up the offer to all institutions of higher education in a post on social media website Truth Social.

As of 20 October, the following schools have responded to Trump’s offer:

  • Massachusetts Institute of Technology: MIT was the first to reject Trump’s offer. In a 10 October letter to the administration, MIT President Sally Kornbluth wrote that MIT’s practices “meet or exceed many standards outlined in the document,” but that the compact “also includes principles with which we disagree, including those that would restrict freedom of expression and our independence as an institution.”
  • Brown University: In a 15 October letter to the administration, Brown University President Christina H. Paxson declined the deal. She wrote that Brown “would work with the government to find solutions if there were concerns about the way the University fulfills its academic mission,” but that, like Kornbluth, she was “concerned that the Compact by its nature and by various provisions would restrict academic freedom and undermine the autonomy of Brown’s governance.”
  • University of Southern California: In a 16 October statement, USC Interim President Beong-Soo Kim informed the university community that he had declined the deal, and wrote that the university takes legal obligations seriously and is diligently working to streamline administrative functions, control tuition rates, maintain academic rigor, and ensure that students develop critical thinking skills. “Even though the Compact would be voluntary, tying research benefits to it would, over time, undermine the same values of free inquiry and academic excellence that the Compact seeks to promote,” he wrote.
  • University of Pennsylvania: In a 16 October statement, UPenn President J. Larry Jameson informed the university community that he had declined to sign the compact. “At Penn, we are committed to merit-based achievement and accountability. The long-standing partnership between American higher education and the federal government has greatly benefited society and our nation. Shared goals and investment in talent and ideas will turn possibility into progress,” he wrote.
  • University of Virginia: In a 17 October letter to the administration, UVA Interim President Paul Mahoney declined to sign the compact. “We seek no special treatment in exchange for our pursuit of those foundational goals,” the letter said. “The integrity of science and other academic work requires merit-based assessment of research and scholarship. A contractual arrangement predicating assessment on anything other than merit will undermine the integrity of vital, sometimes lifesaving, research and further erode confidence in American higher education.”
  • Dartmouth University: In a 18 October letter to the administration, Dartmouth President Sian Leah Beilock declined the deal. “I do not believe that the involvement of the government through a compact—whether it is a Republican- or Democratic-led White House—is the right way to focus America’s leading colleges and universities on their teaching and research mission,” Beilock wrote.
  • University of Arizona: In a 20 October announcement, President Suresh Garimella said he had declined to agree to the proposal and had instead submitted a Statement of Principles to the U.S. Department of Education informed by “hundreds of U of A stakeholders and partner organizations.” “This response is our contribution toward a national conversation about the future relationship between universities and the federal government. It is critical for the University of Arizona to take an active role in this discussion and to work toward maintaining a strong relationship with the federal government while staying true to our principles,” Garimella wrote.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

20 October: This article was updated to include the University of Virginia and Dartmouth University.

21 October: This article was updated to include the University of Arizona.

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

When the Earth Moves: 25 Years of Probabilistic Fault Displacement Hazards

Fri, 10/17/2025 - 16:08
Editors’ Vox is a blog from AGU’s Publications Department.

Earthquake surface ruptures can cause severe damage to infrastructure and, while preventative measures can be taken to allow the structures to adapt in the case of an earthquake, one of the best methods is to avoid unnecessary risks in the first place.

A new article in Reviews of Geophysics explores the history of Probabilistic Fault Displacement Hazard Assessments (PFDHA) and recent efforts to improve them with modern methods. Here, we asked the authors to give an overview of PFDHAs, how scientists’ methods have evolved over time, and future research directions.

What is fault displacement and what kinds of risks are associated with it?

Fault displacement occurs when an earthquake breaks the ground surface along a fault. This displacement along the fault can shift the ground horizontally and/or vertically, by several meters for the largest earthquakes. Such ruptures pose serious risks to infrastructures located across faults—such as pipelines, transportation systems, dams and power generation facilities—because they may be torn apart or severely damaged. While some facilities can be engineered to tolerate limited movements, many critical systems are highly vulnerable, making it essential to evaluate this hazard.

This figure shows the Trans-Alaska Pipeline crossing the Denali Fault, which ruptured during the 2002 earthquake. Photos and diagrams illustrate how the pipeline was designed to bend and slide, allowing it to survive several meters of fault movement without breaking. Credit: Valentini et al. [2025], Figure 5

In simple terms, what are Probabilistic Fault Displacement Hazard Assessments (PFDHA)?

A Probabilistic Fault Displacement Hazard Assessment (PFDHA) is a quantitative analysis based on a method that estimates the likelihood that an earthquake will rupture the surface at a specific site and anticipate the magnitude of the displacement. Instead of giving a single answer, PFDHA provides probabilities of different displacement levels for different reference periods of interest. This allows engineers and planners to evaluate risks in a structured way and make informed decisions about building designs or land use near faults.

This diagram explains how scientists estimate the expected amount of displacement due to an earthquake and at a specific site. It shows the main steps and data used in a Probabilistic Fault Displacement Hazard Assessment (PFDHA). Credit: Valentini et al. [2025], Figure 8

How have Fault Displacement Hazard Assessments evolved over time?

The first systematic PFDHA was developed in the early 2000s for the Yucca Mountain nuclear waste repository in the USA. Since then, the methodology has expanded from normal faults to include strike-slip and reverse faults worldwide. Over the past 25 years, new global databases of surface ruptures supporting statistical analysis, advances in statistical modeling, and international benchmark exercises have significantly improved the reliability and comparability of PFDHA approaches. In the future, the field should integrate remote sensing data, artificial intelligence, and physics-based modeling to better capture the complexity of earthquake ruptures.

What are the societal benefits of developing PFDHAs?

By quantifying the hazard of surface fault rupture, PFDHAs provide critical input for the safe design of infrastructures. This helps to avoid catastrophic failures such as pipeline leaks, dam collapses and resulting flooding, or road and railway disruption. Beyond engineering, PFDHAs also support land-use planning by identifying areas where construction should be avoided. Ultimately, these assessments reduce economic losses, improve resilience, and protect human lives in earthquake-prone regions.

What are some real-life examples of PFDHAs being developed and implemented?

One of the earliest and most influential applications was at Yucca Mountain, Nevada, where PFDHA helped assess the safety of a proposed nuclear waste repository. More recently, PFDHA approaches have been adopted internationally, including in Japan and Italy, for assessing risks to dams, tunnels, and other critical infrastructure.

What are some of the most exciting recent developments in this field?

These photos show how earthquakes can damage critical infrastructure such as bridges, dams, railways, and pipelines. The images highlight both principal and distributed fault ruptures, underscoring why engineers and planners must consider both when assessing earthquake hazards. Credit: Valentini et al. [2025], Figure 4

Recent years have seen major advances thanks to new global databases such as the worldwide and unified database of surface ruptures (SURE) and the Fault Displacement Hazard Initiative (FDHI), which collect tens of thousands of observations of past surface ruptures. Remote sensing techniques now allow scientists to map fault ruptures with unprecedented detail. Importantly, these techniques have also awakened the geological and seismological community to the relevance of moderate earthquakes. Since the 2000s and 2010s, it has become clear that earthquakes smaller than magnitude 6.5 can also produce significant surface ruptures, a threat that was often overlooked before these technological advances. Additionally, international collaborations, such as the International Atomic Energy Agency benchmark project, are helping to unify approaches and ensure that PFDHAs are robust and reproducible across different regions.

What are the major unsolved or unresolved questions and where are additional research, data, or modeling efforts needed?

Several challenges remain. A key issue is the limited number of well-documented earthquakes outside North America and Japan, leaving other regions underrepresented in global databases. Another challenge is how to model complex, multi-fault ruptures, which are increasingly observed in large earthquakes. Understanding the controls on off-fault deformation, as revealed by modern geodetic techniques during large to moderate events, is another critical open question. This knowledge could improve our ability to predict rupture patterns and displacement amounts.

Similarly, the role of near-surface geology in controlling the location, size, and distribution of surface ruptures for a given earthquake magnitude remains poorly constrained and deserves further study. Standardizing terminology and methods is also essential for consistent hazard assessments. Looking forward, more high-quality data, integration of physics-based models, and improved computational frameworks will be crucial to advance the field.


—A. Valentini (alessandro.valentini@univie.ac.at, 0000-0001-5149-2090), University of Vienna, Austria; Francesco Visini (0000-0001-9582-6443), Istituto Nazionale di Geofisica e Vulcanologia, Italy; Paolo Boncio (0000-0002-4129-5779),  Università degli Studi “G. d’Annunzio,” Italy; Oona Scotti (0000-0002-6640-9090), Autorité de Sureté Nucléaire et de Radioprotection, France; and Stéphane Baize (0000-0002-7656-1790), Autorité de Sureté Nucléaire et de Radioprotection, France

Editor’s Note: It is the policy of AGU Publications to invite the authors of articles published in Reviews of Geophysics to write a summary for Eos Editors’ Vox.

Citation: Valentini, A., F. Visini, P. Boncio, O. Scotti, and S. Baize (2025), When the earth moves: 25 years of probabilistic fault displacement hazards, Eos, 106, https://doi.org/10.1029/2025EO255033. Published on 17 October 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Scientists Must Join Forces to Solve Forecasting’s Predictability Desert

Fri, 10/17/2025 - 11:55

Should I wear a jacket to work today, or will I be too warm? Will that hurricane miss my town, or should I prepare to evacuate? We rely on accurate short-term weather forecasts both to make mundane daily decisions and to warn us of extreme events on the horizon. At the same time, Earth system scientists focus on understanding what drives variations in temperature, precipitation, and extreme conditions over periods spanning months, decades, and longer.

Between those two ends of the forecasting spectrum are subseasonal-to-seasonal (S2S) predictions on timescales of 2 weeks to 2 months. S2S forecasts bridge the gap between short-term weather forecasts and long-range outlooks and hold enormous potential for supporting effective advance decisionmaking across sectors ranging from water and agriculture to energy, disaster preparedness, and more. Yet these timescales represent an underdeveloped scientific frontier where our predictive capabilities are weakest. Indeed, the S2S range is often referred to as the predictability desert.

Forecasts at 3- to 4-week lead times, for example, remain inconsistent. Sometimes, so-called windows of opportunity arise when models provide strikingly accurate, or skillful, guidance at this timescale. But these windows of skillful S2S forecasting are themselves unpredictable. Why do they occur when they do? Do they have recognizable precursors? And how does predictability depend on the quantity (e.g., temperature versus precipitation) being predicted?

Three interlocking puzzle pieces represent the integration of weather prediction (left) and long-term outlooks (right) with the “missing middle” of S2S predictability (center). The center piece highlights key applications—agriculture, water availability, and disaster preparedness—and the tools needed to advance S2S skill, including modeling, data assimilation (DA), artificial intelligence (AI), and multiscale process understanding. Credit: Simmi Readle/NSF NCAR

These questions are more than academic curiosities. Answering them would transform our ability to gauge the value of S2S forecasts in real time and to anticipate and respond to high-impact events such as heat waves, flooding rains, drought onset, and wildfires.

Tackling this challenge requires traditionally siloed communities—scientists focused on predicting near-term weather and those focused on projecting long-term changes in the Earth system—to coordinate efforts. Together, these communities can advance scientific understanding and predictive capabilities across scales.

Discovering Windows of Opportunity

The challenges of subseasonal-to-seasonal (S2S) prediction reflect the complex and interconnected dynamics of the Earth system.

The challenges of S2S prediction reflect the complex and interconnected dynamics of the Earth system. At these lead times, forecast skill relies not only on the accuracy of initial input atmospheric conditions—always a vital element for weather forecasts—but also on model treatments of slowly evolving components of the Earth system. These components—including the ocean state, land surface conditions, snow cover, atmospheric composition, and large-scale patterns of variability such as the Madden-Julian Oscillation (MJO), El Niño–Southern Oscillation, stratospheric quasi-biennial oscillation, and sudden stratospheric warmings—interact in ways that enhance or degrade forecast performance. Volcanic eruptions can further influence these interactions, altering circulation patterns and modulating surface climate on S2S timescales.

Researchers have made substantial progress in understanding these individual Earth system components. But we still cannot reliably anticipate when models will yield skillful forecasts because their accuracy at S2S timescales is episodic and state dependent, meaning it comes and goes and depends on various interacting conditions at any given time. A model might perform well for a given region in one season—yielding a window of opportunity—but struggle in another region or season.

So how might we get better at anticipating such windows? For starters, rather than viewing the predictive capability of models as fixed, we can treat it as a dynamic property that changes depending on evolving system conditions. This paradigm shift could help scientists focus on developing tools and metrics that help them anticipate when forecasts will be most reliable. It could also suggest a need to rethink strategies for collecting environmental observations.

Just as predictability is episodic, so too might be the value of strategically enhanced observations. For example, targeted observations of sea surface temperatures, soil moisture, or atmospheric circulation during periods when these conditions strongly influence forecast skill could be far more valuable than the same measurements made at other times. Such adaptive, or state-aware, observing strategies, say, intensifying atmospheric sampling ahead of a developing MJO event, would mean concentrating resources where and when they will matter most. By feeding these strategically enhanced observations into forecast models, scientists could improve both the forecasts themselves and the ability to evaluate their reliability.

Aligning Goals Across Disciplines

S2S timescales fall at the intersection of weather forecasts and seasonal to decadal outlooks, and the communities working on those different types of predictions have different focuses and research strengths.

To drive needed technical advances supporting improved S2S predictability, we also need a cultural shift to remove barriers between scientific disciplines. S2S timescales fall at the intersection of weather forecasts and seasonal to decadal outlooks, and the communities working on those different types of predictions have different focuses and research strengths. Weather prediction emphasizes initial condition accuracy, data assimilation, and high-resolution modeling of fast atmospheric processes. Studying Earth system behavior and variability over longer timescales focuses on modeling slowly evolving boundary conditions (e.g., the ocean) and coupled component interactions (e.g., between the land and the atmosphere).

Historically, these communities have operated along parallel tracks, each with its own institutions, funding structures, and research priorities. The challenge of identifying windows of opportunity at S2S timescales offers a unifying scientific problem.

Earth system features that offer potentially promising signals of S2S predictability, such as the MJO, are already shared terrain, studied through the lenses of both weather and longer-term change. Extreme events are another area of convergence: Weather models focus on forecasting specific short-lived, high-impact events, whereas Earth system models explore the conditions and teleconnections that influence the likelihood and persistence of extremes. Together, these complementary perspectives can illuminate not only what might happen but why and when skillful forecasts are possible.

The path to unlocking S2S predictability involves more than simply blending models, though. It requires aligning the communities’ scientific goals, model performance evaluation strategies, and approaches for dealing with uncertainty. These approaches include the design of model ensembles, data assimilation strategies that quantify uncertainty in initial conditions, probabilistic evaluation methods, and ways of communicating forecast confidence to users.

The path forward also entails building modeling systems that capitalize on the weather community’s expertise in initialization and the Earth system modeling community’s insights into boundary forcing and component coupling. Accurate initialization must capture all Earth system components—from soil moisture, ocean heat content, and snow cover, for example, to the state of the atmosphere, including the stratosphere. However, observations and data assimilation for several key variables, especially in the ocean, stratosphere, and other data-sparse regions, remain limited, constraining our ability to represent their influences in prediction systems.

A near-term opportunity for aligning goals and developing models lies in improving prediction of MJO-related extreme rainfall events, which arise from tropical ocean–atmosphere interactions and influence regional circulation and precipitation. This improvement will require that atmospheric convection be better represented in models, a long-standing challenge in both communities.

Emerging kilometer-scale models and machine learning offer shared innovation and collaboration spaces. Kilometer-scale models can explicitly resolve convection, validate and refine model parameterizations, and elucidate interactions between large-scale circulation and small-scale processes. Machine learning provides new avenues to emulate convection-permitting simulations, represent unresolved processes, and reduce systematic model errors.

Success with this challenge could yield immediate value for science and decisionmaking by, for example, enabling earlier warnings for flood-prone areas and supporting more informed planting and irrigation decisions in agriculture.

From Forecast Skill to Societal Resilience

The societal need for more skillful S2S prediction is urgent and growing. Communities worldwide are increasingly vulnerable to extreme conditions whose impacts unfold on weekly to monthly timescales. In scenarios such as a prolonged dry spell that turns into drought, a sudden warming trend that amplifies wildfire risk, or a stalled precipitation pattern that leads to flooding, insights from S2S forecasting could provide foresight and opportunities to prepare in affected areas.

Officials overseeing water management, energy planning, public health, agriculture, and emergency response are all seeking more reliable guidance for S2S time frames. In many cases, forecasts providing a few additional weeks of lead time could enable more efficient resource allocation, preparedness actions, and adaptation strategies. Imagine if forecasts could reliably indicate prolonged heat waves 3–4 weeks in advance. Energy providers could prepare for surges in cooling demand, public health officials could implement heat safety campaigns, and farmers could adjust planting or irrigation schedules to reduce losses.

The resilience of infrastructure, ecosystems, and economies hinges on knowing not only what might happen but also when we can trust our forecasts. By focusing on understanding when and where we have windows of opportunity with S2S modeling, we open the door to developing new, intermediate-term forecasting systems that are both skillful and useful—forecast systems that communicate confidence dynamically and inform real-world decisions with nuance.

Realizing this vision will require alignment of research priorities and investments. S2S forecasting and modeling efforts have often fallen between the traditional mandates of agencies concerned with either weather or longer-term outlooks. As a result, the research and operational efforts of these communities have not always been coordinated or sustained at the scale required to drive progress.

Coordination and Collaboration

With growing public attention on maintaining economic competitiveness internationally and building disaster resilience, S2S prediction represents an untapped opportunity space. And as machine learning and artificial intelligence offer new ways to explore predictability with models and to extract meaningful patterns from model outputs, now is the time to advance the needed coordination.

The many use cases for S2S prediction underscore that it isn’t just a scientific challenge, it’s a strategy for achieving resilience and prosperity.

The many use cases for S2S prediction underscore that it isn’t just a scientific challenge, it’s a strategy for achieving resilience and prosperity. We call on a variety of communities and enterprises to collaborate and rally around the challenge of illuminating windows of opportunity in S2S modeling.

Scientists from traditionally distinct disciplines should codesign research strategies to jointly investigate when, where, and why S2S skill emerges. For example, they could examine weather regimes (e.g., the Pacific or Alaska ridges) and their links to modes of variability (e.g., the North Atlantic Oscillation) and leverage data assimilation to better understand how these phenomena evolve across timescales.

The scientific community could also identify and evaluate critical observational gaps that limit progress in modeling and data assimilation. And they could develop strategies to implement adaptive observing approaches that, for example, target soil moisture, surface energy fluxes, and boundary layer profiles to better capture land-atmosphere interactions at S2S timescales. Such approaches would help to fill gaps and advance understanding of key Earth system processes.

Modeling centers could build flexible prediction systems that allow for advanced data assimilation and incorporate robust coupling of Earth system components—drawing from the weather and Earth system modeling communities, respectively—to explore how initial conditions and boundary forcing jointly influence S2S skill. Using modular components—self-contained pieces of code that represent individual Earth system processes, such as atmospheric aerosols and dynamic vegetation—within these systems could help isolate sources of predictability and improve process-level understanding.

To sustain progress initiated by scientists and modeling centers, agencies and funders must recognize S2S prediction as a distinct priority and commit to investing in the needed modeling, observations, and institutional coordination.

Furthermore, it’s essential that scientists, decisionmakers, and end users codevelop forecast tools and information. Close integration among these groups would focus scientific innovation on user-defined needs of what is useful and actionable, allowing scientists to build tools that meet those needs.

S2S forecasting may never deliver consistent skill across all timescales and regions, but knowing when and where it is skillful could make it profoundly powerful for anticipating high-impact hazards. Can we reliably predict windows of opportunity to help solve the predictability desert? Let’s do the work together to find out.

Author Information

Jadwiga H. Richter (jrichter@ucar.edu) and Everette Joseph, National Science Foundation National Center for Atmospheric Research, Boulder, Colo.

Citation: Richter, J. H., and E. Joseph (2025), Scientists must join forces to solve forecasting’s predictability desert, Eos, 106, https://doi.org/10.1029/2025EO250389. Published on 17 October 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Flash, a Boom, a New Microbe Habitat

Fri, 10/17/2025 - 11:54

A sizable asteroid impact generally obliterates anything alive nearby. But the aftermath of such a cataclysm can actually function like an incubator for life. Researchers studying a Finnish impact structure found minerals whose chemistry implies that microbes were present roughly 4 million years after the impact. These findings, which were published in Nature Communications last month, shed light on how rapidly microscopic life colonizes a site after an asteroid impact.

A Special Lake

Finland is known for its myriad lakes used by boaters, fishers, swimmers, and other outdoor afficionados. Lake Lappajärvi is a particularly special Finnish lake with a storied past: Its basin was created roughly 78 million years ago when an asteroid slammed into the planet. In 2024, the United Nations Educational, Scientific and Cultural Organization (UNESCO) established a geopark in South Ostrobothnia, Finland, dedicated to preserving and sharing the history of the 23-kilometer-diameter lake and the surrounding region.

“It’s one of the places where you think that life could have started.”

Jacob Gustafsson, a geoscientist at Linnaeus University in Kalmar, Sweden, and his colleagues recently analyzed a collection of rocks unearthed from deep beneath Lake Lappajärvi. The team’s goal was to better understand how rapidly microbial life colonized the site after the sterilizing impact, which heated the surrounding rock to around 2,000°C (3,632°F).

There’s an analogue between this type of work and studies of the origin of life, said Henrik Drake, a geochemist at Linnaeus University and a member of the team. That’s because a fresh impact site contains a slew of temperature and chemical gradients and no shortage of shattered rocks with nooks and crannies for tiny life-forms. A similar environment beyond Earth would be a logical place for life to arise, Drake said. “It’s one of the places where you think that life could have started.”

Microbe-Sculpted Minerals

In 2022, Gustafsson and his collaborators traveled to Finland to visit the National Drill Core Archive of the Geological Survey of Finland.

There, in the rural municipality of Loppi, the team pored over sections of cores drilled from beneath Lake Lappajärvi in the 1980s and 1990s. The researchers selected 33 intervals of core that were fractured or shot through with holes. The goal was to find calcite or pyrite crystals that had formed in those interstices as they were washed with mineral-rich fluids.

“It’s amazing what we can find out in tiny crystals.”

The team used tweezers to pick out individual calcite and pyrite crystals from the cores. Gustafsson and his collaborators then estimated the ages of those crystals using uranium-lead dating and a technique known as secondary ion mass spectrometry to calculate the ratios of various carbon, oxygen, and sulfur isotopes within them. Because microbes preferentially take up certain isotopes, measuring the isotopic ratios preserved in minerals can reveal the presence of long-ago microbial activity and even identify types of microbes. “We see the products of the microbial process,” Drake said.

“It’s amazing what we can find out in tiny crystals,” Gustafsson added.

The researchers also used isotopic ratios of carbon, oxygen, and sulfur to estimate local groundwater temperatures in the distant past. By combining their age and temperature estimates, the team could trace how the Lake Lappajärvi impact site cooled over time.

A Slow Cool

Groundwater temperatures at Lake Lappajärvi had cooled to around 50°C (122°F) roughly 4 million years after the impact, the team found. That’s a far slower cooling rate than has been inferred for other similarly sized impact craters, such as Ries Crater in Germany, in which hydrothermal activity ceased after about 250,000 years, and Haughton Crater in Canada, where such activity lasted only about 50,000 years.

“Four million years is a very long time,” said Teemu Öhman, an impact geologist at the Impact Crater Lake–Lappajärvi UNESCO Global Geopark in South Ostrobothnia, Finland, not involved in the research. “If you compare Lappajärvi with Ries or Haughton, which are the same size, they cooled way, way, way faster.”

That difference is likely due to the type of rocks that predominate at the Lappajärvi impact site, Gustafsson and his collaborators proposed. For starters, there’s only a relatively thin layer of sedimentary rock at the surface. “Sedimentary rocks often don’t fully melt during impact because of their inherent water and carbon dioxide content,” Drake explained. And Lappajärvi has a thick layer of bedrock (including granites and gneisses), which would have melted in the impact, sending temperatures surging to around 2,000°C, earlier research estimated.

About 4 million years after the impact is also when microbial activity in the crater began, according to Gustafsson and his collaborators. Those ancient microbes were likely converting sulfate into sulfide, the team proposed. And roughly 10 million years later, when temperatures had fallen to around 30°C (86°F), methane-producing microbes appeared, the researchers surmised on the basis of their isotopic analysis of calcite.

In the future, Gustafsson and his colleagues plan to study other Finnish impact craters and look for similar microbial features in smaller and older impact structures. In the meantime, the team is carefully packaging up their material from the Lappajärvi site. It’s time to return the core samples to the Geological Survey of Finland, Drake said. “Now we need to ship them back.”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), A flash, a boom, a new microbe habitat, Eos, 106, https://doi.org/10.1029/2025EO250388. Published on 17 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Tectonics and Climate Are Shaping an Alaskan Ecosystem

Thu, 10/16/2025 - 13:24
Source: AGU Advances

Increased warming in high-latitude wetlands seems poised to increase the activity of methanogens, or methane-producing microbes. These ecosystems are complex places, however, making outcomes hard to predict.

In new biogeochemical research taking into account tectonic, climatic, and ecological factors affecting the Copper River Delta in Alaska, Buser-Young et al. found that seismic uplift and glacial meltwater have each contributed to changes in microbial metabolism, with the surprising effect of potentially decreasing methane production.

The Copper River Delta in south central Alaska has a history of large seismic events. That includes, most recently, a 1964 earthquake that lifted portions of the delta to up to 3.4 meters above sea level, turning much of it from a marine environment to a freshwater one. In more recent decades, increasing amounts of iron-rich glacial runoff have also begun flowing through the delta, the result of climate change.

Combining geochemical studies of sediment cores from six wetland locations in the delta with metagenomic analyses of the microbes in the cores, the authors documented a distinct shift in microbial metabolism. Though genes for methanogenesis are still prevalent, and organic matter is available, they found that in an increasingly freshwater, iron-rich environment, the dominant means of energy production among the microbes shifted to involve iron cycling. Their findings are a demonstration of the ways large-scale geological and climatic shifts can affect small-scale processes such as the dynamics of microbial communities.

Looking ahead, the researchers say analyzing deeper sediment core samples could provide more information about how microbial dynamics have changed over time. In addition, they say, further culture-based experiments could improve understanding of the relationships between iron and organic matter within the carbon cycle. (AGU Advances, https://doi.org/10.1029/2025AV001821, 2025)

—Nathaniel Scharping (@nathanielscharp), Science Writer

Citation: Scharping, N. (2025), Tectonics and climate are shaping an Alaskan ecosystem, Eos, 106, https://doi.org/10.1029/2025EO250387. Published on 16 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Scientists Tune In to the Ocean’s Sound Waves

Thu, 10/16/2025 - 13:23

“It’s a good thing we can’t hear it with our ears. Otherwise, we’d just have this constant din from the oceans.”

The steady thrumming of crashing waves is the ocean’s soundtrack. But behind that calming rhythm is a host of hidden chaotic sound waves, most of which are too low in frequency for humans to hear. This acoustic energy travels as infrasound through the air and as seismic waves through the ground. “It’s a good thing we can’t hear it with our ears,” said Stephen Arrowsmith, a geoscientist at Southern Methodist University in Texas. “Otherwise, we’d just have this constant din from the oceans.”

Recently, scientists developed a new method to monitor surf’s acoustic and seismic signatures to identify individual breaking waves within the noise. The findings could allow for new methods for monitoring sea conditions from land and even provide insights into conditions in the upper atmosphere.

A Signal in the Noise

Scientists first discovered surf-generated infrasound more than 20 years ago. One study, led by Arrowsmith, even detected infrasound more than 124 miles (200 kilometers) inland. While the number of such studies has slowed over the past decade, researchers at the University of California, Santa Barbara (UC Santa Barbara), who typically study volcano seismology realized they were well positioned to contribute to surf infrasound research. “We have the proximity to the coastline here on campus, so it seemed an interesting thing to explore,” said Robin Matoza, an Earth scientist and senior author on the paper.

While past studies had detected surf infrasound only as a continuous wall of noise, the researchers suspected that with new advances in computation as well as in acoustic and seismic detection, they could identify the acoustic signatures of individual waves.

The team, led by geologist Jeremy Francoeur, who conducted the work for his master’s thesis at UC Santa Barbara, installed a single infrasound sensor that collected near-continuous data for 10 months, from September 2022 to July 2023. Then, in October 2023, they conducted an intensive field experiment over 6 days, deploying a network of 12 infrasound sensors and one seismometer across a roughly 500-foot area near the Santa Barbara coast.

“One of the biggest surprises was that the same infrasound signals are being generated by surf nearly every day.”

The researchers also took GoPro videos to correlate specific ocean waves with the infrasound and seismic profiles they generated. They then selected the signatures of five waves as templates to match against the 10 months of single-sensor acoustic data, picking out individual crashing waves among all the infrasound recorded. “One of the biggest surprises was that the same infrasound signals are being generated by surf nearly every day,” said Francoeur in an email. The approach revealed up to tens of thousands of individual surf events per day.

“I liked how they were able to identify discrete surf events using this local array,” said Arrowsmith, who wasn’t involved in the new study. “Previous studies on this, including mine, were not able to do that.”

The researchers found they could detect discrete infrasound signals only when breaking waves were over approximately 6.5 feet (2 meters) high, suggesting that a minimum amount of energy is required to generate detectable infrasound. When waves were detectable, however, the size of the water’s waves correlated with acoustic signal strength. This finding was particularly noticeable in the winter months when larger storm swells reach the California coast.

By timing when infrasound signals hit each sensor in the network, the scientists triangulated the positions of the waves, pinpointing a hot spot of acoustic activity to a specific rocky reef area just offshore. This suggests that certain bathymetric features might be more effective than others at generating detectable infrasound. The findings were published in Geophysical Journal International.

From the Surf to the Sky

Monitoring and locating the infrasound signature of surf could offer a new method for monitoring sea conditions using land-based sensors, which is critical for maritime safety and coastal management and research. Sea conditions are most often studied using ocean-based buoys or video monitoring, which is obscured at night and in foggy conditions.

The new method could also have applications far beyond the coast. If the signals from individual waves can be detected at greater distances from shore, they could offer information about conditions in the upper atmosphere. This is possible because infrasound enters the upper atmosphere, and features like temperature and wind speed modulate the waves before they refract in the stratosphere and return to Earth.

By comparing the signatures of individual surf events detected at sensors positioned at different distances, scientists say it could be possible to correlate specific acoustic signals with atmospheric conditions, providing a new tool for studying weather patterns and atmospheric dynamics.

“If you have repetitive signals, you can monitor small changes in those signals,” Matoza said. “You could use that to infer changes in the atmosphere.”

—Andrew Chapman (@andrewchapman.bsky.social), Science Writer

Citation: Chapman, A. (2025), Scientists tune in to the ocean’s sound waves, Eos, 106, https://doi.org/10.1029/2025EO250384. Published on 16 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Panama’s Coastal Waters Missed Their Annual Cooldown This Year

Wed, 10/15/2025 - 12:18

From January to April, strong winds blowing south from the Atlantic side of Panama through gaps in the Cordillera mountain range typically travel over the country and push warm water away from Panama’s Pacific coast. This displacement allows cold, nutrient-rich water to flow up from the depths, a process called upwelling. The Panama Pacific upwelling keeps corals cool and nourishes the complex marine food webs that support Panama’s fishing industry and economy.

In 2025, for the first time on record, this upwelling didn’t occur, according to research published in the Proceedings of the National Academy of Sciences of the United States of America.

During the upwelling period early in the year, ocean temperatures near the coast typically fall to a low of about 19°C, said Andrew Sellers, a marine ecologist at the Smithsonian Tropical Research Institute in Panama. This year, the coastal waters reached just 23.3°C at their coolest.

Waning Winds

Sellers said the Panama Pacific upwelling has likely been happening since the isthmus formed millions of years ago. The phenomenon has been recorded at low resolution for 80 years, and scientists have 40 years’ worth of more detailed records.

The team has identified “a shocking extreme event.”

Scripps Institution of Oceanography climate scientist Shang-Ping Xie, who has studied the weather patterns that usually cause the Panama Pacific upwelling but was not involved with this research, said the team had identified “a shocking extreme event.”

Annual upwelling moderates water temperature along the coast and triggers plankton blooms that nourish marine food webs and Panama’s economy. About 95% of the fish the country catches comes from the Pacific side, and most of that marine life is supported by upwelling, said Sellers.

Sellers said that though tropical upwelling plays a critical role in supporting marine food webs and fisheries, it’s understudied. Indeed, it was a happy accident that the research team was able to obtain measurements in 2025. Sellers says the Smithsonian Tropical Research Institute maintains a network of temperature sensors near the coast but does not regularly monitor the temperature of deeper waters. Early this year, the Max Planck Institute research vessel S/Y Eugen Seibold was in the region as part of its mission to study the relationship between the atmosphere and the ocean, and it provided high-resolution temperature measurements, including in deeper waters, during the upwelling failure.

The Panama Pacific upwelling typically causes a rise in chlorophyll concentrations (blue = low concentrations and red = high concentrations) and a phytoplankton bloom, nourishing the area’s rich marine life, as seen here in February 2024. Credit: Aaron O’Dea

These measurements allowed the research team to see that deeper waters offshore were cold as usual but that those waters didn’t make their way to the coast. The cause seems to be a dramatic change in wind patterns in early 2025: Winds hailing from the north were both shorter in duration and 74% less frequent during the study period than in typical years.

Rippling Consequences

“Given how important upwelling is to that region, it’s hard to imagine there wouldn’t be a loss of primary productivity,” the growth of phytoplankton that sustains the ocean’s food chains, said Michael Fox, a coral reef ecologist at the King Abdullah University of Science and Technology. “Upwelling sets the stage for the base of the food web.”

Some models have predicted that climate change will cause upwelling in temperate zones such as California to strengthen, but the dynamics in the tropics are more of a mystery. The Panama Pacific upwelling is strongly influenced by the El Niño–Southern Oscillation (ENSO). Sellers says changes in ENSO might be affecting local dynamics in Panama.

“Studies like this one should motivate people to pay more attention to ocean-atmosphere dynamics in the tropics.”

“Studies like this one should motivate people to pay more attention to ocean-atmosphere dynamics in the tropics,” Fox said.

Sellers said this year’s unprecedented upwelling failure is likely to have adverse effects on the country’s vibrant Pacific marine life, but Panama does not collect extensive data on its fisheries. The team is now examining the exception—a dataset related to small fish such as sardines and anchovies—to see whether the lack of upwelling affected those fish.

Xie said the Smithsonian team hasn’t yet provided enough data to evaluate what caused this year’s unusual wind patterns and whether climate change made the upwelling failure more likely. Early this year, La Niña would likely have raised the pressure on the Pacific side of the country, which would have weakened the winds. But Xie said that La Niña is a frequent phenomenon and it alone can’t explain the unprecedented weather seen in Panama this year. He said something likely happened that changed pressure levels on the country’s northern Atlantic side as well. But more research is needed to say for sure.

Sellers’s team is preparing to gather more detailed measurements of marine life effects in early 2026, in case upwelling fails again. They are planning to assess the population of barnacles and other sessile invertebrates, which rely on plankton whose populations burgeon during upwelling.

Though the Eugen Seibold’s mission is set to end in 2026, Sellers said he’s determined to perform extensive water temperature measurements early next year, with or without a research vessel. “Sensors are cheap, and we can get more of them,” he said.

“In coming years, we’ll know if this is going to be a recurring issue,” Sellers said. “If it is, it’s going to be a hard hit to the economy.”

—Katherine Bourzac (@bourzac.bsky.social), Science Writer

Citation: Bourzac, K. (2025), Panama’s coastal waters missed their annual cooldown this year, Eos, 106, https://doi.org/10.1029/2025EO250382. Published on 15 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Chicago Soil Maps Childhood Lead Exposure Risk

Wed, 10/15/2025 - 12:11
Source: GeoHealth

Lead is a neurotoxin that can damage multiple body systems and lead to learning and developmental problems. The element has been phased out of use in paint, gasoline, and other industrial applications for decades, but it can persist for years in the soil. Children, who can be particularly vulnerable to lead poisoning, can accidentally ingest and inhale lead particles when they play in contaminated areas.

Even though one in four U.S. homes likely has soil lead levels over the recommended safety limits, no major U.S. city includes systematic soil monitoring as part of its lead prevention services, and blood testing often happens only after exposure.

Chicago is one city with many homes built before 1978—the year the U.S. government banned the use of lead-based paint—and its industrial history means that many residents could be living with elevated blood lead levels (EBLL) because of the prevalence of lead in the surrounding soil. Testing soil for lead is one way to predict which communities are most at risk for childhood lead exposure.

Thorstenson et al. analyzed 1,750 soil samples from Chicago’s 77 community areas. The researchers then used these data with the EPA’s Integrated Exposure Uptake Biokinetic model (IEUBK) to estimate how much lead children are likely to have in their blood. Comparing these data to actual EBLL findings from the Chicago Department of Public Health and accounting for factors such as household income, the age of housing, and the housing’s proximity to industrial land, the researchers built a comprehensive map that identifies the Chicago communities most at risk for soil lead exposure.

More than half of the citywide soil samples showed lead levels above the EPA’s recommended threshold of 200 parts per million—with some hot spots rising above 300 parts per million. When matched with the modeling from IEUBK, an estimated 27% of children across the city are at risk of EBLL. In the hot spot areas, that risk rises to 57%.

These findings suggest that though median household income is the strongest predictor of EBLL prevalence, soil lead levels are also a significant predictor. Systematic soil testing could become a crucial way to reduce children’s risk of lead exposure in contaminated areas, the authors say. (GeoHealth, https://doi.org/10.1029/2025GH001572, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2025), Chicago soil maps childhood lead exposure risk, Eos, 106, https://doi.org/10.1029/2025EO250377. Published on 15 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

JPL Workforce Decimated

Tue, 10/14/2025 - 16:26
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

Today, NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, Calif., laid off 550 people, a roughly 11% reduction of its workforce.

“This week’s action, while not easy, is essential to securing JPL’s future by creating a leaner infrastructure, focusing on our core technical capabilities, maintaining fiscal discipline and positioning us to compete in the evolving space ecosystem,” JPL director Dave Gallagher wrote in a brief statement released on 13 October. Layoffs were spread across the technical, business, and support areas.

Gallagher said that this workforce reduction is part of a reorganization that began in July and is not related to the current government shutdown that began on 1 October. A 10 October court filing by the White House Office of Management and Budget did not include NASA among the agencies targeted for layoffs by the Trump administration during the ongoing shutdown, reported Space News.

 
Related

JPL is a research and development laboratory federally funded by NASA. While the current government shutdown continues, NASA has been directed to operate and plan as if the appropriations bill passed by the House of Representative is in effect, which would fund NASA (and most JPL projects) at nearly the same level as the current fiscal year.

Federal whistleblowers, however, have come forward with evidence that NASA leadership has been operating as if the President’s Budget Request (PBR)—not the appropriations bill—is in effect, directing mission wind-down operations and staff reductions under the assumption of a 20% overall budget cut. Some of that lost spending would affect JPL’s ability to plan, build, and operate Earth science missions and space exploration spacecraft.

Despite vocal support from the Trump administration and NASA leadership about putting humans on the Moon again and eventually on Mars, the PBR would also cancel the Mars Sample Return program, which would pick up and return to Earth sample capsules collected and deposited by the Perseverance rover. Analysis of those samples would provide critical support to any future human exploration mission to Mars.

Kevin Hicks, a systems engineer who formerly operated rovers at JPL, said that Perseverance’s budget is being reduced by two-thirds, “just enough to technically keep it going and not get the full PR backlash of canceling a working rover,” he wrote.

Credit: Kevin Hicks (@astro-cowboy.bsky.social) via Bluesky

This is the fourth round of layoffs at JPL since the beginning of 2024, including an 8% reduction in staff that affected mostly engineering-related positions. The mood among current and former JPL employees is grim. Several people commented on a JPL Reddit forum that they expect more layoffs in the future.

“Today was very somber on lab. It felt like everyone [was] grieving,” one Redditor wrote on 13 October. Several other posters echoed that sentiment. “We tried to keep a positive, but realistic attitude and we even took a final group photo in front of the JPL concrete logo. However, there’s no whitewashing the ‘doomsday-eve’ feeling that’s looming over all our heads.”

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

As Seas Rise, Corals Can’t Keep Up

Tue, 10/14/2025 - 12:14

Coral reefs face myriad challenges, from ocean acidification to warming seas to destructive fishing activities. Sometimes, reefs can rebound from these ecological harms—but only if the coral species assembled on a reef can maintain the required growth rates.

A revised estimate of coral growth rates, published in Nature, suggests that tropical western Atlantic reefs are losing their capacity to build upward. Without upward reef growth, rising seas threaten to drown these reefs and cancel out the benefits they offer to coastal communities, such as minimizing flood damage. Researchers found that reef growth rates at essentially all the 400 sites analyzed won’t be enough to keep up with sea level rise by 2100.

“It’s very critical that we get a handle on what these rates are to be able to adequately gauge the scale of the problem.”

“It’s very critical that we get a handle on what these rates are to be able to adequately gauge the scale of the problem,” said Cody Clements, a coral reef ecologist at the Georgia Institute of Technology who was not involved in the new study. “We have a lot of work ahead of us.”

“Unfortunately, the estimates are worse than before,” said Rich Aronson, a coral reef ecologist at the Florida Institute of Technology who was not involved in the new paper but works closely with its authors. 

Eroding Reefs

Coral reefs grow when corals secrete calcium carbonate, a hard material that forms their exoskeletons.

Scientists can use knowledge of the species that make up a coral reef to estimate its vertical stacking porosity—how much vertical space a reef can build with a given amount of calcium carbonate. 

The skeletons of branching corals, for example, tend to accumulate in an arrangement with more empty space, leading to more upward growth than other corals, such as flat corals, might achieve with the same amount of calcium carbonate.

However, the relationship between coral assemblage and vertical growth ability has so far been poorly defined, said Chris Perry, a coastal geoscientist at the University of Exeter and lead author of the new study. 

The studied reefs “are going to have zero capacity, really, to be able to track future sea level rise.”

Perry and his research group wanted a better estimate. They gathered 66 images of fossilized coral reefs from the tropical western Atlantic and analyzed how those reefs grew over time on the basis of the species of corals within. Then, they applied their revised estimates of growth to previously collected data on the ecology and carbonate production of 400 sites at three reef systems in the tropical western Atlantic: the Mexican Mesoamerican Reef, the Florida Keys, and Bonaire. 

The adjusted estimate of growth revealed a bleaker picture of reef health than the scientists anticipated: Researchers found that on average, reefs at all sites were growing at a sluggish pace—less than 1 millimeter per year—with an average growth rate decline of 12.4% when compared to previous estimates. On average, global sea levels are rising by about 4.5 millimeters per year.

The new calculations are particularly stark for reefs dominated by branching coral species, Didier De Bakker, a coral reef ecologist at the University of Exeter and a coauthor of the new study, wrote in an email. 

If corals can’t grow, they shrink, falling victim to erosion by other marine creatures such as fish and sea urchins. Eventually, corals unable to keep up with sea level rise are drowned, unable to access sufficient light to continue growing at all.

The studied reefs “are going to have zero capacity, really, to be able to track future sea level rise,” Perry said. 

Corals at Limones Reef in the Mexican Caribbean suffered a bleaching event in 2023. Credit: Lorenzo Álvarez-Filip

In general, the new estimates of the link between assemblage type and vertical growth “revise our estimate downward” of how well corals will be able to keep up with sea level rise, Aronson said. The results also align with a 2023 study by Aronson and others that found reef growth in Panama’s Gulf of Chiriquí, part of the Pacific Ocean, is likely already unable to keep up with sea level rise. 

Perry and De Bakker hope the data in the new study will feed into future studies modeling coastal wave exposure. “These new estimates provide a more realistic basis for projecting the vulnerability of adjacent habitats and reef-fronted urban areas,” De Bakker wrote. 

Aronson said one next step for the research would be to apply the research team’s new estimates of vertical growth to reefs elsewhere, such as those in tropical Indo-Pacific waters. There, more species of branching coral still survive, giving Indo-Pacific reefs a slightly better chance of keeping up with sea level rise, said Clements, who studies Indo-Pacific reefs.

Climate Change and Corals

As a last step to their study, the researchers used what they’d learned about reef growth at 400-plus reef sites along with various future climate warming scenarios, called Shared Socioeconomic Pathways, or SSPs, to project how reef growth rates may change as the climate warms and sea levels continue to rise.

Results predicted that more than 70% of tropical western Atlantic reefs will transition into net erosional states by 2040 under an optimistic scenario (SSP1-2.6). But if warming exceeds SSP2-4.5 (a middle-of-the-road scenario in line with current development patterns), nearly all reefs will be eroding by 2100.

“Even if you go by some of the conservative estimates that they’re using, we still have a major problem in terms of coral reef accretion rates,” Clements said. 

Reef Benefits Wash Away

Slower vertical growth means corals will have a tougher time maintaining their crest, or high point. These crests serve as wave breakers that dissipate wave energy and reduce flood damages to coastal communities. One estimate suggests that coral reefs near the U.S. coastline prevent more than $1.8 billion in damage each year.

This coral reef crest in the Mexican Caribbean dissipates wave energy and reduces beach erosion and possible flood damage. Credit: Lorenzo Álvarez-Filip

As coral growth fails to track with sea level rise, these crests fall below the water’s surface. In turn, rising seas and waves from storms face less resistance, and reefs’ protective abilities get washed away.

“It’s quite difficult to see how we turn this around without really, really aggressive action on greenhouse gas emissions.”

Reef restoration is an active area of research, with engineers and ecologists working together to create various solutions, from LEGO-like scaffolding for corals to robots that sprinkle warming reefs with cool water. Previous research by Aronson and others indicated that successful restoration could help reefs keep pace with future sea level rise.

However, restoration will be effective only if it is done in tandem with efforts to rein in climate warming, which could slow sea level rise and reduce the frequency of marine heat waves, Perry said. “It’s quite difficult to see how we turn this around without really, really aggressive action on greenhouse gas emissions.”

“We have to do something about these global-scale stressors, like climate change, or it’s not going to matter,” Clements said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

This news article is included in our ENGAGE resource for educators seeking science news for their classroom lessons. Browse all ENGAGE articles, and share with your fellow educators how you integrated the article into an activity in the comments section below.

Citation: van Deelen, G. (2025), As seas rise, corals can’t keep up, Eos, 106, https://doi.org/10.1029/2025EO250380. Published on 14 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Space Radiation Can Produce Some Organic Molecules Detected on Icy Moons

Tue, 10/14/2025 - 12:10

New laboratory research suggests that some organic molecules previously detected in plumes erupting from Saturn’s moon Enceladus may be products of natural radiation, rather than originating from the moon’s subsurface ocean. This discovery complicates the assessment of the astrobiological relevance of these compounds.

Enceladus hides a global ocean buried beneath its frozen crust. Material from this liquid reservoir is ejected into space from cracks in the ice near the south pole, forming plumes of dust-sized ice particles that extend for hundreds of kilometers. While most of this material falls back onto the surface, some remains in orbit, becoming part of Saturn’s E ring, the planet’s outermost and widest ring.

Between 2005 and 2015, NASA’s Cassini spacecraft flew repeatedly through these plumes and detected a variety of organic molecules. The detection was viewed as evidence of a chemically rich and potentially habitable environment under the ice, where molecules essential to life could be available. However, the new study offers an explanation in which radiation, not biology, is behind the presence of at least some of these organic molecules.

To test the role of space radiation, a team of researchers led by planetary scientist Grace Richards, a postdoc at the National Institute for Astrophysics in Rome, simulated conditions near Enceladus’s surface by creating a mixture of water, carbon dioxide, methane, and ammonia, the main expected components of surface ice on Enceladus. They cooled the concoction to −200°C inside a vacuum chamber and then bombarded it with water ions, which are an important component of the radiation environment that surrounds the moon.

The radiation induced a series of chemical reactions that produced a cocktail of molecules, including carbon monoxide, cyanate, ammonium, and various alcohols, as well as molecular precursors to amino acids such as formamide, acetylene, and acetaldehyde. The presence of these simple molecules indicates that radiation could induce similar reactions on Enceladus.

Richards presented these findings at the Europlanet Science Congress–Division for Planetary Sciences Joint Meeting (EPSC-DPS 2025) in Helsinki, Finland. She and her coauthors also published a detailed report in Planetary and Space Science.

Enceladus and Beyond

The new research raises the question of whether the organic molecules detected in Enceladus’s plumes truly come from the moon’s buried ocean, whether they are formed in space, or whether they form close to the surface after the plumes leave the Enceladean interior.

While the finding doesn’t exclude the possibility of a habitable ocean on Enceladus, Richards urges caution in assuming a direct link between the presence of these molecules in the plumes, their origin, and their possible role as precursors to biochemistry.

“I don’t necessarily think that my experiments discredit anything to do with Enceladus’s habitability.”

“I don’t necessarily think that my experiments discredit anything to do with Enceladus’s habitability,” Richards said.

However, she added, “when you’re trying to infer this ocean composition from what you’re seeing in space, it’s important to understand all the processes that go into modifying this material.” Apart from radiation, these processes include phase changes, interactions with the moon’s ice walls, and interactions with the space environment.

“We need a lot of experiments of that type,” said planetary scientist Alexis Bouquet, a French National Centre for Scientific Research (CNRS) researcher at L’Université d’Aix-Marseille who wasn’t involved in the study. “They demonstrated that you can produce a certain variety of species in conditions that are relevant to the south pole of Enceladus.”

Bouquet highlighted the importance of simulating these environments in a lab for planning future missions to Enceladus and for interpreting the much-anticipated data from current missions to Jupiter’s icy moons. These missions are NASA’s Europa Clipper, which will explore Europa, and the European Space Agency’s (ESA) JUICE (Jupiter Icy Moons Explorer), which will visit all three of the giant planet’s moons with subsurface oceans: Ganymede, Calisto, and also Europa.

The intense radiation around Jupiter makes these experiments especially relevant. “Radiation chemistry for Europa or the Jovian moons in general [is] a big deal, a bigger deal than in Enceladus,” Bouquet says.

Another Story Completely

As Richards’s work questions the origin of organic compounds around Enceladus, researchers keep adding more molecules to the puzzle.

After a new analysis of data gathered during one of Cassini’s close approaches to Enceladus in 2008, researchers led by planetary scientist Nozair Khawaja at the Freie Universität Berlin and the University of Stuttgart reported the discovery of new types of organic molecules, seemingly emanating from the icy vents. They include ester and ether groups and chains and cyclic species containing double bonds of oxygen and nitrogen.

On Earth, these molecules are essential links in a series of chemical reactions that ultimately produce complex compounds needed for life. And while these molecules could have an inorganic origin, “they increase the habitability potential of Enceladus,” Khawaja said. The findings appeared in Nature Astronomy.

Khawaja’s team’s analysis suggests that complex organic molecules are present in fresh ice grains just expelled from the vents. During its last flyby, Cassini got as close as 28 kilometers to the moon’s surface.

After modeling the plumes and the icy grains’ residence times in space, they think that the ice grains sampled by Cassini did not spend a lot of time in space, likely just “a few minutes,” Khawaja said. “It is fresh.”

This short duration in space questions whether space radiation had enough time to produce the organic molecules Khawaja detected. Just a few minutes would not be long enough for such complex chemistry to take place, even in a high-radiation environment.

“Big grains coming from the surface full of organics? That is much harder to explain through radiation chemistry,” Bouquet said.

While the types of experiments performed by Richards “are valuable and take the science to the next level,” Khawaja said, “our results tell the other story completely.”

Back to Enceladus

Both studies reinforce the complexity of Enceladus’s chemistry, upholding it as a prime target in the search for extraterrestrial life, or at least life’s building blocks. Enceladus has all three prerequisites for life: liquid water, an energy source, and a rich cocktail of chemical elements and molecules. Even if the subsurface ocean is out of reach—it lies at least a few kilometers beneath the ice close to the poles—the plumes offer the only known opportunity to sample an extraterrestrial liquid ocean.

Studies for a potential ESA mission dedicated to Enceladus are already underway, with plans that include high-speed flybys through the plumes and, potentially, a lander on the south pole. The insights from both recent studies will help researchers design the instrumentation and guide the interpretation of future results.

“There is no better place to look for [life] than Enceladus,” Khawaja said.

—Javier Barbuzano (@javibar.bsky.social), Science Writer

Citation: Barbuzano, J. (2025), Space radiation can produce some organic molecules detected on icy moons, Eos, 106, https://doi.org/10.1029/2025EO250383. Published on 14 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The 12 July 2024 landslide cluster in Pengshui County, Chongqing, China

Tue, 10/14/2025 - 07:45

About 140 mm triggered 143 landslides in an area of about 10 km2, killing two people.

Loyal readers will have noticed that I’m fascinated by dense clusters of landslides triggered by intense rainfall (or earthquakes). Over the years, I have written about these on multiple occasions, but increasing numbers are being described in the literature.

Another very interesting example has just been published in the journal Landslides (Xie et al. 2025). This example occurred on 12 July 2024 close to Puzi in Pengshui County, Chongqing, China. The centre of the cluster as at [29.56790, 108.28781] – this is the marker on the images that follow.

The Planet image below shows the area on 24 May 2024, before the rainfall:-

The site of the 12 July 2024 landslides in Pengshui County, Chongqing, China. Image copyright Planet, used with permission. Image dated 24 May 2024.

And this is the same site after the event on 12 July 2024:-

The aftermath of the 12 July 2024 landslides in Pengshui County, Chongqing, China. Image copyright Planet, used with permission. Image dated 1 August 2024.

And here is an image compare:-

Images copyright Planet, used with permission.

Xie et al. (2025) show that this cluster of landslides was triggered by a rainstorm that deposited about 140 mm of rainfall in a few hours. In total, 143 landslides were triggered in an area of about 10 km2. The failures were mostly disrupted avalanches, some of which formed channelised debris flows. However, Xie et al. (2025) also show that there are a number of interesting aspects of this cluster of landslides.

Note the geographical isolation of these landslides. The slopes to the east and west suffered far fewer failures. Perhaps surprisingly, this cluster of landslides did not occur in the area of highest rainfall – a short distance to the west, more than 200 mm was recorded, but few landslides occurred.

The analysis of Xie et al. (2025) shows that this cluster occurred because of a weak geological unit (sandstone) that was highly fractured, a geological structure that promoted instability and steep slope gradients (which may be associated with erosion by the river). Thus, it is the combination of the meteorological, geological and geomorphological factors that led to the cluster of landslides.

Fortunately, the area had been mostly evacuated ahead of the rainfall, so there were just two fatalities. There was extensive damage to properties though.

This event illustrates well the ways in which extreme rainfall events are combining with local factors to create clusters of landslides that have the potential to generate high levels of damage.

Many thanks to Xie et al. (2025) for such an interesting example.

References

Xie, X., Liu, S., Macciotta, R. et al. 2025. Spatial heterogeneity in landslide response to a short-duration intense rainfall event on 12 July 2024 in Pengshui County, Chongqing, ChinaLandslides. https://doi.org/10.1007/s10346-025-02624-6.

Planet Team 2025. Planet Application Program Interface: In Space for Life on Earth. San Francisco, CA. https://www.planet.com/.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer