EOS

Syndicate content Eos
Science News by AGU
Updated: 45 min 51 sec ago

Some useful tools for monitoring the evolution and behaviour of Hurricane Melissa

Tue, 10/28/2025 - 08:13

Various online datasets will allow a detailed understanding of Hurricane Melissa as it impacts Jamaica and then Cuba

Hurricane Melissa is now making headlines around the world in anticipation of its landfall today. As always with tropical cyclones, the picture is evolving continuously as the storm evolves. Their behaviour is highly complex.

I thought I’d highlight some useful tools for monitoring the evolution and behaviour of Hurricane Melissa. First, of course, NOAA CPHC provides a range of graphics, some of which are adaptable. This includes the forecast track of the centre of the storm, the forecast earliest arrival time of the centre of the hurricane and (most usefully in the context of landslides), the rainfall potential:

Precipitation potential for Hurricane Melissa. Graphic from NOAA as at 07:18 UTC on 28 October 2025.

Note that this is three day potential rainfall (the graphic that I posted yesterday was for four days). Jamaica is going to start to feel the full brunt of the storm today (Tuesday 28 October), and it will then move on to eastern Cuba. The latest forecast suggests that the most serious rainfall will occur in the central part of Jamaica, but that there will also be very significant rainfall in the west of the island. The change appears to be the result of a slightly later than forecast turn to the north.

The NASA Global Precipitation Measurement site provides near real time data – the best tool available for understanding the rainfall that the storm is delivering. This is the latest image showing 24 hour precipitation totals:-

24 hour precipitation accumulation for Hurricane Melissa. Graphic from NASA GPM as at 07:34 UTC on 28 October 2025.

Note that this site also provides a global landslide nowcast, but sadly the site indicates that this is not functioning. I am unsure as to why – maybe this is the effect of the government shutdown.

In terms of the landslides themselves, this map of Jamaica and Cuba provides landslide susceptibility – yet again, this is work from NASA:-

Landslide susceptibility for Jamaica and Cuba. Data from NASA.

Overlaying this with the forecast precipitation is fascinating – the east of Jamaica has the highest landslide susceptibility, but is now forecast to receive less rainfall. Central Jamaica has lower average susceptibility, but may receive more rainfall. But also remember that landslides in storms like this are often driven mostly by rainfall intensity, which is hard to forecast and very variable. There’s also a nice BGS report on landslide hazard for a catchment in Central Jamaica, which gives an idea of the scale of the issues.

In terms of news within Jamaica itself, the Jamaica Observer and the Jamaica Star will be providing local coverage.

Finally, in such situations there is a tendency in the international media to adopt a slightly condescending tone to reporting of such events in countries with lower levels of per capita GDP. Both Jamaica and Cuba have advanced disaster management systems – they are far from helpless victims. Indeed, Cuba has a remarkably successful record of managing disasters and Jamaica fared remarkably well during Hurricane Beryl last year due to its preparedness. But tropical cyclones are complex, and the impact of a Category 5 event is very much greater than that of a Category 4 storm. Even the best prepared nation struggles to cope with such a storm.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Building Better Weather Networks

Mon, 10/27/2025 - 12:58

Lake Victoria, Africa’s largest lake, supports a quarter of East Africa’s population with fish, fresh water, and critical transportation routes. It’s also one of the deadliest bodies of water in the world.

Storms, high winds, and waves frequently capsize boats, causing thousands of deaths each year.

Despite the hazard, Lake Victoria has historically lacked enough weather observation stations to provide a clear picture of weather patterns in the region. Overly general and often inaccurate forecasts have meant that those heading out on the water had little idea what weather they’d face.

In 2017, the World Meteorological Organization (WMO), the weather agency of the United Nations, began a multiyear effort to improve weather information for the lake and establish early-warning systems for life-threatening storms. Now, much of the lakeside population uses the program’s tailored forecasts, leading to an estimated 30% reduction in weather-related deaths on the lake.

Still, a dearth of weather data persists across the continent. Because of ongoing economic depression, conflict, and disruptive weather patterns, Africa has gone decades without observational networks that meet international standards.

Today, the continent has the least dense weather observation network in the world. The average density of stations meeting WMO standards is 8 times lower than the WMO recommended level; more reporting surface stations exist in Germany than across all of Africa, according to the WMO and the World Bank.

The lack of observations often leaves communities without early warnings of natural disasters, cripples forecasts that farmers rely on, and leaves scientists who are studying global climate and weather with a major data gap.

In 2019, the need for improved weather networks around the world was recognized at the Eighteenth World Meteorological Congress in Geneva, Switzerland. There, the WMO’s 193 member states and territories established the Global Basic Observing Network (GBON), an international agreement that specifies requirements for weather data collection, including which parameters to measure, when and at what resolution to measure them, and how to share the data between countries.

With encouragement from the WMO and smaller organizations, national meteorological agencies in Africa are also recognizing the need for enhanced weather and climate services and planning for them, said Zablon Shilenje, the technical coordinator of services for the WMO’s Regional Office for Africa.

That recognition, combined with increased economic investment, has slowly led to weather stations being added to networks throughout Africa.

“The situation now has improved a lot,” said Frank Annor, a water resources expert at TU Delft and Kwame Nkrumah University of Science and Technology in Ghana. But the continent is still far from meeting GBON reporting standards. And in the face of an ever more variable climate, additional investments and improved ways of working together are needed.

“There is a huge gap, and we need to work on it,” Shilenje said.

Scarce Stations

Climate models used by scientists and forecasts created by meteorologists fundamentally rely on current and past weather data, particularly temperature and precipitation measurements, to extrapolate patterns into the future. “Any time the historical information isn’t perfect, that’s going to cause potential issues,” especially for estimating the impacts of climate change, said Pascal Polonik, a climate scientist at Stanford University.

Forecast accuracy declines as the number of observations drop. That’s particularly problematic when an entire region or large swaths of it have little to no observational data—as is the case in many parts of Africa.

“We lack the ground data. That data is not being ingested into models, so then when you do predictions, your predictions are less accurate.”

“We lack the ground data. That data is not being ingested into models, so then when you do predictions, your predictions are less accurate,” Annor said.

There wasn’t always such a lack of station density. In the first half of the 20th century, African countries’ networks were growing on par with those in other parts of the world, though they never reached the same densities as in places like North America. But now, Africa has less than one third of the weather stations that it once had.

Social and political conflict is one reason for the decline. One 2019 analysis of temperature records available in sub-Saharan Africa found that a country’s civil conflict risk was negatively correlated with both the number and density of weather stations contributing to its temperature record.

Some conflicts or social upheavals have had an outsized effect on monitoring networks. During and after the 1994 genocide in Rwanda, for instance, the average number of actively reporting weather stations in the country dropped from more than 50 to less than 10. Nearly 15 years passed before station coverage returned to preconflict levels. In another instance, station density in Uganda declined from a peak of about 500 stations following independence in the 1960s to less than 100 by 2001. A civil war in Ethiopia beginning in 2018 resulted in a sharp decline in reporting weather stations in the northwest part of the country, where much of the fighting took place.

“You can see from one year to another how unrest can affect station density,” said Tufa Dinku, a senior research scientist at the Columbia Climate School in New York.

The ongoing conflict in the northwestern part of Ethiopia may have led to a decrease in reported weather data from stations in the same area. Credit: Tola et al., 2025, https://doi.org/10.3389/fclim.2025.1551188, CC BY 4.0

Beyond conflict and the challenges of establishing a stable national government, a lack of economic resources has also contributed to the drop in weather station density. In the late 1980s and 1990s, Africa entered an economic depression that made it difficult for states to update their weather observational systems with technology on par with that used by countries in Europe and North America.

“African countries were not able to recover their meteorological [networks],” said David Mburu, who worked for more than 30 years as a meteorologist at the Kenya Meteorological Department.

Weather itself is partly to blame for slow economic development: Climate variability has caused frequent droughts, floods, heat waves, and land degradation, Dinku wrote in a 2006 article exploring the challenges of managing climate change on the continent.

The places in the world that are most affected by climate change tend to overlap with places in the world that are economically poor and, often, also data poor, Polonik said.

Shilenje said climate change only adds to the challenge of maintaining networks, affecting the durability of the instruments and equipment used to make observations. “There is a strong correlation between climate change and the ability to maintain a stable observing network on the continent,” he said.

A Dearth of Data

The reporting weather stations that do exist are often located along roads or concentrated in urban areas, meaning they’re not dispersed well enough to give an accurate reflection of weather across a whole country or region, Dinku said. Weather station coverage tends to be worse in rural areas, where better weather and forecasting information is most needed.

The dearth of observational stations has far-reaching consequences for Africans and those doing business there. Farming, for example, makes up about 15% of the continent’s economy. Without accurate forecasts, farmers are left without the information they need to make decisions about how to keep their livelihoods afloat.

“If you don’t know when it’s going to rain or how much to expect, then how do you go about your agriculture activities? That is the problem,” Annor said.

Inaccurate accounting of rainfall also means some farmers struggle to get their insurance claims paid, he said. “People then don’t want to invest in insurance again,” Annor said. “What that means is that people take calculated risk: They minimize the amount of food they can grow so they can minimize their risk.”

“The data from Ethiopia is not just for Ethiopia. The more observations you have in Africa, the better forecast we’ll have anywhere else in the world.”

The observations lost over the past few decades didn’t just limit forecasts then: The holes in the data will exist forever, always needing to be filled with reanalysis data—climate modeling that interpolates data on the basis of available observations—whenever a meteorological service or scientists want to analyze trends in a country’s weather and climate.

Lost observations also mean policymakers have no long-term data to use to plan adaptation strategies. “The resilience of people in the communities is reduced, and people become very vulnerable,” Annor said.

It’s not just local residents who suffer the consequences of a lack of data, either. Weather patterns in Africa play a role in the genesis of Atlantic hurricanes and spark Saharan dust storms, which can travel thousands of kilometers and affect global atmospheric processes.

“The data from Ethiopia is not just for Ethiopia,” Dinku said. “The more observations you have in Africa, the better forecast we’ll have anywhere else in the world.”

A lack of observational stations leaves scientists without sufficient data to answer research questions, too. For instance, sparse rainfall observations limited a full assessment of whether and how climate change influenced heavy rainfall and flooding around Lake Kivu that killed at least 595 people in Rwanda and the Democratic Republic of Congo in 2023.

“The scarcity and inaccessibility of meteorological data…meant we couldn’t confidently evaluate the role of climate change,” scientists from World Weather Attribution wrote in a summary of their attempt to analyze the event.

Low-Cost Stations as a Solution

In 2006, Oregon State University hydrologist John Selker ran into a similar data problem. He was working in Ghana, attempting to measure how much rainfall trees intercept. He and his collaborators found themselves stymied by a lack of rainfall measurements that kept them from completing the analysis they had planned.

“It was really shocking,” Selker said, adding that the only rainfall data they seemed to be able to find were sparse datasets that they had to apply for access to.

Selker and his colleague at the Delft University of Technology, Nick van de Giesen, brainstormed a solution: a low-cost weather station that could transmit meteorological and hydrological data over cell networks. They called their new project the Trans-African Hydro-Meteorological Observatory, or TAHMO.

With TAHMO, “the question was, What can we do now to improve on the density of stations to ensure that we can have reliable data from Africa that can both help feed the global models and [create] local models that are as accurate and useful as the ones that we have in the U.S. and EU countries?” said Annor, TAHMO’s CEO.

To date, TAHMO has worked with national liaison agencies (most frequently, national meteorological agencies) to install more than 750 stations in 23 countries and has collected more than 7 billion total observations. The stations, owned and installed by TAHMO, measure standard weather parameters such as precipitation, wind speed, wind direction, relative humidity, temperature, and solar radiation. Often, TAHMO approaches national meteorological agencies with a proposal to install stations, though some countries have asked TAHMO for assistance, too.

The data from TAHMO stations are shared directly with each country’s liaison agency. Each agreement between TAHMO and a country allows TAHMO to make these data available for any researchers interested in using them in peer-reviewed studies. It also gives a country the right to halt data collection, if it chooses.

“Policymakers are supposed to be guided by climate scientists, and the climate scientists can only authoritatively talk about that if they have quality data.”

Mburu, the longtime Kenyan meteorologist, became one of TAHMO’s main contacts in that country, helping to establish a relationship between the organization and the Kenya Meteorological Department. Now semiretired, he is a consultant for TAHMO at the organization’s headquarters, located in Nairobi. In Kenya, he said, TAHMO stations have been the most reliable forecasting system over the past decade.

Data from TAHMO stations have given Kenya’s Meteorological Department significant insight into what causes flooding, especially in Nairobi County, said Paul Murage, a climate scientist at the department who also trains other meteorologists at the WMO regional training center in Nairobi. Flash flooding has become a significant issue in the city; Murage recounted a day in March 2024 when the Nairobi Expressway, a major roadway, was impassable during heavy rains.

Murage said having rainfall data from TAHMO stations empowers his agency to persuade policymakers that better, climate-proofed infrastructure is needed. “Policymakers are supposed to be guided by climate scientists, and the climate scientists can only authoritatively talk about that if they have quality data,” he said.

TAHMO stations were included in the High Impact Weather Lake System Project (HIGHWAY), the WMO project to improve early-warning systems across Lake Victoria.

Another U.S.-based project, called 3D-PAWS (for 3D-Printed Automatic Weather Station), works with national meteorological agencies in developing countries to establish 3D printing fabrication facilities, install 3D printed, low-cost observational stations, and train local staff to maintain their own stations long term.

The group has worked with six African countries—Kenya, Malawi, Senegal, Uganda, Zambia, and Zimbabwe—and has deployed more than 250 stations. Prior to being dissolved in July 2025, the U.S. Agency for International Development (USAID) was a major 3D-PAWS funder, connecting the organization with countries via requests from national meteorological agencies in Africa.

Staff from the Zimbabwe Meteorological Services Department install a 3D-PAWS (Printed Automatic Weather Station) tipping bucket rain gauge at the Kutsaga Research Station in 2024. Credit: Paul Kucera

The goal is for each country to eventually run its network completely on its own, said Paul Kucera, an atmospheric scientist at the University Corporation for Atmospheric Research (UCAR) who codeveloped the 3D-PAWS program with his colleague Martin Steinson. They designed the original 3D printed stations themselves and incorporated feedback from international partners in newer iterations of the design.

Each partner country owns the stations once they’re installed. Though the initial training and installations are supported by grants to 3D-PAWS, the expectation is that each country’s own staff will incorporate the costs of operation and maintenance into its annual budgets after a few years.

Barbados, one of the countries outside Africa that 3D-PAWS works with, now has a self-sufficient team independent of the 3D-PAWS group that provides feedback to 3D-PAWS on how to improve their operations. Kucera hopes Kenya will be the first African country to achieve the same level of independence.

The 3D-PAWS data are typically open to the public via a free database system. Near-real-time data from 3D-PAWS stations in Kenya and Zimbabwe are also sent to the Famine Early Warning Systems Network (FEWS NET), a program established by USAID to provide early-warning systems for famine. Kucera’s own research group aims to use the data to develop other tools, such as automatic early alerts for weather events.

Many national governments in Africa are investing in climate services, too, Shilenje said. He’s seen an increase in the number of countries adopting national frameworks for climate services and putting plans in motion to improve weather networks.

As one example, the Tanzania Meteorological Authority installed five new weather radars in 2024, bringing the countrywide total to seven and giving Tanzania the greatest number of radars in East and Central Africa. It’s “quite a significant investment” for a single African country to make, Shilenje said.

Global Support

Larger, international efforts also provide assistance. In 2021, the WMO launched the Systematic Observations Financing Facility (SOFF), a partnership between the WMO, United Nations Environment Programme, United Nations Development Programme, and United Nations Multi-Partner Trust Fund meant to finance advancements in weather observational networks in developing countries through grants. SOFF is also part of the U.N. Early Warnings for All initiative, a project aiming to ensure that everyone on Earth is protected from natural disasters by early-warning systems by 2027.

SOFF, now 3 years old, has partnered with 24 African countries to support station building, radiosonde launches, and continued maintenance of these networks. SOFF, like TAHMO and 3D-PAWS, emphasizes continued support after the initial installation of stations. SOFF does this via a peer-to-peer network of national meteorological agencies. Twenty-eight agencies worldwide have expressed interest in acting as peer mentors, many to African agencies, said Mario Peiró Espí from the partnership office at SOFF.

The concept has seen successes. Peiró Espí recounted a recent interaction with a staff member at the Austrian meteorological agency: “He said the guys in South Sudan write to him all the time to check on questions that before, they didn’t have anyone to check in with. They didn’t have any number to call and say, ‘Hey, we are facing this challenge, we don’t know how to solve it, can you help us?’”

Nine of SOFF’s African partner countries have entered the organization’s investment phase, during which national meteorological agency staff install stations and launch radiosondes with SOFF support. Mozambique, a coastal nation that frequently faces destructive floods and tropical cyclones, is one of those countries.

During a flooding event in 2000, Mozambique lost the majority of its weather stations. A $7.8 million grant from SOFF is helping the country’s national meteorological agency to recover the network by establishing 6 new land-based weather stations, upgrading 15 existing stations, and launching 4 airborne stations.

Farther north, Chad faces a dire lack of weather data, too—as of October 2024, Chad was reporting just 3% of the surface weather observations required by the GBON agreement. SOFF is working with the country toward the goal of installing or upgrading at least 34 weather stations.

Markus Repnik, SOFF’s director, feels strongly that the world should think of improvements in Africa’s observational networks not just as assistance to Africa but as a global public good. The world is dependent on African meteorological agencies for accurate forecasts everywhere, he said. It’s as much as 20 times more valuable to install a single station in a data-poor area of Africa than to add one to a European network, he said.

While 3D-PAWS, TAHMO, and SOFF focus on station building and radiosonde launching, other groups lend additional support. Beyond investments in SOFF, the WMO is spending roughly $56 million in Africa on climate service projects such as those used to support improved food security and health outcomes.

Dinku’s research group has an additional solution: the so-called ENACTS (Enhancing National Climate Services) approach.

Via ENACTS, Dinku and his colleagues work with countries’ national meteorological and hydrological services to create more comprehensive and usable precipitation and temperature datasets. For rainfall, they combine a country’s meteorological station data with satellite rainfall estimates to improve the coverage of the dataset. For temperature, they use reanalysis to fill in missing past data points.

ENACTS prioritizes satellite and modeling work over installing new stations because new stations can’t provide the decades of past data needed to provide reliable forecasts or understand climate trends now. ENACTS has been implemented in 15 African countries at the national level.

An Upward, Uncertain Trend

Thanks to these efforts and others, the number and density of reporting weather stations in Africa continue to tick slowly upward. Philanthropic donors are beginning to understand the importance of a robust, global weather observation system, and the need for improvements has gotten recent exposure on the world stage. But there’s still a long way to go before observational networks on the continent reach the density of those in Europe or North America.

One ever-present barrier is funding. Many African countries still lack the financial resources to improve their meteorological services, according to a recent WMO report. Even SOFF, which has been able to mobilize more than $100 million in grants in the 4 years it’s existed, faces a “very challenging fundraising environment,” Peiró Espí said. SOFF needs an additional $200 million by the end of 2026 to meet the needs of the countries it’s working with.

Facing such fundraising challenges, SOFF plans to announce a new funding mechanism, the SOFF Impact Bond, at COP30 (the 30th Conference of the Parties) in Belém, Brazil. The bond will “make resources available upfront…while allowing donors to spread contributions over a longer period,” according to SOFF.

A changing political landscape in the United States could pose obstacles, too. This summer, the Trump administration officially shut down USAID and said many of its programs would be absorbed by the U.S. State Department. Kucera said 3D-PAWS is still waiting to hear whether the changes will affect its fiscal year 2026 funding, but the group is being “cautiously optimistic” and working to diversify its funding sources.

Fragmentation of efforts also slows progress. Africa is full of fragmented investments, Repnik said. “In each and every country, you have a hodgepodge of investments.”

“The future benefits [of more investment] will be immense.”

This hodgepodge leads to scenarios like the one that Dinku witnessed in Kenya, where the national meteorological agency receives data from a handful of different types of weather stations provided by various nongovernmental organizations and intergovernmental organizations, all with different data reporting systems. Shilenje’s seen this too: “You have different companies providing different sets of equipment,” he said. “It may be working very well, but compatibility is a challenge.”

To help with that issue, Dinku and a colleague created a data tool that allows users to access, process, perform quality control on, and visualize data from all their different weather station systems.

The WMO is working to solve the fragmentation issue as well, including efforts to improve national meteorological agencies’ digital capacity, a software tool to homogenize data, and the African Partner Coordination Mechanism, a platform by which nongovernmental organizations, intergovernmental organizations, and companies can exchange plans and objectives to ensure that everybody is working toward the same goal.

Still, collaboration and coordination are an uphill battle, Dinku said. “The last two or three decades, we have been talking about coordination. But everybody talks about coordination, and then they go about doing their own thing.”

Climate change only adds urgency to the efforts. As the climate warms, it will become even more variable. Risky decisions that farmers make about when, where, and what to plant will come with higher consequences than they once did. Mitigating the effects of climate change “will not happen without proper climate services,” Mburu said, adding that a robust observational network is critical to drastically reduce the impacts of climate change in Africa.

“The future benefits [of more investment] will be immense,” he said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Citation: van Deelen, G. (2025), Building better weather networks, Eos, 106, https://doi.org/10.1029/2025EO250386. Published on 27 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New Earthquake Model Goes Against the Grain

Mon, 10/27/2025 - 12:50
Source: Geophysical Research Letters

When a slab slides beneath an overriding plate in a subduction zone, the slab takes on a property called anisotropy, meaning its strength is not the same in all directions. Anisotropy is what causes a wooden board to break more easily along the grain than in other directions. In rock, the alignment of minerals such as clay, serpentine, and olivine can lead to anisotropy. Pockets of water in rock can also cause and enhance anisotropy, as repeated dehydration and rehydration commonly occur at depth in a subducting slab.

It is well known that an earthquake generates both a compressional wave and a shear wave. If the shear wave passes through anisotropic rock, it can split into a faster shear wave and a slower one with different polarizations.

Although seismologists routinely measure the shear wave splitting in subduction zones by analyzing recorded seismic waveform data, it is challenging to pinpoint where splitting occurs along the wave propagation path.

In the past, researchers have investigated the circulation of Earth’s interior for answers, in particular in the mantle wedge region above and below the slab. However, Appini et al. suggest a different explanation: that, contrary to popular wisdom, it is the downgoing slab that causes most of the shear wave splitting.

The researchers tested their theory using recordings of 2,567 shear waves from the Alaska-Aleutian subduction zone. They found that the way the waves split as they propagate through the slab varied by earthquake location and that these variations were consistent with the anisotropy observed in the dipping slab. They also used a forward model to predict that the splitting pattern will differ depending on the direction the shear wave comes from, which was verified by data observation. Previously, scientists thought the variation in splitting patterns was due to complex mantle flows.

Furthermore, a dipping anisotropic slab also explains why deep earthquakes within a slab have unusual seismic wave radiation patterns. Other recent findings also hint that the composition of subducting plates causes anisotropy, the authors write.

If the slab holds most of the anisotropy, instead of the mantle wedge or subslab region, this finding has far-reaching consequences that could fundamentally change established ideas on how mantle dynamics work and how rock deforms, the authors suggest.

These results drive home the plausibility that slab anisotropy is an understudied component of seismology and geodynamics, the authors say. (Geophysical Research Letters, https://doi.org/10.1029/2025GL116411, 2025)

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), New earthquake model goes against the grain, Eos, 106, https://doi.org/10.1029/2025EO250403. Published on 27 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Anticipating the impact of Hurricane Melissa in Jamaica

Mon, 10/27/2025 - 08:32

Hurricane Melissa is bearing down on Jamaica, with many areas likely to see over 500 mm of rainfall. The impacts could be extremely significant.

Hurricane Melissa has strengthened substantially over the weekend, and is now on course to track across Jamaica in the next couple of days. Various media agencies have identified the threats that this storm poses to a country with high vulnerability. As always, NOAA has excellent tracking charts for this storm.

The current forecast track will take the storm directly across Jamaica:-

The forecast track of Hurricane Melissa. Graphic from NOAA as at 07:52 UTC on 27 October 2025.

NOAA also provides data on forecast precipitation (rainfall):-

Precipitation potential for Hurricane Melissa. Graphic from NOAA as at 07:52 UTC on 27 October 2025.

There is a great deal of uncertainty in this type of forecast – the final totals will depend upon the track, the rate at which the storm moves, the intensity of the storm (and how that changes as a result of the contact with the land mass) and orographic effects. But much of Jamaica is forecast to receive over 500 mm of rainfall, and some parts may receive more than 750 mm.

Now, the average annual rainfall in Jamaica is 2,100 mm for the island as a whole, and much more in some places, so this must be seen in context. However, as I have noted often before, in most cases the dominant medium through which tropical cyclone losses occur in water (even though windspeed often grabs the headlines). As the Google Earth image below shows, the island is characterised by steep slopes – this is a recipe for channelised debris flows:-

Google Earth image showing the landscape of eastern Jamaica.

There is active preparation underway in Jamaica, including evacuations, and in Hurricane Beryl last year this was a success. However, we know that many people choose not to move, and this storm is on a different scale.

In the immediate aftermath, the initial focus will inevitably be on the capital, Kingston, as this is where the reporters are likely to be located. Watch out for news from the east of the island though, especially on the coast and on the southern and eastern sides of the mountains. In severe storms, communications are often lost, so in this case no news may well be probably bad news.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The giant Tupaasat rock avalanche in South Greenland

Fri, 10/24/2025 - 14:38

A new paper describes a rock avalanche in Greenland about 10,900 years BP that had a volume of over 1 billion cubic metres and that travelled almost 16 kilometres.

A fascinating paper (Pedersen et al. 2026) has just been published in the journal Geomorphology that describes a newly-discovered ancient rock avalanche in Greenland. This landslide, which is located in the Tupaasat Valley, is truly enormous. The authors estimate that it has a volume that exceeds 1 km3 (1 billion m3), with a runout distance of 15.8 kilometres and a vertical height difference of 1,440 metres.

The rear scar of the landslide is located at [60.4117, -44.2791]. It is really hard to capture this landslide on Google Earth, but fortunately the paper has been published under a creative commons licence. Here, therefore, is a map of the landslide by Pedersen et al. (2026):-

A) Geomorphological map of the Tupaasat rock avalanche deposits within the landslide outline together with the paleo-sea level line at 10 m a.s.l., and the proposed paleo-ice sheet extent.
B) Map showing the bathymetry data and the landslide outline. The bathymetry data is acquired from the Danish Geodata Agency and is not suitable for navigation C) Cross-section of Tupaasat rock avalanche with columns indicating the geomorphological features described in the results. The terrain slopes are presented below.
Images from Pedersen et al. (2026).

I have quickly annotated a Google Earth image of the site, showing the source and the track of the landslide. Note that the toe extends into the fjord, and thus is underwater, by a couple of kilometres:-

Annotated Google Earth image showing of the Tupaasat rock avalanche.

Landslides on this scale are hard to fathom. If this volume of rock was standing on a standard American football field (110 m x 49 m) it would form a column 185.5 km tall.

Pedersen et al. (2026) have dated the time of occurrence of this landslide. They conclude that it occurred about 10,900 years ago. This coincides remarkably well with the dated deglaciation (retreat of the icesheets) in this area. Thus, the authors suggest that the instability was probably associated with debuttressing of the glacier (i.e. the removal of the ice adjacent to the slope. They cannot rule out the possibility that final failure might have been triggered by an earthquake, though.

A further intriguing question is whether the event triggered a tsunami in the fjord. The distance that the landslide has moved suggests that it was very energetic. Given that it extended to the water (and some of the deposit is now within the lake) it is extremely likely that a displacement wave was triggered.

The latter point is very pertinent as there is increasing concern about the dangers of giant rock slope failures generating damaging tsunami events in fjords. For example, CNN published an article this week in the aftermath of the Tracy Arm landslide and tsunami that highlights the risk to cruise ships. It notes that:

Alaska’s foremost expert on these landslides knows why there hasn’t been a deadly landslide-turn-tsunami disaster, yet: sheer luck.

“It’s not because this isn’t a hazard,” said geologist Bretwood Higman, co-founder and executive director of nonprofit Ground Truth Alaska. “It’s because it just hasn’t happened to be above someone’s house or next to a cruise ship.”

An additional piece of context is the remarkable flooding that occurred in Alaska last weekend as Typhoon Halong tracked across parts of the state. This appears to have received far less attention than might have been anticipated, at least outside the US.

It is surely only a matter of time before we see a really large-scale accident as a result of a tsunami triggered by a rock slope failure. A vey serious scenario is that a large cruise ship is overwhelmed and sunk. The loss of life could be very high.

Reference

L.L. Pedersen et al. 2026. A giant Early Holocene tsunamigenic rock-ice avalanche in South Greenland preconditioned by glacial debuttressing. Geomorphology, 492, 110057,
https://doi.org/10.1016/j.geomorph.2025.110057.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Tiny Uranian Moon Likely Had a Massive Subsurface Ocean

Fri, 10/24/2025 - 13:25

Uranus’s tiny moon Ariel may have had a subsurface ocean that made up around 55% of its total volume. By mapping craters, crags, and ridges on the moon’s surface, planetary scientists modeled how thick Ariel’s crust was before it cracked under tidal stress and created the geologic features seen today. By subtracting the size of the crust and core, the researchers found that the Arielian ocean could have been about 170 kilometers thick as recently as 1 billion years ago.

“If Ariel had a subsurface ocean, it definitely does imply that other small icy moons could also have [had] subsurface oceans,” said Caleb Strom, who conducted this research as a planetary geologist fellow at the University of North Dakota in Grand Forks.

Maybe “it’s easier to make an ocean world than we thought,” he added.

An Unlikely Ocean World

Ariel is the second closest of the five large moons of Uranus. But large is a bit of a misnomer, as Ariel is only about 1,160 kilometers across, or about a third the size of Earth’s Moon.

When Voyager 2 flew through the Uranus system in 1986, scientists were surprised to see that Ariel’s icy surface was relatively young, was geologically complex, and showed some signs of cryovolcanism. Some features on the moon’s surface are similar to those seen on Europa, Enceladus, and Triton, three confirmed ocean worlds.

“We weren’t necessarily expecting it to be an ocean world.”

“What’s interesting about Ariel is that it’s unexpected,” Strom said. “We weren’t necessarily expecting it to be an ocean world.”

Later studies also found ammonia and carbon oxide compounds on Ariel’s surface, chemistry that often suggests the presence of subsurface liquid. The molecules disappear quickly unless they are frequently replenished.

But with Ariel being so small and unable to retain heat for very long, scientists thought that any subsurface ocean it may once have had was relatively thin and short-lived.

Strom and his colleagues didn’t initially set out to challenge this understanding of Ariel’s interior. They were interested in understanding the forces that could have created the moon’s geologic features.

To do this, the researchers first mapped the moon’s surface using images from the Voyager 2 flyby, cataloging ridges, fractures, and craters. They then modeled Ariel’s internal structure, giving it, from the top down, a brittle crust, a flexible crust, and an ocean all atop a solid core. They then simulated how that crust would deform under different levels of stress from tidal forces from other nearby Uranian moons and the planet itself. By varying the crust and ocean thickness and the strength of the tidal stress, the team sought to match the stress features in their models to the Voyager-derived geologic maps.

In 2023, the James Webb Space Telescope imaged Uranus and several of its major moons and rings. Credit: NASA, ESA, CSA, STScI; Image Processing: Joseph DePasquale (STScI)

The team’s models indicate that a crust less than 30 kilometers thick would have fractured under a moderate amount of tidal stress and created the geologic features seen today. The researchers suggest that to cause that stress, in the past 1–2 billion years (Ga), an orbital resonance with nearby moon Miranda stretched Ariel’s orbit about 4% from circular and fractured the surface.

“This is really a prediction about the crustal thickness” and the stress level it can withstand, Strom said. Then, with a core 740 kilometers across and a crust 30 kilometers thick, that would mean that Ariel’s subsurface ocean was 170 kilometers from top to bottom and made up about 55% of its total volume. The researchers published their results in Icarus in September.

Is Ariel Odd? Maybe Not

“The possible presence of an ocean in Ariel in the past [roughly] 1 Ga is certainly an exciting prospect,” said Richard Cartwright, an ocean world scientist at Johns Hopkins Applied Physics Laboratory (JHUAPL) in Laurel, Md. “These results track with other studies that suggest the surface geology of Ariel offers key clues in terms of recent activity” and the possibility that Ariel is, or was, an ocean world. Cartwright was not involved with the new research.

Strom cautioned that just because Ariel once had a substantial subsurface ocean doesn’t mean that it still does. The moon is very small and doesn’t retain heat very well, he said. Any ocean that remained would likely be much thinner and probably not a good place to search for life.

However, the fact that tiny Ariel may once have had such a large ocean may mean that ocean worlds are more common and easier to create than scientists once thought. Understanding the conditions that led to Ariel’s subsurface ocean could help scientists better understand how such worlds come about and how they evolve.

“Ariel’s case demonstrates that even comparatively sized moons can, under the right conditions, develop and sustain significant internal oceans.”

“Ariel’s case demonstrates that even comparatively sized moons can, under the right conditions, develop and sustain significant internal oceans,” said Chloe Beddingfield, a planetary scientist also at JHUAPL. “However, that doesn’t mean all similar bodies would have done so. Each moon’s potential for an ocean depends on its particular mix of heat sources, chemistry, and orbital evolution.”

An ocean composing 55% of a planet’s or moon’s total volume might seem pretty huge, but it also might be perfectly within normal range for ocean worlds, added Beddingfield, who was not involved with this research. “The estimated thickness of Ariel’s internal ocean…is striking, but not necessarily unexpected given the diversity of icy satellites.”

Too, Voyager 2 did not image all of Ariel’s surface, only the 35% that was illuminated during its flyby. A future long-term mission to the Uranus system could provide higher-resolution global maps of Ariel and other moons to help refine calculations of crustal thickness and determine the existence of subsurface oceans, Strom said.

Strom and his team plan to expand their stress test research to other moons of Uranus such as Miranda, Oberon, and Umbriel and possibly icy moons around other planets.

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

Citation: Cartier, K. M. S. (2025), Tiny Uranian moon likely had a massive subsurface ocean, Eos, 106, https://doi.org/10.1029/2025EO250398. Published on 24 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A medida que el Ártico se calienta, los suelos pierden nutrientes clave

Fri, 10/24/2025 - 13:22

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

Los suelos árticos y subárticos almacenan una proporción considerable del carbono de la Tierra. Sin embargo, el aumento de las temperaturas podría drenar el nitrógeno de estos suelos — un nutriente clave —. Según un nuevo estudio, la pérdida de nitrógeno podría reducir el crecimiento de las plantas, limitando la capacidad de los suelos para almacenar carbono y amplificando el calentamiento global.

Los suelos de latitudes altas almacenan grandes cantidades de carbono porque las bajas temperaturas retardan la actividad microbiana. Aunque las plantas producen materia orgánica a través de la fotosíntesis, los microorganismos no pueden consumirla lo suficientemente rápido, provocando su acumulación con el tiempo. Los científicos han estado preocupados de que un Ártico más cálido aceleraría la actividad microbiana, liberando el carbono almacenado a la atmósfera como dióxido de carbono (CO₂). Pero también esperaban que las temperaturas más cálidas estimularan el crecimiento de las plantas, lo que reabsorbería parte del carbono y compensaría parcialmente estas emisiones.

La nueva investigación muestra que este último escenario es muy improbable, ya que el calentamiento provoca que los suelos pierdan nitrógeno, una pérdida que podría inhibir el crecimiento de las plantas.

“No esperábamos ver una pérdida de nitrógeno.”

Los hallazgos provienen de un experimento de una década de duración realizado en un pastizal subártico cerca de Hveragerði, Islandia. En 2008, un potente terremoto alteró los flujos de agua geotérmica en la región, convirtiendo parcelas de suelo que antes eran normales en zonas calentadas naturalmente con gradientes de temperatura que oscilan entre 0.5 °C y 40 °C por encima de los niveles anteriores. El evento creó un laboratorio natural único para observar cómo responden los ecosistemas al calentamiento a largo plazo.

Usando isótopos estables de nitrógeno-15 para rastrear los flujos de nutrientes en el paisaje, los investigadores encontraron que, por cada grado Celsius de calentamiento, los suelos pierden entre 1.7 % y 2.6 % de su nitrógeno. Las mayores pérdidas ocurrieron durante el invierno y principios de la primavera, cuando los microbios permanecían activos pero las plantas estaban inactivas. Durante este tiempo, se liberaron compuestos nitrogenados como el amonio y el nitrato en el suelo, pero las plantas no podían absorberlos, se perdieron ya sea por lixiviación al agua subterránea o escapándose a la atmósfera como óxido nitroso, un gas de efecto invernadero casi 300 veces más potente que el CO₂.

Los resultados se publicaron en un artículo en Global Change Biology.

«No esperábamos ver una pérdida de nitrógeno», mencionó Sara Marañón, científica del suelo del Centro de Investigación Ecológica y Aplicaciones Forestales de España y primera autora del estudio. «Los mecanismos del suelo para almacenar nitrógeno se están deteriorando».

Un ecosistema menos fértil, más rápido

Los investigadores también encontraron que el calentamiento debilitó los mecanismos que ayudan a los suelos a retener el nitrógeno. En las parcelas más cálidas, la biomasa microbiana y la densidad de las raíces finas — ambas fundamentales para el almacenamiento de nitrógeno — eran mucho menores que en las parcelas más frías. Aunque los microbios eran menos abundantes, su metabolismo era más rápido, liberando más CO2 por unidad de biomasa. Mientras tanto, las plantas luchaban por adaptarse, quedando rezagadas tanto en su crecimiento como en la absorción de nutrientes.

«Las comunidades microbianas son capaces de adaptarse y alcanzar un nuevo equilibrio con tasas de actividad más rápidas», dijo Marañón. «Pero las plantas no pueden seguirles el ritmo»

“Este no es un mensaje muy optimista.”

El aumento del metabolismo microbiano resulta inicialmente en un mayor consumo del nitrógeno y carbono disponibles en el suelo. Sin embargo, después de 5 o 10 años, el sistema parece alcanzar un nuevo equilibrio, con niveles reducidos de materia orgánica y menor fertilidad. Ese cambio sugiere que el calentamiento de los suelos puede provocar una transición hacia un estado permanentemente menos fértil, haciendo más difícil la recuperación de la vegetación y conduciendo a una pérdida irreversible de carbono.

Tradicionalmente, los científicos han pensado que, dado que la materia orgánica se descompone más rápidamente en un clima más cálido, el nitrógeno que contiene estará más disponible, lo que conducirá a una mayor productividad, según Erik Verbruggen, ecólogo del suelo de la Universidad de Amberes, en Bélgica, que no participó en el estudio. «Este artículo demuestra que, en realidad, esto no está ocurriendo».

En cambio, el nitrógeno está siendo filtrado del suelo durante la primavera, lo que lo hace inaccesible para una mayor producción de biomasa. «Este no es un mensaje muy optimista», afirmó Verbruggen.

Una fuente subestimada de gases de efecto invernadero

Dado que las regiones árticas se están calentando más rápido que el promedio global, esta alteración del ciclo de nutrientes podría volverse más evidente pronto. La pérdida de nitrógeno y carbono de los suelos en regiones frías puede representar una fuente significativa y previamente subestimada de emisiones de gases de efecto invernadero, que los modelos climáticos actuales aún no han incorporado por completo.

Los investigadores regresaban periódicamente a los cálidos pastizales cercanos a Hveragerði, Islandia, para medir el nitrógeno. Crédito: Sara Marañón.

Los investigadores planean explorar las fases tempranas del calentamiento del suelo, trasplantando fragmentos de suelos normales hacia áreas calentadas, y también investigar cómo distintos tipos de suelo responden al calor. Marañón señaló que los suelos islandeses estudiados son de origen volcánico y muy ricos en minerales, a diferencia de los suelos orgánicos de turba comunes en otras regiones árticas.

“Los suelos árticos también incluyen el permafrost en lugares como el norte de Rusia y partes de Escandinavia, y ellos son los mayores reservorios de carbono en los suelos del mundo”, dice Verbruggen. Por otro lado, los suelos analizados en esta investigación eran suelos de pastizal someros. “No son necesariamente representativos de todos los suelos árticos.”

Aun así, Verbruggen añadió, los hallazgos del estudio resaltan el delicado equilibrio entre productividad y pérdida de nutrientes en estos sistemas.

Las abundantes reservas de carbono del suelo lo convierten en un riesgo importante si se gestiona inadecuadamente, dijo Marañón. «Pero también puede convertirse en un aliado potencial y compensar las emisiones de CO2».

—Javier Barbuzano (@javibar.bsky.social), Escritor de ciencia

This translation by Saúl A. Villafañe-Barajas (@villafanne) was made possible by a partnership with Planeteando and Geolatinas. Esta traducción fue posible gracias a una asociación con Planeteando y Geolatinas.

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Better Way to Monitor Greenhouse Gases

Fri, 10/24/2025 - 13:21

In recent years, the international community has made progress in slowing increases in the rate of carbon dioxide emissions and in acknowledging the scale of methane leaks from oil and gas facilities. However, carbon dioxide emissions continue to rise, methane releases from the energy sector have not abated, and there is more need than ever for targeted and sustained greenhouse gas (GHG) emissions reductions and other climate change mitigation approaches.

The success of climate change mitigation approaches relies in part on having accurate, timely, and integrated carbon cycle data from surface, airborne, and satellite sensors.

The success of such actions relies in part on having accurate, timely, and integrated carbon cycle data from surface, airborne, and satellite sensors covering local, regional, and international scales. These data improve efforts to track emissions reductions, identify and mitigate unexpected emissions and leaks, and monitor ecosystem feedbacks to inform land management.

In September 2024, researchers in the carbon cycle monitoring community met to discuss how best to establish a more effective system for monitoring GHGs and to help accelerate climate action through better data and decision support.

Here we highlight issues and challenges facing emissions monitoring and documentation efforts illuminated during the September meeting, as well as ideas and proposals for tackling the challenges. The recommendations emphasize the urgency of enhanced monitoring to support the goals of the Paris Agreement and the Global Methane Pledge, particularly in the face of increasing climate extremes and the vulnerability of Earth’s natural carbon reservoirs [Friedlingstein et al., 2025].

Bottom-Up Meets Top-Down

Parties to the Paris Agreement track their progress toward meeting GHG emissions reduction targets through bottom-up accounting methods that track carbon using local ground-based observations. These methods combine information about the spatial extents of carbon sources and sinks with estimates of how much these sources and sinks emit or take up, respectively.

This inventorying approach offers high-precision information at time intervals that support long-term tracking. However, it is also often time intensive, depends on country-specific methodologies, may not accurately reflect spatiotemporal variability in GHG fluxes, and is not suited for operational monitoring of sudden changes or reversals [Elguindi et al., 2020; Nicholls et al., 2015].

Top-down approaches using remotely sensed atmospheric GHG and biomass observations offer an independent accounting method [Friedlingstein et al., 2025], with the potential for low-latency (weekly to monthly) monitoring of GHG emissions and removals. Technological advances offered by facility-scale plume imagers (e.g., GHGSat, Earth Surface Mineral Dust Source Investigation (EMIT), Carbon Mapper) and global GHG mappers (e.g., Orbiting Carbon Observatory-2 and -3 (OCO-2 and -3), Tropospheric Monitoring Instrument (TROPOMI), Greenhouse gases Observing Satellite-2 (GOSAT-2)) show promise for monitoring GHG fluxes at the local and global scale, respectively [Joint CEOS-CGMS Working Group on Climate Greenhouse Gas Task Team, 2024].

Greenhouse gas (GHG) observations with existing capabilities alone are insufficient for adequately informing climate change mitigation measures.

However, a significant gap remains in our ability to monitor weaker, spatially distributed emissions and removals at intermediate (10- to 1,000-kilometer) scales [Joint CEOS-CGMS Working Group on Climate Greenhouse Gas Task Team, 2024], particularly in systems managed by humans such as energy production and land use.

Conversations during the 2024 workshop—partly intended to inform the development of the next Decadal Survey for Earth Science and Applications from Space—highlighted limitations in current GHG monitoring capabilities. They also emphasized the critical need for an operational observing system that leverages top-down and bottom-up approaches to support climate action at local, national, and international scales.

Because of a lack of sensitivity to subregional processes, GHG observations with existing capabilities alone are insufficient for adequately informing climate change mitigation measures [e.g., Jacob et al., 2022; Watine-Guiu et al., 2023]. We must also integrate state-of-the-art science and improved understanding of Earth’s changing carbon cycle, as well as data from new observing system technologies, into the information provided to decisionmakers.

This integration requires identifying gaps and opportunities with respect to knowledge, data, and stakeholder needs. It also requires defining a vision for sustained, operational GHG monitoring to support emissions reductions, track carbon cycle feedbacks, and deliver reliable, timely, transparent, and actionable information.

This vision could be achieved with a unified multitiered global system combining models and observations of the atmosphere, land, and ocean collected with surface, airborne, and satellite tools to track carbon fluxes (e.g., atmospheric emissions and removals) and stocks (e.g., biomass, soil carbon) with improved frequency, spatial coverage, and precision (Figure 1).

Fig. 1. An effective multitiered greenhouse gas (GHG) observing system should integrate observations of the atmosphere, land, and ocean from sensors and samples on Earth’s surface, in the air, and aboard satellites. Carbon dioxide is shown as black and red molecules, and methane is shown as black and white molecules. ARGO refers to a fleet of sensors floating in the upper ocean. FTIR is Fourier transform infrared spectroscopy. Credit: Created in BioRender; Carroll, 2025, https://BioRender.com/b77439n

Organizing such a system would require substantial international coordination among governmental, academic, and nongovernmental organizations, perhaps mediated through entities such as the World Meteorological Organization’s Global Greenhouse Gas Watch, the Committee on Earth Observation Satellites, and the U.S. Greenhouse Gas Center (USGHGC).

Addressing Gaps from Space

A globally unified GHG observing system should capitalize on spaceborne technologies to fill spatial and temporal gaps in in situ networks and to monitor the responses of carbon fluxes and stocks to disturbances, weather extremes, and environmental change. This system should prioritize four key elements.

First, gathering more vertically detailed data—from the top of the atmosphere to ground level—is critical. Existing satellites measure the total amounts of carbon dioxide and methane in the atmospheric column. These measurements work well for detecting changes over large (e.g., continental) spatial scales and at facility scale, but they provide less detail about smaller-scale processes. Knowing GHG concentrations near the surface relative to those in the upper atmosphere could, for example, provide improved tracking of fluxes and understanding of the processes responsible.

Sustained vertical GHG profiling, achieved using multichannel passive sensors deployed on missions such as GOSAT-2 or emerging cloud-slicing lidar methods, for example, is foundational to the proposed system. This profiling would provide long-term time series data to help researchers detect weak but consistent flux changes and increased sensitivity to natural and anthropogenic regional sources [e.g., Parazoo et al., 2016].

Sampling the atmosphere every day would enable better detection of sudden changes in GHG concentrations and linking of those changes to particular sources.

Second, more frequent observations—obtained with a constellation of satellites observing from low, geostationary, and highly elliptical Earth orbits—are needed. Sampling the atmosphere every day, or even multiple times per day, would enable better detection of sudden changes in GHG concentrations and linking of those changes to particular sources.

Third, mapping of carbon stocks should be harmonized by combining information from different sensors and methods. Several means exist to map carbon in vegetation from space, for example, including lidar altimetry used to identify treetops and synthetic aperture radar used to estimate the volumes of trees.

Combining the strengths of existing methods and missions would facilitate more accurate and better resolved monitoring of carbon accumulation and loss due to management practices, disturbances, and ecosystem recovery. Future biomass satellite missions should focus on measurements at the scale of forest plots (i.e., hectare-scale systems with many trees) to provide more useful maps with reduced uncertainty, rather than on applying very high resolution sensors that resolve individual trees.

The fourth key is expanded satellite coverage of tropical, high-latitude, and oceanic regions to better monitor carbon cycle feedbacks [Sellers et al., 2018]. This coverage should involve the use of new active and imaging spectrometer techniques, such as those being developed in the Carbon-I mission concept study, to probe through prevalent clouds and darkness that hinder continuous monitoring.

Beyond the primary focus on GHG and biomass data, we also need—and have opportunities to obtain—complementary datasets to better constrain the locations of and processes affecting carbon sources and sinks. Atmospheric measurements of solar-induced fluorescence by vegetation, carbonyl sulfide, oxygen, carbon monoxide, and isotopes of carbon and oxygen could help disentangle fossil sources of emissions from biological sources and provide insights into processes such as photosynthesis and wildfire activity.

Currently, land and ocean ecosystems remove about half of the anthropogenic carbon emitted into the atmosphere, but this amount could change in the future [Friedlingstein et al., 2025]. Sustained monitoring of these ecosystems—and of the indicators of how they are changing—is necessary to understand and track diverse change across the Earth system.

Addressing Gaps from the Ground

Surface and airborne observations are essential for calibrating spaceborne measurements and for monitoring processes that can’t be observed from space.

Expanded surface and airborne networks for gathering data in situ from oceanic, terrestrial, and aquatic ecosystems are also a critical part of the proposed global observing system. These observations are essential for calibrating spaceborne measurements, for improving our understanding of undersampled regions (e.g., nonforest lands, rivers, wetlands, oceans), and for monitoring processes that can’t be observed from space.

Efforts on several fronts are required to provide more comprehensive ground- and air-based information on carbon fluxes and stocks to better meet stakeholder and research needs. Examples of these needed efforts include obtaining more atmospheric GHG profiles from research and commercial aircraft (e.g., through campaigns such as NOAA’s National Observations of Greenhouse Gasses Aircraft Profiles program), expanding measurements of surface-atmosphere GHG exchanges from tower-mounted sensors in undersampled terrestrial and aquatic systems [Baldocchi, 2020], and collecting seawater composition data from autonomous vehicles (e.g., Argo floats) in coastal and open oceans.

Other needed efforts include collecting more in situ measurements of above- and below-ground biomass and soil carbon and airborne sampling of managed and unmanaged (natural) experimental field sites. For example, monitoring of biomass reference measurement networks, such as GEO-TREES, should be expanded to facilitate monitoring and validation of spaceborne biomass data. These complementary measurements of quantities unobserved by remote sensing, such as soil carbon and respiration, are essential for tracking long-term storage [e.g., Konings et al., 2019].

Connecting Users to Data

Workshop participants envisioned a framework to support decisionmaking by scientists and stakeholders that links observing systems with actionable knowledge through a two-way flow of information. This framework involves three key pieces.

Identifying the underlying causes and drivers of changes in GHG emissions and removals is critical for developing effective, targeted mitigation and management policies.

First, integrating information from data-constrained models is crucial. Guan et al. [2023] offered a “system of systems” approach for monitoring agricultural carbon that is also applicable to other ecosystems. This approach leverages multitiered GHG and biomass data as constraints in land, ocean, and inverse models (which start with observed effects and work to determine their causes) to generate multiscale maps of observable and unobservable carbon stock and flux change. The result is a stream of continuous, low-latency information (having minimal delays between information gathering and output) for verifying GHG mitigation strategies.

Second, scientists must work with stakeholders to identify the underlying causes and drivers of changes in GHG emissions and removals. This identification is critical for assessing progress and developing effective, targeted mitigation and management policies.

Third, the actionable knowledge resulting from this framework—and provided through organizations such as the USGHGC—must be applied in practice. Stakeholders, including corporations, regulatory agencies, and policymakers at all levels of government, should use improved understanding of carbon flux change and underlying drivers to track progress toward nationally determined contributions, inform carbon markets, and evaluate near- and long-term GHG mitigation strategies.

Meeting the Needs of the Future

Benchmarking and validation are important parts of building trust in models and improving projections of carbon-climate feedbacks. By using comprehensive observations of carbon fluxes and stocks to assess the performance of Earth system models [e.g., Giorgetta et al., 2013], scientists can generate more reliable predictions to inform climate action policies that, for example, adjust carbon neutrality targets or further augment GHG observing systems to better study regional feedbacks [Ciais et al., 2014].

The globally unified observing system envisioned, which would integrate advanced spaceborne technologies with expanded ground and air networks and a robust decision support framework, could significantly enhance our ability to track and mitigate GHG emissions and manage carbon stocks.

Successful implementation of this system would also hinge on data accessibility and community building. Developing a universal data platform with a straightforward interface that prioritizes data literacy is crucial for ensuring accessibility for a global community of users. In addition, fostering cross-agency partnerships and engagement and collaborative networking opportunities among stakeholders will be essential for building trust, catalyzing further participation in science, and developing innovative solutions for a more sustainable future.

Acknowledgments

The September 2024 workshop and work by the authors on this article were funded as an unsolicited proposal (Proposal #226264: In support of ‘Carbon Stocks Workshop: Sep 23–25, 2024’) by the U.S. Greenhouse Gas Center, Earth Science Division, NASA. A portion of this research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration (80NM0018D0004).

References

Baldocchi, D. D. (2020), How eddy covariance flux measurements have contributed to our understanding of global change biology, Global Change Biol., 26(1), 242–260, https://doi.org/10.1111/gcb.14807.

Ciais, P., et al. (2014), Current systematic carbon-cycle observations and the need for implementing a policy-relevant carbon observing system, Biogeosciences, 11(13), 3,547–3,602, https://doi.org/10.5194/bg-11-3547-2014.

Elguindi, N., et al. (2020), Intercomparison of magnitudes and trends in anthropogenic surface emissions from bottom-up inventories, top-down estimates, and emission scenarios, Earth’s Future, 8(8), e2020EF001520, https://doi.org/10.1029/2020EF001520.

Friedlingstein, P., et al. (2025), Global Carbon Budget 2024, Earth Syst. Sci. Data, 17(3), 965–1,039, https://doi.org/10.5194/essd-17-965-2025.

Giorgetta, M. A., et al. (2013), Climate and carbon cycle changes from 1850 to 2100 in MPI‐ESM simulations for the Coupled Model Intercomparison Project Phase 5, J. Adv. Model. Earth Syst., 5(3), 572–597, https://doi.org/10.1002/jame.20038.

Guan, K., et al. (2023), A scalable framework for quantifying field-level agricultural carbon outcomes, Earth Sci. Rev., 243, 104462, https://doi.org/10.1016/j.earscirev.2023.104462.

Jacob, D. J., et al. (2022), Quantifying methane emissions from the global scale down to point sources using satellite observations of atmospheric methane, Atmos. Chem. Phys., 22(14), 9,617–9,646, https://doi.org/10.5194/acp-22-9617-2022.

Joint CEOS-CGMS Working Group on Climate Greenhouse Gas Task Team (2024), Roadmap for a coordinated implementation of carbon dioxide and methane monitoring from space, 52 pp., ceos.org/document_management/Publications/Publications-and-Key-Documents/Atmosphere/CEOS_CGMS_GHG_Roadmap_Issue_2_V1.0_FINAL.pdf.

Konings, A. G., et al. (2019), Global satellite-driven estimates of heterotrophic respiration, Biogeosciences, 16(11), 2,269–2,284, https://doi.org/10.5194/bg-16-2269-2019.

Nicholls, D., et al. (2015), Top-down and bottom-up approaches to greenhouse gas inventory methods—A comparison between national- and forest-scale reporting methods, Gen. Tech. Rep. PNW-GTR-906, 30 pp., Pac. Northwest Res. Stn., For. Serv., U.S. Dep. of Agric., Portland, Ore., https://doi.org/10.2737/PNW-GTR-906.

Parazoo, N. C., et al. (2016), Detecting regional patterns of changing CO2 flux in Alaska, Proc. Natl. Acad. Sci. U. S. A., 113(28), 7,733–7,738, https://doi.org/10.1073/pnas.1601085113.

Sellers, P. J., et al. (2018), Observing carbon cycle–climate feedbacks from space, Proc. Natl. Acad. Sci. U. S. A., 115(31), 7,860–7,868, https://doi.org/10.1073/pnas.1716613115.

Watine-Guiu, M., et al. (2023), Geostationary satellite observations of extreme and transient methane emissions from oil and gas infrastructure, Proc. Natl. Acad. Sci. U. S. A., 120(52), e2310797120, https://doi.org/10.1073/pnas.2310797120.

Author Information

Dustin Carroll (dustin.carroll@sjsu.edu), Moss Landing Marine Laboratories, San José State University, San José, Calif.; also at Jet Propulsion Laboratory, California Institute of Technology, Pasadena; Nick Parazoo and Hannah Nesser, Jet Propulsion Laboratory, California Institute of Technology, Pasadena; Yinon Bar-On, California Institute of Technology, Pasadena; also at Department of Earth and Planetary Sciences, Weizmann Institute of Science, Rehovot, Israel; and Zoe Pierrat, Jet Propulsion Laboratory, California Institute of Technology, Pasadena

Citation: Carroll, D., N. Parazoo, H. Nesser, Y. Bar-On, and Z. Pierrat (2025), A better way to monitor greenhouse gases, Eos, 106, https://doi.org/10.1029/2025EO250395. Published on 24 October 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

1.5 Million Acres of Alaskan Wildlife Refuge to Open for Drilling

Thu, 10/23/2025 - 21:54
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

A large swath of the Arctic National Wildlife Refuge (ANWR) will soon open for drilling, the Trump administration announced today.

“For too long, many politicians and policymakers in DC treated Alaska like it was some kind of zoo or reserve, and that, somehow, by not empowering the people or having even the slightest ability to tap into the vast resources was somehow good for the country or good for Alaska,” Secretary of the Interior Doug Burgum said during an Alaska Day event.

As of July 2025, Alaska ranked sixth in the nation for crude oil production.

 
Related

The news is the latest in a saga involving the ANWR, which in total spans 19.6 million acres. The 1.5 million acres to be opened for drilling represent the coastal plain of the refuge.

The 1980 Alaska National Interest Lands Conservation Act, which created most of the state’s national park lands, included a provision that no exploratory drilling or production could occur without congressional action.

Trump first opened the 1.5 million-acre coastal plain region for drilling in 2020, but the sale of drilling leases in early 2021 generated just $14.4 million in bids, rather than the $1.8 billion his administration had estimated.

On his first day in office, Biden placed a temporary moratorium on oil and gas drilling in the refuge, later going on to cancel the existing leases.

Trump resumed his efforts to allow drilling in ANWR early in his second term, though in January 2025, a lease sale attracted zero bidders. Previously, major banks had ruled out financing such drilling efforts, some citing environmental concerns. Cost is also likely a factor, as the area currently has no roads or facilities.

In addition to opening drilling, the Department of Interior also announced today the reissuing of permits to build a road through Izembek National Wildlife Refuge and a plan to greenlight another road.

“Today’s Arctic Refuge announcement puts America — and Alaska — last,” said Erik Grafe, an attorney for the environmental law nonprofit Earthjustice, in a statement. “The Gwich’in people, most Americans, and even major banks and insurance companies know the Arctic Refuge is no place to drill.”

In contrast, Voice of the Arctic Iñupiat (VOICE), a nonprofit dedicated “to preserving and advancing North Slope Iñupiat cultural and economic self-determination,” released a statement on Thursday in favor of the policy shift.

“Developing ANWR’s Coastal Plain is vital for Kaktovik’s future,” said Nathan Gordon, Jr., mayor of Kaktovik, an Iñupiat village on the northern edge of ANWR. “Taxation of development infrastructure in our region funds essential services across the North Slope, including water and sewer systems to clinics, roads, and first responders. Today’s actions by the federal government create the conditions for these services to remain available and for continued progress for our communities.”

The Department of the Interior said it plans to reinstate the 2021 leases that were cancelled by the Biden administration, as well as to hold a new lease sale sometime this winter.

—Emily Gardner (@emfurd.bsky.social) Associate Editor

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New Satellite Data Reveal a Shift in Earth’s Once-Balanced Energy System

Thu, 10/23/2025 - 13:22

Years ago, scientists noted something odd: Earth’s Northern and Southern Hemispheres reflect nearly the same amount of sunlight back into space. The reason why this symmetry is odd is because the Northern Hemisphere has more land, cities, pollution, and industrial aerosols. All those things should lead to a higher albedo—more sunlight reflected than absorbed. The Southern Hemisphere is mostly ocean, which is darker and absorbs more sunlight.

New satellite data, however, suggest that symmetry is breaking.

From Balance to Imbalance

In a new study published in the Proceedings of the National Academy of Sciences of the United States of America, Norman Loeb, a climate scientist at NASA’s Langley Research Center, and colleagues analyzed 24 years of observations from NASA’s Clouds and the Earth’s Radiant Energy System (CERES) mission.

They found that the Northern Hemisphere is darkening faster than the Southern Hemisphere. In other words, it’s absorbing more sunlight. That shift may alter weather patterns, rainfall, and the planet’s overall climate in the decades ahead.

Since 2000, CERES has recorded how much sunlight is absorbed and reflected, as well as how much infrared (longwave) radiation escapes back to space. Loeb used these measurements to analyze how Earth’s energy balance changed between 2001 and 2024. The energy balance tells scientists whether the planet is absorbing more energy than it releases and how that difference varies between hemispheres.

“Any object in the universe has a way to maintain equilibrium by receiving energy and giving off energy. That’s the fundamental law governing everything in the universe,” said Zhanqing Li, a climate scientist at the University of Maryland who was not part of the study. “The Earth maintains equilibrium by exchanging energy between the Sun and the Earth’s emitted longwave radiation.”

The team found that the Northern Hemisphere is absorbing about 0.34 watt more solar energy per square meter per decade than the Southern Hemisphere. “This difference doesn’t sound like much, but over the whole planet, that’s a huge number,” said Li.

Results pointed to three main reasons for the Northern Hemisphere darkening: melting snow and ice, declining air pollution, and rising water vapor.

To figure out what was driving this imbalance, the scientists applied a technique called partial radiative perturbation (PRP) analysis. The PRP method separates the influence of factors such as clouds, aerosols, surface brightness, and water vapor from calculations of how much sunlight each hemisphere absorbs.

The results pointed to three main reasons for the Northern Hemisphere darkening: melting snow and ice, declining air pollution, and rising water vapor.

“It made a lot of sense,” Loeb said. “The Northern Hemisphere’s surface is getting darker because snow and ice are melting. That exposes the land and ocean underneath. And pollution has gone down in places like China, the U.S., and Europe. It means there are fewer aerosols in the air to reflect sunlight. In the Southern Hemisphere, it’s the opposite.”

“Because the north is warming faster, it also holds more water vapor,” Loeb continued. “Water vapor doesn’t reflect sunlight, it absorbs it. That’s another reason the Northern Hemisphere is taking in more heat.”

Curiosity About Cloud Cover

One of the study’s interesting findings is what didn’t change over the past 20 years: cloud cover.

“The clouds are a puzzle to me because of this hemispheric symmetry,” Loeb said. “We kind of questioned whether this was a fundamental property of the climate system. If it were, the clouds should compensate. You should see more cloud reflection in the Northern Hemisphere relative to the Southern Hemisphere, but we weren’t seeing that.”

Loeb worked with models to understand these clouds.

“We are unsure about the clouds,” said Loeb.

“Understanding aerosol and cloud interactions is still a major challenge,” agreed Li. “Clouds remain the dominant factor adjusting our energy balance,” he said. “It’s very important.”

Still, Li said that “Dr. Norman Loeb’s study shows that not only does [the asymmetry] exist, but it’s important enough to worry about what’s behind it.”

Loeb is “excited about the new climate models coming out soon” and how they will further his work. “It’ll be interesting to revisit this question with the latest and greatest models.”

—Larissa G. Capella (@CapellaLarissa), Science Writer

Citation: Capella, L. G. (2025), New satellite data reveal a shift in Earth’s once-balanced energy system, Eos, 106, https://doi.org/10.1029/2025EO250399. Published on 23 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Melting Cylinders of Ice Reveal an Iceberg’s Tipping Point

Thu, 10/23/2025 - 13:22

The titanic dangers icebergs pose to ships are well documented. Sometimes, however, icebergs themselves can capsize, creating earthquakes and tsunamis or even pushing entire glaciers backward. Most of those dramatic events occur right after the chunk of floating ice splits off from its source, but sometimes icebergs flip over in the open ocean.

Earlier lab experiments using simulated plastic icebergs showed that the energy released in capsize events can rival nuclear weapon blasts. But beyond an understanding that capsize events are likely related to melting induced by ocean warming, knowing why icebergs flip is a question that’s harder to answer. Large variations in iceberg size and shape, along with slow drifting across wide distances, make studying icebergs expensive and challenging.

One solution: make miniature icebergs in the lab and watch them melt under controlled conditions.

“Understanding the mathematics and the physics of what’s going on at a base level is important in order to scale up.”

“We wanted to study the simplest capsize problem we could come up with,” said Bobae Johnson, a physicist and Ph.D. student at the Courant Institute at New York University. She and her colleagues simplified and standardized iceberg shape to a cylinder of pure water ice 8 centimeters in diameter and 24 centimeters long. In their article for Physical Review Fluids, they described how each cylinder flipped several times over the course of a 30-minute experiment.

“It is good to look at these things on smaller scales because even what we were doing in the simplest setting gave us something very complex,” Johnson said. “Understanding the mathematics and the physics of what’s going on at a base level is important in order to scale up.”

From their experiments, Johnson and her colleagues linked the different rates of ice melt above and below the waterline to dynamic changes in the shape of the iceberg—including the location of the center of mass, which makes them flip. Despite the small scale of the experiments, the implications could be enormous.

“Icebergs play a key role in the climate system,” said Sammie Buzzard, a glaciologist at the Centre for Polar Observation and Modelling and Northumbria University who was not involved in the experiments. “When they melt, they add fresh, cold water to the ocean, which can impact currents.”

Icebergs, Soda Pop, and Cheerios

Real-world icebergs range in size from about 15 meters to hundreds of kilometers across, rivaling the size of some small nations. Tolkienesque mountain-like structures (“iceberg” literally means “ice mountain”) split off from glaciers, whereas flat slablike icebergs tend to break off from ice sheets like those surrounding Antarctica.

“An iceberg’s shape determines how it floats in the water and which parts are submerged and which parts sit above the ocean’s surface,” Buzzard said, adding that icebergs change shape as they melt or erode via wind and wave action. But the precise manner of this change is uncertain because in situ measurements are challenging. “If this erosion changes the shape enough that the iceberg is no longer stable in the water, [the iceberg] can suddenly flip over into a position in which it is stable.”

“Even if lab experiments aren’t exactly the same as a natural system, they can go a long way to improving our understanding of [iceberg capsizing].”

Whatever their major differences in shape and size, because they are fresh water floating on salt water, icebergs all exhibit the similar property that roughly 10% off their mass is above water, with the remaining 90% beneath. The similarities provided the starting point for the cylindrical iceberg experiments performed by Johnson and her collaborators.

A sphere or irregular body can rotate in many different directions, but a cylinder with a length greater than the diameter of its circular face floating in water will rotate along only one axis, effectively reducing the problem from three dimensions to two.

Standardizing the shape of the icebergs wasn’t the only simplification the team made. Under natural conditions, ice freezes from the outside in, which traps a lot of air. As icebergs melt, they sometimes release enough trapped air bubbles to make the surrounding water fizz like an opened can of soda pop. This effect can create chaotic motion in samples, so Johnson and collaborators opted to eliminate bubbles entirely in their experiment. To do so, they froze water in cylindrical molds suspended in extremely cold brine and stirred the water to drive residual air out—a process that took 24 to 48 hours for each cylinder.

This video depicts the flow of water beneath the surface of a melting model iceberg. Credit: New York University’s Applied Mathematics Laboratory

Finally, to keep the cylinders from drifting randomly in the ocean simulation tank, the researchers exploited the “Cheerios effect.” Floating cereal pieces tend to group together because of surface tension, so the team 3D printed pieces of flat plastic and coated them with wax. Placing those objects in the tank created a meniscus on either side of the cylinder, keeping it in place so the only motion it exhibited was the rotation they were looking for.

“The ice melts very slowly in the air and very quickly underwater,” Johnson said. In the experiment, that difference resulted in a gravitational instability as the center of mass shifted upward, making the whole cylinder flip. “Every time the ice locks into one position, it carves out a facet above the water and very sharp corners at the waterline, giving you a shape that looks quasi pentagonal about halfway through the experiment. We ran many, many experiments, and this happened across all of them.”

Buzzard emphasized the need for this sort of work. “Even if lab experiments aren’t exactly the same as a natural system, they can go a long way to improving our understanding of [iceberg capsizing],” she said. Every flip of a simulated iceberg could help us understand the effects on the warming ocean and the connection between small occurrences and global consequences.

—Matthew R. Francis (@BowlerHatScience.org), Science Writer

Citation: Francis, M. R. (2025), Melting cylinders of ice reveal an iceberg’s tipping point, Eos, 106, https://doi.org/10.1029/2025EO250390. Published on 23 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

How Plant-Fungi Friendships Are Changing

Wed, 10/22/2025 - 13:30
Source: Journal of Geophysical Research: Biogeosciences

Just as the human body contains a multitude of symbiotic microbial companions, most plant species also live alongside microbial friends. Among these companions are mycorrhizal fungi, which help plants gather water and nutrients—particularly nitrogen—from the soil. In exchange, plants provide mycorrhizal fungi with an average of 3% to 13% of the carbon they pull from the atmosphere through photosynthesis and sometimes as much as 50%.

This carbon donation to support mycorrhizal fungi can incur a significant carbon cost for plants. But few groups have investigated how environmental factors such as soil temperature and nitrogen levels influence the amount of carbon flowing from plants to mycorrhizal fungi and how this flow is likely to shift with climate change. To fill this gap, Shao et al. derived a model that they call Myco-CORPSE (Mycorrhizal Carbon, Organisms, Rhizosphere, and Protection in the Soil Environment) that illustrates how the environment influences interactions between plants and mycorrhizal fungi.

When the researchers fed data from more than 1,800 forest sites in the eastern United States into Myco-CORPSE, they obtained some familiar results and also made some new discoveries. The model echoed previous work in suggesting that increasing the abundance of soil nitrogen, for example, through fertilizer runoff, decreases the dependence of plants on mycorrhizal fungi and therefore reduces the amount of carbon plants allocate to their microbial counterparts. But in contrast to previous studies, these researchers found that rising soil temperatures had the same effect of reducing the amount of nitrogen and carbon exchanged by fungi and plants. That’s because warmth accelerates the breakdown of organic material, which releases nitrogen. Increasing atmospheric carbon dioxide levels, on the other hand, will likely increase the reliance of plants on mycorrhizal fungi by increasing the growth rate of plants and therefore increasing their need for nutrients.

The Myco-CORPSE model also replicated observed patterns, showing that the two major kinds of mycorrhizal fungal species (arbuscular and ectomycorrhizal) behave differently: Arbuscular trees tend to donate less carbon to their associated fungi relative to how much ectomycorrhizal trees donate to theirs. The model also found that forests with a mix of both kinds of species typically accrue less carbon from plants than forests with less mycorrhizal diversity.

As forest managers navigate the many stresses that forests face today, promoting a diversity of mycorrhizal species within forests could optimize plant growth while minimizing the carbon diverted to mycorrhizal fungi, the researchers wrote. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2025JG009198, 2025)

This article is part of the special collection Biogeosciences Leaders of Tomorrow: JGR: Biogeosciences Special Collection on Emerging Scientists.

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), How plant-fungi friendships are changing, Eos, 106, https://doi.org/10.1029/2025EO250397. Published on 22 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

An Asteroid Impact May Have Led to Flooding near the Grand Canyon

Wed, 10/22/2025 - 13:30

When it comes to famous holes in the ground, northern Arizona has two: Grand Canyon and Barringer Meteorite Crater.

New research now suggests that these famous depressions might, in fact, be linked—the impact that created the crater roughly 56,000 years ago might also have unleashed landslides in a canyon that’s part of Grand Canyon National Park today. Those landslides in turn likely dammed the Colorado River and temporarily created an 80-kilometer-long lake, the team proposed. The results were published in Geology.

Driftwood Then and Now

“These are two iconic features of Arizona.”

Karl Karlstrom, a geologist recently retired from the University of New Mexico, grew up in Flagstaff, Ariz. Grand Canyon and Barringer Meteorite Crater both were therefore in his proverbial backyard. “These are two iconic features of Arizona,” said Karlstrom.

Karlstrom’s father—also a geologist—used to regularly explore the caves that dot the walls of Grand Canyon and surrounding canyons. In 1970, he collected two pieces of driftwood from a cavern known as Stanton’s Cave. The mouth of Stanton’s Cave is more than 40 meters above the Colorado River, so finding driftwood in its recesses was unexpected. Routine flooding couldn’t have lofted woody detritus that high, said Karlstrom. “It would have required a flood 10 times bigger than any known flood over the last 2,000 years.”

The best radiocarbon dating available in the 1970s suggested that the driftwood was at least 35,000 years old. A colleague of the elder Karlstrom suggested that the driftwood had floated into Stanton’s Cave when an ancient landslide temporarily dammed the Colorado, raising water levels. The researchers even identified the likely site of the landslide—a wall of limestone in Nankoweap Canyon.

But what had set off that landslide in the first place? That’s the question that Karl Karlstrom and his colleagues sought to answer. In 2023, the researchers collected two additional samples of driftwood from another cave 5 kilometers downriver from Stanton’s Cave.

A “Striking” Coincidence

Modern radiocarbon dating of both the archival and newly collected driftwood samples yielded ages of roughly 56,000 years, with uncertainties of a few thousand years, for all samples. The team also dated sand collected from the second cave; it too had ages that, within the errors, were consistent with the sand having been emplaced 56,000 years ago.

The potential significance of that timing didn’t set in until one of Karlstrom’s international collaborators took a road trip to nearby Barringer Meteorite Crater, also known as Meteor Crater. There, he learned that the crater is believed to have formed around 56,000 years ago.

That coincidence was striking, said Karlstrom, and it got the team thinking that perhaps these two famous landmarks of northern Arizona—Meteor Crater and Grand Canyon National Park—might be linked. The impact that created Meteor Crater has been estimated to have produced ground shaking equivalent to that of an M5.2–5.4 earthquake. At the 160-kilometer distance of Nankoweap Canyon, the purported site of the landsliding, that ground movement would have been attenuated to roughly M3.3–3.5.

It’s impossible to know for sure whether such movement could have dislodged the limestone boulders of Nankoweap Canyon, Karlstrom and his colleagues concede. That’s where future modeling work will come in, said Karlstrom. It’s important to remember that an asteroid impact likely produces a distinctly different shaking signature than an earthquake caused by slip on a fault, said Karlstrom. “Fault slip earthquakes release energy from several kilometers depths whereas impacts may produce larger surface waves.”

But there’s good evidence that a cliff in Nankoweap Canyon did, indeed, let go, said Chris Baisan, a dendrochronologist at the Laboratory of Tree-Ring Research at the University of Arizona and a member of the research team. “There was an area where it looked like the canyon wall had collapsed across the river.”

An Ancient Lake

Using the heights above the Colorado where the driftwood and sand samples were collected, the team estimated that an ancient lake extended from Nankoweap Canyon nearly 80 kilometers upstream. At its deepest point, it would have measured roughly 90 meters. Such a feature likely persisted for several decades until the lake filled with sediment, allowing the river to overtop the dam and quickly erode it, the team concluded.

“They’re certainly close, if not contemporaneous.”

The synchronicity in ages between the Meteor Crater impact and the evidence of a paleolake in Nankoweap Canyon is impressive, said John Spray, a planetary scientist at the University of New Brunswick in Canada not involved in the research. “They’re certainly close, if not contemporaneous.” And while it’s difficult to prove causation, the team’s assertion that an impact set landslides in motion in the area around Grand Canyon is convincing, he added. “I think the likelihood of it being responsible is very high.”

Karlstrom and his collaborators are continuing to collect more samples from caves in Grand Canyon National Park. So far, they’ve found additional evidence of material that dates to roughly 56,000 years ago, as well as even older samples. It seems that there might have been multiple generations of lakes in the Grand Canyon area, said Karlstrom. “The story is getting more complicated.”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), An asteroid impact may have led to flooding near the Grand Canyon, Eos, 106, https://doi.org/10.1029/2025EO250391. Published on 22 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Another landslide dam flood at the site of the Matai’an rock avalanche in Taiwan

Wed, 10/22/2025 - 06:59

Failure of the landslide debris from the Matai’an rock avalanche allowed another barrier lake to form. This breached on 21 October 2025, generating another damaging debris flow.

Newspapers in Taiwan are reporting that a new landslide barrier lake formed and then failed at the site of the giant Matai’an rock avalanche. The breach event apparently occurred at baout 9 pm local time on 21 October 2025. The risk had been identified in advance and the downstream population had been evacuated successfully this time, so there are no reports of fatalities.

The Taipei Times has an image of the barrier lake that was released by the Hualien branch of the Forestry and Nature Conservation Agency:-

The Matai’an landslide barrier lakes prior to the failure of the lower one on 21 October 2025. Photo courtesy of the Hualien branch of the Forestry and Nature Conservation Agency via the Taipei Times.

There is also a video on Youtube from Focus Taiwan (CNA English News) that includes helicopter footage of the site, also provided by the Forestry and Nature Conservation Agency:-

This includes the following still:-

The lower Matai’an landslide barrier lake prior to the failure on 21 October 2025. Still from a video posted to Youtube by CNA English News – original footage courtesy of the Hualien branch of the Forestry and Nature Conservation Agency.

It appears to me that the barrier lake has formed because of a large landslide in the debris from the original rock avalanche note the dark coloured landslide scar on the left side of the image.

Loyal readers will remember that I highlighted that this could be an issue in my post on 3 October:-

“So, a very interesting question will now pertain to the stability of these slopes. How will they perform in conditions of intense rainfall and/or earthquake shaking? Is there the potential for a substantial slope failure on either side, allowing a new (enlarged) lake to form.”

“This will need active monitoring (InSAR may well be ideal). The potential problems associated with the Matai’an landslide are most certainly not over yet.”

There is a high probability that this will be a recurring issue in periods of heavy rainfall.

Meanwhile, keep a close eye on Tropical Storm Melissa, which is tracking slowly northwards in the Caribbean. This could bring exceptionally high levels of rainfall to Haiti and Jamaica as it is moving very slowly. This one looks like a disaster in waiting at the moment.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

To Find Critical Minerals, Look to Plate Tectonics

Tue, 10/21/2025 - 13:31

For much of the 20th century, “petroleum politics” shaped international policy. In the 21st century, a new set of resources has taken center stage: critical minerals. Sourcing and extracting these minerals have become a priority for countries and communities around the world because they are used in everything from solar panels to cell phones to superconductors.

A new study suggests where prospectors can search for critical minerals: rifting sites left behind by the supercontinent Rodinia, which broke up in the Proterozoic, more than 800 million years ago.

“To better find [critical] resources, really, we need a better understanding of geology.”

“Unless it is grown, absolutely everything on the planet that we use as a manufactured good requires something that comes out of a mine,” said Chris Kirkland, a geologist at Curtin University in Australia and a coauthor of the new study, published last month in Geological Magazine. “To better find those resources, really, we need a better understanding of geology.”

Kirkland and his colleagues began by analyzing rocks unearthed by drilling companies in Western Australia. The slabs contain carbonatite, a “weird,” rare, and poorly understood kind of igneous rock formed in the mantle from magmas rich in carbonate minerals. As the magmas rise through Earth’s interior, they react with surrounding rocks, altering the chemical signatures that geologists typically use to trace a sample’s origins.

Carbonatites often contain rare earth elements, such as niobium. Although niobium can be found in different rocks, carbonatites are the only ones offering it in amounts economically suitable for extraction. The Western Australia sites are home to more than 200 million metric tons of the metal.

The team “threw the whole kitchen sink of analytical techniques” at the carbonatites, explained Kirkland. The first step was to take a drill core sample and image its structure to see the broad geological ingredients inside. Then the researchers used lasers to sample individual grains and piece out their crystals.

The carbonatites contained zircon, apatite, and mica, all crystals with isotopes that decay at known rates and can tell researchers about the sample’s age and source. The researchers also analyzed the helium present in zircon, because helium is a volatile element that easily escapes rocks near the surface and can help reveal when the rocks reached the crust.

Written in Stone

The story written in the slabs is one tied to the long history of plate tectonics. The breakup of Rodinia began around 800 million years ago and continued for millions of years as hot, metal-enriched oozes of magma rose up from the mantle. Pressure from this rising rock helped split apart the supercontinent, and the metals encased in carbonatites breached the surface at once-stable mounds of continental crust called cratons.

Today, said Kirkland, tracking these “old fossil scars” where cratons split could reveal stores of minerals.

More than 200 million metric tons of niobium were recently identified in Australia’s Aileron Province, a likely result of the breakup of Rodinia. Credit: Dröllner et al., 2025, https://doi.org/10.1017/S0016756825100204

“Reconstructing a geologic history for one particular area on Earth is something that I think has potential to help us in better understanding these pretty poorly understood carbonatite systems globally,” said Montana State University geologist Zachary Murguía Burton, who was not involved with the paper.

Burton estimates that some 20% of the carbonatites on Earth contain economically attractive concentrations of critical minerals, although he noted that the rocks in the study experienced a unique confluence of local and regional geologic processes that might influence the minerals they contain.

In particular, the carbonatites analyzed in the new study identified the source of recently discovered niobium deposits beneath central Australia. Niobium is a critical mineral used in lithium-ion batteries and to strengthen and lighten steel. Because 90% of today’s supply of niobium comes from a single operation in Brazil, finding additional deposits is a priority.

In addition to niobium, Kirkland said a geologic “recipe” similar to the one his team identified might work for finding gold.

The work is an important reminder of “how tiny minerals and clever dating techniques can not only solve deep-time geological puzzles, but also help guide the hunt for the critical metals we need,” Kirkland said.

—Hannah Richter (@hannah-richter.bsky.social), Science Writer

Citation: Richter, H. (2025), To find critical minerals, look to plate tectonics, Eos, 106, https://doi.org/10.1029/2025EO250393. Published on 21 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Seismic Anisotropy Reveals Deep-Mantle Dynamics

Tue, 10/21/2025 - 13:31
Source: Geochemistry, Geophysics, Geosystems

In some parts of Earth’s interior, seismic waves travel at different speeds depending on the direction in which they are moving through the layers of rock in Earth’s interior. This property is known as seismic anisotropy, and it can offer important information about how the silicate rock of the mantle—particularly at the mantle’s lowermost depths—deforms. In contrast, areas through which seismic waves travel at the same speed regardless of direction are considered isotropic.

In the bottom 300 kilometers of the mantle, also known as the D’’ layer, anisotropy is potentially caused by mantle plumes or mantle flow interacting with the edges of large low-shear-velocity provinces: continent-sized, dense, hot BLOBs (big lower-mantle basal structures) at the base of the mantle above the core. Many questions persist about the viscosity, movement, stability, and shape of the BLOBS, as well as about how they can be influenced by mantle plumes and subduction.

Roy et al. used ASPECT, a 3D mantle convection modeling software, and ECOMAN, a mantle fabric simulation code, to examine the deep mantle. They tested five different mantle model configurations, adjusting the viscosity and density of the BLOBs. The goal was to see which configuration would most closely re-create the observed seismic anisotropy.

The researchers treated the BLOBs as regions with their own unique chemistry, which form from a 100-kilometer-thick layer at the bottom of the mantle. Their models simulated how mantle plumes formed over the past 250 million years, during which time events such as the breakup of Pangaea, the opening of the Atlantic, and the evolution of various subduction zones occurred.

The study suggests that the best explanation for observed seismic anisotropy is when the BLOBs are 2% denser and 100 times more viscous than the surrounding mantle. This aligns with observations of anisotropy patterns in seismic data. Plumes form mainly at the edges of BLOBs, where strong deformation causes strong anisotropy. (Geochemistry, Geophysics, Geosystems, https://doi.org/10.1029/2025GC012510, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2025), Seismic anisotropy reveals deep-mantle dynamics, Eos, 106, https://doi.org/10.1029/2025EO250392. Published on 21 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Alaska Awaits Response from FEMA in the Aftermath of Major Floods

Mon, 10/20/2025 - 16:45
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.

Major floods in Alaska have caused the death of at least one person and displaced thousands more over the course of the last two weeks. Many of the displaced may not be able to return home for 18 months or longer, according to Alaska Gov. Mike Dunleavy.

Tropical Storm Halong formed in the Northern Philippine Sea on 5 October, and had become a category 4 typhoon by 7 October. Though it was considered an ex-typhoon by the time it reached western Alaska, the storm brought wind speeds of up to 113 miles per hour (181 kilometers per hour), along with severe flooding across the Yukon Delta, Kuskokwim Delta, and Norton Sound.

 
Related

Among the hardest hit population centers were the villages of Kipnuk and Kwigillingok, home to a combined 1,000 people, mostly Alaska Native or American Indian. At this time of year, the remote villages can only be reached by water or by air.

In Kipnuk, water levels rose 5.9 feet (1.8 meters) above the normal highest tide line. In Kwigillingok, water levels measured 6.3 feet (1.9 meters) above the normal highest tide line—more than double the previous record set in 1990. According to a letter from the governor’s office to President Trump, 90% of structures in Kipnuk and 35% of structures in Kwigillingok have been destroyed.

The Alaska Air and Army National Guard, the U.S. Coast Guard, and Alaska State Troopers evacuated hundreds of residents to the regional hub of Bethel, then to the capital of Anchorage in what the Alaska National Guard called the state’s largest airlift operation in history.

“It’s been an all-hands-on deck endeavor, and everybody is trying to support their fellow Alaskans in their time of need,” said Col. Christy Brewer, the Alaska National Guard director of joint operations, in a 19 October statement.

Silence From FEMA

But calls for assistance from the Federal Emergency Management Agency seem to have so far gone unanswered, leaving some people asking, “Where is FEMA?”

An urgent question. According to the FEMA Daily Briefing a presidential disaster declaration was requested on October 16th. To the best of my knowledge it hasn’t been granted. Any event of this size should be an easy and immediate yes.

Dr. Samantha Montano (@samlmontano.bsky.social) 2025-10-18T23:13:44.421Z

As reported by the New York Times, the EPA revoked a $20 million grant in May that was intended to protect Kipnuk from extreme flooding. The grant cancellation was likely part of a larger effort by the administration to shift the burden of disaster response to states.

On 16 October, Dunleavy submitted a request to President Trump to declare a major disaster for the state.

The letter notes that Alaska has seen 57 state-declared disasters since November 2018, 14 of which have been approved for federal disaster assistance. There have been 14 state-declared disasters in Alaska in the last 12 months alone, including fires, freezes, landslides, and floods.

“It is anticipated that more than 1,500 Alaskans will be evacuated to our major cities, many of whom will not be able to return to their communities and homes for upwards of 18 months,” Gov. Dunleavy wrote. “This incident is of such magnitude and severity that an effective response exceeds state and local capabilities, necessitating supplementary federal assistance to save lives, protect property, public health, and safety, and mitigate the threat of further disaster.”

On 17 October, Alaska’s senators and state representative (all Republicans) also submitted a letter to President Trump, urging him to approve the governor’s request for a major disaster declaration.

Also on 17 October, Vice President JD Vance said on X that he and the president were “closely tracking the storm devastation,” and that the federal government was working closely with Alaska officials. On 18 October, Lisa Murkowski (R-AK) said she believed FEMA representatives were “totally on the ground.”

However, as of 20 October, the incident is not listed in FEMA’s disaster declaration database.

—Emily Gardner (@emfurd.bsky.social) Associate Editor

These updates are made possible through information from the scientific community. Do you have a story about science or scientists? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The Southern Ocean May Be Building Up a Massive Burp

Mon, 10/20/2025 - 13:16
Source: AGU Advances

The ocean has helped mitigate global warming by absorbing around a quarter of anthropogenic carbon dioxide (CO2) emissions, along with more than 90% of the excess heat those emissions generate.

Many efforts, including assessments by the Intergovernmental Panel on Climate Change, have looked at how the oceans may continue to mitigate increasing emissions and global warming. However, few have looked at the opposite: How will the oceans respond if emissions and associated atmospheric heat levels begin to decrease in response to net negative emissions?

Frenger et al. examined what might happen in the Southern Ocean if after more than a century of human-induced warming, global mean temperatures were to be reduced via CO2 removal from the atmosphere. The Southern Ocean is a dynamic system, with large-scale upwelling and a robust ability to take up excess carbon and heat. To better understand how the Southern Ocean will behave in net negative carbon conditions, the researchers modeled how the ocean and the atmosphere would interact.

They used the University of Victoria climate model, UVic v. 2.9, to simulate multicentury timescales and carbon cycle feedbacks. UVic uses a combination of an atmospheric energy–moisture balance model, an ocean circulation and sea ice model, a land biosphere model, and an ocean biochemistry model. The researchers used UVic to model an idealized climate change scenario commonly used in climate modeling: Emissions increase until atmospheric CO2 levels double after 70 years, followed by a steep emissions cut and subsequent sustained net negative emissions.

The results showed that after several centuries of net negative emissions levels and gradual global cooling, the Southern Ocean abruptly released a burst of accumulated heat—an oceanic “burp”—that led to a decadal- to centennial-scale period of warming. This warming was comparable to average historical anthropogenic warming rates. The team said that because of seawater’s unique chemistry, this burp released relatively little CO2 along with the heat.

Frenger and colleagues note that their work uses a model with intermediate-level complexity and an idealized climate change scenario, but that their findings were consistent when tested with other modeling setups. They say the Southern Ocean’s importance to the global climate system, including its role in heat release to the atmosphere in a cooling climate, should be studied further and contemporary changes closely monitored. (AGU Advances, https://doi.org/10.1029/2025AV001700, 2025)

—Sarah Derouin (@sarahderouin.com), Science Writer

Citation: Derouin, S. (2025), The Southern Ocean may be building up a massive burp, Eos, 106, https://doi.org/10.1029/2025EO250385. Published on 20 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Publishing Participatory Science: The Community Science Exchange

Mon, 10/20/2025 - 12:00
Editors’ Vox is a blog from AGU’s Publications Department.

The Community Science Exchange was founded in 2021 to elevate the work of scientists, scholars and community members collectively engaged in participatory science and to broaden the reach of their discoveries, results and science-based solutions. Now more than ever, we would like to recognize the importance of the work of the Community Science Exchange in fostering an inclusive scientific community and strengthening public trust in science. Here, we highlight the publication outlets offered by the Community Science Exchange and encourage the AGU community to contribute.     

The Community Science Exchange aims to encourage, foster, and promote co-production between science and community.

Within equitable participatory science, or a collective scientific endeavor giving significant voice and weight to both science and publics, the Community Science Exchange defines “community” variously as place-based, a group defined by a shared culture or heritage, and/or a group defined by a shared experience. From environmental concerns to public health, anthropology to engineering, the Community Science Exchange aims to encourage, foster, and promote co-production between science and community. To aid in the integration of local knowledge and lived experience, the Community Science Exchange specifically includes community voice in its publications: as authors, in sections devoted to community description and community impact, and in quotes from community members involved in and/or affected by the work. Scientists and academic scholars with an interest in elevating their community partners within their publications instead of hiding them in an acknowledgment should consider publication within the Exchange.

The American Geophysical Union hosts the Community Science Exchange with further support and guidance from five partnership organizations: the American Anthropological Association (AAA), the American Public Health Association (APHA), the Association for Advancing Participatory Sciences (AAPS), the Unión Geofísica Mexicana (UGM), and Wiley. To broaden the publication venues for community members and organizations, practitioners, boundary spanners, and others who may not receive career benefits from scientific journal publication, the Community Science Exchange has created two new avenues for those who want to publish and share their work: the journal Community Science and the online publication venue managed by AGU, the Hub.

Since its first issue in June 2022, Community Science has published articles discussing a variety of topics of interest to communities and scientists, including water quality, plastic pollution, language as a barrier to equitable access to scientific literature, and integration of Indigenous knowledge in shellfish monitoring. Community Science has also participated in several special collections, including on air quality, equitable co-production, and sustainable agriculture. Growing steadily in submissions, Community Science received the PROSE Award for Journals from the Princeton University Press in 2024. The journal is open access, allowing anyone to read the work published for free.

As a peer-reviewed journal, manuscripts go through an evaluation and revision process to ensure that research published in the journal rigorously advances both science and community outcomes. Like the other journals within the AGU journal portfolio, those who review for Community Science are welcome to invite a co-reviewer. This endeavor can help early career researchers to become thorough and constructive reviewers, and can invite experienced community organizers, boundary spanners and those with relevant lived expertise to engage in thoughtful reviews complementary to scientific review. Publications in both Community Science and the Hub are periodically featured in Editor’s Highlights, in which editors explain what they found exciting about a work, or in Research Spotlights, which are written by Eos’ professional science writers and feature recent newsworthy work. These features offer a more approachable point of entry to explore the science.

Unlike any other journal in the AGU portfolio, the Community Science Exchange also supports an alternate publication venue – the Hub – which is hosted on the Community Science Exchange website. Broadening the definition and understanding of scientific research, work, and resources, the Hub seeks to deepen the connection between science and community.

The Hub is home to a wide variety of content, including stand-alone submissions that are intentionally written outside of the strictures of a scientific journal format.

The Hub is home to a wide variety of content, including stand-alone submissions that are intentionally written outside of the strictures of a scientific journal format, to “complementary materials” allowing journal paper authors to enrich their articles with linked materials furthering community voice. Although the Hub isn’t a scholarly journal in the traditional sense, all submissions are editor-vetted before potential revision and publication. Any new, original content published on the Hub is now eligible to receive a permanent digital object identifier (DOI) allowing it to be cited in the references of scholarly publications and other content.

Authors can submit materials to the Hub that fall into one of four categories:

Project Descriptions are narratives of work done, or even more formalized case studies. They should include a description of the community involved, an explanation of the community knowledge utilized, and a summary of the work done. Example: Climate Safe Neighborhoods [Project Description] (doi.org/10.1029/2024CSE000101)

Protocols and Methods are for describing how the community science work was done. These could be practiced approaches, descriptions of relevant policies to be considered, or outlines of project development.

Tools and Resources are items that can help others along on their own community science work, such as datasets or visualization tools. Even descriptions of useful apps that would be helpful would be welcome.

Educational Materials are items geared toward educating or training about community science practices. These could include instruction manuals, guidebooks, or even workshop or webinar curricula.

Because the Hub is a living initiative, evolving with the needs and desires of the community, submissions that don’t cleanly fit into any one of these categories will still be considered.

If you are interested in joining in the Community Science Exchange’s efforts to expand how we view, publish, and share science, please email us at communitysci@agu.org. Whether you have a resource to submit to the Hub, an article to submit to the journal, want to be a reviewer, or even want to apply to be an editor – we’d love to hear from you.

Finally, we want to thank all of those who have served as editors of this initiative so far, both past and present (starred are original editorial board members):

  • Julia Parrish*, current Editor-in-Chief
  • Kathryn Semmens*, current Deputy Editor of the Hub
  • Claire Beveridge*, current editor
  • Gillian Bowser, current editor
  • Muki Haklay*, current editor
  • Rajul Pandya, current editor
  • Jean Schensul*, founding Deputy Editor, current editor
  • Kevin Noone*, founding Editor-in-Chief, past editor
  • Paula Buchanan*, founding Deputy Editor, past editor
  • Shobhana Gupta*, past editor
  • Heidi Roop*, past editor
  • Roopam Shukla*, past editor

—Allison Schuette (aschuette@agu.org, 0009-0007-1055-0937), Program Coordinator, AGU Publications; Julia Parrish (0000-0002-2410-3982), Editor-in-Chief, Community Science Exchange; Kathryn Semmens (0000-0002-8822-3043), Deputy Editor, The Hub; Kristina Vrouwenvelder (0000-0002-5862-2502), Assistant Director, AGU Publications; and Sarah Dedej (0000-0003-3952-4250), Assistant Director, AGU Publications

Citation: Schuette, A., J. Parrish, K. Semmens, K. Vrouwenvelder, and S. Dedej (2025), Publishing participatory science: the Community Science Exchange, Eos, 106, https://doi.org/10.1029/2025EO255032. Published on 20 October 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Universities Reject Trump Funding Deal

Fri, 10/17/2025 - 16:09
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

The “Compact for Academic Excellence in Higher Education,” developed by the Trump administration and sent to nine universities on 1 October, proposes that the institutions agree to a series of criteria in exchange for preferential treatment in funding decisions.

The compact’s provisions ask universities to: 

  • Ban the consideration of any demographic factors, including sex, ethnicity, race, sexual orientation, and religion in any admissions decisions, financial aid decisions, or hiring decisions.
  • Commit to “institutional neutrality,” create an “intellectually open campus environment,” and abolish “institutional units that purposefully punish, belittle, and even spark violence against conservative ideas.”
  • Require all employees to abstain from actions or speech related to social and political events unless such events have a direct impact on their university or they are acting in their individual capacity rather than as university representatives. 
  • Interpret the words “woman,” and “man” according to “reproductive function and biological processes.”
  • Stop charging tuition for any admitted student pursuing “hard science” programs. (This only applies for universities with endowments over $2 million per undergraduate student.)
  • Disclose foreign funding and gifts.
Compact-for-Academic-Excellence-in-Higher-Education-10.1Download

The proposed deal was sent to the University of Pennsylvania, the University of Virginia, the University of Arizona, the University of Texas at Austin, the University of Southern California, Vanderbilt University, Dartmouth University, Brown University, and the Massachusetts Institute of Technology. 

 
Related

“Any university that refuses this once-in-a-lifetime opportunity to transform higher education isn’t serving its students or their parents—they’re bowing to radical, left-wing bureaucrats,” Liz Huston, a White House spokesperson, told Bloomberg

Simon Marginson, a professor of higher education at Oxford University, told Time that if successful, the compact would “establish a level of federal control of the national mind that has never been seen before.” 

On 12 October, President Trump opened up the offer to all institutions of higher education in a post on social media website Truth Social.

As of 20 October, the following schools have responded to Trump’s offer:

  • Massachusetts Institute of Technology: MIT was the first to reject Trump’s offer. In a 10 October letter to the administration, MIT President Sally Kornbluth wrote that MIT’s practices “meet or exceed many standards outlined in the document,” but that the compact “also includes principles with which we disagree, including those that would restrict freedom of expression and our independence as an institution.”
  • Brown University: In a 15 October letter to the administration, Brown University President Christina H. Paxson declined the deal. She wrote that Brown “would work with the government to find solutions if there were concerns about the way the University fulfills its academic mission,” but that, like Kornbluth, she was “concerned that the Compact by its nature and by various provisions would restrict academic freedom and undermine the autonomy of Brown’s governance.”
  • University of Southern California: In a 16 October statement, USC Interim President Beong-Soo Kim informed the university community that he had declined the deal, and wrote that the university takes legal obligations seriously and is diligently working to streamline administrative functions, control tuition rates, maintain academic rigor, and ensure that students develop critical thinking skills. “Even though the Compact would be voluntary, tying research benefits to it would, over time, undermine the same values of free inquiry and academic excellence that the Compact seeks to promote,” he wrote.
  • University of Pennsylvania: In a 16 October statement, UPenn President J. Larry Jameson informed the university community that he had declined to sign the compact. “At Penn, we are committed to merit-based achievement and accountability. The long-standing partnership between American higher education and the federal government has greatly benefited society and our nation. Shared goals and investment in talent and ideas will turn possibility into progress,” he wrote.
  • University of Virginia: In a 17 October letter to the administration, UVA Interim President Paul Mahoney declined to sign the compact. “We seek no special treatment in exchange for our pursuit of those foundational goals,” the letter said. “The integrity of science and other academic work requires merit-based assessment of research and scholarship. A contractual arrangement predicating assessment on anything other than merit will undermine the integrity of vital, sometimes lifesaving, research and further erode confidence in American higher education.”
  • Dartmouth University: In a 18 October letter to the administration, Dartmouth President Sian Leah Beilock declined the deal. “I do not believe that the involvement of the government through a compact—whether it is a Republican- or Democratic-led White House—is the right way to focus America’s leading colleges and universities on their teaching and research mission,” Beilock wrote.
  • University of Arizona: In a 20 October announcement, President Suresh Garimella said he had declined to agree to the proposal and had instead submitted a Statement of Principles to the U.S. Department of Education informed by “hundreds of U of A stakeholders and partner organizations.” “This response is our contribution toward a national conversation about the future relationship between universities and the federal government. It is critical for the University of Arizona to take an active role in this discussion and to work toward maintaining a strong relationship with the federal government while staying true to our principles,” Garimella wrote.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

20 October: This article was updated to include the University of Virginia and Dartmouth University.

21 October: This article was updated to include the University of Arizona.

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer