EOS

Syndicate content Eos
Science News by AGU
Updated: 1 day 16 hours ago

Developing Nations Need 12 Times More Financing to Meet Climate Adaptation Needs

Wed, 10/29/2025 - 13:07
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.

An annual United Nations report, published 29 October, reveals a “yawning gap” between existing and necessary climate adaptation finance that is “putting lives, livelihoods, and entire economies at risk.”

Adaptation needs in developing countries—estimated to be at least $310 billion per year by 2035—are 12 times higher than current public international finance flows, which currently sit at about $26 billion per year. 

This so-called adaptation gap limits poor countries’ ability to withstand the changing climate, the report states.

Poorer countries are often the hardest hit by the effects of climate change, despite emitting just a fraction of the world’s greenhouse gases. These countries rely on funding from other countries, from both public and private sources, to finance climate adaptation efforts.

A comparison of adaptation financing needs from nationally determined contributions (NDCs) and national adaptation plans (NAPs) and international public adaptation finance flows in developing countries. Achieving the Glasgow Pact and delivering the multilateral development banks’ (MDB) target would help narrow the gap slightly, but would not be enough to close the gap. Credit: UNEP

“We need a global push to increase adaptation finance—from both public and private sources—without adding to the debt burdens of vulnerable nations,” said Inger Andersen, the executive director of the UN Environment Programme, in a press release.  “Even amid tight budgets and competing priorities, the reality is simple: If we do not invest in adaptation now, we will face escalating costs every year.”

“New finance providers and instruments must come on board.”

The report credits the difficulty of mobilizing necessary financial resources to “current geopolitical tensions and cuts to overseas development assistance, among other factors.” 

The Glasgow Climate Pact, an international agreement adopted in 2021, set a goal to double international public adaptation finance by 2025. The goal will not be met under current trajectories, according to the report. The most recent climate finance target of $300 billion per year by 2035, agreed upon at the UN climate change conference last year, COP 29, is also insufficient to meet the adaptation needs of developing countries.

A failure to meet international finance goals means “many more people will suffer needlessly,” Andersen wrote in the report. “New finance providers and instruments must come on board.”

“The smart choice is to invest in adaptation now,” she wrote.

The report did include some silver linings: The adaptation finance gap is slightly smaller for Least Developed Countries, the UN’s classification for low-income countries facing severe obstacles to sustainable development, and Small Island Developing States. Additionally, in-country progress to plan for climate change is improving: 172 countries have at least one national adaptation strategy in place, while 21 others have started developing one.

However, the world is failing to reach other climate goals: Another UN report, published 22 October, found that oil and gas companies are still vastly underreporting their methane emissions. And ahead of COP30, scheduled to be held next month in Belém, Brazil, only 64 of the 195 nations party to the Paris Agreement have submitted their required updates to their emissions plans.

 
Related

The new report is expected to inform discussions at COP30. The Brazilian presidency of COP30 has called for the conference to be a “mutirão global,” a global collective effort, to achieve ambitious climate action. In the report, authors advise nations attending the conference in Belém to transition away from fossil fuels, engage additional financial system stakeholders, and avoid expensive but mostly ineffective maladaptations such as seawalls or wildfire suppression. 

In a recent interview with The Guardian about COP30 priorities, Secretary-General of the UN António Guterres said the world has “failed to avoid an overshooting above 1.5°C [2.7°F] in the next few years” and urged swift action.

“It is absolutely indispensable to change course,” he said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about science or scientists? Send us a tip at eos@agu.org.

Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Judge Stops Shutdown-Related RIFs Indefinitely

Tue, 10/28/2025 - 21:51
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

A judge has announced that the government cannot issue further reduction-in-force (RIF) notices to federal employees because of the government shutdown, nor implement RIFs that had already been issued during the shutdown.

The ruling by U.S. District Judge Susan Illston will mark the latest in a months-long court battle over RIFs at federal agencies.

“I think it’s important that we remember that, although we are here talking about statutes and administrative procedure and the like, we are also talking about human lives, and these human lives are being dramatically affected by the activities that we’re discussing this morning,” Judge Illston said at the top of the hearing, which was held at the headquarters of the Northern District of California in San Francisco.

 
Related

The case, American Federation of Government Employees, AFL-CIO v. United States Office of Personnel Management (OPM) (3:25-cv-01780), was first filed in February. AGU joined as a plaintiff in the case in March. Other plaintiffs include Climate Resilient Communities, the Coalition to Protect America’s National Parks, and the American Public Health Association.

Judge Illston granted a temporary restraining order earlier this month, which prevented the government from executing RIFs during the shutdown until further notice.

However, the Trump administration only paused some RIFs, arguing that most of the thousands of layoffs announced since the shutdown are not covered by the court order.

As part of the temporary restraining order, the court ordered the government to provide an accounting of “all RIFs, actual or imminent,” that it planned to execute during the shutdown. The list included 143 Fish and Wildlife Service employees, 355 USGS employees, 272 National Park Service employees, and 474 Bureau of Land Management employees.

On 22 October, Judge Illston broadened the reach of who was protected by the temporary restraining order by adding several unions representing federal employees as plaintiffs.

In today’s hearing, the plaintiffs argued for a preliminary injunction, a move that essentially preserves the status quo before the final judgement of a trial. Danielle Leonard, an attorney representing the plaintiffs, argued that, in this case, the state of affairs prior to the government shutdown should be considered the “status quo.” In essence, this meant seeking for a halt to RIFs that have occurred since the shutdown, not just future RIFs.

The plaintiffs sought prove that the RIFs were “arbitrary or capricious,” a legal standard that is part of the Administrative Procedure Act, which governs how federal agencies operate.

Michael Velchick, an attorney representing the U.S. government, argued that the government’s actions were not only not arbitrary or capricious, but good policy, and “the right thing to do.”

“Morally it’s the right thing to do, and it’s the democratic thing to do,” he said. “The American people selected someone known above all else for his eloquence in communicating to employees that, ‘You’re fired.’”

This was seemingly a reference to the president’s former reality TV show, The Apprentice.

Leonard argued that Velchick’s statement was offensive to the 1.5 million federal employees represented by her clients. She summed up the defendant’s argument like this:

“There is some general authority, and therefore that blesses the specific actions that are happening here for the reasons that the government has given, regardless of how poor those reasons are. And that’s just not the way the law works.”

Judge Illston seemed to agree, stating that the Office of Personnel Management and Office of Management and Budget were prohibited from issuing more RIF notices or implementing those already issued.

The judge noted that she will likely hold an evidentiary hearing to settle a potential dispute over whether specific RIF notices were issued because of the shutdown, or were “already in the works” and unrelated to the shutdown.

—Emily Gardner (@emfurd.bsky.social), Associate Editor

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

AI is Changing our Understanding of Earthquakes

Tue, 10/28/2025 - 13:48

This story was originally published by Knowable Magazine.

When the biggest earthquake in more than a decade rattled Russia’s remote Kamchatka Peninsula in July, seismologists around the world knew within moments. For earthquakes big or small, sensors around the globe detect the tremors and relay that information to researchers, who quickly analyze the observations and issue alerts.

Now artificial intelligence is poised to make almost everything about earthquake research much faster—and to rewrite researchers’ very understanding of how earthquakes happen.

“Machine learning opened a whole new window.”

By using a subfield of AI called machine learning, some scientists are identifying up to millions of tiny, previously unseen earthquakes in data gathered from seismically active places. These new and improved databases are helping researchers to better understand the geological faults along which quakes happen, and can help to illuminate the risks of future quakes. Some scientists are even using machine learning to improve their forecasts of how many aftershocks may rattle a location that has just experienced a large and damaging earthquake.

More broadly, researchers hope that machine learning, with its ability to crunch through huge amounts of information and learn from the patterns within, will reveal fresh insights into some of the biggest mysteries about earthquakes, including how a quake unfolds in its first devastating seconds.

“Machine learning opened a whole new window,” says Mostafa Mousavi, a seismologist at Harvard University.

Shaking Earth, Exploding Data

Earthquakes happen when geological stress builds up in the ground, such as when two plates of Earth’s crust grind alongside one another, as they do at California’s San Andreas Fault. At some point, the stress reaches a critical threshold and the fault ruptures, breaking the rock and causing seismic energy to ripple outward and shake the ground.

The San Andreas fault, seen here as a dramatic slash across the Carrizo Plain in Southern California, is an example of a geologically active area where seismologists are using AI to better understand earthquake patterns. Credit: John Wiley, Wikimedia Commons, CC BY 3.0

That energy is recorded by seismometers and other instruments around the world, which are positioned in great numbers in geologically active areas like California and Japan. The data feed into national and international systems for tracking earthquakes and alerting the world. The amount of data has exploded in recent years as seismologists find new ways to gather information on ground movements—like detecting seismic signals over fiber optic networks, or using the accelerometers built into smartphones to create a phone-based earthquake warning network.

Just a decade or two ago, much of the analysis of seismic signals was done by hand, with scientists working as quickly as possible to assess recordings coming in from their observing networks. But today, there are just too many data points. “Now the only—almost—way that you can deal with the seismic data is to go to automatic processing,” says Mousavi, who coauthored a 2023 article in the Annual Review of Earth and Planetary Sciences on machine learning in earthquake seismology.

One of the most common uses of machine learning in seismology is measuring the arrival time of seismic waves at a particular location, a process known as phase picking. Earthquakes generate two kinds of seismic waves, known as P and S waves, that affect the ground in different ways and show up as different types of squiggles on a seismogram. In the past, a seismologist would analyze data arriving from seismic sensors and hand-select what they gauged to be the start of P waves or S waves on those seismogram plots. Picking the starts of those waves accurately and precisely is important for understanding factors such as where exactly the earthquake hit. But phase picking is very time consuming.

An earthquake’s energy appears as a squiggly waveform on measurements made by seismometers. The first type of signal to arrive is a ground movement known as a P wave, followed by a type known as an S wave. Picking where the waves first arrive on a seismometer reading is an important part of understanding an earthquake’s impact—this has typically been done by human seismologists but in recent years the process has been made much quicker by incorporating machine-learning algorithms. Credit: Knowable Magazine, adapted from USGS.gov

In the past few years, seismologists have been using machine learning algorithms to pick seismic phases much faster than a human can. There are a number of automated methods that can do phase picking, but machine learning algorithms, which have been trained on huge volumes of data on past quakes, can identify a wide variety of signals from different types of tremors in a way that was not possible before. The practice is now so standard that the term “machine learning” is no longer stated in the titles of research papers, says Mousavi. “By default, everybody knows.”

AI-based phase picking is faster than phase picking by humans and at least as accurate, Mousavi says. Seismologists are now working to expand these tools to other types of seismic analysis.

Expanding Quake Catalogs

One area that has already seen big discoveries is the use of machine learning to expand earthquake catalogs—basically, lists of what earthquakes happened where in a particular region. Earthquake catalogs include all the quakes that seismologists can identify from recorded signals—but AI can find exponentially more tremors than human scientists can.

Essentially, machine learning can trawl through the data to identify small earthquakes that people don’t have the ability or time to flag. “Either you don’t see them by eye, or there’s no time to go and look at all those tiny events,” says Leila Mizrahi, a seismologist with the Swiss Seismological Service at ETH Zürich. Often, these tremors are obscured by background noise in the data.

Tiny earthquakes are important as a window into how larger earthquakes begin.

In a pioneering 2019 study in Science, researchers used an AI algorithm that matched patterns of seismic waves to identify more than 1.5 million tiny earthquakes that happened in Southern California between 2008 and 2017 but had not been spotted before. These are itty-bitty quakes that most people wouldn’t feel even if they were standing on top of them. But knowing they exist is important in helping seismologists understand patterns of behavior along a geological fault.

In particular, Mousavi says, tiny earthquakes are important as a window into how larger earthquakes begin. Large earthquakes may happen along a particular fault once every century or more—far too long a time period for scientists to observe in order to understand the rupture process. Tiny quakes behave much the same as big ones, but they happen much more frequently. So studying the pattern of tiny quakes in the newly expanded earthquake catalogs could help scientists better understand what gets everything going. In this way, the richer catalogs “have potential to help us to understand and to model better the seismic hazard,” Mousavi says.

Expanded earthquake catalogs can also illuminate the structure of geological faults below a region much better than before. It’s like going from a simplistic sketch of how the faults are arranged to a painting with more photorealistic details. In 2022, a team led by seismologist Yongsoo Park, then at Stanford University, used machine learning to build an expanded catalog of quakes in Oklahoma and Kansas between 2010 and 2019, many of them induced by oil and gas companies injecting wastewater into the ground. The work illuminated fault structures that weren’t visible before, allowing the scientists to map the faults more precisely and to better understand seismic risk.

This dramatic example shows the power of machine learning to enhance scientists’ knowledge of earthquakes. Top is a depiction of an earthquake swarm that occurred near Pawnee, Oklahoma, in September 2016. Each dot represents an earthquake measured by seismometers (with yellow representing quakes that occurred early in the swarm, and red representing quakes that occurred later). Bottom is the same earthquake swarm, but in this case when scientists used machine learning to pinpoint additional, smaller quakes in the observations. The enhanced earthquake catalog shows far more detail of where the quakes occurred, including illuminating the underlying geological faults. Credit: Courtesy of Yongsoo Park

Park and his colleagues showed that 80 percent of the larger earthquakes that happened could have been anticipated based on the smaller earthquakes that occurred before the big ones. “There is always a possibility that the next major earthquake can occur on a fault that is still not mapped,” says Park, who is now at Los Alamos National Laboratory in New Mexico. “Routinely capturing smaller earthquakes might be able to reveal such hidden faults before a major earthquake happens.”

Scientists are applying this approach around the globe. Researchers in Taiwan, for instance, recently used machine learning to produce a more detailed catalog of a magnitude 7.3 tremor in April 2024 that killed at least 18 people on the island and damaged hundreds of buildings. The study, reported at a seismology meeting in April 2025, found the AI-based catalog to be about five times more complete than the one produced by human analysts, and it was made within a day rather than taking months. It revealed new details on the location and orientation of geological faults—information that can help officials better prepare for how the ground might move in future quakes. Such catalogs “will become the standard in every earthquake-prone region in the future,” says team leader and seismologist Hsin-Hua Huang of Academia Sinica in Taiwan.

Forecasting is Still a Problem

So far, AI hasn’t been as successful in tackling another of seismology’s biggest challenges—forecasting the probability of future quakes.

The field of earthquake forecasting deals with general probabilities—such as the chances of a quake of magnitude X happening in region Y over time period Z. Currently, seismologists create quake forecasts using mathematical analyses of past earthquakes, such as a statistical method that relies on observations of how past earthquakes triggered subsequent quakes. This approach works well enough for specific tasks, like understanding how many aftershocks may rattle a region after a Big One. That sort of information can help people in a disaster zone know whether it’s safe to return to their houses or whether more aftershocks might be on the way, threatening to collapse more buildings.

But this kind of analysis can’t always accurately capture the real seismic risk, especially along faults that only rarely yield big quakes and thus aren’t well represented in the seismic record. Seismologists are testing AI-based algorithms for earthquake forecasting to see if they might do better, but so far, the news is tepid. In their best performances, the machine learning analyses are about as good as the standard methods of quake forecasting. “They are not outperforming the traditional ones yet,” says Mousavi, who summarized the state of the field in an August 2025 article in Physics Today.

Overall, though, seismologists see a bright future in using AI to understand earthquakes better.

In one of the more promising experiments, Mizrahi has been trying to use AI to speed up producing aftershock forecasts in the crucial minutes and hours after a large earthquake hits. She and a colleague trained a machine-learning algorithm on the older statistical method of quake forecasting, then unleashed it on its own to see how the AI would do. It did perform much faster than the older, non-AI approach, but there’s still more work to do. “We’re in the process of evaluating how happy we are with it,” says Mizrahi, who published the findings last year in Seismological Research Letters.

In the future, researchers hope to speed up these types of forecasting analyses. Other areas of seismology could eventually benefit, too. Some early research hints that machine learning could be used in earthquake early warning, for instance estimating exactly how much the ground will move in the seconds after an earthquake has started nearby. But the usefulness of this is limited to the few parts of the world that have early warning systems in place, like California and Japan.

Park also cautions about relying too much on machine learning tools. Scientists need to be careful about maintaining quality control so they can be sure they are interpreting the results of any AI analysis correctly, he says.

Overall, though, seismologists see a bright future in using AI to understand earthquakes better. “We’re on the way,” Mizrahi says.

—Alexandra Witze, Knowable Magazine

This article originally appeared in Knowable Magazine, a nonprofit publication dedicated to making scientific knowledge accessible to all. Sign up for Knowable Magazine’s newsletter. Read the original article here.

Martian Dust Devils Reveal Dynamic Surface Winds

Tue, 10/28/2025 - 13:40

In 2020, the scientists and engineers behind NASA’s InSight lander were optimistic. The mission was performing spectacularly, and it had no end in sight. Then, its power began to fade. Fine Martian dust was relentlessly piling on top of its solar panels, blocking sunlight. Mission operators had anticipated this but hoped that occasional wind gusts or passing dust devils would sweep the panels clean. Such fortuitous cleaning had prolonged the lives of earlier robotic explorers, such as the Spirit and Opportunity rovers. But for InSight, no such wind ever came, and its batteries slowly ran out of juice. InSight fell silent in December 2022.

InSight’s demise illustrates a long-standing gap in Martian science: Researchers still know little about how winds move across the planet’s surface and interact with dust. To help fill this gap, a group of researchers has now reviewed decades of orbital imagery from two European Space Agency (ESA) spacecraft—Mars Express and the ExoMars Trace Gas Orbiter, operational since 2004 and 2016, respectively—looking for dust devils and using them as a proxy for surface winds.

Over the years, these orbiters have captured thousands of high-resolution images of Mars’s surface. Hidden within this vast dataset are countless sightings of dust devils, which drift with the prevailing winds. Because surface winds are otherwise impossible to measure directly from Martian orbit with the available instruments, tracking the motion of these vortices provides a rare window into their direction and velocity.

To measure these parameters, the researchers exploited a technical quirk of the spacecraft cameras, namely, the slight temporal delay between capturing different color layers of an image or between the right and left images in stereoscopic views. By tracking the dust devils’ movement between exposures, the team could track the velocity and directions of the winds carrying them. Their observations revealed some of the fastest surface wind speeds ever detected on Mars, challenging existing atmospheric models.

The Colour and Stereo Surface Imaging System (CaSSIS) on board the European Space Agency’s ExoMars Trace Gas Orbiter captured this dust devil tracking across the Martian surface on 28 February 2019. Credit: ESA/TGO/CaSSIS, CC BY-SA 3.0 IGO

“With dust devils, we now have a tool to measure wind velocities across the planet, across space and time,” said Valentin Bickel, a planetary scientist at the University of Bern and lead author of the study. “We get a measurement of wind speeds in a distributed way around the planet, not just in specific lander locations.”

AI-Assisted Research

Detecting dust devils in orbital images, however, is not easy. For instance, the Colour and Stereo Surface Imaging System (CaSSIS) camera on board the ExoMars Trace Gas Orbiter resolves the surface at about 4 meters per pixel, meaning that dust devils dozens of meters wide appear as tiny smudges. Finding all these dust devils in the images is something that “an army of people could do in a few months or years, but nobody can pay for that,” Bickel said.

To automate the search, Bickel and colleagues trained a convolutional neural network—a type of artificial intelligence (AI) commonly used in image recognition—to identify the dust devils. After training the algorithm with about 50 examples labeled by experts, they let it loose on their full dataset of 50,000 orbital images. “Its only function is to identify dust levels in images; it can’t do anything else. It’s very stupid,” Bickel said. However, it needed only a few hours to scan the entire collection.

“The velocities we measured are totally surprising; I didn’t think we would see so many fast dust devils on Mars.”

The neural network detected more than a thousand dust devils across nearly all Martian latitudes. Each detection offered a new data point on local surface winds. The analysis revealed that Martian surface winds are generally faster than current atmospheric models suggest—and occasionally stronger than any speeds directly recorded by landers or rovers equipped with weather instruments. For instance, the researchers detected wind speeds of up to 44 meters per second, which is substantially faster than the previous 32 meters per second mark recorded by the Perseverance rover. Scientists previously assumed that dust devils might not even have been able to form at such wind speeds, as they could be destroyed by currents, Bickel said.

“The velocities we measured are totally surprising; I didn’t think we would see so many fast dust devils on Mars,” Bickel said. “You always picture them as these slowly moving clouds of dust, but it turns out they’re like superfast, highway speed level objects. I think it’s just crazy.”

The second key finding is that fast winds are more widespread across the planet than previously thought. To showcase this, the researchers produced a map with the locations of all 1,039 dust devils detected, including the direction of motion for 373 of them, confirming that dust devils are found all over Mars, even atop the tallest volcanoes. However, dust devils tend to cluster in specific regions, for instance, in Amazonis Planitia (visible at upper left in the map), a vast area known to be covered by an extensive, fine layer of dust and sand.

Researchers created a map showing 1,039 dust devils that occurred on the Martian surface, as seen in 20 years’ worth of images from European Mars orbiters. Credit: ExoMars TGO data: ESA/TGO/CaSSIS; Mars Express data: ESA/DLR/FU Berlin; Background: NASA Viking color mosaic, CC BY-SA 3.0 IGO

“Of course,” Bickel noted, “we have a bias because we need dust devils to see [the winds], so if there’s no dust at ground level, we don’t see the wind.”

The team also observed a clear seasonal pattern: Dust devils and strong winds appear more frequently during each hemisphere’s spring and summer, typically happening around midday, when surface heating is more intense. The researchers published their findings in Science Advances.

Blowing in the Dusty Wind

Deciphering how Martian winds work is key to understanding how dust shapes the planet’s weather and climate. Wind is the main force that lifts and transports the Red Planet’s abundant dust, which in turn regulates how the Martian atmosphere absorbs and radiates heat.

Understanding dust transport is thus critical for future exploration, both robotic and human. A global map of wind patterns might have helped InSight’s engineers choose a landing site less prone to rapid dust accumulation. On a larger scale, planet-encircling dust storms that erupt every decade or so—sometimes blocking sunlight for months—remain a serious hazard for exploration.

“Wind is one of the holy grails for human exploration and to understand the Martian climate.”

“Wind is one of the holy grails for human exploration and to understand the Martian climate,” said Germán Martínez, a researcher at the Center for Astrobiology in Madrid, Spain, who wasn’t involved with the new study. “Surface winds are very important, for Martian climatology, but especially at this time for human exploration safety, and we know very little.” In that sense, Martínez said, this research is important because it provides a map of surface wind speeds and directions that we didn’t have before, even if it’s a bit coarse.

Bickel agreed that more data, and more tools in orbit, will improve understanding of the Martian wind system. In the meantime, he hopes the new map will be used to validate and improve climate and wind models of Mars.

—Javier Barbuzano (@javibar.bsky.social), Science Writer

Citation: Barbuzano, J. (2025), Martian dust devils reveal dynamic surface winds, Eos, 106, https://doi.org/10.1029/2025EO250404. Published on 28 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Some useful tools for monitoring the evolution and behaviour of Hurricane Melissa

Tue, 10/28/2025 - 08:13

Various online datasets will allow a detailed understanding of Hurricane Melissa as it impacts Jamaica and then Cuba

Hurricane Melissa is now making headlines around the world in anticipation of its landfall today. As always with tropical cyclones, the picture is evolving continuously as the storm evolves. Their behaviour is highly complex.

I thought I’d highlight some useful tools for monitoring the evolution and behaviour of Hurricane Melissa. First, of course, NOAA CPHC provides a range of graphics, some of which are adaptable. This includes the forecast track of the centre of the storm, the forecast earliest arrival time of the centre of the hurricane and (most usefully in the context of landslides), the rainfall potential:

Precipitation potential for Hurricane Melissa. Graphic from NOAA as at 07:18 UTC on 28 October 2025.

Note that this is three day potential rainfall (the graphic that I posted yesterday was for four days). Jamaica is going to start to feel the full brunt of the storm today (Tuesday 28 October), and it will then move on to eastern Cuba. The latest forecast suggests that the most serious rainfall will occur in the central part of Jamaica, but that there will also be very significant rainfall in the west of the island. The change appears to be the result of a slightly later than forecast turn to the north.

The NASA Global Precipitation Measurement site provides near real time data – the best tool available for understanding the rainfall that the storm is delivering. This is the latest image showing 24 hour precipitation totals:-

24 hour precipitation accumulation for Hurricane Melissa. Graphic from NASA GPM as at 07:34 UTC on 28 October 2025.

Note that this site also provides a global landslide nowcast, but sadly the site indicates that this is not functioning. I am unsure as to why – maybe this is the effect of the government shutdown.

In terms of the landslides themselves, this map of Jamaica and Cuba provides landslide susceptibility – yet again, this is work from NASA:-

Landslide susceptibility for Jamaica and Cuba. Data from NASA.

Overlaying this with the forecast precipitation is fascinating – the east of Jamaica has the highest landslide susceptibility, but is now forecast to receive less rainfall. Central Jamaica has lower average susceptibility, but may receive more rainfall. But also remember that landslides in storms like this are often driven mostly by rainfall intensity, which is hard to forecast and very variable. There’s also a nice BGS report on landslide hazard for a catchment in Central Jamaica, which gives an idea of the scale of the issues.

In terms of news within Jamaica itself, the Jamaica Observer and the Jamaica Star will be providing local coverage.

Finally, in such situations there is a tendency in the international media to adopt a slightly condescending tone to reporting of such events in countries with lower levels of per capita GDP. Both Jamaica and Cuba have advanced disaster management systems – they are far from helpless victims. Indeed, Cuba has a remarkably successful record of managing disasters and Jamaica fared remarkably well during Hurricane Beryl last year due to its preparedness. But tropical cyclones are complex, and the impact of a Category 5 event is very much greater than that of a Category 4 storm. Even the best prepared nation struggles to cope with such a storm.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Building Better Weather Networks

Mon, 10/27/2025 - 12:58

Lake Victoria, Africa’s largest lake, supports a quarter of East Africa’s population with fish, fresh water, and critical transportation routes. It’s also one of the deadliest bodies of water in the world.

Storms, high winds, and waves frequently capsize boats, causing thousands of deaths each year.

Despite the hazard, Lake Victoria has historically lacked enough weather observation stations to provide a clear picture of weather patterns in the region. Overly general and often inaccurate forecasts have meant that those heading out on the water had little idea what weather they’d face.

In 2017, the World Meteorological Organization (WMO), the weather agency of the United Nations, began a multiyear effort to improve weather information for the lake and establish early-warning systems for life-threatening storms. Now, much of the lakeside population uses the program’s tailored forecasts, leading to an estimated 30% reduction in weather-related deaths on the lake.

Still, a dearth of weather data persists across the continent. Because of ongoing economic depression, conflict, and disruptive weather patterns, Africa has gone decades without observational networks that meet international standards.

Today, the continent has the least dense weather observation network in the world. The average density of stations meeting WMO standards is 8 times lower than the WMO recommended level; more reporting surface stations exist in Germany than across all of Africa, according to the WMO and the World Bank.

The lack of observations often leaves communities without early warnings of natural disasters, cripples forecasts that farmers rely on, and leaves scientists who are studying global climate and weather with a major data gap.

In 2019, the need for improved weather networks around the world was recognized at the Eighteenth World Meteorological Congress in Geneva, Switzerland. There, the WMO’s 193 member states and territories established the Global Basic Observing Network (GBON), an international agreement that specifies requirements for weather data collection, including which parameters to measure, when and at what resolution to measure them, and how to share the data between countries.

With encouragement from the WMO and smaller organizations, national meteorological agencies in Africa are also recognizing the need for enhanced weather and climate services and planning for them, said Zablon Shilenje, the technical coordinator of services for the WMO’s Regional Office for Africa.

That recognition, combined with increased economic investment, has slowly led to weather stations being added to networks throughout Africa.

“The situation now has improved a lot,” said Frank Annor, a water resources expert at TU Delft and Kwame Nkrumah University of Science and Technology in Ghana. But the continent is still far from meeting GBON reporting standards. And in the face of an ever more variable climate, additional investments and improved ways of working together are needed.

“There is a huge gap, and we need to work on it,” Shilenje said.

Scarce Stations

Climate models used by scientists and forecasts created by meteorologists fundamentally rely on current and past weather data, particularly temperature and precipitation measurements, to extrapolate patterns into the future. “Any time the historical information isn’t perfect, that’s going to cause potential issues,” especially for estimating the impacts of climate change, said Pascal Polonik, a climate scientist at Stanford University.

Forecast accuracy declines as the number of observations drop. That’s particularly problematic when an entire region or large swaths of it have little to no observational data—as is the case in many parts of Africa.

“We lack the ground data. That data is not being ingested into models, so then when you do predictions, your predictions are less accurate.”

“We lack the ground data. That data is not being ingested into models, so then when you do predictions, your predictions are less accurate,” Annor said.

There wasn’t always such a lack of station density. In the first half of the 20th century, African countries’ networks were growing on par with those in other parts of the world, though they never reached the same densities as in places like North America. But now, Africa has less than one third of the weather stations that it once had.

Social and political conflict is one reason for the decline. One 2019 analysis of temperature records available in sub-Saharan Africa found that a country’s civil conflict risk was negatively correlated with both the number and density of weather stations contributing to its temperature record.

Some conflicts or social upheavals have had an outsized effect on monitoring networks. During and after the 1994 genocide in Rwanda, for instance, the average number of actively reporting weather stations in the country dropped from more than 50 to less than 10. Nearly 15 years passed before station coverage returned to preconflict levels. In another instance, station density in Uganda declined from a peak of about 500 stations following independence in the 1960s to less than 100 by 2001. A civil war in Ethiopia beginning in 2018 resulted in a sharp decline in reporting weather stations in the northwest part of the country, where much of the fighting took place.

“You can see from one year to another how unrest can affect station density,” said Tufa Dinku, a senior research scientist at the Columbia Climate School in New York.

The ongoing conflict in the northwestern part of Ethiopia may have led to a decrease in reported weather data from stations in the same area. Credit: Tola et al., 2025, https://doi.org/10.3389/fclim.2025.1551188, CC BY 4.0

Beyond conflict and the challenges of establishing a stable national government, a lack of economic resources has also contributed to the drop in weather station density. In the late 1980s and 1990s, Africa entered an economic depression that made it difficult for states to update their weather observational systems with technology on par with that used by countries in Europe and North America.

“African countries were not able to recover their meteorological [networks],” said David Mburu, who worked for more than 30 years as a meteorologist at the Kenya Meteorological Department.

Weather itself is partly to blame for slow economic development: Climate variability has caused frequent droughts, floods, heat waves, and land degradation, Dinku wrote in a 2006 article exploring the challenges of managing climate change on the continent.

The places in the world that are most affected by climate change tend to overlap with places in the world that are economically poor and, often, also data poor, Polonik said.

Shilenje said climate change only adds to the challenge of maintaining networks, affecting the durability of the instruments and equipment used to make observations. “There is a strong correlation between climate change and the ability to maintain a stable observing network on the continent,” he said.

A Dearth of Data

The reporting weather stations that do exist are often located along roads or concentrated in urban areas, meaning they’re not dispersed well enough to give an accurate reflection of weather across a whole country or region, Dinku said. Weather station coverage tends to be worse in rural areas, where better weather and forecasting information is most needed.

The dearth of observational stations has far-reaching consequences for Africans and those doing business there. Farming, for example, makes up about 15% of the continent’s economy. Without accurate forecasts, farmers are left without the information they need to make decisions about how to keep their livelihoods afloat.

“If you don’t know when it’s going to rain or how much to expect, then how do you go about your agriculture activities? That is the problem,” Annor said.

Inaccurate accounting of rainfall also means some farmers struggle to get their insurance claims paid, he said. “People then don’t want to invest in insurance again,” Annor said. “What that means is that people take calculated risk: They minimize the amount of food they can grow so they can minimize their risk.”

“The data from Ethiopia is not just for Ethiopia. The more observations you have in Africa, the better forecast we’ll have anywhere else in the world.”

The observations lost over the past few decades didn’t just limit forecasts then: The holes in the data will exist forever, always needing to be filled with reanalysis data—climate modeling that interpolates data on the basis of available observations—whenever a meteorological service or scientists want to analyze trends in a country’s weather and climate.

Lost observations also mean policymakers have no long-term data to use to plan adaptation strategies. “The resilience of people in the communities is reduced, and people become very vulnerable,” Annor said.

It’s not just local residents who suffer the consequences of a lack of data, either. Weather patterns in Africa play a role in the genesis of Atlantic hurricanes and spark Saharan dust storms, which can travel thousands of kilometers and affect global atmospheric processes.

“The data from Ethiopia is not just for Ethiopia,” Dinku said. “The more observations you have in Africa, the better forecast we’ll have anywhere else in the world.”

A lack of observational stations leaves scientists without sufficient data to answer research questions, too. For instance, sparse rainfall observations limited a full assessment of whether and how climate change influenced heavy rainfall and flooding around Lake Kivu that killed at least 595 people in Rwanda and the Democratic Republic of Congo in 2023.

“The scarcity and inaccessibility of meteorological data…meant we couldn’t confidently evaluate the role of climate change,” scientists from World Weather Attribution wrote in a summary of their attempt to analyze the event.

Low-Cost Stations as a Solution

In 2006, Oregon State University hydrologist John Selker ran into a similar data problem. He was working in Ghana, attempting to measure how much rainfall trees intercept. He and his collaborators found themselves stymied by a lack of rainfall measurements that kept them from completing the analysis they had planned.

“It was really shocking,” Selker said, adding that the only rainfall data they seemed to be able to find were sparse datasets that they had to apply for access to.

Selker and his colleague at the Delft University of Technology, Nick van de Giesen, brainstormed a solution: a low-cost weather station that could transmit meteorological and hydrological data over cell networks. They called their new project the Trans-African Hydro-Meteorological Observatory, or TAHMO.

With TAHMO, “the question was, What can we do now to improve on the density of stations to ensure that we can have reliable data from Africa that can both help feed the global models and [create] local models that are as accurate and useful as the ones that we have in the U.S. and EU countries?” said Annor, TAHMO’s CEO.

To date, TAHMO has worked with national liaison agencies (most frequently, national meteorological agencies) to install more than 750 stations in 23 countries and has collected more than 7 billion total observations. The stations, owned and installed by TAHMO, measure standard weather parameters such as precipitation, wind speed, wind direction, relative humidity, temperature, and solar radiation. Often, TAHMO approaches national meteorological agencies with a proposal to install stations, though some countries have asked TAHMO for assistance, too.

The data from TAHMO stations are shared directly with each country’s liaison agency. Each agreement between TAHMO and a country allows TAHMO to make these data available for any researchers interested in using them in peer-reviewed studies. It also gives a country the right to halt data collection, if it chooses.

“Policymakers are supposed to be guided by climate scientists, and the climate scientists can only authoritatively talk about that if they have quality data.”

Mburu, the longtime Kenyan meteorologist, became one of TAHMO’s main contacts in that country, helping to establish a relationship between the organization and the Kenya Meteorological Department. Now semiretired, he is a consultant for TAHMO at the organization’s headquarters, located in Nairobi. In Kenya, he said, TAHMO stations have been the most reliable forecasting system over the past decade.

Data from TAHMO stations have given Kenya’s Meteorological Department significant insight into what causes flooding, especially in Nairobi County, said Paul Murage, a climate scientist at the department who also trains other meteorologists at the WMO regional training center in Nairobi. Flash flooding has become a significant issue in the city; Murage recounted a day in March 2024 when the Nairobi Expressway, a major roadway, was impassable during heavy rains.

Murage said having rainfall data from TAHMO stations empowers his agency to persuade policymakers that better, climate-proofed infrastructure is needed. “Policymakers are supposed to be guided by climate scientists, and the climate scientists can only authoritatively talk about that if they have quality data,” he said.

TAHMO stations were included in the High Impact Weather Lake System Project (HIGHWAY), the WMO project to improve early-warning systems across Lake Victoria.

Another U.S.-based project, called 3D-PAWS (for 3D-Printed Automatic Weather Station), works with national meteorological agencies in developing countries to establish 3D printing fabrication facilities, install 3D printed, low-cost observational stations, and train local staff to maintain their own stations long term.

The group has worked with six African countries—Kenya, Malawi, Senegal, Uganda, Zambia, and Zimbabwe—and has deployed more than 250 stations. Prior to being dissolved in July 2025, the U.S. Agency for International Development (USAID) was a major 3D-PAWS funder, connecting the organization with countries via requests from national meteorological agencies in Africa.

Staff from the Zimbabwe Meteorological Services Department install a 3D-PAWS (Printed Automatic Weather Station) tipping bucket rain gauge at the Kutsaga Research Station in 2024. Credit: Paul Kucera

The goal is for each country to eventually run its network completely on its own, said Paul Kucera, an atmospheric scientist at the University Corporation for Atmospheric Research (UCAR) who codeveloped the 3D-PAWS program with his colleague Martin Steinson. They designed the original 3D printed stations themselves and incorporated feedback from international partners in newer iterations of the design.

Each partner country owns the stations once they’re installed. Though the initial training and installations are supported by grants to 3D-PAWS, the expectation is that each country’s own staff will incorporate the costs of operation and maintenance into its annual budgets after a few years.

Barbados, one of the countries outside Africa that 3D-PAWS works with, now has a self-sufficient team independent of the 3D-PAWS group that provides feedback to 3D-PAWS on how to improve their operations. Kucera hopes Kenya will be the first African country to achieve the same level of independence.

The 3D-PAWS data are typically open to the public via a free database system. Near-real-time data from 3D-PAWS stations in Kenya and Zimbabwe are also sent to the Famine Early Warning Systems Network (FEWS NET), a program established by USAID to provide early-warning systems for famine. Kucera’s own research group aims to use the data to develop other tools, such as automatic early alerts for weather events.

Many national governments in Africa are investing in climate services, too, Shilenje said. He’s seen an increase in the number of countries adopting national frameworks for climate services and putting plans in motion to improve weather networks.

As one example, the Tanzania Meteorological Authority installed five new weather radars in 2024, bringing the countrywide total to seven and giving Tanzania the greatest number of radars in East and Central Africa. It’s “quite a significant investment” for a single African country to make, Shilenje said.

Global Support

Larger, international efforts also provide assistance. In 2021, the WMO launched the Systematic Observations Financing Facility (SOFF), a partnership between the WMO, United Nations Environment Programme, United Nations Development Programme, and United Nations Multi-Partner Trust Fund meant to finance advancements in weather observational networks in developing countries through grants. SOFF is also part of the U.N. Early Warnings for All initiative, a project aiming to ensure that everyone on Earth is protected from natural disasters by early-warning systems by 2027.

SOFF, now 3 years old, has partnered with 24 African countries to support station building, radiosonde launches, and continued maintenance of these networks. SOFF, like TAHMO and 3D-PAWS, emphasizes continued support after the initial installation of stations. SOFF does this via a peer-to-peer network of national meteorological agencies. Twenty-eight agencies worldwide have expressed interest in acting as peer mentors, many to African agencies, said Mario Peiró Espí from the partnership office at SOFF.

The concept has seen successes. Peiró Espí recounted a recent interaction with a staff member at the Austrian meteorological agency: “He said the guys in South Sudan write to him all the time to check on questions that before, they didn’t have anyone to check in with. They didn’t have any number to call and say, ‘Hey, we are facing this challenge, we don’t know how to solve it, can you help us?’”

Nine of SOFF’s African partner countries have entered the organization’s investment phase, during which national meteorological agency staff install stations and launch radiosondes with SOFF support. Mozambique, a coastal nation that frequently faces destructive floods and tropical cyclones, is one of those countries.

During a flooding event in 2000, Mozambique lost the majority of its weather stations. A $7.8 million grant from SOFF is helping the country’s national meteorological agency to recover the network by establishing 6 new land-based weather stations, upgrading 15 existing stations, and launching 4 airborne stations.

Farther north, Chad faces a dire lack of weather data, too—as of October 2024, Chad was reporting just 3% of the surface weather observations required by the GBON agreement. SOFF is working with the country toward the goal of installing or upgrading at least 34 weather stations.

Markus Repnik, SOFF’s director, feels strongly that the world should think of improvements in Africa’s observational networks not just as assistance to Africa but as a global public good. The world is dependent on African meteorological agencies for accurate forecasts everywhere, he said. It’s as much as 20 times more valuable to install a single station in a data-poor area of Africa than to add one to a European network, he said.

While 3D-PAWS, TAHMO, and SOFF focus on station building and radiosonde launching, other groups lend additional support. Beyond investments in SOFF, the WMO is spending roughly $56 million in Africa on climate service projects such as those used to support improved food security and health outcomes.

Dinku’s research group has an additional solution: the so-called ENACTS (Enhancing National Climate Services) approach.

Via ENACTS, Dinku and his colleagues work with countries’ national meteorological and hydrological services to create more comprehensive and usable precipitation and temperature datasets. For rainfall, they combine a country’s meteorological station data with satellite rainfall estimates to improve the coverage of the dataset. For temperature, they use reanalysis to fill in missing past data points.

ENACTS prioritizes satellite and modeling work over installing new stations because new stations can’t provide the decades of past data needed to provide reliable forecasts or understand climate trends now. ENACTS has been implemented in 15 African countries at the national level.

An Upward, Uncertain Trend

Thanks to these efforts and others, the number and density of reporting weather stations in Africa continue to tick slowly upward. Philanthropic donors are beginning to understand the importance of a robust, global weather observation system, and the need for improvements has gotten recent exposure on the world stage. But there’s still a long way to go before observational networks on the continent reach the density of those in Europe or North America.

One ever-present barrier is funding. Many African countries still lack the financial resources to improve their meteorological services, according to a recent WMO report. Even SOFF, which has been able to mobilize more than $100 million in grants in the 4 years it’s existed, faces a “very challenging fundraising environment,” Peiró Espí said. SOFF needs an additional $200 million by the end of 2026 to meet the needs of the countries it’s working with.

Facing such fundraising challenges, SOFF plans to announce a new funding mechanism, the SOFF Impact Bond, at COP30 (the 30th Conference of the Parties) in Belém, Brazil. The bond will “make resources available upfront…while allowing donors to spread contributions over a longer period,” according to SOFF.

A changing political landscape in the United States could pose obstacles, too. This summer, the Trump administration officially shut down USAID and said many of its programs would be absorbed by the U.S. State Department. Kucera said 3D-PAWS is still waiting to hear whether the changes will affect its fiscal year 2026 funding, but the group is being “cautiously optimistic” and working to diversify its funding sources.

Fragmentation of efforts also slows progress. Africa is full of fragmented investments, Repnik said. “In each and every country, you have a hodgepodge of investments.”

“The future benefits [of more investment] will be immense.”

This hodgepodge leads to scenarios like the one that Dinku witnessed in Kenya, where the national meteorological agency receives data from a handful of different types of weather stations provided by various nongovernmental organizations and intergovernmental organizations, all with different data reporting systems. Shilenje’s seen this too: “You have different companies providing different sets of equipment,” he said. “It may be working very well, but compatibility is a challenge.”

To help with that issue, Dinku and a colleague created a data tool that allows users to access, process, perform quality control on, and visualize data from all their different weather station systems.

The WMO is working to solve the fragmentation issue as well, including efforts to improve national meteorological agencies’ digital capacity, a software tool to homogenize data, and the African Partner Coordination Mechanism, a platform by which nongovernmental organizations, intergovernmental organizations, and companies can exchange plans and objectives to ensure that everybody is working toward the same goal.

Still, collaboration and coordination are an uphill battle, Dinku said. “The last two or three decades, we have been talking about coordination. But everybody talks about coordination, and then they go about doing their own thing.”

Climate change only adds urgency to the efforts. As the climate warms, it will become even more variable. Risky decisions that farmers make about when, where, and what to plant will come with higher consequences than they once did. Mitigating the effects of climate change “will not happen without proper climate services,” Mburu said, adding that a robust observational network is critical to drastically reduce the impacts of climate change in Africa.

“The future benefits [of more investment] will be immense,” he said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Citation: van Deelen, G. (2025), Building better weather networks, Eos, 106, https://doi.org/10.1029/2025EO250386. Published on 27 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New Earthquake Model Goes Against the Grain

Mon, 10/27/2025 - 12:50
Source: Geophysical Research Letters

When a slab slides beneath an overriding plate in a subduction zone, the slab takes on a property called anisotropy, meaning its strength is not the same in all directions. Anisotropy is what causes a wooden board to break more easily along the grain than in other directions. In rock, the alignment of minerals such as clay, serpentine, and olivine can lead to anisotropy. Pockets of water in rock can also cause and enhance anisotropy, as repeated dehydration and rehydration commonly occur at depth in a subducting slab.

It is well known that an earthquake generates both a compressional wave and a shear wave. If the shear wave passes through anisotropic rock, it can split into a faster shear wave and a slower one with different polarizations.

Although seismologists routinely measure the shear wave splitting in subduction zones by analyzing recorded seismic waveform data, it is challenging to pinpoint where splitting occurs along the wave propagation path.

In the past, researchers have investigated the circulation of Earth’s interior for answers, in particular in the mantle wedge region above and below the slab. However, Appini et al. suggest a different explanation: that, contrary to popular wisdom, it is the downgoing slab that causes most of the shear wave splitting.

The researchers tested their theory using recordings of 2,567 shear waves from the Alaska-Aleutian subduction zone. They found that the way the waves split as they propagate through the slab varied by earthquake location and that these variations were consistent with the anisotropy observed in the dipping slab. They also used a forward model to predict that the splitting pattern will differ depending on the direction the shear wave comes from, which was verified by data observation. Previously, scientists thought the variation in splitting patterns was due to complex mantle flows.

Furthermore, a dipping anisotropic slab also explains why deep earthquakes within a slab have unusual seismic wave radiation patterns. Other recent findings also hint that the composition of subducting plates causes anisotropy, the authors write.

If the slab holds most of the anisotropy, instead of the mantle wedge or subslab region, this finding has far-reaching consequences that could fundamentally change established ideas on how mantle dynamics work and how rock deforms, the authors suggest.

These results drive home the plausibility that slab anisotropy is an understudied component of seismology and geodynamics, the authors say. (Geophysical Research Letters, https://doi.org/10.1029/2025GL116411, 2025)

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), New earthquake model goes against the grain, Eos, 106, https://doi.org/10.1029/2025EO250403. Published on 27 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Anticipating the impact of Hurricane Melissa in Jamaica

Mon, 10/27/2025 - 08:32

Hurricane Melissa is bearing down on Jamaica, with many areas likely to see over 500 mm of rainfall. The impacts could be extremely significant.

Hurricane Melissa has strengthened substantially over the weekend, and is now on course to track across Jamaica in the next couple of days. Various media agencies have identified the threats that this storm poses to a country with high vulnerability. As always, NOAA has excellent tracking charts for this storm.

The current forecast track will take the storm directly across Jamaica:-

The forecast track of Hurricane Melissa. Graphic from NOAA as at 07:52 UTC on 27 October 2025.

NOAA also provides data on forecast precipitation (rainfall):-

Precipitation potential for Hurricane Melissa. Graphic from NOAA as at 07:52 UTC on 27 October 2025.

There is a great deal of uncertainty in this type of forecast – the final totals will depend upon the track, the rate at which the storm moves, the intensity of the storm (and how that changes as a result of the contact with the land mass) and orographic effects. But much of Jamaica is forecast to receive over 500 mm of rainfall, and some parts may receive more than 750 mm.

Now, the average annual rainfall in Jamaica is 2,100 mm for the island as a whole, and much more in some places, so this must be seen in context. However, as I have noted often before, in most cases the dominant medium through which tropical cyclone losses occur in water (even though windspeed often grabs the headlines). As the Google Earth image below shows, the island is characterised by steep slopes – this is a recipe for channelised debris flows:-

Google Earth image showing the landscape of eastern Jamaica.

There is active preparation underway in Jamaica, including evacuations, and in Hurricane Beryl last year this was a success. However, we know that many people choose not to move, and this storm is on a different scale.

In the immediate aftermath, the initial focus will inevitably be on the capital, Kingston, as this is where the reporters are likely to be located. Watch out for news from the east of the island though, especially on the coast and on the southern and eastern sides of the mountains. In severe storms, communications are often lost, so in this case no news may well be probably bad news.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The giant Tupaasat rock avalanche in South Greenland

Fri, 10/24/2025 - 14:38

A new paper describes a rock avalanche in Greenland about 10,900 years BP that had a volume of over 1 billion cubic metres and that travelled almost 16 kilometres.

A fascinating paper (Pedersen et al. 2026) has just been published in the journal Geomorphology that describes a newly-discovered ancient rock avalanche in Greenland. This landslide, which is located in the Tupaasat Valley, is truly enormous. The authors estimate that it has a volume that exceeds 1 km3 (1 billion m3), with a runout distance of 15.8 kilometres and a vertical height difference of 1,440 metres.

The rear scar of the landslide is located at [60.4117, -44.2791]. It is really hard to capture this landslide on Google Earth, but fortunately the paper has been published under a creative commons licence. Here, therefore, is a map of the landslide by Pedersen et al. (2026):-

A) Geomorphological map of the Tupaasat rock avalanche deposits within the landslide outline together with the paleo-sea level line at 10 m a.s.l., and the proposed paleo-ice sheet extent.
B) Map showing the bathymetry data and the landslide outline. The bathymetry data is acquired from the Danish Geodata Agency and is not suitable for navigation C) Cross-section of Tupaasat rock avalanche with columns indicating the geomorphological features described in the results. The terrain slopes are presented below.
Images from Pedersen et al. (2026).

I have quickly annotated a Google Earth image of the site, showing the source and the track of the landslide. Note that the toe extends into the fjord, and thus is underwater, by a couple of kilometres:-

Annotated Google Earth image showing of the Tupaasat rock avalanche.

Landslides on this scale are hard to fathom. If this volume of rock was standing on a standard American football field (110 m x 49 m) it would form a column 185.5 km tall.

Pedersen et al. (2026) have dated the time of occurrence of this landslide. They conclude that it occurred about 10,900 years ago. This coincides remarkably well with the dated deglaciation (retreat of the icesheets) in this area. Thus, the authors suggest that the instability was probably associated with debuttressing of the glacier (i.e. the removal of the ice adjacent to the slope. They cannot rule out the possibility that final failure might have been triggered by an earthquake, though.

A further intriguing question is whether the event triggered a tsunami in the fjord. The distance that the landslide has moved suggests that it was very energetic. Given that it extended to the water (and some of the deposit is now within the lake) it is extremely likely that a displacement wave was triggered.

The latter point is very pertinent as there is increasing concern about the dangers of giant rock slope failures generating damaging tsunami events in fjords. For example, CNN published an article this week in the aftermath of the Tracy Arm landslide and tsunami that highlights the risk to cruise ships. It notes that:

Alaska’s foremost expert on these landslides knows why there hasn’t been a deadly landslide-turn-tsunami disaster, yet: sheer luck.

“It’s not because this isn’t a hazard,” said geologist Bretwood Higman, co-founder and executive director of nonprofit Ground Truth Alaska. “It’s because it just hasn’t happened to be above someone’s house or next to a cruise ship.”

An additional piece of context is the remarkable flooding that occurred in Alaska last weekend as Typhoon Halong tracked across parts of the state. This appears to have received far less attention than might have been anticipated, at least outside the US.

It is surely only a matter of time before we see a really large-scale accident as a result of a tsunami triggered by a rock slope failure. A vey serious scenario is that a large cruise ship is overwhelmed and sunk. The loss of life could be very high.

Reference

L.L. Pedersen et al. 2026. A giant Early Holocene tsunamigenic rock-ice avalanche in South Greenland preconditioned by glacial debuttressing. Geomorphology, 492, 110057,
https://doi.org/10.1016/j.geomorph.2025.110057.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Tiny Uranian Moon Likely Had a Massive Subsurface Ocean

Fri, 10/24/2025 - 13:25

Uranus’s tiny moon Ariel may have had a subsurface ocean that made up around 55% of its total volume. By mapping craters, crags, and ridges on the moon’s surface, planetary scientists modeled how thick Ariel’s crust was before it cracked under tidal stress and created the geologic features seen today. By subtracting the size of the crust and core, the researchers found that the Arielian ocean could have been about 170 kilometers thick as recently as 1 billion years ago.

“If Ariel had a subsurface ocean, it definitely does imply that other small icy moons could also have [had] subsurface oceans,” said Caleb Strom, who conducted this research as a planetary geologist fellow at the University of North Dakota in Grand Forks.

Maybe “it’s easier to make an ocean world than we thought,” he added.

An Unlikely Ocean World

Ariel is the second closest of the five large moons of Uranus. But large is a bit of a misnomer, as Ariel is only about 1,160 kilometers across, or about a third the size of Earth’s Moon.

When Voyager 2 flew through the Uranus system in 1986, scientists were surprised to see that Ariel’s icy surface was relatively young, was geologically complex, and showed some signs of cryovolcanism. Some features on the moon’s surface are similar to those seen on Europa, Enceladus, and Triton, three confirmed ocean worlds.

“We weren’t necessarily expecting it to be an ocean world.”

“What’s interesting about Ariel is that it’s unexpected,” Strom said. “We weren’t necessarily expecting it to be an ocean world.”

Later studies also found ammonia and carbon oxide compounds on Ariel’s surface, chemistry that often suggests the presence of subsurface liquid. The molecules disappear quickly unless they are frequently replenished.

But with Ariel being so small and unable to retain heat for very long, scientists thought that any subsurface ocean it may once have had was relatively thin and short-lived.

Strom and his colleagues didn’t initially set out to challenge this understanding of Ariel’s interior. They were interested in understanding the forces that could have created the moon’s geologic features.

To do this, the researchers first mapped the moon’s surface using images from the Voyager 2 flyby, cataloging ridges, fractures, and craters. They then modeled Ariel’s internal structure, giving it, from the top down, a brittle crust, a flexible crust, and an ocean all atop a solid core. They then simulated how that crust would deform under different levels of stress from tidal forces from other nearby Uranian moons and the planet itself. By varying the crust and ocean thickness and the strength of the tidal stress, the team sought to match the stress features in their models to the Voyager-derived geologic maps.

In 2023, the James Webb Space Telescope imaged Uranus and several of its major moons and rings. Credit: NASA, ESA, CSA, STScI; Image Processing: Joseph DePasquale (STScI)

The team’s models indicate that a crust less than 30 kilometers thick would have fractured under a moderate amount of tidal stress and created the geologic features seen today. The researchers suggest that to cause that stress, in the past 1–2 billion years (Ga), an orbital resonance with nearby moon Miranda stretched Ariel’s orbit about 4% from circular and fractured the surface.

“This is really a prediction about the crustal thickness” and the stress level it can withstand, Strom said. Then, with a core 740 kilometers across and a crust 30 kilometers thick, that would mean that Ariel’s subsurface ocean was 170 kilometers from top to bottom and made up about 55% of its total volume. The researchers published their results in Icarus in September.

Is Ariel Odd? Maybe Not

“The possible presence of an ocean in Ariel in the past [roughly] 1 Ga is certainly an exciting prospect,” said Richard Cartwright, an ocean world scientist at Johns Hopkins Applied Physics Laboratory (JHUAPL) in Laurel, Md. “These results track with other studies that suggest the surface geology of Ariel offers key clues in terms of recent activity” and the possibility that Ariel is, or was, an ocean world. Cartwright was not involved with the new research.

Strom cautioned that just because Ariel once had a substantial subsurface ocean doesn’t mean that it still does. The moon is very small and doesn’t retain heat very well, he said. Any ocean that remained would likely be much thinner and probably not a good place to search for life.

However, the fact that tiny Ariel may once have had such a large ocean may mean that ocean worlds are more common and easier to create than scientists once thought. Understanding the conditions that led to Ariel’s subsurface ocean could help scientists better understand how such worlds come about and how they evolve.

“Ariel’s case demonstrates that even comparatively sized moons can, under the right conditions, develop and sustain significant internal oceans.”

“Ariel’s case demonstrates that even comparatively sized moons can, under the right conditions, develop and sustain significant internal oceans,” said Chloe Beddingfield, a planetary scientist also at JHUAPL. “However, that doesn’t mean all similar bodies would have done so. Each moon’s potential for an ocean depends on its particular mix of heat sources, chemistry, and orbital evolution.”

An ocean composing 55% of a planet’s or moon’s total volume might seem pretty huge, but it also might be perfectly within normal range for ocean worlds, added Beddingfield, who was not involved with this research. “The estimated thickness of Ariel’s internal ocean…is striking, but not necessarily unexpected given the diversity of icy satellites.”

Too, Voyager 2 did not image all of Ariel’s surface, only the 35% that was illuminated during its flyby. A future long-term mission to the Uranus system could provide higher-resolution global maps of Ariel and other moons to help refine calculations of crustal thickness and determine the existence of subsurface oceans, Strom said.

Strom and his team plan to expand their stress test research to other moons of Uranus such as Miranda, Oberon, and Umbriel and possibly icy moons around other planets.

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

Citation: Cartier, K. M. S. (2025), Tiny Uranian moon likely had a massive subsurface ocean, Eos, 106, https://doi.org/10.1029/2025EO250398. Published on 24 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A medida que el Ártico se calienta, los suelos pierden nutrientes clave

Fri, 10/24/2025 - 13:22

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

Los suelos árticos y subárticos almacenan una proporción considerable del carbono de la Tierra. Sin embargo, el aumento de las temperaturas podría drenar el nitrógeno de estos suelos — un nutriente clave —. Según un nuevo estudio, la pérdida de nitrógeno podría reducir el crecimiento de las plantas, limitando la capacidad de los suelos para almacenar carbono y amplificando el calentamiento global.

Los suelos de latitudes altas almacenan grandes cantidades de carbono porque las bajas temperaturas retardan la actividad microbiana. Aunque las plantas producen materia orgánica a través de la fotosíntesis, los microorganismos no pueden consumirla lo suficientemente rápido, provocando su acumulación con el tiempo. Los científicos han estado preocupados de que un Ártico más cálido aceleraría la actividad microbiana, liberando el carbono almacenado a la atmósfera como dióxido de carbono (CO₂). Pero también esperaban que las temperaturas más cálidas estimularan el crecimiento de las plantas, lo que reabsorbería parte del carbono y compensaría parcialmente estas emisiones.

La nueva investigación muestra que este último escenario es muy improbable, ya que el calentamiento provoca que los suelos pierdan nitrógeno, una pérdida que podría inhibir el crecimiento de las plantas.

“No esperábamos ver una pérdida de nitrógeno.”

Los hallazgos provienen de un experimento de una década de duración realizado en un pastizal subártico cerca de Hveragerði, Islandia. En 2008, un potente terremoto alteró los flujos de agua geotérmica en la región, convirtiendo parcelas de suelo que antes eran normales en zonas calentadas naturalmente con gradientes de temperatura que oscilan entre 0.5 °C y 40 °C por encima de los niveles anteriores. El evento creó un laboratorio natural único para observar cómo responden los ecosistemas al calentamiento a largo plazo.

Usando isótopos estables de nitrógeno-15 para rastrear los flujos de nutrientes en el paisaje, los investigadores encontraron que, por cada grado Celsius de calentamiento, los suelos pierden entre 1.7 % y 2.6 % de su nitrógeno. Las mayores pérdidas ocurrieron durante el invierno y principios de la primavera, cuando los microbios permanecían activos pero las plantas estaban inactivas. Durante este tiempo, se liberaron compuestos nitrogenados como el amonio y el nitrato en el suelo, pero las plantas no podían absorberlos, se perdieron ya sea por lixiviación al agua subterránea o escapándose a la atmósfera como óxido nitroso, un gas de efecto invernadero casi 300 veces más potente que el CO₂.

Los resultados se publicaron en un artículo en Global Change Biology.

«No esperábamos ver una pérdida de nitrógeno», mencionó Sara Marañón, científica del suelo del Centro de Investigación Ecológica y Aplicaciones Forestales de España y primera autora del estudio. «Los mecanismos del suelo para almacenar nitrógeno se están deteriorando».

Un ecosistema menos fértil, más rápido

Los investigadores también encontraron que el calentamiento debilitó los mecanismos que ayudan a los suelos a retener el nitrógeno. En las parcelas más cálidas, la biomasa microbiana y la densidad de las raíces finas — ambas fundamentales para el almacenamiento de nitrógeno — eran mucho menores que en las parcelas más frías. Aunque los microbios eran menos abundantes, su metabolismo era más rápido, liberando más CO2 por unidad de biomasa. Mientras tanto, las plantas luchaban por adaptarse, quedando rezagadas tanto en su crecimiento como en la absorción de nutrientes.

«Las comunidades microbianas son capaces de adaptarse y alcanzar un nuevo equilibrio con tasas de actividad más rápidas», dijo Marañón. «Pero las plantas no pueden seguirles el ritmo»

“Este no es un mensaje muy optimista.”

El aumento del metabolismo microbiano resulta inicialmente en un mayor consumo del nitrógeno y carbono disponibles en el suelo. Sin embargo, después de 5 o 10 años, el sistema parece alcanzar un nuevo equilibrio, con niveles reducidos de materia orgánica y menor fertilidad. Ese cambio sugiere que el calentamiento de los suelos puede provocar una transición hacia un estado permanentemente menos fértil, haciendo más difícil la recuperación de la vegetación y conduciendo a una pérdida irreversible de carbono.

Tradicionalmente, los científicos han pensado que, dado que la materia orgánica se descompone más rápidamente en un clima más cálido, el nitrógeno que contiene estará más disponible, lo que conducirá a una mayor productividad, según Erik Verbruggen, ecólogo del suelo de la Universidad de Amberes, en Bélgica, que no participó en el estudio. «Este artículo demuestra que, en realidad, esto no está ocurriendo».

En cambio, el nitrógeno está siendo filtrado del suelo durante la primavera, lo que lo hace inaccesible para una mayor producción de biomasa. «Este no es un mensaje muy optimista», afirmó Verbruggen.

Una fuente subestimada de gases de efecto invernadero

Dado que las regiones árticas se están calentando más rápido que el promedio global, esta alteración del ciclo de nutrientes podría volverse más evidente pronto. La pérdida de nitrógeno y carbono de los suelos en regiones frías puede representar una fuente significativa y previamente subestimada de emisiones de gases de efecto invernadero, que los modelos climáticos actuales aún no han incorporado por completo.

Los investigadores regresaban periódicamente a los cálidos pastizales cercanos a Hveragerði, Islandia, para medir el nitrógeno. Crédito: Sara Marañón.

Los investigadores planean explorar las fases tempranas del calentamiento del suelo, trasplantando fragmentos de suelos normales hacia áreas calentadas, y también investigar cómo distintos tipos de suelo responden al calor. Marañón señaló que los suelos islandeses estudiados son de origen volcánico y muy ricos en minerales, a diferencia de los suelos orgánicos de turba comunes en otras regiones árticas.

“Los suelos árticos también incluyen el permafrost en lugares como el norte de Rusia y partes de Escandinavia, y ellos son los mayores reservorios de carbono en los suelos del mundo”, dice Verbruggen. Por otro lado, los suelos analizados en esta investigación eran suelos de pastizal someros. “No son necesariamente representativos de todos los suelos árticos.”

Aun así, Verbruggen añadió, los hallazgos del estudio resaltan el delicado equilibrio entre productividad y pérdida de nutrientes en estos sistemas.

Las abundantes reservas de carbono del suelo lo convierten en un riesgo importante si se gestiona inadecuadamente, dijo Marañón. «Pero también puede convertirse en un aliado potencial y compensar las emisiones de CO2».

—Javier Barbuzano (@javibar.bsky.social), Escritor de ciencia

This translation by Saúl A. Villafañe-Barajas (@villafanne) was made possible by a partnership with Planeteando and Geolatinas. Esta traducción fue posible gracias a una asociación con Planeteando y Geolatinas.

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Better Way to Monitor Greenhouse Gases

Fri, 10/24/2025 - 13:21

In recent years, the international community has made progress in slowing increases in the rate of carbon dioxide emissions and in acknowledging the scale of methane leaks from oil and gas facilities. However, carbon dioxide emissions continue to rise, methane releases from the energy sector have not abated, and there is more need than ever for targeted and sustained greenhouse gas (GHG) emissions reductions and other climate change mitigation approaches.

The success of climate change mitigation approaches relies in part on having accurate, timely, and integrated carbon cycle data from surface, airborne, and satellite sensors.

The success of such actions relies in part on having accurate, timely, and integrated carbon cycle data from surface, airborne, and satellite sensors covering local, regional, and international scales. These data improve efforts to track emissions reductions, identify and mitigate unexpected emissions and leaks, and monitor ecosystem feedbacks to inform land management.

In September 2024, researchers in the carbon cycle monitoring community met to discuss how best to establish a more effective system for monitoring GHGs and to help accelerate climate action through better data and decision support.

Here we highlight issues and challenges facing emissions monitoring and documentation efforts illuminated during the September meeting, as well as ideas and proposals for tackling the challenges. The recommendations emphasize the urgency of enhanced monitoring to support the goals of the Paris Agreement and the Global Methane Pledge, particularly in the face of increasing climate extremes and the vulnerability of Earth’s natural carbon reservoirs [Friedlingstein et al., 2025].

Bottom-Up Meets Top-Down

Parties to the Paris Agreement track their progress toward meeting GHG emissions reduction targets through bottom-up accounting methods that track carbon using local ground-based observations. These methods combine information about the spatial extents of carbon sources and sinks with estimates of how much these sources and sinks emit or take up, respectively.

This inventorying approach offers high-precision information at time intervals that support long-term tracking. However, it is also often time intensive, depends on country-specific methodologies, may not accurately reflect spatiotemporal variability in GHG fluxes, and is not suited for operational monitoring of sudden changes or reversals [Elguindi et al., 2020; Nicholls et al., 2015].

Top-down approaches using remotely sensed atmospheric GHG and biomass observations offer an independent accounting method [Friedlingstein et al., 2025], with the potential for low-latency (weekly to monthly) monitoring of GHG emissions and removals. Technological advances offered by facility-scale plume imagers (e.g., GHGSat, Earth Surface Mineral Dust Source Investigation (EMIT), Carbon Mapper) and global GHG mappers (e.g., Orbiting Carbon Observatory-2 and -3 (OCO-2 and -3), Tropospheric Monitoring Instrument (TROPOMI), Greenhouse gases Observing Satellite-2 (GOSAT-2)) show promise for monitoring GHG fluxes at the local and global scale, respectively [Joint CEOS-CGMS Working Group on Climate Greenhouse Gas Task Team, 2024].

Greenhouse gas (GHG) observations with existing capabilities alone are insufficient for adequately informing climate change mitigation measures.

However, a significant gap remains in our ability to monitor weaker, spatially distributed emissions and removals at intermediate (10- to 1,000-kilometer) scales [Joint CEOS-CGMS Working Group on Climate Greenhouse Gas Task Team, 2024], particularly in systems managed by humans such as energy production and land use.

Conversations during the 2024 workshop—partly intended to inform the development of the next Decadal Survey for Earth Science and Applications from Space—highlighted limitations in current GHG monitoring capabilities. They also emphasized the critical need for an operational observing system that leverages top-down and bottom-up approaches to support climate action at local, national, and international scales.

Because of a lack of sensitivity to subregional processes, GHG observations with existing capabilities alone are insufficient for adequately informing climate change mitigation measures [e.g., Jacob et al., 2022; Watine-Guiu et al., 2023]. We must also integrate state-of-the-art science and improved understanding of Earth’s changing carbon cycle, as well as data from new observing system technologies, into the information provided to decisionmakers.

This integration requires identifying gaps and opportunities with respect to knowledge, data, and stakeholder needs. It also requires defining a vision for sustained, operational GHG monitoring to support emissions reductions, track carbon cycle feedbacks, and deliver reliable, timely, transparent, and actionable information.

This vision could be achieved with a unified multitiered global system combining models and observations of the atmosphere, land, and ocean collected with surface, airborne, and satellite tools to track carbon fluxes (e.g., atmospheric emissions and removals) and stocks (e.g., biomass, soil carbon) with improved frequency, spatial coverage, and precision (Figure 1).

Fig. 1. An effective multitiered greenhouse gas (GHG) observing system should integrate observations of the atmosphere, land, and ocean from sensors and samples on Earth’s surface, in the air, and aboard satellites. Carbon dioxide is shown as black and red molecules, and methane is shown as black and white molecules. ARGO refers to a fleet of sensors floating in the upper ocean. FTIR is Fourier transform infrared spectroscopy. Credit: Created in BioRender; Carroll, 2025, https://BioRender.com/b77439n

Organizing such a system would require substantial international coordination among governmental, academic, and nongovernmental organizations, perhaps mediated through entities such as the World Meteorological Organization’s Global Greenhouse Gas Watch, the Committee on Earth Observation Satellites, and the U.S. Greenhouse Gas Center (USGHGC).

Addressing Gaps from Space

A globally unified GHG observing system should capitalize on spaceborne technologies to fill spatial and temporal gaps in in situ networks and to monitor the responses of carbon fluxes and stocks to disturbances, weather extremes, and environmental change. This system should prioritize four key elements.

First, gathering more vertically detailed data—from the top of the atmosphere to ground level—is critical. Existing satellites measure the total amounts of carbon dioxide and methane in the atmospheric column. These measurements work well for detecting changes over large (e.g., continental) spatial scales and at facility scale, but they provide less detail about smaller-scale processes. Knowing GHG concentrations near the surface relative to those in the upper atmosphere could, for example, provide improved tracking of fluxes and understanding of the processes responsible.

Sustained vertical GHG profiling, achieved using multichannel passive sensors deployed on missions such as GOSAT-2 or emerging cloud-slicing lidar methods, for example, is foundational to the proposed system. This profiling would provide long-term time series data to help researchers detect weak but consistent flux changes and increased sensitivity to natural and anthropogenic regional sources [e.g., Parazoo et al., 2016].

Sampling the atmosphere every day would enable better detection of sudden changes in GHG concentrations and linking of those changes to particular sources.

Second, more frequent observations—obtained with a constellation of satellites observing from low, geostationary, and highly elliptical Earth orbits—are needed. Sampling the atmosphere every day, or even multiple times per day, would enable better detection of sudden changes in GHG concentrations and linking of those changes to particular sources.

Third, mapping of carbon stocks should be harmonized by combining information from different sensors and methods. Several means exist to map carbon in vegetation from space, for example, including lidar altimetry used to identify treetops and synthetic aperture radar used to estimate the volumes of trees.

Combining the strengths of existing methods and missions would facilitate more accurate and better resolved monitoring of carbon accumulation and loss due to management practices, disturbances, and ecosystem recovery. Future biomass satellite missions should focus on measurements at the scale of forest plots (i.e., hectare-scale systems with many trees) to provide more useful maps with reduced uncertainty, rather than on applying very high resolution sensors that resolve individual trees.

The fourth key is expanded satellite coverage of tropical, high-latitude, and oceanic regions to better monitor carbon cycle feedbacks [Sellers et al., 2018]. This coverage should involve the use of new active and imaging spectrometer techniques, such as those being developed in the Carbon-I mission concept study, to probe through prevalent clouds and darkness that hinder continuous monitoring.

Beyond the primary focus on GHG and biomass data, we also need—and have opportunities to obtain—complementary datasets to better constrain the locations of and processes affecting carbon sources and sinks. Atmospheric measurements of solar-induced fluorescence by vegetation, carbonyl sulfide, oxygen, carbon monoxide, and isotopes of carbon and oxygen could help disentangle fossil sources of emissions from biological sources and provide insights into processes such as photosynthesis and wildfire activity.

Currently, land and ocean ecosystems remove about half of the anthropogenic carbon emitted into the atmosphere, but this amount could change in the future [Friedlingstein et al., 2025]. Sustained monitoring of these ecosystems—and of the indicators of how they are changing—is necessary to understand and track diverse change across the Earth system.

Addressing Gaps from the Ground

Surface and airborne observations are essential for calibrating spaceborne measurements and for monitoring processes that can’t be observed from space.

Expanded surface and airborne networks for gathering data in situ from oceanic, terrestrial, and aquatic ecosystems are also a critical part of the proposed global observing system. These observations are essential for calibrating spaceborne measurements, for improving our understanding of undersampled regions (e.g., nonforest lands, rivers, wetlands, oceans), and for monitoring processes that can’t be observed from space.

Efforts on several fronts are required to provide more comprehensive ground- and air-based information on carbon fluxes and stocks to better meet stakeholder and research needs. Examples of these needed efforts include obtaining more atmospheric GHG profiles from research and commercial aircraft (e.g., through campaigns such as NOAA’s National Observations of Greenhouse Gasses Aircraft Profiles program), expanding measurements of surface-atmosphere GHG exchanges from tower-mounted sensors in undersampled terrestrial and aquatic systems [Baldocchi, 2020], and collecting seawater composition data from autonomous vehicles (e.g., Argo floats) in coastal and open oceans.

Other needed efforts include collecting more in situ measurements of above- and below-ground biomass and soil carbon and airborne sampling of managed and unmanaged (natural) experimental field sites. For example, monitoring of biomass reference measurement networks, such as GEO-TREES, should be expanded to facilitate monitoring and validation of spaceborne biomass data. These complementary measurements of quantities unobserved by remote sensing, such as soil carbon and respiration, are essential for tracking long-term storage [e.g., Konings et al., 2019].

Connecting Users to Data

Workshop participants envisioned a framework to support decisionmaking by scientists and stakeholders that links observing systems with actionable knowledge through a two-way flow of information. This framework involves three key pieces.

Identifying the underlying causes and drivers of changes in GHG emissions and removals is critical for developing effective, targeted mitigation and management policies.

First, integrating information from data-constrained models is crucial. Guan et al. [2023] offered a “system of systems” approach for monitoring agricultural carbon that is also applicable to other ecosystems. This approach leverages multitiered GHG and biomass data as constraints in land, ocean, and inverse models (which start with observed effects and work to determine their causes) to generate multiscale maps of observable and unobservable carbon stock and flux change. The result is a stream of continuous, low-latency information (having minimal delays between information gathering and output) for verifying GHG mitigation strategies.

Second, scientists must work with stakeholders to identify the underlying causes and drivers of changes in GHG emissions and removals. This identification is critical for assessing progress and developing effective, targeted mitigation and management policies.

Third, the actionable knowledge resulting from this framework—and provided through organizations such as the USGHGC—must be applied in practice. Stakeholders, including corporations, regulatory agencies, and policymakers at all levels of government, should use improved understanding of carbon flux change and underlying drivers to track progress toward nationally determined contributions, inform carbon markets, and evaluate near- and long-term GHG mitigation strategies.

Meeting the Needs of the Future

Benchmarking and validation are important parts of building trust in models and improving projections of carbon-climate feedbacks. By using comprehensive observations of carbon fluxes and stocks to assess the performance of Earth system models [e.g., Giorgetta et al., 2013], scientists can generate more reliable predictions to inform climate action policies that, for example, adjust carbon neutrality targets or further augment GHG observing systems to better study regional feedbacks [Ciais et al., 2014].

The globally unified observing system envisioned, which would integrate advanced spaceborne technologies with expanded ground and air networks and a robust decision support framework, could significantly enhance our ability to track and mitigate GHG emissions and manage carbon stocks.

Successful implementation of this system would also hinge on data accessibility and community building. Developing a universal data platform with a straightforward interface that prioritizes data literacy is crucial for ensuring accessibility for a global community of users. In addition, fostering cross-agency partnerships and engagement and collaborative networking opportunities among stakeholders will be essential for building trust, catalyzing further participation in science, and developing innovative solutions for a more sustainable future.

Acknowledgments

The September 2024 workshop and work by the authors on this article were funded as an unsolicited proposal (Proposal #226264: In support of ‘Carbon Stocks Workshop: Sep 23–25, 2024’) by the U.S. Greenhouse Gas Center, Earth Science Division, NASA. A portion of this research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration (80NM0018D0004).

References

Baldocchi, D. D. (2020), How eddy covariance flux measurements have contributed to our understanding of global change biology, Global Change Biol., 26(1), 242–260, https://doi.org/10.1111/gcb.14807.

Ciais, P., et al. (2014), Current systematic carbon-cycle observations and the need for implementing a policy-relevant carbon observing system, Biogeosciences, 11(13), 3,547–3,602, https://doi.org/10.5194/bg-11-3547-2014.

Elguindi, N., et al. (2020), Intercomparison of magnitudes and trends in anthropogenic surface emissions from bottom-up inventories, top-down estimates, and emission scenarios, Earth’s Future, 8(8), e2020EF001520, https://doi.org/10.1029/2020EF001520.

Friedlingstein, P., et al. (2025), Global Carbon Budget 2024, Earth Syst. Sci. Data, 17(3), 965–1,039, https://doi.org/10.5194/essd-17-965-2025.

Giorgetta, M. A., et al. (2013), Climate and carbon cycle changes from 1850 to 2100 in MPI‐ESM simulations for the Coupled Model Intercomparison Project Phase 5, J. Adv. Model. Earth Syst., 5(3), 572–597, https://doi.org/10.1002/jame.20038.

Guan, K., et al. (2023), A scalable framework for quantifying field-level agricultural carbon outcomes, Earth Sci. Rev., 243, 104462, https://doi.org/10.1016/j.earscirev.2023.104462.

Jacob, D. J., et al. (2022), Quantifying methane emissions from the global scale down to point sources using satellite observations of atmospheric methane, Atmos. Chem. Phys., 22(14), 9,617–9,646, https://doi.org/10.5194/acp-22-9617-2022.

Joint CEOS-CGMS Working Group on Climate Greenhouse Gas Task Team (2024), Roadmap for a coordinated implementation of carbon dioxide and methane monitoring from space, 52 pp., ceos.org/document_management/Publications/Publications-and-Key-Documents/Atmosphere/CEOS_CGMS_GHG_Roadmap_Issue_2_V1.0_FINAL.pdf.

Konings, A. G., et al. (2019), Global satellite-driven estimates of heterotrophic respiration, Biogeosciences, 16(11), 2,269–2,284, https://doi.org/10.5194/bg-16-2269-2019.

Nicholls, D., et al. (2015), Top-down and bottom-up approaches to greenhouse gas inventory methods—A comparison between national- and forest-scale reporting methods, Gen. Tech. Rep. PNW-GTR-906, 30 pp., Pac. Northwest Res. Stn., For. Serv., U.S. Dep. of Agric., Portland, Ore., https://doi.org/10.2737/PNW-GTR-906.

Parazoo, N. C., et al. (2016), Detecting regional patterns of changing CO2 flux in Alaska, Proc. Natl. Acad. Sci. U. S. A., 113(28), 7,733–7,738, https://doi.org/10.1073/pnas.1601085113.

Sellers, P. J., et al. (2018), Observing carbon cycle–climate feedbacks from space, Proc. Natl. Acad. Sci. U. S. A., 115(31), 7,860–7,868, https://doi.org/10.1073/pnas.1716613115.

Watine-Guiu, M., et al. (2023), Geostationary satellite observations of extreme and transient methane emissions from oil and gas infrastructure, Proc. Natl. Acad. Sci. U. S. A., 120(52), e2310797120, https://doi.org/10.1073/pnas.2310797120.

Author Information

Dustin Carroll (dustin.carroll@sjsu.edu), Moss Landing Marine Laboratories, San José State University, San José, Calif.; also at Jet Propulsion Laboratory, California Institute of Technology, Pasadena; Nick Parazoo and Hannah Nesser, Jet Propulsion Laboratory, California Institute of Technology, Pasadena; Yinon Bar-On, California Institute of Technology, Pasadena; also at Department of Earth and Planetary Sciences, Weizmann Institute of Science, Rehovot, Israel; and Zoe Pierrat, Jet Propulsion Laboratory, California Institute of Technology, Pasadena

Citation: Carroll, D., N. Parazoo, H. Nesser, Y. Bar-On, and Z. Pierrat (2025), A better way to monitor greenhouse gases, Eos, 106, https://doi.org/10.1029/2025EO250395. Published on 24 October 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

1.5 Million Acres of Alaskan Wildlife Refuge to Open for Drilling

Thu, 10/23/2025 - 21:54
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

A large swath of the Arctic National Wildlife Refuge (ANWR) will soon open for drilling, the Trump administration announced today.

“For too long, many politicians and policymakers in DC treated Alaska like it was some kind of zoo or reserve, and that, somehow, by not empowering the people or having even the slightest ability to tap into the vast resources was somehow good for the country or good for Alaska,” Secretary of the Interior Doug Burgum said during an Alaska Day event.

As of July 2025, Alaska ranked sixth in the nation for crude oil production.

 
Related

The news is the latest in a saga involving the ANWR, which in total spans 19.6 million acres. The 1.5 million acres to be opened for drilling represent the coastal plain of the refuge.

The 1980 Alaska National Interest Lands Conservation Act, which created most of the state’s national park lands, included a provision that no exploratory drilling or production could occur without congressional action.

Trump first opened the 1.5 million-acre coastal plain region for drilling in 2020, but the sale of drilling leases in early 2021 generated just $14.4 million in bids, rather than the $1.8 billion his administration had estimated.

On his first day in office, Biden placed a temporary moratorium on oil and gas drilling in the refuge, later going on to cancel the existing leases.

Trump resumed his efforts to allow drilling in ANWR early in his second term, though in January 2025, a lease sale attracted zero bidders. Previously, major banks had ruled out financing such drilling efforts, some citing environmental concerns. Cost is also likely a factor, as the area currently has no roads or facilities.

In addition to opening drilling, the Department of Interior also announced today the reissuing of permits to build a road through Izembek National Wildlife Refuge and a plan to greenlight another road.

“Today’s Arctic Refuge announcement puts America — and Alaska — last,” said Erik Grafe, an attorney for the environmental law nonprofit Earthjustice, in a statement. “The Gwich’in people, most Americans, and even major banks and insurance companies know the Arctic Refuge is no place to drill.”

In contrast, Voice of the Arctic Iñupiat (VOICE), a nonprofit dedicated “to preserving and advancing North Slope Iñupiat cultural and economic self-determination,” released a statement on Thursday in favor of the policy shift.

“Developing ANWR’s Coastal Plain is vital for Kaktovik’s future,” said Nathan Gordon, Jr., mayor of Kaktovik, an Iñupiat village on the northern edge of ANWR. “Taxation of development infrastructure in our region funds essential services across the North Slope, including water and sewer systems to clinics, roads, and first responders. Today’s actions by the federal government create the conditions for these services to remain available and for continued progress for our communities.”

The Department of the Interior said it plans to reinstate the 2021 leases that were cancelled by the Biden administration, as well as to hold a new lease sale sometime this winter.

—Emily Gardner (@emfurd.bsky.social) Associate Editor

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New Satellite Data Reveal a Shift in Earth’s Once-Balanced Energy System

Thu, 10/23/2025 - 13:22

Years ago, scientists noted something odd: Earth’s Northern and Southern Hemispheres reflect nearly the same amount of sunlight back into space. The reason why this symmetry is odd is because the Northern Hemisphere has more land, cities, pollution, and industrial aerosols. All those things should lead to a higher albedo—more sunlight reflected than absorbed. The Southern Hemisphere is mostly ocean, which is darker and absorbs more sunlight.

New satellite data, however, suggest that symmetry is breaking.

From Balance to Imbalance

In a new study published in the Proceedings of the National Academy of Sciences of the United States of America, Norman Loeb, a climate scientist at NASA’s Langley Research Center, and colleagues analyzed 24 years of observations from NASA’s Clouds and the Earth’s Radiant Energy System (CERES) mission.

They found that the Northern Hemisphere is darkening faster than the Southern Hemisphere. In other words, it’s absorbing more sunlight. That shift may alter weather patterns, rainfall, and the planet’s overall climate in the decades ahead.

Since 2000, CERES has recorded how much sunlight is absorbed and reflected, as well as how much infrared (longwave) radiation escapes back to space. Loeb used these measurements to analyze how Earth’s energy balance changed between 2001 and 2024. The energy balance tells scientists whether the planet is absorbing more energy than it releases and how that difference varies between hemispheres.

“Any object in the universe has a way to maintain equilibrium by receiving energy and giving off energy. That’s the fundamental law governing everything in the universe,” said Zhanqing Li, a climate scientist at the University of Maryland who was not part of the study. “The Earth maintains equilibrium by exchanging energy between the Sun and the Earth’s emitted longwave radiation.”

The team found that the Northern Hemisphere is absorbing about 0.34 watt more solar energy per square meter per decade than the Southern Hemisphere. “This difference doesn’t sound like much, but over the whole planet, that’s a huge number,” said Li.

Results pointed to three main reasons for the Northern Hemisphere darkening: melting snow and ice, declining air pollution, and rising water vapor.

To figure out what was driving this imbalance, the scientists applied a technique called partial radiative perturbation (PRP) analysis. The PRP method separates the influence of factors such as clouds, aerosols, surface brightness, and water vapor from calculations of how much sunlight each hemisphere absorbs.

The results pointed to three main reasons for the Northern Hemisphere darkening: melting snow and ice, declining air pollution, and rising water vapor.

“It made a lot of sense,” Loeb said. “The Northern Hemisphere’s surface is getting darker because snow and ice are melting. That exposes the land and ocean underneath. And pollution has gone down in places like China, the U.S., and Europe. It means there are fewer aerosols in the air to reflect sunlight. In the Southern Hemisphere, it’s the opposite.”

“Because the north is warming faster, it also holds more water vapor,” Loeb continued. “Water vapor doesn’t reflect sunlight, it absorbs it. That’s another reason the Northern Hemisphere is taking in more heat.”

Curiosity About Cloud Cover

One of the study’s interesting findings is what didn’t change over the past 20 years: cloud cover.

“The clouds are a puzzle to me because of this hemispheric symmetry,” Loeb said. “We kind of questioned whether this was a fundamental property of the climate system. If it were, the clouds should compensate. You should see more cloud reflection in the Northern Hemisphere relative to the Southern Hemisphere, but we weren’t seeing that.”

Loeb worked with models to understand these clouds.

“We are unsure about the clouds,” said Loeb.

“Understanding aerosol and cloud interactions is still a major challenge,” agreed Li. “Clouds remain the dominant factor adjusting our energy balance,” he said. “It’s very important.”

Still, Li said that “Dr. Norman Loeb’s study shows that not only does [the asymmetry] exist, but it’s important enough to worry about what’s behind it.”

Loeb is “excited about the new climate models coming out soon” and how they will further his work. “It’ll be interesting to revisit this question with the latest and greatest models.”

—Larissa G. Capella (@CapellaLarissa), Science Writer

Citation: Capella, L. G. (2025), New satellite data reveal a shift in Earth’s once-balanced energy system, Eos, 106, https://doi.org/10.1029/2025EO250399. Published on 23 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Melting Cylinders of Ice Reveal an Iceberg’s Tipping Point

Thu, 10/23/2025 - 13:22

The titanic dangers icebergs pose to ships are well documented. Sometimes, however, icebergs themselves can capsize, creating earthquakes and tsunamis or even pushing entire glaciers backward. Most of those dramatic events occur right after the chunk of floating ice splits off from its source, but sometimes icebergs flip over in the open ocean.

Earlier lab experiments using simulated plastic icebergs showed that the energy released in capsize events can rival nuclear weapon blasts. But beyond an understanding that capsize events are likely related to melting induced by ocean warming, knowing why icebergs flip is a question that’s harder to answer. Large variations in iceberg size and shape, along with slow drifting across wide distances, make studying icebergs expensive and challenging.

One solution: make miniature icebergs in the lab and watch them melt under controlled conditions.

“Understanding the mathematics and the physics of what’s going on at a base level is important in order to scale up.”

“We wanted to study the simplest capsize problem we could come up with,” said Bobae Johnson, a physicist and Ph.D. student at the Courant Institute at New York University. She and her colleagues simplified and standardized iceberg shape to a cylinder of pure water ice 8 centimeters in diameter and 24 centimeters long. In their article for Physical Review Fluids, they described how each cylinder flipped several times over the course of a 30-minute experiment.

“It is good to look at these things on smaller scales because even what we were doing in the simplest setting gave us something very complex,” Johnson said. “Understanding the mathematics and the physics of what’s going on at a base level is important in order to scale up.”

From their experiments, Johnson and her colleagues linked the different rates of ice melt above and below the waterline to dynamic changes in the shape of the iceberg—including the location of the center of mass, which makes them flip. Despite the small scale of the experiments, the implications could be enormous.

“Icebergs play a key role in the climate system,” said Sammie Buzzard, a glaciologist at the Centre for Polar Observation and Modelling and Northumbria University who was not involved in the experiments. “When they melt, they add fresh, cold water to the ocean, which can impact currents.”

Icebergs, Soda Pop, and Cheerios

Real-world icebergs range in size from about 15 meters to hundreds of kilometers across, rivaling the size of some small nations. Tolkienesque mountain-like structures (“iceberg” literally means “ice mountain”) split off from glaciers, whereas flat slablike icebergs tend to break off from ice sheets like those surrounding Antarctica.

“An iceberg’s shape determines how it floats in the water and which parts are submerged and which parts sit above the ocean’s surface,” Buzzard said, adding that icebergs change shape as they melt or erode via wind and wave action. But the precise manner of this change is uncertain because in situ measurements are challenging. “If this erosion changes the shape enough that the iceberg is no longer stable in the water, [the iceberg] can suddenly flip over into a position in which it is stable.”

“Even if lab experiments aren’t exactly the same as a natural system, they can go a long way to improving our understanding of [iceberg capsizing].”

Whatever their major differences in shape and size, because they are fresh water floating on salt water, icebergs all exhibit the similar property that roughly 10% off their mass is above water, with the remaining 90% beneath. The similarities provided the starting point for the cylindrical iceberg experiments performed by Johnson and her collaborators.

A sphere or irregular body can rotate in many different directions, but a cylinder with a length greater than the diameter of its circular face floating in water will rotate along only one axis, effectively reducing the problem from three dimensions to two.

Standardizing the shape of the icebergs wasn’t the only simplification the team made. Under natural conditions, ice freezes from the outside in, which traps a lot of air. As icebergs melt, they sometimes release enough trapped air bubbles to make the surrounding water fizz like an opened can of soda pop. This effect can create chaotic motion in samples, so Johnson and collaborators opted to eliminate bubbles entirely in their experiment. To do so, they froze water in cylindrical molds suspended in extremely cold brine and stirred the water to drive residual air out—a process that took 24 to 48 hours for each cylinder.

This video depicts the flow of water beneath the surface of a melting model iceberg. Credit: New York University’s Applied Mathematics Laboratory

Finally, to keep the cylinders from drifting randomly in the ocean simulation tank, the researchers exploited the “Cheerios effect.” Floating cereal pieces tend to group together because of surface tension, so the team 3D printed pieces of flat plastic and coated them with wax. Placing those objects in the tank created a meniscus on either side of the cylinder, keeping it in place so the only motion it exhibited was the rotation they were looking for.

“The ice melts very slowly in the air and very quickly underwater,” Johnson said. In the experiment, that difference resulted in a gravitational instability as the center of mass shifted upward, making the whole cylinder flip. “Every time the ice locks into one position, it carves out a facet above the water and very sharp corners at the waterline, giving you a shape that looks quasi pentagonal about halfway through the experiment. We ran many, many experiments, and this happened across all of them.”

Buzzard emphasized the need for this sort of work. “Even if lab experiments aren’t exactly the same as a natural system, they can go a long way to improving our understanding of [iceberg capsizing],” she said. Every flip of a simulated iceberg could help us understand the effects on the warming ocean and the connection between small occurrences and global consequences.

—Matthew R. Francis (@BowlerHatScience.org), Science Writer

Citation: Francis, M. R. (2025), Melting cylinders of ice reveal an iceberg’s tipping point, Eos, 106, https://doi.org/10.1029/2025EO250390. Published on 23 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

How Plant-Fungi Friendships Are Changing

Wed, 10/22/2025 - 13:30
Source: Journal of Geophysical Research: Biogeosciences

Just as the human body contains a multitude of symbiotic microbial companions, most plant species also live alongside microbial friends. Among these companions are mycorrhizal fungi, which help plants gather water and nutrients—particularly nitrogen—from the soil. In exchange, plants provide mycorrhizal fungi with an average of 3% to 13% of the carbon they pull from the atmosphere through photosynthesis and sometimes as much as 50%.

This carbon donation to support mycorrhizal fungi can incur a significant carbon cost for plants. But few groups have investigated how environmental factors such as soil temperature and nitrogen levels influence the amount of carbon flowing from plants to mycorrhizal fungi and how this flow is likely to shift with climate change. To fill this gap, Shao et al. derived a model that they call Myco-CORPSE (Mycorrhizal Carbon, Organisms, Rhizosphere, and Protection in the Soil Environment) that illustrates how the environment influences interactions between plants and mycorrhizal fungi.

When the researchers fed data from more than 1,800 forest sites in the eastern United States into Myco-CORPSE, they obtained some familiar results and also made some new discoveries. The model echoed previous work in suggesting that increasing the abundance of soil nitrogen, for example, through fertilizer runoff, decreases the dependence of plants on mycorrhizal fungi and therefore reduces the amount of carbon plants allocate to their microbial counterparts. But in contrast to previous studies, these researchers found that rising soil temperatures had the same effect of reducing the amount of nitrogen and carbon exchanged by fungi and plants. That’s because warmth accelerates the breakdown of organic material, which releases nitrogen. Increasing atmospheric carbon dioxide levels, on the other hand, will likely increase the reliance of plants on mycorrhizal fungi by increasing the growth rate of plants and therefore increasing their need for nutrients.

The Myco-CORPSE model also replicated observed patterns, showing that the two major kinds of mycorrhizal fungal species (arbuscular and ectomycorrhizal) behave differently: Arbuscular trees tend to donate less carbon to their associated fungi relative to how much ectomycorrhizal trees donate to theirs. The model also found that forests with a mix of both kinds of species typically accrue less carbon from plants than forests with less mycorrhizal diversity.

As forest managers navigate the many stresses that forests face today, promoting a diversity of mycorrhizal species within forests could optimize plant growth while minimizing the carbon diverted to mycorrhizal fungi, the researchers wrote. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2025JG009198, 2025)

This article is part of the special collection Biogeosciences Leaders of Tomorrow: JGR: Biogeosciences Special Collection on Emerging Scientists.

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), How plant-fungi friendships are changing, Eos, 106, https://doi.org/10.1029/2025EO250397. Published on 22 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

An Asteroid Impact May Have Led to Flooding near the Grand Canyon

Wed, 10/22/2025 - 13:30

When it comes to famous holes in the ground, northern Arizona has two: Grand Canyon and Barringer Meteorite Crater.

New research now suggests that these famous depressions might, in fact, be linked—the impact that created the crater roughly 56,000 years ago might also have unleashed landslides in a canyon that’s part of Grand Canyon National Park today. Those landslides in turn likely dammed the Colorado River and temporarily created an 80-kilometer-long lake, the team proposed. The results were published in Geology.

Driftwood Then and Now

“These are two iconic features of Arizona.”

Karl Karlstrom, a geologist recently retired from the University of New Mexico, grew up in Flagstaff, Ariz. Grand Canyon and Barringer Meteorite Crater both were therefore in his proverbial backyard. “These are two iconic features of Arizona,” said Karlstrom.

Karlstrom’s father—also a geologist—used to regularly explore the caves that dot the walls of Grand Canyon and surrounding canyons. In 1970, he collected two pieces of driftwood from a cavern known as Stanton’s Cave. The mouth of Stanton’s Cave is more than 40 meters above the Colorado River, so finding driftwood in its recesses was unexpected. Routine flooding couldn’t have lofted woody detritus that high, said Karlstrom. “It would have required a flood 10 times bigger than any known flood over the last 2,000 years.”

The best radiocarbon dating available in the 1970s suggested that the driftwood was at least 35,000 years old. A colleague of the elder Karlstrom suggested that the driftwood had floated into Stanton’s Cave when an ancient landslide temporarily dammed the Colorado, raising water levels. The researchers even identified the likely site of the landslide—a wall of limestone in Nankoweap Canyon.

But what had set off that landslide in the first place? That’s the question that Karl Karlstrom and his colleagues sought to answer. In 2023, the researchers collected two additional samples of driftwood from another cave 5 kilometers downriver from Stanton’s Cave.

A “Striking” Coincidence

Modern radiocarbon dating of both the archival and newly collected driftwood samples yielded ages of roughly 56,000 years, with uncertainties of a few thousand years, for all samples. The team also dated sand collected from the second cave; it too had ages that, within the errors, were consistent with the sand having been emplaced 56,000 years ago.

The potential significance of that timing didn’t set in until one of Karlstrom’s international collaborators took a road trip to nearby Barringer Meteorite Crater, also known as Meteor Crater. There, he learned that the crater is believed to have formed around 56,000 years ago.

That coincidence was striking, said Karlstrom, and it got the team thinking that perhaps these two famous landmarks of northern Arizona—Meteor Crater and Grand Canyon National Park—might be linked. The impact that created Meteor Crater has been estimated to have produced ground shaking equivalent to that of an M5.2–5.4 earthquake. At the 160-kilometer distance of Nankoweap Canyon, the purported site of the landsliding, that ground movement would have been attenuated to roughly M3.3–3.5.

It’s impossible to know for sure whether such movement could have dislodged the limestone boulders of Nankoweap Canyon, Karlstrom and his colleagues concede. That’s where future modeling work will come in, said Karlstrom. It’s important to remember that an asteroid impact likely produces a distinctly different shaking signature than an earthquake caused by slip on a fault, said Karlstrom. “Fault slip earthquakes release energy from several kilometers depths whereas impacts may produce larger surface waves.”

But there’s good evidence that a cliff in Nankoweap Canyon did, indeed, let go, said Chris Baisan, a dendrochronologist at the Laboratory of Tree-Ring Research at the University of Arizona and a member of the research team. “There was an area where it looked like the canyon wall had collapsed across the river.”

An Ancient Lake

Using the heights above the Colorado where the driftwood and sand samples were collected, the team estimated that an ancient lake extended from Nankoweap Canyon nearly 80 kilometers upstream. At its deepest point, it would have measured roughly 90 meters. Such a feature likely persisted for several decades until the lake filled with sediment, allowing the river to overtop the dam and quickly erode it, the team concluded.

“They’re certainly close, if not contemporaneous.”

The synchronicity in ages between the Meteor Crater impact and the evidence of a paleolake in Nankoweap Canyon is impressive, said John Spray, a planetary scientist at the University of New Brunswick in Canada not involved in the research. “They’re certainly close, if not contemporaneous.” And while it’s difficult to prove causation, the team’s assertion that an impact set landslides in motion in the area around Grand Canyon is convincing, he added. “I think the likelihood of it being responsible is very high.”

Karlstrom and his collaborators are continuing to collect more samples from caves in Grand Canyon National Park. So far, they’ve found additional evidence of material that dates to roughly 56,000 years ago, as well as even older samples. It seems that there might have been multiple generations of lakes in the Grand Canyon area, said Karlstrom. “The story is getting more complicated.”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), An asteroid impact may have led to flooding near the Grand Canyon, Eos, 106, https://doi.org/10.1029/2025EO250391. Published on 22 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Another landslide dam flood at the site of the Matai’an rock avalanche in Taiwan

Wed, 10/22/2025 - 06:59

Failure of the landslide debris from the Matai’an rock avalanche allowed another barrier lake to form. This breached on 21 October 2025, generating another damaging debris flow.

Newspapers in Taiwan are reporting that a new landslide barrier lake formed and then failed at the site of the giant Matai’an rock avalanche. The breach event apparently occurred at baout 9 pm local time on 21 October 2025. The risk had been identified in advance and the downstream population had been evacuated successfully this time, so there are no reports of fatalities.

The Taipei Times has an image of the barrier lake that was released by the Hualien branch of the Forestry and Nature Conservation Agency:-

The Matai’an landslide barrier lakes prior to the failure of the lower one on 21 October 2025. Photo courtesy of the Hualien branch of the Forestry and Nature Conservation Agency via the Taipei Times.

There is also a video on Youtube from Focus Taiwan (CNA English News) that includes helicopter footage of the site, also provided by the Forestry and Nature Conservation Agency:-

This includes the following still:-

The lower Matai’an landslide barrier lake prior to the failure on 21 October 2025. Still from a video posted to Youtube by CNA English News – original footage courtesy of the Hualien branch of the Forestry and Nature Conservation Agency.

It appears to me that the barrier lake has formed because of a large landslide in the debris from the original rock avalanche note the dark coloured landslide scar on the left side of the image.

Loyal readers will remember that I highlighted that this could be an issue in my post on 3 October:-

“So, a very interesting question will now pertain to the stability of these slopes. How will they perform in conditions of intense rainfall and/or earthquake shaking? Is there the potential for a substantial slope failure on either side, allowing a new (enlarged) lake to form.”

“This will need active monitoring (InSAR may well be ideal). The potential problems associated with the Matai’an landslide are most certainly not over yet.”

There is a high probability that this will be a recurring issue in periods of heavy rainfall.

Meanwhile, keep a close eye on Tropical Storm Melissa, which is tracking slowly northwards in the Caribbean. This could bring exceptionally high levels of rainfall to Haiti and Jamaica as it is moving very slowly. This one looks like a disaster in waiting at the moment.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

To Find Critical Minerals, Look to Plate Tectonics

Tue, 10/21/2025 - 13:31

For much of the 20th century, “petroleum politics” shaped international policy. In the 21st century, a new set of resources has taken center stage: critical minerals. Sourcing and extracting these minerals have become a priority for countries and communities around the world because they are used in everything from solar panels to cell phones to superconductors.

A new study suggests where prospectors can search for critical minerals: rifting sites left behind by the supercontinent Rodinia, which broke up in the Proterozoic, more than 800 million years ago.

“To better find [critical] resources, really, we need a better understanding of geology.”

“Unless it is grown, absolutely everything on the planet that we use as a manufactured good requires something that comes out of a mine,” said Chris Kirkland, a geologist at Curtin University in Australia and a coauthor of the new study, published last month in Geological Magazine. “To better find those resources, really, we need a better understanding of geology.”

Kirkland and his colleagues began by analyzing rocks unearthed by drilling companies in Western Australia. The slabs contain carbonatite, a “weird,” rare, and poorly understood kind of igneous rock formed in the mantle from magmas rich in carbonate minerals. As the magmas rise through Earth’s interior, they react with surrounding rocks, altering the chemical signatures that geologists typically use to trace a sample’s origins.

Carbonatites often contain rare earth elements, such as niobium. Although niobium can be found in different rocks, carbonatites are the only ones offering it in amounts economically suitable for extraction. The Western Australia sites are home to more than 200 million metric tons of the metal.

The team “threw the whole kitchen sink of analytical techniques” at the carbonatites, explained Kirkland. The first step was to take a drill core sample and image its structure to see the broad geological ingredients inside. Then the researchers used lasers to sample individual grains and piece out their crystals.

The carbonatites contained zircon, apatite, and mica, all crystals with isotopes that decay at known rates and can tell researchers about the sample’s age and source. The researchers also analyzed the helium present in zircon, because helium is a volatile element that easily escapes rocks near the surface and can help reveal when the rocks reached the crust.

Written in Stone

The story written in the slabs is one tied to the long history of plate tectonics. The breakup of Rodinia began around 800 million years ago and continued for millions of years as hot, metal-enriched oozes of magma rose up from the mantle. Pressure from this rising rock helped split apart the supercontinent, and the metals encased in carbonatites breached the surface at once-stable mounds of continental crust called cratons.

Today, said Kirkland, tracking these “old fossil scars” where cratons split could reveal stores of minerals.

More than 200 million metric tons of niobium were recently identified in Australia’s Aileron Province, a likely result of the breakup of Rodinia. Credit: Dröllner et al., 2025, https://doi.org/10.1017/S0016756825100204

“Reconstructing a geologic history for one particular area on Earth is something that I think has potential to help us in better understanding these pretty poorly understood carbonatite systems globally,” said Montana State University geologist Zachary Murguía Burton, who was not involved with the paper.

Burton estimates that some 20% of the carbonatites on Earth contain economically attractive concentrations of critical minerals, although he noted that the rocks in the study experienced a unique confluence of local and regional geologic processes that might influence the minerals they contain.

In particular, the carbonatites analyzed in the new study identified the source of recently discovered niobium deposits beneath central Australia. Niobium is a critical mineral used in lithium-ion batteries and to strengthen and lighten steel. Because 90% of today’s supply of niobium comes from a single operation in Brazil, finding additional deposits is a priority.

In addition to niobium, Kirkland said a geologic “recipe” similar to the one his team identified might work for finding gold.

The work is an important reminder of “how tiny minerals and clever dating techniques can not only solve deep-time geological puzzles, but also help guide the hunt for the critical metals we need,” Kirkland said.

—Hannah Richter (@hannah-richter.bsky.social), Science Writer

Citation: Richter, H. (2025), To find critical minerals, look to plate tectonics, Eos, 106, https://doi.org/10.1029/2025EO250393. Published on 21 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Seismic Anisotropy Reveals Deep-Mantle Dynamics

Tue, 10/21/2025 - 13:31
Source: Geochemistry, Geophysics, Geosystems

In some parts of Earth’s interior, seismic waves travel at different speeds depending on the direction in which they are moving through the layers of rock in Earth’s interior. This property is known as seismic anisotropy, and it can offer important information about how the silicate rock of the mantle—particularly at the mantle’s lowermost depths—deforms. In contrast, areas through which seismic waves travel at the same speed regardless of direction are considered isotropic.

In the bottom 300 kilometers of the mantle, also known as the D’’ layer, anisotropy is potentially caused by mantle plumes or mantle flow interacting with the edges of large low-shear-velocity provinces: continent-sized, dense, hot BLOBs (big lower-mantle basal structures) at the base of the mantle above the core. Many questions persist about the viscosity, movement, stability, and shape of the BLOBS, as well as about how they can be influenced by mantle plumes and subduction.

Roy et al. used ASPECT, a 3D mantle convection modeling software, and ECOMAN, a mantle fabric simulation code, to examine the deep mantle. They tested five different mantle model configurations, adjusting the viscosity and density of the BLOBs. The goal was to see which configuration would most closely re-create the observed seismic anisotropy.

The researchers treated the BLOBs as regions with their own unique chemistry, which form from a 100-kilometer-thick layer at the bottom of the mantle. Their models simulated how mantle plumes formed over the past 250 million years, during which time events such as the breakup of Pangaea, the opening of the Atlantic, and the evolution of various subduction zones occurred.

The study suggests that the best explanation for observed seismic anisotropy is when the BLOBs are 2% denser and 100 times more viscous than the surrounding mantle. This aligns with observations of anisotropy patterns in seismic data. Plumes form mainly at the edges of BLOBs, where strong deformation causes strong anisotropy. (Geochemistry, Geophysics, Geosystems, https://doi.org/10.1029/2025GC012510, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2025), Seismic anisotropy reveals deep-mantle dynamics, Eos, 106, https://doi.org/10.1029/2025EO250392. Published on 21 October 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer