EOS

Syndicate content Eos
Science News by AGU
Updated: 1 day 7 hours ago

Inside a Georgia Beach’s High-Tech Fight Against Erosion

Tue, 09/02/2025 - 13:09

This story was originally published by Grist. Sign up for Grist’s weekly newsletter here. This coverage is made possible through a partnership between Grist and WABE, Atlanta’s NPR station.

At low tide on Tybee Island, Georgia, the beach stretches out as wide as it gets with the small waves breaking far away across the sand—you’ll have a long walk if you want to take a dip. But these conditions are perfect for a team of researchers from the University of Georgia’s Skidaway Institute of Oceanography.

Every three months, at low tide, they set out a miniature helipad near the foot of the dune and send up their drone equipped with lidar—technology that points a laser down at the sand and uses it to measure the elevation of the beach and dunes. The team flies it back and forth from the breakers to the far side of the dune and back until they have a complete, detailed map of the island’s 7-mile beach, about 400 acres.

“I see every flip-flop on the beach.”

“It’s high accuracy, it’s a high resolution,” explained research technician Claudia Venherm, who leads this project. “I see every flip-flop on the beach.”

That detailed information is crucial because Tybee is a barrier island, and rising seas are constantly eating away at the sandy beach and dunes that protect the island’s homes and businesses as well as a stretch of the Georgia mainland. Knowing exactly where the island is eroding and how the dunes are holding up to constant battering can help local leaders protect this piece of coastline.

“Tybee wants to retain its beach. It also wants to maintain, obviously, its dune. It’s a protection for them,” said Venherm. “We also give some of our data to the Corps of Engineers so they know what’s going on and when they have to renourish the beach.”

Since the 1970s the Army Corps of Engineers has helped maintain Tybee Island’s beaches with regular renourishment: Every seven years or so, the Corps dredges up sand from the ocean floor and deposits on the beach to replace sand that’s washed away. The data from the Skidaway team will only help the Corps do this work more effectively. Lidar isn’t new, and neither is aerial coastal mapping. Several federal agencies monitor coastlines with lidar, but those surveys are more typically several years apart for any one location, rather than a few months.

The last renourishment finished in January 2020, and Venherm and her team got to work a few months later. That means they have five years of high-resolution beach data, recorded every three months and after major storms like Hurricane Helene, creating a precise picture of how the beach is changing.

“I can compute what the elevation of the dune is, as well as how much volume has been lost or gained since a previous survey.”

“I can compute what the elevation of the dune is, as well as how much volume has been lost or gained since a previous survey,” said Venherm. “I can also compute how long it will take until the beach is completely gone, or how long will it take until water reaches the dune system.”

The Corps conducts regular renourishment projects on beaches all along the East Coast, and uses a template to inform that work, said Alan Robertson, a consultant who leads the city of Tybee’s resilience planning. But he hopes that such granular evidence of specific changes over time can shift where exactly the sand gets placed within the bounds of that template. An area near the island’s north end, for instance, is a clear hot spot for erosion, so the city may push for concentrating sand there, and north of that point so that it can travel south to fill in the erosion.

“We know exactly where the hotspots of erosion are. We know where there’s accretion,” he said, referring to areas where sand tends to build up. “[We] never had that before.”

The data can also inform the city’s own decision-making, because it provides a much clearer picture of what happens to the dunes and beach over time after the fresh sand is added. In the past, they’ve been able to see the most obvious erosion, but now they can compare how different methods of dune-building and even sources of sand hold up. The vegetation that’s critical to holding dunes together, for instance, takes root far better in sand dredged from the ocean compared to sand trucked in from the mainland, Robertson said.

“There’s an example of the research and the monitoring. I actually can make that statement,” he said. “I actually know where you should get your sand from if you can, and why. No one could have told you that eight years ago.”

That sort of proven information is key in resilience projects, which are often expensive and funded by grants from agencies that want confirmation their money is being spent well.

“Everything we do now on resiliency, measuring, and monitoring has become a priority,” said Robertson. “We’ve been able over these years through proof statements of ‘look at what this does for you’ to make it part of the project.”

—Emily Jones (@ejreports.bsky.social), Grist

This article originally appeared in Grist at https://grist.org/science/inside-a-georgia-beachs-high-tech-fight-against-erosion/.

Grist is a nonprofit, independent media organization dedicated to telling stories of climate solutions and a just future. Learn more at Grist.org

How Researchers Have Studied the Where, When, and Eye of Hurricanes Since Katrina

Fri, 08/29/2025 - 12:02

On 28 August 2005, New Orleans area residents received a bulletin from the National Weather Service (NWS) office in Slidell, La., warning them of “a most powerful hurricane with unprecedented strength.” One excerpt of the chilling announcement, issued via NOAA radio and the Federal Communications Commission’s Emergency Alert Service, read,

BLOWN DEBRIS WILL CREATE ADDITIONAL DESTRUCTION. PERSONS…PETS…AND LIVESTOCK EXPOSED TO THE WINDS WILL FACE CERTAIN DEATH IF STRUCK.

POWER OUTAGES WILL LAST FOR WEEKS…AS MOST POWER POLES WILL BE DOWN AND TRANSFORMERS DESTROYED. WATER SHORTAGES WILL MAKE HUMAN SUFFERING INCREDIBLE BY MODERN STANDARDS.

Hurricane Katrina, which caused 1,833 fatalities and about $108 billion in damage (more than $178 billion in 2025 dollars), remains the costliest hurricane on record to hit the United States and among the top five deadliest.

“If we were to have a Katrina today, that [forecast] cone would be half the size that it was in 2005.”

In the 20 years since the hurricane, meteorologists, modelers, computer scientists, and other experts have worked to improve the hurricane forecasting capabilities that inform bulletins like that one.

Consider the forecast cone, for instance. Also known as the cone of uncertainty, this visualization outlines the likely path of a hurricane with decreasing specificity into the future: The wider part of the cone might represent the forecasted path 36 hours in advance, and the narrower part might represent the forecasted path 12 hours in advance.

“If we were to have a Katrina today, that cone would be half the size that it was in 2005,” said Jason Beaman, meteorologist-in-charge at the National Weather Service Mobile/Pensacola office.

How to Make a Hurricane

The ingredients for a hurricane boil down to warm water and low pressure. When an atmospheric low-pressure area moves over warm ocean water, surface water evaporates, rises, then condenses into clouds. Earth’s rotation causes the mass of clouds to spin as the low pressure pulls air toward its center.

Storms born in the Gulf of Mexico or that traverse it, as Katrina did, benefit from the body’s sheltered, warm water, and the region’s shallow continental shelf makes storm surges particularly destructive for Gulf Coast communities.

Hurricanes gain strength as long as they remain over warm ocean waters. But countless factors contribute to how intense a storm becomes and what path it takes, from water temperature and wind speed to humidity and proximity to the equator.

Because predicting the behavior of hurricanes requires understanding how they work, data gathered by satellites, radar, and aircraft are crucial for researchers. Feeding these data into computer simulations helps researchers understand the mechanisms behind hurricanes and predict how future storms may behave.

“Since 2005, [there have been] monumental leaps in observation skill,” Beaman said.

Seeing a Storm More Clearly

Many observations of the weather conditions leading up to hurricanes come from satellites, which can offer a year-round bird’s-eye view of Earth.

NOAA operates a pair of geostationary satellites that collect imagery and monitor weather over the United States and most of the Atlantic and Pacific oceans. The mission, known as the Geostationary Operational Environmental Satellite (GOES) program, has been around since 1975; the current satellites are GOES-18 and GOES-19.

When Beaman started his career just a few years before Katrina hit, satellite imagery from GOES-8 to GOES-12 was typically beamed to Earth every 30–45 minutes—sometimes as often as every 15 minutes. Now it’s routine to receive images every 5 minutes or even as often as every 30 seconds. Having more frequent updates makes for much smoother animations of a hurricane’s track, meaning fewer gaps in the understanding of a storm’s path and intensification.

For Beaman, the launch of the GOES-16 satellite in 2016 marked a particularly important advance: In addition to beaming data to scientists more frequently, it scanned Earth with 4 times the resolution of the previous generation of satellites. It could even detect lightning flashes, which can sometimes affect the structure and intensity of a hurricane.

The transition to GOES-16 “was like going from black-and-white television to 4K television.”

The transition to GOES-16 “was like going from black-and-white television to 4K television,” Beaman said.

NOAA also has three polar-orbiting satellites, launched between 2011 and 2017, that orbit Earth from north to south 14 times a day. As part of the Joint Polar Satellite System (JPSS) program, the satellites’ instruments collect data such as temperature, moisture, rainfall rates, and wind for large swaths of the planet. They also provide microwave imagery using radiation emitted from water droplets and ice. NOAA’s earlier polar-orbiting satellites had lower resolution at the edges of scans, a more difficult time differentiating clouds from snow and fog, and less accurate measurements of sea surface temperature.

“With geostationary satellites, you’re really just looking at the cloud tops,” explained Daniel Brown, branch chief of the Hurricane Specialist Unit at NOAA’s National Hurricane Center in Miami. “With those microwave images, you can really kind of see into the storm, looking at structure, whether an eye has formed. It’s really helpful for seeing the signs of what could be rapid intensification.”

NOAA’s Geostationary Operational Environmental Satellites (GOES) monitor weather over the United States and most of the Atlantic and Pacific oceans. Credit: NOAA/Lockheed Martin, Public Domain

Rapid intensification is commonly defined as an increase in maximum sustained wind speed of 30 or more nautical miles per hour in a 24-hour period. Katrina had two periods of rapid intensification, and they were one reason the storm was so deadly. In the second period, the storm strengthened from a low-end category 3 hurricane (in which winds blow between 178 and 208 kilometers per hour, or between 111 and 129 miles per hour) to a category 5 hurricane (in which winds blow faster than 252 kilometers per hour, or 157 miles per hour) in less than 12 hours.

New Angles

Radar technology has also made strides in the decades since Katrina. Hurricane-tracking radar works via a ground- or aircraft-based transmitter sending out a radio signal. When the signal encounters an obstacle in the atmosphere, such as a raindrop, it bounces back to a receiver. The amount of time it takes for the signal to return provides information about the location of the obstacle.

Between 2011 and 2013, NWS upgraded its 150+ ground-based radars throughout the United States with dual-polarization technology—a change a 2013 NWS news release called “the most significant enhancement made to the nation’s radar network since Doppler radar was first installed in the early 1990s.”

So-called dual-pol technology sends both horizontal and vertical pulses through the atmosphere. With earlier technology, a radar signal might tell researchers only the location of precipitation. Dual-pol can offer information about how much precipitation is falling, the sizes of raindrops, and the type of precipitation or can even help researchers identify debris being transported in a storm.

Credit: NOAA

“That’s not something that we had back in Katrina’s time,” Beaman said. In 2005, forecasters used “much more crude ways of trying to calculate, from radar, how much rain may have fallen.”

Radar updates have become more frequent as well. Beaman said his office used to receive routine updates every 5 or 6 minutes. Now they receive updated radar imagery as often as every minute.

Hunting Hurricanes from the Skies

For a more close-up view of a hurricane, NOAA and the U.S. Air Force employ Hurricane Hunters—planes that fly directly through or around a storm to take measurements of pressure, humidity, temperature, and wind speed and direction. These aircraft also scan the storms with radar and release devices called dropwindsondes, which take similar measurements at various altitudes on their way down to the ocean.

NOAA’s P-3 Orion planes and the 53rd Weather Reconnaissance Squadron’s WC-130J planes fly through the eyes of storms. NOAA’s Gulfstream IV jet takes similar measurements from above hurricanes and thousands of square kilometers around them, also releasing dropwindsondes along the way. These planes gather information about the environment in which storms form. A 2025 study showed that hurricane forecasts that use data from the Gulfstream IV are 24% more accurate than forecasts based only on satellite imagery and ground observations.

The NOAA P-3 Hurricane Hunter aircraft captured this image from within the eye of Hurricane Katrina on 28 August 2005, 1 day before the storm made landfall. Credit: NOAA, Public Domain

Hurricane Hunters’ tactics have changed little since Katrina, but Brown said that in the past decade or so, more Hurricane Hunter data have been incorporated into models and have contributed to down-to-Earth forecasting.

Sundararaman “Gopal” Gopalakrishnan, senior meteorologist with NOAA’s Atlantic Oceanographic and Meteorological Laboratory’s (AOML) Hurricane Research Division, emphasized that Hurricane Hunter data have been “pivotal” for improving both the initial conditions of models and the forecasting of future storms.

With Hurricane Hunters, “you get direct, inner-core structure of the storm,” he said.

Hurricane Hunters are responsible for many of the improvements in hurricane intensity forecasting over the past 10–15 years, said Ryan Torn, an atmospheric and environmental scientist at the University at Albany and an author of the recent study about Gulfstream IVs. One part of this improvement, he explained, is that NOAA began flying Hurricane Hunters not just for the largest storms but for weaker and smaller ones as well, allowing scientists to compare what factors differentiate the different types.

“We now have a very comprehensive observation dataset that’s come from years of flying Hurricane Hunters into storms,” he said. These datasets, he added, make it possible to test how accurately a model is predicting wind, temperature, precipitation, and humidity.

In 2021, NOAA scientists also began deploying uncrewed saildrones in the Caribbean Sea and western Atlantic to measure changes in momentum at the sea surface. The drones are designed to fill observational gaps between floats and buoys on the sea surface and Hurricane Hunters above.

Modeling Track and Intensity

From the 1980s to the early 2000s, researchers were focused on improving their ability to forecast the path of a hurricane, not necessarily what that hurricane might look like when it made landfall, Gopalakrishnan explained.

Brown said a storm’s track is easier to forecast than its intensity because a hurricane generally moves “like a cork in the stream,” influenced by large-scale weather features like fronts, which are more straightforward to identify. Intensity forecasting, on the other hand, requires a more granular look at factors ranging from wind speed and air moisture to water temperature and wind shear.

Storms like 2005’s Katrina and Rita “showed the importance of [tracking a storm’s] intensity, especially rapid intensification.”

Gopalakrishnan said storms like 2005’s Katrina and Rita “showed the importance of [tracking a storm’s] intensity, especially rapid intensification.”

Without intensity forecasting, Gopalakrishnan said, some of the most destructive storms might appear “innocuous” not long before they wreak havoc on coastlines and lives. “Early in the evening, nobody knows about it,” he explained. “And then, early in the morning, you see a category 3 appear from nowhere.”

Gopalakrishnan came to AOML in 2007 to set up both the Hurricane Modeling Group and NOAA’s Hurricane Forecast Improvement Project. He had begun working on what is now known as the Hurricane Weather Research Forecast model (HWRF) in 2002 in his role at NOAA’s Environmental Modeling Center. With the formation of the hurricane modeling group in 2007, scientists decided to focus on using HWRF to forecast intensity changes.

HWRF used a technique called moving nests to model the path of a storm in higher resolution than surrounding areas. Gopalakrishnan compared a nest to using a magnifying glass focused on the path of a storm. Though a model might simulate a large area to provide plenty of context for a storm’s environment, capturing most of an area in lower resolution and the storm path itself in higher resolution can save computing power.

By 2014, Gopalakrishnan said, the model’s tracking and intensity forecasting capabilities had improved 25% since 2007. The model’s resolution also upgraded from 9 square kilometers in 2007 to 1.5 square kilometers by the time it was retired in 2023.

Since 2007, the National Hurricane Center’s official (OFCL) track forecast errors decreased between 30% and 50%, and intensity errors shrank by up to 55%. MAE = mean absolute error; VMAX = maximum sustained 10-meter winds. Credit: Alaka et al., 2024, https://doi.org/10.1175/BAMS-D-23-0139.1

Over time, advances in how data are introduced into models meant that the better data researchers were receiving from satellites, radars, and Hurricane Hunters improved modeling abilities even further. Gopalakrishnan estimated that by 2020, his office could predict hurricane track and intensity with somewhere between 50% and 54% more accuracy than in 2007.

NOAA began transitioning operations to a new model known as the Hurricane Analysis and Forecast System (HAFS) in 2019, and HAFS became the National Hurricane Center’s operational forecasting model in 2023. HAFS, developed jointly by several NOAA offices, can more reliably forecast storms, in part by increasing the use of multiple nests—or multiple high-resolution areas in a model—to follow multiple storms at the same time. HAFS predicted the rapid intensification of Hurricanes Helene and Milton in 2024.

Just as they did with HWRF, scientists run multiple versions of HAFS each year: an operational model, used to inform the public, and a handful of experimental models to see which of them work the best. At the end of hurricane season, researchers examine which versions performed the best and begin combining elements to develop the next generation of the operational model. The team expects that as HAFS improves, it will lengthen the forecast from the 5 days offered by previous models.

“As a developer [in 2007], I would have been happy to even get 2 days forecast correctly,” Gopalakrishnan said. “And today, I’m aiming to get a 7-day forecast.”

NOAA’s budget plan for 2026 could throw a wrench into this progress, as it proposes eliminating all NOAA labs, including AOML.

The Role of Communication

An accurate hurricane forecast does little good if the information isn’t shared with the people who need it. And communication about hurricane forecasts has seen its own improvements in the past 2 decades. NWS has partnered with social scientists to learn how to craft the most effective messages for the public, something Beaman said has paid dividends.

Communication between the National Hurricane Center and local weather service offices can be done over video calls, rather than by phone as was once done. Sharing information visually can make these calls more straightforward and efficient. NWS began sending wireless emergency alerts directly to cell phones in 2012.

In 2017, the National Hurricane Center began issuing storm surge watches and warnings in addition to hurricane watches and warnings. Beaman said storm surge inundation graphics, which show which areas may experience flooding, may have contributed to a reduction in storm surge–related fatalities. In the 50-year period between 1963 and 2012, around 49% of storm fatalities were related to storm surge, but by 2022, that number was down to 11%.

“You take [the lack of visualization] back to Katrina in 2005, one of the greatest storm surge disasters our country has seen, we’re trying to express everything in words,” Beaman said. “There’s no way a human can properly articulate all the nuances of that.”

Efforts to create storm data visualization go beyond NOAA.

Carola and Hartmut Kaiser moved to Baton Rouge, La., just weeks before Hurricane Katrina made landfall. Hartmut, a computer scientist, and Carola, an information technology consultant with a cartography background, were both working at Louisiana State University. When the historic storm struck, Hartmut said they wondered, “What did we get ourselves into?”

Shortly after the storm, the Kaisers combined their expertise and began work on the Coastal Emergency Risks Assessment (CERA). The project, led by Carola, is an easy-to-use interface that creates visual representations of data, including storm path, wind speed, and water height, from the National Hurricane Center, the Advanced Circulation Model (ADCIRC), and other sources.

The Coastal Emergency Risks Assessment tool aims to help the public understand the potential timing and impacts of storm surge. Here, it shows a forecast cone for Hurricane Erin in August 2025, along with predicted maximum water height levels. Credit: Coastal Emergency Risks Assessment

“We know of a lot of people who said, ‘Yes, thank you, [looking at CERA] caused me to evacuate.”

What started as an idea for how to make information more user-friendly for the public, emergency managers, and the research community grew quickly: Hundreds of thousands of people now use the tool during incoming storm events, Hartmut said. The Coast Guard often moves its ships to safe regions on the basis of CERA’s predictions, and the team frequently receives messages of thanks.

“We know of a lot of people who said, ‘Yes, thank you, [looking at CERA] caused me to evacuate,” Hartmut said. “And now my house is gone, and I don’t know what would have happened if I didn’t go.”

Looking Forward

Unlike hurricane season itself, the work of hurricane modelers has no end. When the season is over, teams such as Gopalakrishnan’s review the single operational and several experimental models that ran throughout the season, then work all year on building an upgraded operational model.

“It’s 365 days of model developments, testing, and evaluation,” he said.

NOAA scientists aren’t the only ones working to improve hurricane forecasting. For instance, researchers at the University of South Florida’s Ocean Circulation Lab (OCL) and the Florida Flood Hub created a storm surge forecast visualization tool based on the lab’s models. The West Florida Coastal Ocean Model, East Florida Coastal Ocean Model, and Tampa Bay Coastal Ocean Model were designed for the coastal ocean with a sufficiently high resolution to model small estuaries and shipping channels.

Though Yonggang Liu, a coastal oceanographer and director of OCL, cited examples of times his lab’s models have outperformed NOAA’s models, the tool is not used in operational NOAA forecasts. But it is publicly available on the OCL website (along with a disclaimer that the analyses and data are “research products under development”).

The Cyclone Global Navigation Satellite System (CYGNSS) is a NASA mission that pairs signals from existing GPS satellites with a specialized radar receiver to measure reflections off the ocean surface—a proxy for wind levels. The constellation of eight satellites can take measurements more frequently than GOES satellites, allowing for better measurement of rapid intensification, said Chris Ruf, a University of Michigan climate and space scientist and CYGNSS principal investigator.

It might seem that if a method or mission offers a way to more accurately forecast hurricanes, it should be promptly integrated into NOAA’s operational models. But Ruf explained NOAA’s hesitation to use data from university-led efforts: Because they are outside of NOAA’s control and could therefore lose funding or otherwise stop running, it’s too risky for NOAA to rely on such projects.

“CYGNSS is a one-off mission that was funded to go up there and do its thing, and then, when it deorbits, it’s over,” Ruf said. “They [at NWS] don’t want to invest a lot of time learning how to assimilate some new data source and then have the data disappear later. They want to have operational usage where they can trust that it’s going to be there later on.”

“These improvements cannot happen as a one-man army.”

Whatever office they’re in, it’s scientists who make the work of hurricane forecasting possible. Gopalakrishnan said that during Katrina, there were two or three people at NOAA associated with model development. He credits the modeling improvements made since then to the fact that, now, there’s a team of several dozen. And more advances may be on the horizon. For instance, NOAA expects a new Hurricane Hunter jet, a G550, to join the ranks by 2026.

However, some improvements are stalling. The Geostationary Extended Observations (GeoXO) satellite system is slated to begin expanding observations of GOES satellites in the early 2030s. But the 2026 U.S. budget proposal, which suggests slashing $209 million from NOAA’s efforts to procure weather satellites and infrastructure, specifically suggests a “rescope” of the GeoXO program

Hundreds of NOAA scientists have been laid off since January 2025, including Hurricane Hunter flight directors and researchers at AOML (though NWS received permission to rehire hundreds of meteorologists, hydrologists, and radar technicians, as well as hire for previously approved positions, in August).

In general, hurricane fatalities are decreasing: As of 2024, the 10-year average in the United States was 27, whereas the 30-year average was 51. But this decrease is not because storms are becoming less dangerous.

“Improved data assimilation, improved computing, improved physics, improved observations, and more importantly, the research team that I could bring together [were] pivotal” in enabling the past 2 decades of forecasting improvements, said Gopalakrishnan. “These improvements cannot happen as a one-man army. It’s a team.”

—Emily Dieckman (@emfurd.bsky.social), Associate Editor

Citation: Dieckman, E. (2025), How researchers have studied the where, when, and eye of hurricanes since Katrina, Eos, 106, https://doi.org/10.1029/2025EO250320. Published on 28 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Debate over Wakes in the Land of 10,000 Lakes

Fri, 08/29/2025 - 12:01

Wakeboats are causing a stir in Minnesota.

Though all powerboats create wakes, these specialty craft have heavier sterns and engines specifically designed to shape water into surfable waves. That extra turbulence is drawing ire from other lake-lovers.

Across the state, Minnesotans are reporting eroding banks, murky waters, and shredded vegetation. When considering wakeboats, one person’s recreation is another’s resentment.

“It’s divisive,” said Joe Shneider, president of the Minnesota Coalition of Lake Associations. “The three big issues we hear all the time [about wakeboats] are personal safety, bank erosion, and lake bed disruption.”

Specialty wakeboats are designed to shape water into surfable waves, allowing riders to follow behind without needing a towrope. New research shows how those wakes can affect the lake bed below. Credit: Colin Van Dervort/Flickr, CC BY 2.0

As the popularity and size of wakeboats grow, so does the need for data. Communities are wrestling with issues of regulation and education, and both approaches require information. That’s why Shneider and more than 200 others helped crowdfund recent research from the University of Minnesota’s Saint Anthony Falls Laboratory. (The state also supported the project.) The resulting public dataset shows how wakeboats can churn lake beds, information that can help communities navigate the brewing conflict.

The Stakes

Minnesota is not the only state navigating a great wake debate. In 2024, Maine implemented wakeboat regulations and Vermont restricted wake surfing to its 30 largest lakes. (Some residents want the number further reduced to 20.) In Wisconsin, individual municipalities are debating bans on wake surfing at hundreds of lakes, prompting at least one lawsuit.

Minnesota, in contrast, has issued wakeboat regulations at only one of its 10,000 lakes.

“There’s a whole lot of people out there that need to make decisions about their lake.”

The environmental issues at stake arise in shallow water, where powerboats can stir up obvious trails of sediment. Resuspended sediment absorbs sunlight, which heats the water column. Turbidity reduces the feeding rates of some fishes. Once-buried nutrients again become available, triggering toxic algal blooms that choke beaches and rob fish of oxygen.

But to connect the dots between wakeboat use and ecosystem disruption, researchers needed to document how various powerboats affect sediment dispersal.

“We want to understand how boats are interacting with the water column and provide data, because there’s a whole lot of people out there that need to make decisions about their lake,” said Jeff Marr, a hydraulic engineer at the University of Minnesota and a coauthor of the study.

The Wake

On Lake Minnetonka, just west of Minneapolis, seven locals lent their boats for the research. These watercraft ranged from relatively light, low-power deck boats (150-horsepower, 2,715 pounds) to burly bowriders (760-horsepower, 14,530 pounds) and included two boats built for wake surfing.

On test days, volunteers piloted their boats between buoy-marked goalposts. Acoustic sensors on the lake bed tracked pressure changes in the water column.

Powerboats mostly operate at either displacement speed (chugging low in the water) or planing speed (skipping faster along the surface). But there’s a transition called semidisplacement, in which the stern sinks in the water and waves spike in size.

“It’s right at that transition that [wakeboats] like to operate,” said Andy Riesgraf, an aquatic biologist at the University of Minnesota and a coauthor of the study.

Boaters drove the course five times at planing speed (21–25 miles per hour, common for water-skiing and tubing) and five times at displacement or semidisplacement mode (7–11 miles per hour, common for cruising and wake surfing). Researchers in rowboats paddled to collect water samples at various intervals in the track.

Researchers Chris Feist and Jessica Kozarek stand by the research rowboat. To minimize disruption in the water column, the human-powered sampling team paddled into the wake racetrack to collect 1-liter water samples at three different depths. Credit: Saint Anthony Falls Laboratory

The acoustic sensors showed that three types of waves affected the water column. Pressure waves, created by the immediate shift and rebound of water around a boat, were short-lived but strong enough to shake loose sediments. Transverse waves, which follow the boat’s path, and propeller wash, the frothy vortex generated by its engines, both elevated loose sediment and caused minutes-long disturbances.

Though all boats created these waves, the wakeboats churned the most sediment.

In planing mode, all seven boats caused brief and minimal disturbances. Sediments settled in less than 10 seconds at 9- and 14-foot depths. But when operating in slower, semidisplacement mode, wakeboats created a distinct disturbance. Following a pass from a wakeboat, sediment needed 8 minutes to settle at 14-foot depth and more than 15 minutes at 9-foot depth.

The research team released simple recommendations based on their findings. One recommendation is that all recreational powerboats should operate in at least 10 feet of water to minimize disturbances. Another is that wakeboats, when used for surfing, need 20 feet of water to avoid stirring up sediments and altering the ecosystem.

The Uptake

The new research adds to the group’s existing dataset on powerboats’ hydrologic impacts on lake surfaces.

Whether the suggestions lead to regulations is up to lake managers.

“Our goal is just to get the data out,” Marr said. The researchers published their findings in the University of Minnesota’s open-access digital library so that everyday lake-goers can find the information. Three external experts reviewed the material.

The more we continue to collect these data, the more that we start to fill in those other gaps.

The results add information to the policy debate. “If there is going to be some type of environmental regulation [on powerboating], you need very clear evidence that under these conditions, it’s detrimental,” said Chris Houser, a coastal geomorphologist at the University of Waterloo who was not involved in the project.

There are other variables to study—such as the number of boats on the water and the paths they’re carving—but “the more we continue to collect this data, the more we start to fill in those other gaps of different depths and different configurations,” Houser said.

For Shneider, the new data add much-needed clarity. The latest report “is monumental,” he said.

Marr, Riesgraf, and their colleagues are now comparing the impacts of boat-generated wakes against wind-driven waves. Those data could further isolate the impacts powerboats have on lakes.

—J. Besl (@J_Besl, @jbesl.bsky.social), Science Writer

This news article is included in our ENGAGE resource for educators seeking science news for their classroom lessons. Browse all ENGAGE articles, and share with your fellow educators how you integrated the article into an activity in the comments section below.

Citation: Besl, J. (2025), A debate over wakes in the land of 10,000 lakes, Eos, 106, https://doi.org/10.1029/2025EO250316. Published on 29 August 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

USDA Moves to Rescind Roadless Rule Protecting 45 Million Acres of Wild Area

Thu, 08/28/2025 - 21:11
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

The U.S. Department of Agriculture (USDA) is proposing rescinding the 2001 Roadless Area Conservation Rule, which protects about 45 million acres (182,000 square kilometers) of National Forest System lands from road construction, reconstruction, and timber harvests.

Of the land that would be affected by the rescission, more than 95% is in 10 western states: Alaska, Montana, California, Utah, Wyoming, Nevada, Washington, Oregon, New Mexico, and Arizona. The change would not apply to Colorado and Idaho, which have state-specific roadless rules.

Secretary of Agriculture Brooke L. Rollins first announced the USDA’s rescission of the rule on 23 June, prompting negative responses from several environmental, conservation, and native groups.

“The Tongass is more than an ecosystem—it is our home. It is the foundation of our identity, our culture, and our way of life,” said a letter from the Central Council of the Tlingit and Haida Indian Tribes of Alaska to the USDA and the U.S. Forest Service. “We understand the need for sustainable industries and viable resource development in Southeast Alaska. Our communities need opportunities for economic growth, but that growth must be guided by those who call this place home.”

 
Related

On 27 August, the USDA released a statement about the agency taking “the next step in the rulemaking process,” noting that the proposal aligned with several recent executive orders, including Executive Order 14192, Unleashing Prosperity Through Deregulation and Executive Order 14153, Unleashing Alaska’s Extraordinary Resource Potential.

“This administration is dedicated to removing burdensome, outdated, one-size-fits-all regulations that not only put people and livelihoods at risk but also stifle economic growth in rural America,” Rollins said in the release.

A notice of intent seeking public comment on the proposal was published in the Federal Register on Friday, 29 August, but a preview of the document became available for public inspection on 28 August. The document suggests that the rule has posed “undue burden on production of the Nation’s timber and identification, development, and use of domestic energy and mineral resources.” Repealing the rule, the document states, would allow for local land managers to make more tailored decisions and would allow for better wildfire suppression.

“This scam is cloaked in efficiency and necessity,” said Nicole Whittington-Evans, senior director of Alaska and Northwest programs at Defenders of Wildlife, in a statement. “But in reality, it will liquidate precious old-growth forest lands critical to Alaska Natives, local communities, tourists and countless wildlife, who all depend on intact habitat for subsistence harvesting, recreation and shelter. Rare and ancient trees will be shipped off at a loss to taxpayers, meaning that Americans will subsidize the destruction of our own natural heritage.”  

The proposal will be open for public comment through 19 September.

–Emily Dieckman, Associate Editor (@emfurd.bsky.social)

29 August 2025: This article was updated with a link to the notice of intent published in the Federal Registrar.

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Burst of Subglacial Water Cracked the Greenland Ice Sheet

Thu, 08/28/2025 - 13:12

Greenland, despite its name, is largely blanketed in ice. And beneath that white expanse lies a world of hidden lakes. Researchers have now used satellite observations to infer that one such subglacial lake recently burst through the surface of the Greenland Ice Sheet, an unexpected and unprecedented event. By connecting this outburst with changes in the velocity and calving of a nearby glacier, the researchers helped to unravel how subglacial lakes affect ice sheet dynamics. These results were published in Nature Geoscience.

Researchers have known for decades that pools of liquid water exist beneath the Antarctic Ice Sheet, but scientific understanding of subglacial lakes in Greenland is much more nascent. “We first discovered them about 10 years ago,” said Mal McMillan, a polar scientist at Lancaster University and the Centre for Polar Observation and Modelling, both in the United Kingdom.

Subglacial lakes can exert a significant influence on an ice sheet. That’s because they affect how water drains from melting glaciers, a mechanism that in turn causes sea level rise, water freshening, and a host of other processes that affect local and global ecosystems.

McMillan is part of a team that recently studied an unusual subglacial lake beneath the Greenland Ice Sheet. The work was led by Jade Bowling, who was a graduate student of McMillan’s at the time; Bowling is now employed by Natural England.

Old, but Not Forgotten, Data

In the course of mining archival satellite observations of the height of the Greenland Ice Sheet, the team spotted something unusual in a 2014 dataset: An area of roughly 2 square kilometers had dropped in elevation by more than 80 meters (260 feet) between two satellite passes just 10 days apart. That deflation reflected something going on deep beneath the surface of the ice, the researchers surmised.

A subglacial lake that previously was situated at the interface between the ice and the underlying bedrock must have drained, said McMillan, leaving the ice above it hanging unsupported until it tumbled down. The team used the volume of the depression to estimate that roughly 90 million cubic meters (more than 3.1 billion cubic feet) of water had drained from the lake between subsequent satellite observations, making the event one of Greenland’s biggest subglacial floods in recorded history.

“We haven’t seen this before.”

Subglacial lakes routinely grow and shrink, however, so that observation by itself wasn’t surprising. What was truly unexpected lay nearby.

“We also saw an appearance, about a kilometer downstream, of a huge area of fractures and crevassing,” McMillan said. And beyond that lay 6 square kilometers (2.3 square miles)—an area roughly the size of lower Manhattan—that was unusually smooth.

The researchers concluded that after the subglacial lake drained, its waters likely encountered ice frozen to the underlying bedrock and were forced upward and through the surface of the ice. The water then flowed across the Greenland Ice Sheet before reentering the ice several kilometers downstream, leaving behind the polished, 6-square-kilometer expanse.

“This was unexpected,” said McMillan. “We haven’t seen this before.”

A Major Calving, a Slowing Glacier

It’s most likely that the floodwater traveled under northern Greenland’s Harder Glacier before finally flowing into the ocean.

Within the same 10-day period, Harder Glacier experienced its seventh-largest calving event in the past 3 decades. It’s impossible to know whether there’s a direct link between the subglacial lake draining and the calving, but it’s suggestive, said McMillan. “The calving event that happened at the same point is consistent with lots of water flooding out” from the glacier.

Using data from several Earth-observing satellites, scientists discovered that a huge subglacial flood beneath the Greenland Ice Sheet occurred with such force that it fractured the ice sheet, resulting in a vast quantity of meltwater bursting upward through the ice surface. Credit: ESA/CPOM/Planetary Visions

“It’s like you riding on a waterslide versus a rockslide. You’re going to slide a lot faster on the waterslide.”

The team also found that Harder Glacier rapidly decelerated—3 times more quickly than normal—in 2014. That’s perhaps because the influx of water released by the draining lake carved channels in the ice that acted as conduits for subsequent meltwater, the team suggested. “When you have normal melting, it can just drain through these channels,” said McMillan. Less water in and around the glacier means less lubrication. “That’s potentially why the glacier slowed down.”

That reasoning makes sense, said Winnie Chu, a polar geophysicist at the Georgia Institute of Technology in Atlanta who was not involved in the research. “It’s like you riding on a waterslide versus a rockslide. You’re going to slide a lot faster on the waterslide.”

Just a One-Off?

In the future, McMillan and his colleagues hope to pinpoint similar events. “We don’t have a good understanding currently of whether it was a one-off,” he said.

Getting access to higher temporal resolution data will be important, McMillan added, because such observations would help researchers understand just how rapidly subglacial lakes are draining. Right now, it’s unclear whether this event occurred over the course of hours or days, because the satellite observations were separated by 10 days, McMillan said.

It’s also critical to dig into the mechanics of why the meltwater traveled vertically upward and ultimately made it to the surface of the ice sheet, Chu said. The mechanism that this paper is talking about is novel and not well reproduced in models, she added. “They need to explain a lot more about the physical mechanism.”

But something this investigation clearly shows is the value of digging through old datasets, said Chu. “They did a really good job combining tons and tons of observational data.”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), A burst of subglacial water cracked the Greenland Ice Sheet, Eos, 106, https://doi.org/10.1029/2025EO250317. Published on 28 August 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Fossilized Micrometeorites Record Ancient CO2 Levels

Thu, 08/28/2025 - 13:10

Micrometeorites, unlike their larger brethren, rarely get a spotlight at museums. But there’s plenty to learn from these extraterrestrial particles, despite the largest of them measuring just millimeters across.

Nearly 50 tons of extraterrestrial material fall on Earth every day, and the majority of that cosmic detritus is minuscule. Micrometeorites are, by definition, smaller than 2 millimeters in diameter, and they’re ubiquitous, said Fabian Zahnow, an isotope geochemist at Ruhr-Universität Bochum in Germany. “You can basically find them everywhere.”

Researchers recently analyzed fossilized micrometeorites that fell to Earth millions of years ago. They extracted whiffs of atmospheric oxygen incorporated into the particles and showed that carbon dioxide (CO2) levels during the Miocene and Cretaceous did not differ wildly from modern-day values. The results were published in Communications Earth and Environment.

Extraterrestrial Needles in Rocky Haystacks

Newly fallen micrometeorites can be swept from rooftops and dredged from the bottoms of lakes.

Zahnow and his collaborators, however, opted to turn back the clock: The team analyzed a cadre of micrometeorites that fell to Earth millions of years ago and have since been fossilized. The team sifted through more than a hundred kilograms of sedimentary rocks, mostly unearthed in Europe, to discover 92 micrometeorites rich in iron. They added eight other iron-dominated micrometeorites from personal collections to bring their sample to 100 specimens.

Metal-rich micrometeorites such as these are special, said Zahnow, because they function like atmospheric time capsules. As they hurtle through the upper atmosphere on their way to Earth, they melt and oxidize, meaning that atmospheric oxygen gets incorporated into their otherwise oxygen-free makeup.

“When we extract them from the rock record, we have our oxygen, in the best case, purely from the Earth’s atmosphere,” said Zahnow.

Ancient Carbon Dioxide Levels

And that oxygen holds secrets about the past. It turns out that atmospheric oxygen isotope ratios—that is, the relative concentrations of the three isotopes of oxygen, 16O, 17O, and 18O—correlate with the amount of photosynthesis occurring and how much CO2 is present at the time. That fact, paired with model simulations of ancient photosynthesis, allowed Zahnow and his colleagues to infer long-ago atmospheric CO2 concentrations.

“The story of the atmosphere is the story of life on Earth.”

Reconstructing Earth’s atmosphere as it was millions of years ago is important because atmospheric gases affect our planet so fundamentally, said Matt Genge, a planetary scientist at Imperial College London not involved in the work. “The story of the atmosphere is the story of life on Earth.”

But Zahnow and his collaborators first had to make sure the oxygen in their micrometeorites hadn’t been contaminated. Terrestrial water, with its own unique oxygen isotope ratios, can seep into micrometeorites that would otherwise reflect atmospheric oxygen isotope ratios from long ago. That’s a common problem, said Zahnow, given the ubiquity of water on Earth. “There’s always some water present.”

The team found that the presence of manganese in their micrometeorites was a tip-off that contamination had occurred. “Extraterrestrial metal has basically no manganese,” said Zahnow. “Manganese is really a tracer for alteration.”

Unfortunately, the vast majority of the researchers’ micrometeorites contained measurable quantities of manganese. In the end, Zahnow and his collaborators deemed that only four of their micrometeorites were uncontaminated.

Those micrometeorites, which fell to Earth during the Miocene (9 million years ago) and the Late Cretaceous (87 million years ago), suggested that CO2 levels during those time periods were, on average, roughly 250–300 parts per million. That’s a bit lower than modern-day levels, which hover around 420 parts per million.

“What we really hoped for was to get pristine micrometeorites from periods where the reconstructions say really high concentrations.”

The team’s findings are consistent with values suggested previously, said Genge, but unfortunately, the team’s numbers just aren’t precise enough to conclude anything meaningful. “You have a really huge uncertainty,” he said.

The team’s methods are solid, however, said Genge, and the researchers made a valiant effort to measure what are truly faint whiffs of ancient oxygen. “It’s a brave attempt.”

In the future, it would be valuable to collect a larger number of pristine micrometeorites dating to time periods when model reconstructions suggest anomalously high CO2 levels, said Zahnow. “What we really hoped for was to get pristine micrometeorites from periods where the reconstructions say really high concentrations.”

Confirming, with data, whether such time periods, such as the Triassic, truly had off-the-charts CO2 levels would be valuable for understanding how life on Earth responded to such an abundance of CO2.

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), Fossilized micrometeorites record ancient CO2 levels, Eos, 106, https://doi.org/10.1029/2025EO250319. Published on 28 August 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

As Simple as Possible: The Importance of Idealized Climate Models

Thu, 08/28/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

 “Everything should be made as simple as possible, but not simpler.” This popular saying paraphrases a sentiment expressed by Einstein about the need for simplicity, though not at the expense of accuracy. Modeling of the Earth’s climate system has become an incredibly complex endeavor, especially when coupling of physics of atmospheric movement with complex and nonlinear feedbacks with the ocean and land surface and forcing by collective human actions. Such complexity can make the underlying causes of model behaviors hard to diagnose and can make it prohibitively expensive to perform targeted experiments.

Two very recent developments, the emergence of kilometer-scale simulations and the rapid growth of machine learning (ML) approaches, have further increased the computational complexity of modeling global climate. In their commentary, Reed et al. [2025] remind us of the benefits of maintaining and applying a hierarchy of models with different levels of complexity. They make a special plea not to forget the power of using idealized, or simplified, climate models for hypothesis testing, model development, and teaching. 

Citation: Reed, K. A., Medeiros, B., Jablonowski, C., Simpson, I. R., Voigt, A., & Wing, A. A. (2025). Why idealized models are more important than ever in Earth system science. AGU Advances, 6, e2025AV001716. https://doi.org/10.1029/2025AV001716

—Susan Trumbore, Editor, AGU Advances

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The 26 August 2025 landslide on the Vaishno Devi pilgrimage route in India

Thu, 08/28/2025 - 06:51

On 26 August 2025, a landslide triggered by extraordinary rainfall killed at least 34 people and injured another 20 individuals.

On 26 August 2025, extremely intense late monsoon rainfall struck parts of Jammu and Kashmir in northern India, triggering extensive flooding and landslides. Unfortunately, a significant landslide occurred on the route to the Vaishno Devi shrine, a sacred Hindu site that attracts large numbers of pilgrims. At the time of writing, the reported loss of life is 34 people, with 20 more injured.

I can find little detailed information about this landslide at present – the site is remote and access is clearly extremely difficult. However, this highlights a very major issue that India faces during the monsoon.

The Google Earth image below shows the terrain around the Vaishno Devi shrine (which is located at [33.03004, 74.948032]:-

Google Earth image showing the terrain around the Vaishno Devi shrine in northern India.

The landscape here is steep and geologically vulnerable, and the shrine is located on a remote mountain side, accessed by tracks. There is a good account of making the pilgrimage here – this person started the journey at 19:15 and they arrived at 02:00 the next day. The route is well-established but the journey is long (13 km). Most people travel on foot. According to the temple itself, over 5.2 million people have the journey so far in 2025. Travel during the monsoon is not recommended, but many people inevitably make the trip at this time.

Thus, this pilgrimage, and others that also take devotees into the Himalayas, places people in a dynamic landscape at a time when landslides are most likely. Inevitably, the vulnerability of those people is high. The tragedy at Vaishno Devi is the consequence.

Unfortunately, this event is not isolated. On 14 August, another major landslide occurred at Chasoti in Kishtwar district, also in Jammu and Kashmir, on the route of the Machail Mata Yatra pilgrimage. The final loss of life is unclear, but at least 66 people were killed and some reports suggest as many as 75 more people may be missing. There have been a number of other fatal landslides this year on Hindu pilgrimage routes.

And loyal readers of this blog may remember the 2013 Kedarnath disaster, when vicious debris flows struck the route of the Chardham pilgrimage when it was packed with pilgrims. The remains of 733 victims were recovered, but 3,075 people remain missing. With a total of 3,808 victims, this was one of the worst landslide disasters of the last 30 years.

There are news reports that Jammu and Kashmir chief minister, Omar Abdullah, is questioning why the Shri Mata Vaishno Devi Shrine Board did not suspend the pilgrimage. Reports indicate that the area received 629.4 mm of rainfall in a rolling 24 hour period, exceeding the previous record (342 mm) by a huge margin. In the view of the minister, these totals should have alerted the authorities to the potential for a disaster.

Whilst this is a pertinent question, it is addressing a short term issue, rather considering the underlying problems. The reality is that peak rainfall intensities in the summer monsoon are rapidly increasing across South Asia as a result of climate change, triggering landslides (especially channelised debris flows) and floods. At the same time, huge numbers of pilgrims are travelling into the landscape, where they are extremely vulnerable.

Managing this risk is very taxing, but many more people will lose their lives if systematic action is not taken to protect the pilgrims. These levels of loss cannot be tolerable.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

FEMA Puts Dissenting Staff on Indefinite Leave

Wed, 08/27/2025 - 14:52
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

Update 27 August 2025: This article has been updated to reflect newly released numbers of staff placed on leave.

On 25 August, 182 current and former staff members of the Federal Emergency Management Agency (FEMA) signed a declaration opposing the Trump administration’s actions to obstruct FEMA’s mission to provide relief and recovery assistance after natural disasters. The following evening, 36 FEMA staff, all signatories of that Katrina Declaration, were placed on indefinite administrative leave.

Colette Delawalla, the executive director of Stand Up for Science, an advocacy group that helped publicize the letter, told the New York Times that the move appeared to be an act of retaliation.

“Once again, we are seeing the federal government retaliate against our civil servants for whistleblowing—which is both illegal and a deep betrayal of the most dedicated among us,” she said.

This is illegal, plain and simple. FEMA workers are doing their duty as public servants by blowing the whistle on the dismantling of their agency — and whistleblowing is protected under federal law.

Stand Up for Science! (@standupforscience.bsky.social) 2025-08-27T01:25:29.308Z

Employees were told the leave was effective immediately. Stand Up for Science and the Washington Post both confirmed that two of those suspended were taken off duty from recovery work at the site of Texas floods that killed at least 135 people in early July.

The notice of placement on administrative leave stated that the decision “is not a disciplinary action and is not intended to be punitive.” However, FEMA spokesperson Daniel Llargues said in a statement that “It is not surprising that some of the same bureaucrats who presided over decades of inefficiency are now objecting to reform.”

The staff who were placed on administrative leave will receive pay and benefits but do no work.

FEMA staff sent their letter of dissent to Congress 20 years after of Hurricane Katrina, one of the deadliest natural disasters in modern U.S. history. Experts have long argued that many more people died than should have because of human failures in disaster planning and implementation. Twenty years later, FEMA staff have warned that recent changes to the organization’s structure and procedures put the nation at risk for future Katrina-like disasters.

 
Related

The letter specifically calls out reductions in disaster workforce, failure to appoint a Senate-confirmed FEMA administrator, elimination or reduction of risk reduction programs, interference with preparedness programs, censorship of climate science, and new policies regarding spending that have already delayed FEMA deployment to disaster areas.

The Katrina Declaration followed similar letters of dissent, also facilitated by Stand Up for Science, from the National Institutes of Health, NSF, EPA, and NASA. Not long after EPA staff sent their letter of dissent, 144 signatories were placed on administrative leave.

Only 36 individuals signed their names to the Katrina Declaration, while the rest chose to remain anonymous, likely in fear of similar retribution. (More people have signed the letter since 25 August, all anonymously.) The fears seem to have been well-founded: All those who signed their name were placed on leave.

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Machine Learning Simulates 1,000 Years of Climate

Wed, 08/27/2025 - 13:19
Source: AGU Advances

In recent years, scientists have found that machine learning–based weather models can make weather predictions more quickly using less energy than traditional models. However, many of those models are unable to accurately predict the weather more than 15 days into the future and begin to simulate unrealistic weather by day 60.

The Deep Learning Earth System Model, or DLESyM, is built on two neural networks that run in parallel: One simulates the ocean while the other simulates the atmosphere. During model runs, predictions for the state of the ocean update every four model days. Because atmospheric conditions evolve more rapidly, predictions for the atmosphere update every 12 model hours.

The model’s creators, Cresswell-Clay et al., found that DLESyM closely matches the past observed climate and creates accurate short-term forecasts. Using Earth’s current climate as a baseline, it can also accurately simulate climate and interannual variability over 1,000-year periods in less than 12 hours of computing time. It generally equals or outperforms models based on the Coupled Model Intercomparison Project Phase 6, or CMIP6, which is widely used in computational climate research today.

The DLESyM model outperformed CMIP6 models in replicating tropical cyclones and Indian summer monsoons. It captured the frequency and spatial distribution of Northern Hemisphere atmospheric “blocking” events, which can cause extreme weather, at least as well as CMIP6 models. In addition, the storms the model predicts are also highly realistic. For instance, the structure of a nor’easter generated at the end of a 1,000-year simulation (in 3016) is very similar to a nor’easter observed in 2018.

However, both the new model and CMIP6 models poorly represent Atlantic hurricane climatology. Also, DLESyM is less accurate than other machine learning models for medium-range forecasts, or those made up to about 15 days into the future. Crucially, the DLESyM model only conducts simulations of the current climate, meaning it does not account for anthropogenic climate change.

The key benefit of the DLESyM model, the authors suggest, is that it uses far less computational power than running a CMIP6 model, making it more accessible than traditional models. (AGU Advances, https://doi.org/10.1029/2025AV001706, 2025)

—Madeline Reinsel, Science Writer

Citation: Reinsel, M. (2025), Machine learning simulates 1,000 years of climate, Eos, 106, https://doi.org/10.1029/2025EO250318. Published on 27 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Scientists Track Down Fresh Boulder Falls on the Moon

Wed, 08/27/2025 - 13:17

As a boulder rolls down a cliff slope on the Moon, it kicks up lunar dust, leaving behind a telltale herringbone pattern of ejecta.

In a recent study, for the first time, scientists geolocated and dated evidence of such boulder falls. They identified 245 fresh tracks created as boulders rolled, bounced, and slid down crater walls.

“For a long time, there was this belief that the Moon is geologically dead.…Our study shows that boulders with sizes ranging [from] tens to hundreds of meters and [with] weights in tons have moved from their places over time,” said Sivaprahasam Vijayan, the study’s lead author and an associate professor at the Physical Research Laboratory in Ahmedabad, India. “It is equally important to know how recent these boulder fall events are to understand the time periods when the geological agents were active.”

Tracking Boulder Falls

As lunar boulders bounce, they scoop up bright, unweathered subsurface material and bring it to the surface. As a result, fresh boulder fall tracks appear brighter than older ones.

“One can identify a boulder fall to be a recent one considering the boulder fall ejecta,” said Senthil Kumar Perumal, principal scientist with the Planetary Sciences Group at the National Geophysical Research Institute in Hyderabad, India, who was not involved in the new study.

The craters were found to be around 400,000 years old—which means the BFE tracks are more recent.

To identify relatively recent boulder tracks, Vijayan and his colleagues first manually searched thousands of images of the lunar surface between 40°S and 40°N. At these latitudes, the Sun makes the bright boulder tracks distinguishable from the rest of the lunar surface. Once they identified a track, the researchers studied corresponding images taken by NASA’s Lunar Reconnaissance Orbiter Narrow Angle Camera between 2009 and 2022.

Next, scientists estimated the age of the tracks by studying regions with both boulder fall ejecta (BFE) and distinct impact ejecta blankets. (Such blankets, nicknamed the “lunar equivalent of fossils,” have long been used to estimate the age of impact events.) The craters analyzed by Vijayan and his colleagues were found to be around 400,000 years old—which means the BFE tracks are more recent.

Finally, the scientists identified possible seismic faults or impact craters nearby that could have triggered the boulder falls.

Mapping the Moon

The new geological map of boulder falls, published in Icarus, highlights seismically active spots and fresh impact sites on the Moon. Researchers say these regions could be potential landing sites for future lunar missions focused on recent surface and subsurface activity.

The study authors plan to integrate artificial intelligence methods into the next iteration of their work, but ultimately, Vijayan said, “the next step is to more precisely determine whether the cause [of a fall] is endogenic or exogenic, which can be achieved by deploying additional seismometers in upcoming missions.”

Kumar concurred. “We need to have a large network of seismometers covering the entire [Moon] that monitors seismic activity continuously for several decades,” he said.

—Unnati Ashar, Science Writer

Citation: Ashar, U. (2025), Scientists track down fresh boulder falls on the Moon, Eos, 106, https://doi.org/10.1029/2025EO250314. Published on 27 August 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Equatorial Deep Ocean Response to the Madden-Julian Oscillation

Wed, 08/27/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Journal of Geophysical Research: Oceans

The Madden-Julian Oscillation (MJO) is the dominant weather system in the tropics. It lasts several weeks and changes rainfall, cloudiness, and winds across the tropics. The MJO is well known for triggering an extratropical and global atmospheric circulation response. And recently, several case studies have been conducted on a deeper ocean response to the MJO.

Using 18 years of output from a high-resolution ocean reanalysis product (GLORYS12) largely constrained by Argo data, Robbins et al. [2025] discover intraseasonal anomalies (20-200 days) signals in currents, temperature, and salinity in the tropical oceans down to at least 2,000 meters. They describe that such deep-penetrating structure are equatorial Kelvin waves, which are forced by the MJO in the equatorial Pacific and Indian Oceans. This is one of the first studies to examine the impact of the MJO on the deep ocean and will be beneficial for future investigations into deep-ocean changes.

Citation: Robbins, C., Matthews, A. J., Hall, R. A., Webber, B. G. M., & Heywood, K. J. (2025). The equatorial deep ocean structure associated with the Madden-Julian Oscillation from an ocean reanalysis. Journal of Geophysical Research: Oceans, 130, e2025JC022457.  https://doi.org/10.1029/2025JC022457

—Xin Wang, Editor, JGR: Oceans

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Fatal landslides in June 2025

Wed, 08/27/2025 - 07:43

In June 2025, I recorded 51 fatal landslides worldwide, resulting in 479 fatalities. The number of fatal landslides is significantly above the long term mean.

Yesterday, I provided an update on fatal landslides that occurred in May 2025. This post is a follow-up, providing the data for June.

As always, allow me to remind you that this is a dataset on landslides that cause loss of life, following the methodology of Froude and Petley (2018). At this point, the monthly data is provisional.

The headline is that I recorded 51 landslides over the course of the month, claiming 479 lives. Note that the landslide total is lower than for May (n=66), which is a little unusual. However, 51 landslides is still substantially higher than the 2004-2016 mean (n=40.8), whilst the number of fatalities is also below the mean (n=746).

So, this is the monthly total graph to the end of June 2025:-

The number of fatal landslides to the end of June 2025 by month.

Plotting the data by pentad to the end of pentad 36 (29 June), the trend looks like this (with the exceptional year of 2024 plus the 2004-2016 mean for comparison):-

The number of fatal landslides to 29 June 2025, displayed in pentads. For comparison, the long term mean (2004 to 2016) and the exceptional year of 2024 are also shown.

Through to about 10 June, the trend for 2025 very closely matched that of 2024. However, by the end of the month a significant difference had emerged, with the landslide rate this year being somewhat lower. The data for July and August will start to tell us whether this is a trend.

So, what lies behind a monthly figure that is above the long term average but below the exceptional year for 2024? The Copernicus surface air temperature data for June 2025 notes the following:-

“June 2025 was 0.47°C warmer than the 1991-2020 average for June with an absolute surface air temperature of 16.46°C. [It was the] third-warmest June on record, 0.20°C cooler than the warmest June in 2024, and 0.06°C cooler than 2023, the second warmest.”

Thus, if the hypothesis that the landslide numbers are driven in part by atmospheric temperature, the lower total than in 2024 is perhaps unsurprising.

Reference

Froude M.J. and Petley D.N. 2018. Global fatal landslide occurrence from 2004 to 2016Natural Hazards and Earth System Science 18, 2161-2181. https://doi.org/10.5194/nhess-18-2161-2018

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Fallout from the Fires

Tue, 08/26/2025 - 13:49
From Devastation to Data

Some effects of wildfire are immediately apparent: burned vegetation, smoldering ruins, dissipating smoke. As such, the massive Palisades and Eaton Fires cut charred wakes through western Los Angeles County that remain long after firefighters contained the blazes earlier this year.

This month, we shadow geoscientists investigating the fires’ less tangible, if no less serious, consequences for regional air, soil, and water quality.

Where There’s Fire, There’s Smoke,” writes Emily Dieckman in her profile of air quality following the fires—and where there’s smoke, there are particulates, including organic compounds, toxic chemicals, and hazardous dust and ash.

For Earth scientists, the liminal space between what is urban and what is wild is crucial for understanding postfire debris flows and the ground below. As profiled by Kimberly Cartier, these researchers consider the L.A. fires to be a case study of “how this urban-rural interface is changing and what…recovery looks like.”

Watersheds, those ever-changing interfaces between earth and water, are no less fraught, writes Grace van Deelen in “Scrambling to Study Smoke on the Water.” Scientists are documenting how ash-laden runoff is changing, if only ephemerally, both freshwater and marine ecosystems.

Perhaps the most elusive and powerful consequences of the fires are their effects on human health. And in places like Los Angeles, writes Dieckman, “Access to Air-Conditioning May Affect Wildfire-Related Health Outcomes.” The L.A. fires are yet another test case for extreme events augmented by a warming climate. The importance of thoughtful, science-based policy has never been more relevant for the health of both our planet and ourselves.

—Caryl-Sue Micalizio, Editor in Chief

Citation: Micalizio, C.-S. (2025), Fallout from the fires, Eos, 106, https://doi.org/10.1029/2025EO250311. Published on 26 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

When Disaster Science Strikes Close to Home

Tue, 08/26/2025 - 13:48
From Devastation to Data

Over 24 days in January, the Eaton and Palisades fires burned nearly 38,000 acres of Los Angeles County. Whole neighborhoods were destroyed, 29 people died, and thousands were displaced. The conditions that led to the fires were estimated to be 35% more likely because of climate change, and damage to public and private infrastructure made the blazes among the costliest wildfire disasters in U.S. history.

In the wake of the fires, multiple local, state, and federal disaster response agencies mobilized to contain the flames, document dangers, and communicate those findings to the public. Agencies’ emergency response playbooks are tried and tested and often require interagency cooperation.

Within this massive, coordinated effort in postfire monitoring and response, where have non-agency scientists with relevant skills and a desire to help fit in?

This is a question Michael Lamb has wrestled with this year. Lamb is a geomorphologist at the California Institute of Technology (Caltech) in Pasadena who was evacuated from his home and left without power for several days when the Eaton Fire tore through Altadena.

Lamb, who researched debris flow patterns after the 2009 Station Fire in the Angeles National Forest, wondered how to apply his knowledge to help with this latest disaster, and whether he should. He worried that members of his lab group, by inserting themselves into the disaster response apparatus, might inadvertently confuse official communications or make it harder for first responders to do their jobs.

“We don’t want to take time away [from agency scientists], especially when they’re in the middle of the emergency management part of work,” Lamb said.

Lamb wasn’t alone in his concern, or in his desire to help. The areas of Los Angeles County affected by the Palisades and Eaton fires are home to a high concentration of scientists who work or study at the area’s many scientific institutions. Some of them study fires and fire impacts and realized they could help, while many outside that niche were surprised to find that their work might have new, immediate applications close to home.

Scientists Spot Need

When the Palisades Fire was still burning in early January, Adit Ghosh watched the coverage at home on television. Ghosh, an Earth science graduate student at the University of Southern California (USC) in Los Angeles, had helped evacuate his in-laws and some of his friends from at-risk areas and couldn’t go in to work because campus was closed.

“They were showing the fires nonstop,” Ghosh recalled of news reports. In one broadcast, the camera zoomed in on a house in Mandeville Canyon near Topanga State Park. “I saw it on TV catching fire and then burning to the ground.”

“He took us to this house. Then it clicked. This is the house that I saw burning on TV.”

By the third week in January, Ghosh was back in his geochemistry class. His adviser, who works closely with the professor teaching the course, suggested a way for Ghosh and his fellow graduate students to contribute to the ongoing efforts to understand contamination in water runoff. They were eager to help in whatever ways they could.

That weekend, Ghosh went out with a team of other USC students to collect water runoff in burned areas. They hoped to analyze the samples for chemicals that might prove harmful to human and environmental health. A helpful resident showed the team around the burned area, pointing out places they might collect samples from.

A home burned by the Palisades Fire. Credit: Adit Ghosh

“He took us to this house,” Ghosh said. “Then it clicked. This is the house that I saw burning on TV.”

The postfire landscape and environmental conditions can change rapidly. Many scientists felt a sense of urgency to collect samples of ash, dust, soil, and water, as well as to study sediment and debris built up along the mountainside, because much of these data are considered “perishable,” Lamb explained.

In an area burned by the Palisades Fire, a University of Southern California student collects water runoff from a drainpipe. Credit: Adit Ghosh

Lamb’s team rushed to obtain flight permits and conduct drone flyovers of debris channels along the San Gabriel Mountains above Altadena. Knowing that weather reports anticipated rain soon after the fires, debris flow researchers wanted to obtain postfire, prerain lidar scans of the channels’ topographies to better understand how debris accumulates and what conditions can trigger dangerous flows.

If measurements weren’t taken quickly enough, information about immediate postfire impacts could be washed away. They shared their results with disaster response agencies and affected communities.

Serendipitous Science

In the wake of such a disaster, doing something, anything, to help others can be a powerful tool of healing and recovery.

“As soon as they were safe, people really wanted to contribute,” said Kimberley Miner, a climate scientist and representative of NASA’s Disasters Response Coordination System (DRCS) at the Jet Propulsion Laboratory (JPL) in Pasadena, Calif. NASA and JPL coordinated to fly the Airborne Visible/Infrared Imaging Spectrometer 3 (AVIRIS-3) instrument to survey damage immediately after the fire.

The first AVIRIS-3 flight on 11 January was serendipitous, explained Robert Green, principal investigator for AVIRIS-3 at JPL. The instrument had already been installed on a plane and approved to fly over a completely different area. The team was able to divert the flight path to cover the Eaton burn area instead.

“There were folks working out of hotel rooms [on all of that imagery] while they were evacuated.”

“There were folks working out of hotel rooms while they were evacuated,” Miner said. On some science teams, only one or two people had not been displaced.

The AVIRIS team has been on the scene after some of the most infamous disasters in modern U.S. history. The team flew an earlier version of the instrument over the ruins of the World Trade Center after the terrorist attack on 11 September 2001 to look for asbestos and residual hot spots burning under the rubble. After the 2010 Deepwater Horizon disaster, AVIRIS-3 data yielded the first estimates of how much oil had been released into the Gulf of Mexico.

Even in the context of those disasters, Green said that flying over the L.A. burn scars was “heartbreaking and poignant.”

“It’s especially poignant, I would say, because it is a local disaster,” Green said. “But for 9/11, the Gulf oil spill, or wherever we contribute, our team is committed to offer information via this unique spectroscopy to be helpful.”

The first AVIRIS-3 flyover provided some of the first aerial data assessing the scope of the fires. NASA’s DRCS provided those data to federal and state disaster response teams, and those data helped justify and expedite approval for subsequent flyovers.

Getting Involved but Not Being in the Way

As official emergency responders worked to contain the fires and rapidly document the damage, collecting samples from the air, ground, rivers, or ocean outside of those efforts presented logistical quandaries.

The USC team that Ghosh worked with to collect water runoff samples had been organized within his department and went out on its own volition. But getting to sample sites was a challenge.

“We’re trying to focus on whatever we can get our hands on, essentially, because access is really hard,” he said earlier this year. In some burned areas where runoff sampling would have yielded important science results, for example, the National Guard had restricted access to prevent looting.

“Even in sites that are open, the residents still didn’t really want us hanging around over there. And understandably, because their house almost burnt down,” Ghosh said. When members of his team encountered resistance from residents, he said, they respectfully moved to another location.

“Something that we can try to help with more as research scientists is to think about real forward-looking measurements.”

Lamb said that his research group considered a broad range of science that they might contribute before contacting government agencies operating in the area. “We reached out via email to people…leading debris flow hazard teams and just said, ‘We are interested in helping. These are some of the capabilities we have. We also don’t want to get in the way. Please let us know if this can be of help.’”

Lamb’s team was told it could help by monitoring the accumulation of sediment and debris in ravines on the slopes of the San Gabriel Mountains, and they gained approval to fly drones over certain landslide-prone areas. Those aerial lidar measurements will be helpful in assessing the ongoing risk of debris flows and landslides and also in monitoring for future hazards.

“Emergency managers and the federal agencies are mostly tasked with trying to deal with the immediate situation,” Lamb said. “Whereas something that we can try to help with more as research scientists is to think about real forward-looking measurements.”

Their lidar flights focused on areas of burned mountainside rather than on urban areas. “It’s sad to say, but in some of the areas that were really devastated by the fires, there aren’t homes there [anymore] to be damaged by the debris flows,” Lamb said.

Working with Their Communities

The public messaging that agencies provide is critical for residents to find out about the immediate risks they face, but non-agency scientists also have found ways to engage these communities deeper in the scientific discoveries that are helping them stay safe.

As crews started containing the fires, scientists at the Natural History Museum of Los Angeles County (NHMLA) recognized the need to collect and analyze samples of the ash, not only for the immediate emergency response but also to curate a catalog that scientists could use for longer-term and future studies. Because they have a small staff, the museum’s team solicited community members for ash samples rather than going in to the field themselves.

“They just lost their homes. They want to be treated with respect.”

“We didn’t want to reach out right away, because that would appear as insensitive and not really caring about the people but rather more caring about the science,” said Aaron Celestian, curator of mineral sciences at NHMLA. But once it started raining, they couldn’t wait any longer.

The museum’s community science team approached their existing community partners about collecting ash and found that people were already doing it themselves. The team pivoted, instead showing people how to collect ash without risking personal health or contaminating the samples.

“We didn’t want anybody to do anything that would have any kind of health effects on them long term,” Celestian said. “We had to develop a protocol that could be understood by the community at large, and so that we get the best kind of science out of it in the end.”

Celestian analyzed his first sample on 27 January, measuring the chemical composition of forest ash. He plans to compare the results with those from urban ash.

Natural History Museum of Los Angeles County mineral sciences curator Aaron Celestian prepares one of the collected ash samples for total element analysis to reveal its chemical composition. The whole process takes about 2 hours. Credit: Aaron Celestian/Natural History Museum of Los Angeles County

Then came the question of how to communicate the results. Celestian and the museum’s communications team came up with a two-pronged approach. First and foremost, they consulted with the community member who sent in the ash sample. “They get to decide on how they want their samples to be treated and communicated with everybody else,” Celestian said.

With a resident’s permission, the ash sample was entered into a museum collection for other scientists to check out and analyze. They received 11 samples for the collection.

“Even though I’m collecting the data, it really is their property,” Celestian said. “That’s a big part of making them feel comfortable, making them feel confident in the results.”

“They just lost their homes,” he emphasized. “They want to be treated with respect,” he said, adding that the samples “are really like a family member’s ashes.”

At the same time, Celestian recognized the importance of transparency and that timely information can not only protect people but also help them feel confident in their safety. He began live-streaming his analysis on social media and his blog using anonymized samples.

“People want to know,” Celestian said.

Lamb’s group took a similar approach. They shared their lidar data directly with emergency response managers so they could be incorporated into official responses. They also communicated directly with the public. Lamb had been scheduled to give a public science talk in late January, and he decided to center the science of postfire debris flows.

“I was going to talk about something completely different, and I changed the topic last minute because of this very heightened community interest in understanding what’s happening in the mountains,” Lamb said. Nearly 900 people showed up to listen.

Strong, and Mixed, Emotions

Having a way to help after a disaster—whether through distributing supplies or figuring out whether playground soil has elevated lead levels—can aid community recovery and empower personal healing. In some, it can also evoke a sense of duty.

“I think we have a responsibility to use our skill sets to help the greater Los Angeles area where we live,” Ghosh said. Logically, he knew that sampling water runoff and analyzing it for harmful chemicals is an important part of postfire recovery. But sometimes, it didn’t feel like enough.

“You go up there and you’re collecting water, and people have almost lost their homes,” Ghosh said. “It feels like, ‘Why the hell are you collecting water?’ It may not seem in the moment as important a thing to do. I definitely felt that.”

Studying the risks from these fires “does feel more personal.”

Some residents questioned what the sampling team was doing and whether they were focusing on the right problems. “But we also had neighbors who were like, ‘Thank you so much for doing this, coming out and helping us understand whether we can drink our water, or whether it’s safe to be out,’” Ghosh said. “In fact, some people even let us in to their house, and [we] collected tap water from their house.” Ghosh and his colleagues shared the results of those in-home water tests directly with the homeowners when they got them.

“It’s a lot of mixed emotions,” he added.

Studying the risks from these fires “does feel more personal” for local scientists, Lamb said. “We know people that live in those areas. There’s faculty from Caltech and graduate students that live there, and postdocs and friends. It’s very close to where we live and work. It certainly adds more motivation to try to do anything that we can to help.”

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

Citation: Cartier, K. M. S. (2025), When disaster science strikes close to home, Eos, 106, https://doi.org/10.1029/2025EO250315. Published on 26 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Where There’s Fire, There’s Smoke

Tue, 08/26/2025 - 13:41
From Devastation to Data

Gale Sinatra and her husband fled their Altadena, Calif., home on 7 January with little more than overnight bags, taking just one of their two cars.

“We thought we were going to be gone overnight,” Sinatra said. “We thought they’d get the fire under control and we’d get back in.”

When the couple did return, weeks later, it was to dig through the rubble of their former home, burned to the ground by the Eaton Fire.

Though they escaped with their lives, health hazards were not behind Sinatra, her husband (who chose not to be named for this story), and others from their neighborhood. The Eaton and nearby Palisades fires filled the Los Angeles Basin with a toxic haze for days, and cleanup efforts threatened to loft charred particles long after the fires were out.

Teams of scientists from across the country, along with community members, monitored air quality in the weeks following the fire, seeking to learn more about respiratory health risks and inform community members about how to protect themselves.

Urban Fires Versus Wildfires

Inhaling smoke from any fire can be harmful. Smoke contains hazardous components, including volatile organic compounds (VOCs), emitted by burning vegetation and products such as paint and cleaning supplies; and particulate matter, such as dust and soot.

About 90% of the particulate matter (PM) in wildfire smoke is PM2.5, or particles smaller than 2.5 micrometers in diameter—small enough to enter the bloodstream and deep areas of the lungs.

These instruments are used by Michael Kleeman to monitor air quality from the back of a car in Victory Park in Altadena, as far north as he can go in the area without entering the evacuation zone. Credit: Michael Kleeman

Urban wildfires present their own dangers, because they burn not just through trees and other vegetation but through homes and infrastructure as well.

When Sinatra returned to her former home, she was struck by everything the fire had burned, from her jewelry to her car. “I just found it very eerie standing in my kitchen, going, ‘Where’s my refrigerator?’” she said. “How do you melt an entire refrigerator?”

In January 2025, the Palisades and Eaton fires ravaged more than 150 square kilometers across cities and wildlands in Los Angeles County. Even as they were personally affected, LA-area scientists worked diligently to understand how fires at the urban-wildland interface create unique hazards via air, land, and water.

In the future, hot and dry conditions enhanced by climate change will continue to raise the risks of fires like these. The work of these scientists can provide a blueprint for rapid hazard assessment, health risk mitigation, and urban planning in other fire-prone communities.

In January 2025, the Palisades and Eaton fires ravaged more than 150 square kilometers across cities and wildlands in Los Angeles County. Even as they were personally affected, LA-area scientists worked diligently to understand how fires at the urban-wildland interface create unique hazards via air, land, and water.

In the future, hot and dry conditions enhanced by climate change will continue to raise the risks of fires like these. The work of these scientists can provide a blueprint for rapid hazard assessment, health risk mitigation, and urban planning in other fire-prone communities.

“From mattresses to carpets to paint to electronics, everything like that burns,” said Roya Bahreini, an environmental scientist at the University of California, Riverside (UCR). Bahreini is also co–principal investigator of the Atmospheric Science and Chemistry Measurement Network (ASCENT), a long-term air quality monitoring project led by the Georgia Institute of Technology, UCR, and the University of California, Davis (UC Davis).

ASCENT, which launched in 2021, has stations across the country, including three in Southern California. During the January fires in Los Angeles, which tore through not only Altadena (an unincorporated inland community) but also neighborhoods along the coast, these stations detected levels of lead, chlorine, and bromine at orders of magnitude higher than usual.

Older houses sometimes have lead paint, asbestos ceilings, or wooden decks and fences treated with preservatives containing arsenic. PVC piping contains chlorine. And flame retardant often contains brominated organic compounds. In these forms, such materials don’t necessarily pose a high risk to human health. But when they are burned and released to the air, they can be dangerous.

Smoke plumes from the Palisades Fire (left) and the Eaton Fire are seen from space on 9 January. Credit: ESA, contains modified Copernicus Sentinel dataCC ­BY-SA 3.0 IGO

Michael Kleeman, a civil and environmental engineer at UC Davis, explained that the short-term mortality associated with high PM2.5 events such as wildfires often comes in the form of a heart attack. But inhaling urban wildfire smoke or the particles kicked up from dust and ash during remediation efforts can present risks that aren’t immediately apparent. “It’s not a heart attack a day or three after the exposure. It’s, like, a cancer risk way down [the road],” Kleeman said. “The long-term exposure [risk] can be insidious.”

Air Quality Maps

Southern California is no stranger to wildfires. (Neither is Sinatra, who has evacuated several times during her 15 years in Altadena.) Frequent droughts in the Los Angeles Basin result in large swaths of parched vegetation. The infamous Santa Ana winds, which blow into the basin from the east and northeast, can cause fires to quickly grow out of control, as was the case in the Palisades and Eaton blazes.

Real-time air quality maps, such as those hosted by the South Coast Air Quality Management District (AQMD) and U.S. EPA, pull from several sources to provide data year-round. More detailed data come from sophisticated instruments set up by the agencies themselves; South Coast AQMD hosts 32 permanent air monitoring stations throughout Los Angeles, Orange, Riverside, and San Bernardino counties.

The Air Quality Management District has permanent installations for monitoring air quality, but in the wake of the January 2025 Los Angeles wildfires, it launched supplemental efforts, gathering real-time air quality data from mobile monitoring vans. Credit: South Coast AQMD

Less detailed but more widespread data on particulate matter come from networks of off-the-shelf air quality measuring tools, such as PurpleAir monitors and Clarity Sensors, that are set up by residents or community organizations.

“It turns out that the areas that the fires were in had [a] really, really dense network of these low-cost sensors,” said Scott Epstein, planning and rules manager at South Coast AQMD. “When you combine that with our regulatory network, we had very good coverage of fine particle pollution.”

This density meant researchers could watch the Eaton and Palisades wildfire plumes as they traveled toward the coast.

An existing AQMD station in Compton, about 23 miles (37 kilometers) south of the Eaton Fire, showed highly elevated levels of toxic metals, including arsenic and lead, between 7 and 11 January as the plume passed over the area. These levels returned to normal within a few days. ASCENT instruments in Pico Rivera, about 14 miles (23 kilometers) south of the Eaton Fire, recorded a 110-fold increase in lead levels from 8 to 11 January.

Permanent air quality measuring stations like these offer one source of public information that residents like Sinatra could consult to make decisions about when to stay indoors or return to a burned area. But when the Palisades and Eaton fires broke out, researchers from AQMD and other institutions set out to supplement these efforts with more granular monitoring.

Mobilizing Quickly Melissa Bumstead (left) and Jeni Knack volunteered to gather air and ash samples in the wake of the Eaton and Palisades fires. Credit: Shelly Magier

In January, researchers from Harvard University; the University of California, Los Angeles (UCLA); the University of Texas at Austin; the University of Southern California (USC); and UC Davis launched the Los Angeles Fire Human Exposure and Long-Term Health Study, or LA Fire HEALTH.

While many Los Angeles residents, including Sinatra, were still under evacuation orders, LA Fire HEALTH researchers were traveling into evacuation zones.

One such researcher was Nicholas Spada, an aerosol scientist who headed down to Los Angeles from UC Davis on 14 January to set up four cascading impactors in Santa Monica (near the Palisades Fire), Pasadena (near the Eaton Fire), Hollywood, and West Hills. These briefcase-sized instruments act like coin-sorting machines, Spada said: They take an air sample, then sort particles into eight different size categories. They take an air sample, then sort particles into eight different size categories, ranging from 10 micrometers (about 1/9 the average width of a human hair) to 90 nanometers (about 1/1,000 the width of a human hair). The instruments collected eight samples every 2 hours until 10 February.

A cascading impactor allows scientists to “associate the particle size profiles with time,” Spada said. The instrument “picks up the changes in the smoke plumes as the fire progresses from active to smoldering to being put out, and then to the mitigation effects.”

The measurements showed that not only were toxic elements such as lead and arsenic present in the air throughout the sampling period but also a high proportion of their mass—about 25%—was in the form of ultrafine particles (on the order of nanometers). Such particles aren’t filtered by N95 masks and can penetrate deep into the body when inhaled, Spada explained.

A team of University of Texas researchers arrived in a van that doubled as a mobile laboratory on 2 February, at which point the fires were out but dust-disturbing remediation efforts had begun. They found that outdoor air quality in the weeks after the fires was back to prefire levels and within EPA guidelines. Indoor samples—especially from homes within the burn zones—showed higher levels of VOCs compared with the outdoor samples.

Neighbors Lend a Hand

Community members got in on the efforts to monitor air quality.

Southern California community members got in on the efforts to monitor air quality, too. Melissa Bumstead and Jeni Knack, codirectors of Parents Against Santa Susana Field Laboratory, worked with researchers to create and distribute flyers about appropriate measures regarding personal protective equipment, as well as a self-sampling protocol for residents who wanted to gather ash samples from their own properties.

About twice a week from 14 January to 19 February, they gathered air and ash samples in Pasadena, Altadena, Santa Monica, Topanga, and Pacific Palisades, then sent them to laboratories, including Spada’s, for testing. Arsenic in all of the ash samples and lead in about a third of them exceeded EPA regional screening levels. Spada noted in communications to residents that these screening levels are based on what’s safe for ingestion by a child and are relatively conservative.

“This is going to help people in the next iteration of fires to know what to do,” Bumstead recalled telling residents in sampling areas.

After the Ashes Sinatra lost her Altadena home in the January 2025 Eaton Fire. When she returned to dig through the rubble, she drove past “chimney after chimney after chimney with no house attached.” Credit: Gale Sinatra

The next fire, Sinatra said, is something that weighs on her as she and her neighbors consider the prospect of rebuilding.

When rain finally arrived in Southern California on 26 January, it helped extinguish the fires and tame the dust disturbed by remediation efforts, reducing the risk that people would inhale toxins.

Still, those toxins were also present in the soil and water. When Sinatra and her husband returned to the charred site of their home, they took every precaution they’d heard about from the news, the EPA, community leaders, and neighbors: They wore respirators, hazmat suits, goggles, and two pairs of gloves each to protect themselves.

Concerns about potential long-term consequences of the air they had already breathed, as well as the soil beneath them, linger as they wait for more data.

“Everyone feels there’s a significant chance of a future fire,” Sinatra said. We’re “wondering about whether it would be safe to live up there, [in] regards to the soil quality and the air quality, and whether it’s going to happen again.”

—Emily Dieckman (@emfurd.bsky.social), Associate Editor

Citation: Dieckman, E. (2025), Where there’s fire, there’s smoke, Eos, 106, https://doi.org/10.1029/2025EO250308. Published on 26 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Burning Urban and Wild Land Alike

Tue, 08/26/2025 - 13:40
From Devastation to Data

The 2025 Palisades and Eaton Fires torched scrub-lined slopes of the San Gabriel and Santa Monica Mountains, as well as buildings, streets, cars, and infrastructure.

Before the flames were even contained, scientists from throughout the Los Angeles (LA) metro area turned to the skies, their labs, and their communities to study the extent of the damage. Their work is informing residents and other researchers about how hazards from the fires have shifted in the weeks and months after containment, and how they could change in the years to come.

Scientists are learning how the fires, which burned along the urban-wildland interface, were distinct from strictly urban or rural fires in terms of chemistry, topographic changes, and follow-on hazards.

During Containment

While firefighters were still battling flames on the ground, an airborne surveyor team took stock of the damage, mapping the scope and extent of the fires and assessing which areas needed immediate remediation.

On 11 January 2025, NASA’s Airborne Visible/Infrared Imaging Spectrometer 3 (AVIRIS-3) flew over Los Angeles County mounted on a B200 research plane. The instrument provided some of the first aerial data assessing the scope of the Palisades Fire while it was still smoldering. It flew over the Eaton Fire soon thereafter.

“We immediately looked for methane or natural gas,” said Robert Green, principal investigator for AVIRIS-3 at NASA’s Jet Propulsion Laboratory in Pasadena. Natural gas leaks can pose health and safety risks to first responders. The instrument did not detect any abnormalities.

“But then we realized we had this dataset and could do advanced spectroscopic mapping of the burn severity and the burn products—the char and the ash,” Green said.

One of AVIRIS-3’s quick-turnaround data products is this aerial map of the fraction of burn material, char, and ash within the spectrum of each pixel. Values from 0 (no burn, dark) to 1 (entirely burned, red) describe the burn fraction of the exposed surfaces. Nearly all structures have moderate to severe surface burning (yellow), and few pockets of vegetation (green) survive. Credit: AVIRIS-3/NASA/JPL, via Rob Green

AVIRIS-3 data showed that the Eaton Fire burned more wildland area than urban area, largely because fire managers focused prevention efforts on areas in which people were endangered, Green explained. Initial maps of burn severity, which were provided to first responders, showed that within the burn area, nearly all structures suffered moderate to severe surface-level burns. Very few patches of urban vegetation survived.

“We’ve never had an urban fire like this to collect datasets.”

The instrument completed subsequent surveys of the Eaton and Palisades burn scars in late May.

“We’ve never had an urban fire like this to collect datasets,” Green said. “There’s a whole bunch of new spectroscopic compounds established by burning these [urban] materials, which are different than those you would find in a natural environment.” These surveys will help scientists track combustion products and how long they persist in the soil.

Lead Laden

After the fires were contained, many local scientists mobilized with the tools and people they had at hand—themselves, their students, and even their children—to collect ash, dust, and soil samples. They and fellow residents wanted to know what chemicals enter the environment when wildland-urban fires burn, how bioavailable they are, and how long they present a danger.

“We were scrambling to get ready for the rain. A lot of these data, from a scientific point of view, are perishable.”

Many scientists homed in on lead contamination. Of the more than 7,000 homes and structures that burned, most were built before 1975, when lead-based paint was commonly used. Lead that enters the body is easily absorbed into blood and can cause significant neurological harm, especially in children. It is particularly dangerous when inhaled or ingested as fine dust, and fire ash and dust are key exposure routes. Residents wondered when it would be safe for children and pets to play outside and whether it was safe to eat vegetables from their gardens.

Time was of the essence to collect samples, explained Joshua West of the University of Southern California (USC) in LA, not only because of the potential health impact but also because imminent wind and rain threatened to redistribute or wash away the fires’ by-products.

“We were scrambling to get ready for the rain,” West said. “A lot of these data, from a scientific point of view, are perishable.”

USC researcher Seth John used a handheld X-ray fluorescence machine to test the lead concentration in roadside dust near the Eaton Fire burn scar in early February. Credit: Cecilia John

A team of USC scientists, including West and Earth scientist Seth John, tested dust and ash along the edges of the Altadena burn scar in late January. Over several weekends, the scientists used handheld instruments to measure the lead concentrations in dust that had accumulated on streets and playgrounds. They also collected samples for further analysis in the lab.

EPA’s thresholds for levels of lead in residential soil and playgrounds are 200 parts per million generally and 100 parts per million for sites with multiple sources of exposure. The researchers found that a few roadside spots exceeded those thresholds but no playground samples did.

The team also found a strong correlation between lead levels and proximity to burned structures but not to wildland.

“It seems very clear that there is higher lead in the areas with destroyed structures,” John said.

Wildland ash had low concentrations of lead.

A team from the California Institute of Technology (Caltech) in Pasadena similarly found that ash closest to burned structures contained higher concentrations of lead. The Caltech analysis could differentiate wildland and urban ash and found that wildland ash had low concentrations of lead.

Researchers at the University of Southern California tested lead concentrations in dust samples from streets (circles) and playgrounds (triangles) along the Eaton Fire burn scar. They found higher concentrations of lead (darker blue) in dust near burned and damaged structures (black dots). Credit: map: Seth John, Josh West, and Sam Silva; data collection: Mia Bradshaw, Katherine Thomas, and Cecilia John

“No amount of lead is safe,” John cautioned. “That said, these levels are elevated, but I wouldn’t say they’re elevated to extremely toxic levels.” The researchers found similar trends in dust near the Palisades burn scar.

The USC team returned to the same locations every few weeks to collect more roadside dust samples. As of June, lead concentrations were slowly going down but were still elevated in most of the sample locations in Altadena, John said.

Another Caltech team measured lead concentrations in dust—a fine mix of ash particles, soot, and aerosols—that accumulated on indoor and outdoor surfaces. Many dust samples had lead concentrations exceeding EPA limits. Although simple cleaning with water was often enough to remove that lead, cleanup efforts can disturb new dust and prolong exposure risk.

A survey of garden soil samples found that about 35% had lead concentrations exceeding California’s recommended lead limit, but only about 7% exceeded the EPA’s limit (which is higher). They also found that soil lead levels could vary significantly within a single residential yard and that finer soil particles (smaller than 250 micrometers) had higher lead levels than a mixed soil sample.

The Risks of Rain

As cleanup efforts began in earnest and residents started returning, they were aware of the lingering hazard of debris flows and landslides from the charred slopes of the San Gabriel and Santa Monica Mountains.

The mountains are steep, and gravity slowly moves soil, rocks, and sand downhill, explained Emily Geyman, a graduate student researching climate and surface processes at Caltech. Low brush and scrappy mountainside shrubs typically interrupt that flow and accumulate that debris before it reaches foothill neighborhoods.

“Once you incinerate that [vegetation], those sand grains that are perched at this angle that should be unstable come cascading down,” Geyman said.

What’s more, fires can mobilize contaminants and change soil chemistry so that it repels, rather than absorbs, water and loosens soil so it’s more likely to fall downslope, explained West, who also studied the potential for debris flows in the San Gabriels. Gravity funnels sand and debris into natural channels where it builds up, compounding runoff and erosion risks.

“These are probably some of the first fires in which we have the post-fire, pre-rain topography at really high resolution.”

Studies of past wildfires in the San Gabriels have shown that between 20 and 50 years’ worth of soil erosion happens during the first 2 years following a wildfire. Sudden erosion can damage infrastructure, fill debris basins, and strip ecosystems of critical soil nutrients and structure.

Between 25 January and 24 March, Geyman and other Caltech researchers conducted 10 uncrewed aerial vehicle (UAV) lidar surveys above the Eaton Fire burn scar. They surveyed mountain catchments and debris basins—artificially dug repositories to contain flows—before and after every major rain since the fires.

“These are probably some of the first fires in which we have the post-fire, pre-rain topography at really high resolution,” West said. “Being able to capture that time sequence going forward is one of our big goals.”

A significant rainfall on 13 February triggered debris flows in almost every mountain catchment in the Eaton Fire burn area, Geyman said. Using pre-rain and post-rain lidar scans, the researchers estimated that 680,000 cubic meters of material cascaded downhill, “equivalent in volume to about 270 Olympic-sized swimming pools,” Geyman added.

Debris basins captured most of the flow before it affected residential neighborhoods. “The debris basin infrastructure largely did its job,” Geyman said. Debris flows have not caused any additional loss of life.

Similar UAV lidar work by geomorphologist Seulgi Moon and her colleagues at the University of California, Los Angeles (UCLA), resulted in digital elevation maps of debris channels in Topanga Canyon, an area in the Santa Monica Mountains affected by the Palisades Fire, before and after the February rains. They continue to assess debris flow hazards in the canyon.

On 19 February, a debris flow in Topanga Canyon blocks roads west of the Palisades burn area. Credit: Seulgi Moon The Paths Ahead

UAV surveys continue, and scientists are monitoring debris channels to assess the ongoing risk of landslides and debris flows.

“We plan to continue the surveys before and after each major rain event [through] winter 2026 or until…the loose sediment released from the hillslopes during the fire is cleared out,” Geyman said.

Moon and her colleagues plan to install a geophone and five debris flow monitoring stations in the canyon to monitor ongoing hazards in the area for 2 years. Any ground motion, whether from a skittering animal or an imminent debris flow, will create vibrations. A geophone converts them into a measurable voltage.

“We want to make the connection between how much rain is coming and how much sediment will come downstream,” she said.

Data from both groups will help scientists understand which debris flow mechanisms are most likely, track the volume of material at which flows become imminent, and help inform hazard maps to aid emergency response.

After the initial rush of dust, ash, and soil collection, many research groups shifted to community-led sampling efforts. John and West, for example, set up a free community sampling program for lead in soils and received more than 1,000 samples by mid-May. Some groups are reaching out to residents who want their soil tested or who want to contribute to scientific efforts. Other teams at UCLA and Loyola Marymount University created similar lead soil testing programs for communities.

The AVIRIS-3 team is working with laboratory scientists to match the aerial spectral signatures to those of burn products in ash samples. Green said that every burn compound the team catalogs will help efforts to protect first responders during future urban fires and inform future instruments that could identify when burnable material builds up in fire-prone areas.

Additional flyovers may happen when AVIRIS-3 flies to or from its home base in LA, Green said. Those data could be used to track how environmental damage such as toxic ash contamination and soil erosion change over time.

“That might inform our understanding [of] how this urban-rural interface is changing and what the recovery looks like,” Green said.

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

Citation: Cartier, K. M. S. (2025), Burning urban and wild land alike, Eos, 106, https://doi.org/10.1029/2025EO250309. Published on 26 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Scrambling to Study Smoke on the Water

Tue, 08/26/2025 - 13:40
From Devastation to Data

As multiple fires raged through Los Angeles in January 2025, Bernadeth Tolentino had one more thing to worry about: kelp.

Tolentino, a marine biologist and graduate student at the University of Southern California, is part of a lab that runs a gene bank of kelp spores. The repository preserves genetic diversity and allows scientists to bolster struggling populations.

As the roaring fires turned homes, cars, and businesses into chemical-laden ash, Tolentino realized that runoff from postfire rains would eventually carry that ash to the sea.

In the ocean, the ash threatened to block sunlight and pollute the water surrounding one particular kelp population in Santa Monica Bay—a population not represented in the gene bank. She needed to reach the kelp before runoff damaged viable spores.

The dive team, including Tolentino, scrambled to apply for permits, gather their equipment, and coordinate dives before rainstorms carried too much toxic runoff to the site. “It was a little bit of a rush job,” she said. Accelerated permitting from the California Department of Fish and Wildlife allowed the team to reach the kelp population just in time.

They dove four times, collecting spores from southern sea palm, feather boa, and golden kombu kelp, which may be used to restore regional ecosystems in the future.

AltaSeeds Conservancy curator Michael Marty-Rivera places kelp spores into biobank storage. Credit: Taylor Griffith

Tolentino was one of many water quality and marine scientists who collected valuable, time-sensitive information after the LA fires. “It felt great to be able to apply what I know to jump on this really urgent matter,” she said.

Ash in the Water

On 8 January, a day after the Eaton Fire began, scientists from the California Cooperative Oceanic Fisheries Investigations (CalCOFI) were about 80 kilometers (50 miles) away on a routine monitoring cruise. CalCOFI has been monitoring the state’s coastal waters for more than 75 years, collecting oceanographic and ecological data.

But as the fires raged, scientists on the deck of the CalCOFI vessel—and their colleagues on land—recognized the unique opportunity posed by the disastrous event. Here was a chance to collect real-time data on the environmental impacts of an urban fire without needing to plan and launch a separate expedition. They pivoted, used a planned crew exchange to gather more equipment, and increased their sampling.

Even kilometers off the Pacific coast, those on board a CalCOFI (California Cooperative Oceanic Fisheries Investigations) monitoring cruise observed ash in the air on 9 January. Credit: Rasmus Swalethorp/Scripps Institution of Oceanography, University of California, San Diego

The event provided “a perfect opportunity to study the ocean impact of this very devastating urban fire,” said Julie Dinasquet, a marine microbiologist at the Scripps Institution of Oceanography, University of California, San Diego, who works closely with the CalCOFI team. Dinasquet was not on board the monitoring cruise but helped to coordinate the work during the fires.

Those on board were shocked at the effect the fires were having on the ocean, Dinasquet said. She and her colleagues had to wear masks and goggles when the smoke became too potent.

On board the CalCOFI cruise on 9 January, researchers from NOAA Fisheries’ Southwest Fisheries Science Center hold up a plankton net full of ash and debris collected from the ocean surface. Credit: Rasmus Swalethorp/Scripps Institution of Oceanography, University of California, San Diego

Some particles that landed on the water were big enough to see with the naked eye. The largest chunks the group measured were 5 centimeters wide—quite unusual given how far from the fires the samples were taken, said Douglas Hamilton, an Earth systems scientist at North Carolina State University who collaborates with the CalCOFI team.

Closer to shore, the onboard team pulled up a plankton net that was full of black ash. And water samples, typically filled with plankton, were filled instead with soot and debris.

Hamilton thinks that particles traveling from the primarily urban fires to the ocean contained more toxic material than those from blazes burning primarily biological fuels (such as brush).

Scientists know that falling ash and runoff from wildfires that burn mostly vegetation add nutrients to the ocean, sometimes spurring primary production and altering ocean biogeochemistry. The samples from the CalCOFI cruise will shed light on how urban fires can also affect ocean biogeochemistry—a rather new field, Hamilton said.

“This is really the first time this has been able to be observed.”

Ash from urban fires contains very different chemicals than ash from burned areas of less developed land, and therefore might have very different effects on the ocean. “We’re adding this extra layer of complexity,” Hamilton said. “How does this urban wildfire change the narrative of the way that we’ve been thinking about how wildfires might be impacting ocean ecosystems? This is really the first time this has been able to be observed.”

In April, a very large bloom of the toxic phytoplankton Pseudo-nitzschia unfolded along the California coast, killing sea lions and dolphins. Though the event is not unusual under La Niña conditions, scientists are questioning whether material that entered the ocean from the fires may have contributed to the bloom.

Dinasquet plans to analyze the composition of the ash and water samples collected on board the CalCOFI cruise and compare them with ash transported from fires in less developed areas. Hamilton will create models that could be used to project how future urban fires may affect the ocean.

The January fires were the first large coastal urban fires to have affected the ocean at such a scale. “This is an absolutely tragic event, but it’s the first of its kind,” Hamilton said. “So we need to learn from that.”

Rushing for Runoff

Fires’ effects on marine ecosystems don’t come from just the air: As rainwater percolates through burned neighborhoods, runoff carries pollutants through the watershed and into the ocean, too.

When rainstorms hit after the fires, Adit Ghosh was ready. More rainfall meant more runoff and a chance to collect samples that might shed light on how toxic metals such as lead, arsenic, iron, and vanadium, as well as organic pollutants, move through the watershed.

Rain events after the January Los Angeles wildfires caused debris flows, like this one in Mandeville Canyon on 13 February, 2025.Credit: Rain Blankenship

Ghosh, a geobiologist at the University of Southern California, and his colleagues monitored the forecast in the weeks after the fires. No matter the hour, when they expected rain, they headed out to four field sites to collect runoff.

The team especially wanted to sample runoff from the first storms to hit the area after the fires,as that water, they hypothesized, would contain the highest amounts of pollutants. During the first storm, which occurred in late January, the team stayed out until nearly midnight, filling bottles and vials with runoff. Since then, Ghosh has ventured out into rainstorms more than 2 dozen times, trying to capture runoff each time there’s enough flow to sample.

At one point during the sampling, a University of California, Los Angeles professor led the research team around a burned neighborhood in Mandeville Canyon. Ghosh did a double take at the remains of one of the houses—he’d seen it burn down, live, on television.

“You see it on TV, but when you walk up to it, you see the devastation,” he said. “It really hit home. It was really sad.”

Ghosh said he feels that as a member of the Los Angeles community, he has a responsibility to use his skill set to help area residents understand how the fires may be affecting their water.

“It’s important that we do this work, that we try to find these things out quickly and let the public know what we found.”

“It’s important that we do this work, that we try to find these things out quickly and let the public know what we found.”

As this article was written, the team had not finished analyzing all of the samples, but preliminary results show that lead and arsenic are elevated in runoff from burned urban areas, though not above EPA limits. Lead and arsenic may have been elevated in the area already, as they are regularly derived from several natural and urban sources.

Ghosh and his team want to collect streamflow samples from the unburned watersheds during next year’s winter storms to see how the contamination they’ve found compares with background levels.

Ghosh hopes that a full analysis of the data will help scientists and the public understand how chemicals in runoff differ between burned and unburned, and urban and less developed, areas. Ghosh and his collaborators also plan to create time series analyses for each of the pollutants they sampled to show how concentrations of pollutants in runoff change over time after a fire and over the course of multiple rainstorms.

“If there’s another fire somewhere, we can have a better understanding of how those [water quality] risks are going to linger after the event,” he said.

Spotting Plumes from Above

One public agency is helping coordinate much of the aquatic research. After the fires, the Southern California Coastal Water Research Project started to keep track of what samples scientists were collecting to reduce redundancies and help everyone involved know what data are available.

“We’re going to learn some great things from this.”

“We’re going to learn some great things from this,” said Michelle Gierach, an oceanographer at NASA’s Jet Propulsion Laboratory (JPL) working on ocean monitoring projects.

Gierach has been struck by that high level of collaboration in the postfire research. As just one example, she’s been in conversation with the CalCOFI team to determine whether their samples of ash-laden water can help her team at JPL validate satellite observations and derive new algorithms to assess the impact of future urban fires on the ocean. Her team at JPL is also working with colleagues at the University of California, Los Angeles and the University of California, Merced to assess the impacts of the fires on aquatic ecosystems.

“If our data can support the efforts of state agencies, municipalities, NGOs [nongovernmental organizations], academic groups, and others conducting fieldwork and analyses, and ultimately enhance understanding of urban wildfire impacts on nearshore and coastal aquatic environments, then sharing [them] is not only valuable—it’s essential,” Gierach wrote in an email.

She finds a “glimmer of hope” in seeing the scientific community rallying together so rapidly and coordinating so well. “It’s inspiring that in the face of this horror, something positive can happen.”

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Kimberly M. S. Cartier also contributed to this reporting.

Citation: van Deelen, G. (2025), Scrambling to study smoke on the water, Eos, 106, https://doi.org/10.1029/2025EO250310. Published on 26 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Long-Term Strain Record of Mount Etna Captures 84 Fountaining Eruptions

Tue, 08/26/2025 - 12:43
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Journal of Geophysical Research: Solid Earth

Seismic monitoring has served as one of the most common and reliable tools for volcanic eruption forecasting. While seismic networks have many advantages over other types of monitoring, the data can be complex to interpret because they include signals from a large variety of sources.

Carleo et al. [2025] analyze 84 eruptions from the South-East Crater of Mount Etna between 2012 and 2023 using data from a 200-meter-deep borehole strain meter. To complement the low frequency strain signals associated with the eruptions, the authors produce a strain tremor record from the higher frequency strain data that is consistent with signals from a nearby seismometer. This large dataset permits analysis using an automated clustering approach.

Interestingly, classification of the strain data comes much closer to matching those made by volcanologists based on the eruption behavior, than does the tremor-based classification. So, while tremor may reflect both deep and shallow magmatic fluid processes, which can be complex and time varying, the strain data may better capture the amount of magma flowing out of the reservoir. This implies that broader implementation of real-time strain monitoring could provide important information for eruption early warning.

Citation: Carleo, L., Currenti, G., Bonaccorso, A., & Sicali, A. (2025). Relation between volcanic tremor and geodetic strain signals during basaltic explosive eruptions at Etna. Journal of Geophysical Research: Solid Earth, 130, e2025JB031564. https://doi.org/10.1029/2025JB031564

—Gregory P. Waite, Associate Editor, JGR: Solid Earth

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Fatal landslides in May 2025

Tue, 08/26/2025 - 07:04

In May 2025, I recorded 66 fatal landslides worldwide, resulting in 313 fatalities. The number of fatal landslides is significantly above the long term mean.

Somewhat later than planned, resulting from other workload challenges, this is my latest update on fatal landslides in 2025, covering the month of May. I hope to be able to post data for June in the next few days.

As always, allow me to remind you that this is a dataset on landslides that cause loss of life, following the methodology of Froude and Petley (2018). At this point, the monthly data is provisional.

In May 2025, I recorded 66 fatal landslides worldwide, resulting in 313 fatalities. This is very significantly above the 2004-2016 average number of landslides (n=28.3) and slightly above the average number of fatalities (n=308.8).

This is the histogram by month for the number of fatal landslides in 2025 through to the end of May:-

The number of fatal landslides to the end of May 2025 by month.

The figure, with the higher monthly total in May than for the previous months, primarily reflects the increases in rainfall that start to occur in many Northern Hemisphere areas in May, most notably pre-monsoonal precipitation in large parts of Asia. As such, the trend is quite typical.

This is the graph of the cumulative total number of landslides, organised by pentads. This goes to pentad 30, which ends on 30 May:-

The number of fatal landslides to 30 May 2025, displayed in pentads. For comparison, the long term mean (2004 to 2016) and the exceptional year of 2024 are also shown.

As the above graph shows, at the end of May, 2025 was trending very much above the long term average, and it was remarkably similar to the exceptional year of 2024. This is quite surprising, but may reflect continued high atmospheric temperatures. The EU Copernicus atmospheric temperature note says the following:

May 2025 was the second-warmest May globally, with an average ERA5 surface air temperature of 15.79°C, 0.53°C above the 1991-2020 average for May.

As a teaser for the June data, this trend of exceptional landslide occurrence did not continue into the following month.

Reference

Froude M.J. and Petley D.N. 2018. Global fatal landslide occurrence from 2004 to 2016Natural Hazards and Earth System Science 18, 2161-2181. https://doi.org/10.5194/nhess-18-2161-2018

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer