EOS

Syndicate content Eos
Science News by AGU
Updated: 7 hours 51 min ago

Perseverance Sample Shows Possible Evidence of Ancient Martian Microbial Metabolisms

Wed, 09/10/2025 - 17:44
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.

A sample collected in July 2024 by NASA’s Perseverance Mars rover may be “the closest we’ve actually come to discovering ancient life on Mars,” according to Nicky Fox, the science head of NASA.

In a press conference today, NASA officials shared new results of an analysis of a sample named “Sapphire Canyon,” the 25th sample Perseverance collected from Mars. The analysis was published today in Nature

Perseverance has been looking for signs of life on the Red Planet since 2021, exploring the 28-mile-wide (45-kilometer-wide) Jezero Crater that, billions of years ago, repeatedly flooded with water. The crater’s past conditions mean it could have been a suitable habitat for microbial life. 

The rover collected Sapphire Canyon from a vein of sedimentary rock in a river valley called Neretva Vallis that is situated in the crater. The rock bore a distinct, leopard-print pattern visible to the naked eye. “We hadn’t seen anything like that before on Mars,” Fox said at the press conference. 

An image taken by Perseverance shows the leopard spot pattern that scientists believe could be a signature of ancient microbial life. NASA/JPL-Caltech/MSSS

Perseverance has collected 30 samples in total from Jezero Crater. Though none of the samples have been returned to Earth, scientists have been able to study them via Perseverance’s on-board instrumentation. “We basically threw the entire rover science payload at this rock,” said Katie Stack Morgan, a Perseverance project scientist at NASA’s Jet Propulsion Laboratory and coauthor of the new study. 

“When we see features like this in sediment on Earth, these minerals are often the byproduct of microbial metabolisms that are consuming organic matter.”

Scientists analyzed the sample’s leopard spots using the rover’s Scanning Habitable Environments with Raman & Luminescnece for Organics and Chemicals (SHERLOC) spectrometer and Planetary Instrument for X-ray Lithochemistry (PIXL) X-ray spectrometer. The analysis revealed the presence of organic matter in the mud that formed the rock, as well as the presence of iron-, phosphorus-, and sulfur-bearing minerals called vivianite and greigite in the leopard spots. 

The combination of minerals and the organic matter in the mud indicated the past occurrence of chemical reactions that could have been driven by ancient microorganisms, said Joel Hurowitz, a planetary scientist at Stony Brook University and lead author of the new study.

“What’s exciting about this combination of mud and organic matter that has reacted to produce these minerals and these textures, is that when we see features like this in sediment on Earth, these minerals are often the byproduct of microbial metabolisms that are consuming organic matter,” Hurowitz said. 

Hurowitz added that there are also abiotic processes that could have created the patterns detected in the rock, and that with Perseverance’s instrumentation it isn’t possible to rule out abiotic explanations. 

NASA, in partnership with the European Space Agency, has been planning to develop the Mars Sample Return mission to eventually return samples collected by Perseverance to Earth, allowing scientists to get their hands on the rocks.

But budget changes have left the future of Mars Sample Return uncertain: President Trump’s May budget proposal suggested a $6 billion cut to NASA funding, including a proposal to “terminate unaffordable missions such as the Mars Sample Return.” Congress has not yet passed final appropriations bills and could still decide to allocate funding to Mars Sample Return. 

 
Related

Acting NASA Administrator Sean Duffy fielded multiple questions at the press conference about the future of Mars Sample Return. In his responses, he implied that human space exploration, a priority of NASA under Trump, could allow for the return of samples from Mars on a faster timescale with less expensive methods. That notion may be inconsistent with revised estimated costs of a fully robotic sample return mission compared with that of a human exploration mission to Mars.

“If we don’t have the resources for the right missions or the right people, I will go to the President, I’ll go to the Congress, I’ll ask for more money. But I feel pretty confident that with the money that we’ve been given in the President’s budget, we can accomplish our mission,” Duffy said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about science or scientists? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

When Is a Climate Model “Good Enough”?

Wed, 09/10/2025 - 12:59
Source: Earth’s Future

Global climate models are software behemoths, often containing more than a million lines of code.

Inevitably, such complex models will contain mistakes, or “bugs.” But because model outputs are widely used to inform climate policy, it’s important that they generate trustworthy results.

Proske and Melsen set out to understand how climate modelers think about, identify, and address bugs. They interviewed 11 scientists and scientific programmers from the Max-Planck-Institut für Meteorologie who work on the ICON climate model.

When new code is developed for ICON, it’s screened and tested to catch bugs before being integrated into the model itself, the interviewees said.

After code is integrated, however, such testing usually stops. The code is assumed to be bug free until the model behaves weirdly or a programmer serendipitously discovers a bug while examining the code for other reasons. Even when the model crashes, it’s not necessarily a sign that a bug needs to be fixed because researchers are always making trade-offs between the speed and the stability of the model, and sometimes they simply push the model outside the bounds of what it can handle given those constraints.

Tracking down bugs and fixing them can be time-consuming, so even if the team suspects the presence of a bug, they sometimes estimate its impact to be minor enough that it doesn’t warrant correction. When the researchers do decide to fix a bug, many view the process as an extension of climate science: They generate hypotheses about how the bug might cause the model to behave, then test those hypotheses to discern the exact nature of the bug and how to address it.

The best way to avoid bugs is to test code thoroughly before it’s integrated into the full model, many interviewees said. Tools exist to facilitate testing, such as Buildbot and the GitLab development platform, and the scientists said such tools could be leveraged more fully in ICON’s development process. However, they also said there are inherent limits to how thoroughly researchers can test climate models because researchers don’t always know what a 100% accurate model output would look like. Thus, they do not have that basis to which they can compare actual model output.

Though the interviewees acknowledged that ICON is imperfect, they also considered it to be “good enough” to forecast weather or to answer research questions such as how increased atmospheric carbon will affect global temperatures. The authors write that although “the principle of ‘good enoughness’” is pragmatic and understandable, it could also lead to misunderstandings if users don’t appreciate a model’s limits. (Earth’s Future, https://doi.org/10.1029/2025EF006318, 2025)

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), When is a climate model “good enough”?, Eos, 106, https://doi.org/10.1029/2025EO250332. Published on 10 September 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Extreme Heat in U.S. Cities Revealed at High Resolution

Wed, 09/10/2025 - 12:57
Source: GeoHealth

Recent heat waves in the United States underscore a growing public health threat: Extreme heat events are growing longer, hotter, and more frequent. Soaring temperatures raise the risk of various health problems, such as heat stroke and cardiovascular disease—particularly for older people, people with preexisting conditions, and people who work outdoors.

Understanding these risks, and how to handle them, requires epidemiological research on heat exposure in cities, where most U.S. residents live. However, scientific instruments for measuring urban temperatures are often located at airports, rather than in city centers, where temperatures are typically higher than in surrounding rural regions. Thus, these tools often do not adequately capture the so-called urban heat island effect.

A novel method for measuring heat exposure, created by Marquès and Messier, can pinpoint urban heat islands that previously went undetected. The researchers’ approach harnesses crowdsourced data from the thousands of personal weather stations already installed by residents seeking precise weather information.

The new method employs a statistical technique known as Bayesian hierarchical modeling, which helps account for uncertainty in the crowdsourced temperature data. To demonstrate its capabilities, the researchers applied the method to four urban areas with distinct climates and geography: New York City, Philadelphia, Phoenix, and North Carolina’s “Triangle,” which includes Raleigh, Durham, and Chapel Hill.

Compared with existing tools, the new method captured urban air temperatures at much higher resolution. It identified urban heat islands that were previously detected imprecisely or not at all, such as hot spots clustered in Philadelphia. In addition, it recognized the cooling effects of urban green spaces, such as New York’s Central Park. It performed well at both high and low temperatures, including during Phoenix’s hottest month on record (July 2023) and a cold blizzard event in Philadelphia and New York in January 2021. The new method also revealed that compared with other areas in the same city, more densely populated neighborhoods were more likely to experience hot temperatures and longer hot nights.

The researchers have made their method publicly available in the hope that it will aid research into the health impacts of heat. This work could also help inform public health initiatives to support communities facing extreme heat, they say. (GeoHealth, https://doi.org/10.1029/2025GH001451, 2025).

—Sarah Stanley, Science Writer

Citation: Stanley, S. (2025), Extreme heat in U.S. cities revealed at high resolution, Eos, 106, https://doi.org/10.1029/2025EO250296. Published on 10 September 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Smallholder Farmers Face Risks in China’s Push for Modern Agriculture

Tue, 09/09/2025 - 20:37
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Community Science

In China, efforts to modernize agriculture through large-scale farming have pushed many smallholder farmers—who produce most of the country’s food—to the margins. One promising solution is “circular agriculture,” which focuses on sustainability, productivity, and rural economic development by encouraging cooperation between large- and small-scale farming operations.

In Community Science’s special collection on Transdisciplinary Collaboration for Sustainable Agriculture, Li and Nielsen [2025] examine a circular agriculture project in southwest China that combines pomelo growing with pig breeding. The authors conducted 35 interviews with smallholder farmers, government officials, employees from financial institutions, and various other stakeholders, capturing a wide range of interests and risks faced in this model.

Their findings show that local governments play a key role in creating platforms for cooperation, while agricultural cooperatives are central to business management. The study also reveals the challenge that government involvement is often politically motivated, and smallholders can lose both autonomy and fair representation in decision-making. The authors suggest that for circular agriculture to truly benefit everyone, smallholders need both a voice and power in shaping their future—without having their interests exploited.

Citation: Li, H., & Nielsen, J. Ø. (2025). Smallholders, capital, and circular agriculture—The case of combined pomelo and pig farming in southwest China. Community Science, 4, e2025CSJ000127. https://doi.org/10.1029/2025CSJ000127

—Claire Beveridge, Editor, Community Science

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

How an Interstellar Interloper Spurred Astronomers into Action

Tue, 09/09/2025 - 13:23

On 1 July 2025, astronomers detected a visitor from the deep reaches of space. At the time of discovery, the object was just inside Jupiter’s orbit and was zipping across our solar system 4 times faster than the New Horizons probe sped past Pluto. It was first spotted by the Asteroid Terrestrial-impact Last Alert System (ATLAS) in Chile, which was specifically designed to spot small, fast-moving objects like this. ATLAS sent out a public, automated alert, and when astronomers saw it, they quickly went to work calculating the object’s orbit and trajectory.

That’s when things got interesting. Backtracking the object’s path showed that its origins were not in the Oort cloud, the outermost region of our solar system responsible for most of the comets we see. Instead, the object’s journey started a long time ago in a star system far, far away.

The earliest observations of the object—now labeled 3I/ATLAS for being the third confirmed interstellar object (3I)—showed a distinct coma or haze of material surrounding a dense center.

“We knew we were going to get a 3I. We didn’t know when we were going to get a 3I.”

The trajectory of 3I/ATLAS suggests that it will escape the modest gravitational clutches of the Sun in mid-2026, and that time frame has contributed to a flurry of activity among scientists in the emergent field focused on studying interstellar objects (ISOs). Teams of researchers have secured time on some of the most prominent telescopes around the world and in space, combed through telescope archives for “precovery” images, run computer models and simulations, and released nearly three dozen quick-look research papers in astronomy’s preferred preprint repository.

“We knew we were going to get a 3I. We didn’t know when we were going to get a 3I,” said Michele Bannister, who researches small solar system objects at the University of Canterbury in Ōtautahi-Christchurch, Aotearoa New Zealand.

The speed of discoveries about this interstellar visitor outpaced efforts made when the first and second interstellar objects were discovered: 1I/’Oumuamua in 2017 and 2I/Borisov in 2019. One ISO might be a fluke, and two may be a coincidence, but three seemed inevitable. Astronomers took no chances in preparing for the likely arrival of another interstellar visitor.

Teams’ carefully laid plans have borne fruit, enabling rapid-response science, close international collaborations, and a united global effort to learn as much as possible about 3I/ATLAS before it disappears forever.

Planning for 3I

The arrival of ‘Oumuamua caught astronomers by surprise. It was the first discovery of its kind and wasn’t spotted until it was on its way out of the solar system. Researchers had a mere 2 weeks to get all the data they possibly could, taking their best guesses about what telescopes, instruments, and wavelengths would provide the best data on such short notice.

When something like ‘Oumuamua shows up, “you immediately write what’s called a director’s discretionary [DD] proposal,” explained Karen Meech, a planetary astronomer at the University of Hawai‘i’s Institute for Astronomy. “You scramble, you write a proposal, you submit it. The [telescope] director reads it and makes a decision without a review panel.” Bypassing a review panels speeds up the process but is less democratic.

Having found one ISO, researchers started putting in DD proposals every semester in case another one showed up.

“Astronomers are always trying to use these facilities as efficiently as possible.”

When Borisov appeared 2 years later, it was immediately obvious that it was radically different from ‘Oumuamua. The way observations were allotted on telescopes was also different—facilities became overwhelmed with the sheer volume of DD proposals, Meech said. That led to duplicate observations and some teams’ observations being bumped entirely when a newer, but identical, proposal came in. Telescopes have since worked out those kinks in the system to streamline the DD proposal process.

Anticipating the inevitable detection of a third interstellar object, many ISO observers took a different approach: target of opportunity (TOO) proposals. TOO is a process commonly used in branches of astronomy that study unpredictable phenomena like supernovas, kilonovas, gravitational waves, and gamma ray bursts. Researchers submit observing proposals for short observations of events that could happen at any time. If the event occurs, the team can trigger those telescope observations.

“Most collaborations, including ours, have preapproved dormant programs at the world’s largest telescopes ready to be activated when a suitable [ISO] candidate is confirmed,” said Raúl de la Fuente Marcos, who researches small solar system objects at the Universidad Complutense de Madrid in Spain. Before ‘Oumuamua, “such a discovery was considered highly unlikely. Now all the collaborations that have been involved in early data releases of 3I/ATLAS have such systems.”

Four images taken by the Hubble Space Telescope on 21 July track the motion of 3I/ATLAS through the solar system. Background stars are visible as streaks because the telescope followed the comet’s motion Credit: Images taken by David Jewitt/NASA/ESA/Space Telescope Science Institute (STScI), processed by Nrco0e via Wikimedia Commons, Public Domain

“Basically, if you give us more than a semester to plan, we will plan,” Bannister said. “Astronomers are always trying to use these facilities as efficiently as possible.”

De la Fuente Marcos and his team imaged and obtained spectra of 3I/ATLAS with the Gran Telescopio Canarias and the Two-meter Twin Telescope, both in Spain’s Canary Islands. Their observing program was triggered a mere 6 hours after 3I/ATLAS was confirmed as an interstellar object, allowing them to observe the comet from 2 to 5 July. Their results, published in Astronomy and Astrophysics, were the first to show that 3I/ATLAS’s spectrum is red and dusty, not too dissimilar from dusty solar system comets.

Teddy Kareta’s observations were more serendipitous. Kareta, a planetary scientist at Villanova University in Pennsylvania, already had time scheduled on the NASA Infrared Telescope Facility (IRTF) for 3 and 4 July. He learned about 3I/ATLAS the evening before his observing run and thought, “That’s too cool to be real,” he recalled.

“And then I woke up to about seven text messages, three missed calls, a dozen emails, most of which were saying, ‘Hey, I noticed you’re on the telescope because I checked the schedule— You’re gonna go out, right?’” Kareta said.

But the comet was coming in much faster than past ISOs and from a direction that made it challenging to observe.

“It was a very communal planning process, which I think for science often doesn’t happen so quick and on the fly.”

“People were coming up with observational plans on the fly,” Kareta said. “I pointed a 4-meter telescope at it for 2 full hours, and I think I got three useful images.”

There were plenty of emails, group chats, and Zoom calls trying to figure out the best telescope and camera settings.

“It was a very communal planning process, which I think for science often doesn’t happen so quick and on the fly,” Kareta said. “It felt more like a readiness exercise than it did like a traditional kind of planning….You need as many hands on deck as possible to make it work at all.”

Kareta and his colleagues’ infrared spectral observations, accepted for publication in Astrophysical Journal Letters, suggest that the comet may have a complex grain size distribution, grain compositions unlike solar system comets, or both.

A Broad Research Umbrella

By its galaxy-traveling nature, 3I/ATLAS quite literally connects comet science with the study of stars, planetary systems, and the galaxy.

ISO theorists have spent the time since Borisov’s departure working on a computer model that predicts the properties of interstellar objects across the galaxy. They had timed the release of their Ōtautahi-Oxford model for the beginning of science operations of the Vera C. Rubin Observatory and its Legacy Survey of Space and Time (LSST), which is expected to discover dozens of potential interstellar objects.

“We knew that LSST and Rubin were going to find loads, but we just thought this was going to happen in 6 months’ time, not now,” said Matthew Hopkins, who studies both ISOs and galaxy evolution at the University of Oxford in the United Kingdom.

Comet 3I/ATLAS “really did arrive with fantastic timing.”

Luckily, the model team, composed of people studying interstellar objects, comets, stars, and galaxy dynamics, was putting the finishing touches on a program that could analyze an ISO’s speed and orbital information and predict where in the galaxy it may have come from.

Comet 3I/ATLAS “really did arrive with fantastic timing,” Hopkins said.

The team jumped into action when the comet’s orbital characteristics were announced. It was detected when it was 670 million kilometers (420 million miles) away, traveling at nearly 60 kilometers per second and coming in at a steep angle. Bannister, part of Ōtautahi-Oxford’s New Zealand contingent, said that her team was able to share its results so quickly because it had members scattered from western Europe to New Zealand. After working all day, the New Zealanders could hand off the research to European team members, whose day was just starting. By tag teaming the science, they submitted their analysis to Astrophysical Journal Letters about 84 hours after the comet’s discovery. (It has since been published.)

The orbit of 3I/ATLAS will take it within the orbit of Mars, with close passes to both Mars and Jupiter. Credit: CSS, D. Rankin; Video recorded and edited by Renerpho via Wikimedia Commons, CC BY-SA 4.0

“Especially for 3I, given that it was time sensitive, we definitely wanted to share our results as we had them,” Hopkins said.

The Ōtautahi-Oxford model showed that because 3I/ATLAS entered the solar system at a much steeper angle than either ‘Oumuamua or Borisov, it likely came from a different region of the galaxy, a part known as the thick disk. Though most young and middle-aged stars, including the Sun, live in the narrow thin disk of the Milky Way, many older stars live in the thick disk. The trajectory of 3I/ATLAS suggests that it originated from a star system that could be more than 7.6 billion years old. Indeed, its parent star may already be dead.

The age of 3I/ATLAS has intrigued many researchers who study stellar populations, galaxy dynamics, the birth of exoplanetary systems, and astrobiology, fields that are usually disparate and siloed.

“If you’re studying interstellar objects, you’re sitting cleanly at the division between planetary science and traditional astrophysics.”

“If you’re studying interstellar objects, you’re sitting cleanly at the division between planetary science and traditional astrophysics,” Kareta said. “And I think that means that people from both groups immediately know these are important.”

“Our colleagues who do extragalactic science and supernovae are really excited to help with 3I, and so we’re trying to trigger everything we can on the big telescopes,” Meech said. Her group had been hoping to use the Keck II telescope in Hawaii to obtain high-resolution infrared spectra of the comet, but the telescope had been experiencing technical issues. A student studying kilonovas had TOO time on the nearby James Clerk Maxwell Telescope and donated it.

“He said, ‘You know what, [the kilonova is] not going to go off in the next 2 weeks. Let’s use it for this,” Meech recalled. “And so we got five nights of observations on this object.” Meech and her colleagues are still analyzing those data to understand the abundances of certain gases in 3I/ATLAS’s coma.

The Long-Term Strategy

Several weeks after its initial discovery, it is clear that 3I/ATLAS looks and behaves like a comet. It’s now millions of kilometers closer to the Sun than it was upon detection in early July, and more recent observations, including from the Hubble Space Telescope, James Webb Space Telescope, Very Large Telescope, and more, have shown a dusty coma emitted from the Sun-facing side and the beginnings of a traditional comet tail behind it.

Most of the earliest 3I/ATLAS papers are still undergoing peer review, and Kareta said that more research analyzing July observations will continue to trickle out. Too, groups that wrote early papers will be going back over their data to put them in context with newer information and provide deeper analyses of those initial quick looks.

Hubble imaged 3I/ATLAS on 21 July. The comet is shedding dust in the direction of the Sun (right) and is haloed by a coma. Background stars are streaked, as the telescope followed the comet’s movement. NASA, ESA, D. Jewitt (UCLA); Image Processing: J. DePasquale (STScI), Public Domain

However, with the early rush of observations mostly completed, some scientists are turning their attention to what they want to learn about 3I/ATLAS in the coming months.

“A lot of teams are still scrambling to get telescope time,” Meech said.

The comet will reach its closest approach to the Sun, a mere 35% farther than the Earth-Sun distance, on 29 October. Earth will lose sight of it in the Sun’s glare in early September, but by mid-August, 3I/ATLAS had already started outgassing, as predicted. Astronomers were eager to analyze the chemistry of the gases it emitted because that could give clues about its history.

“Stellar encounters this close are actually really rare for interstellar objects,” Hopkins said. This is probably 3I/ATLAS’s first encounter with a star since it was booted out of its own system, and its surface material has likely been frozen in time since then. “We can use that to learn some really cool things about the chemistry of its parent star halfway around the galaxy, even if it’s dead.”

Spectra obtained from 3I/ATLAS’s coma in mid-August showed strong signs of water ice, carbon dioxide, nickel, and cyanide—all expected of a comet emitting a mixture of gas and dust as it heats up. “Typically for comets, the first thing you see is CN, cyanide, not because it’s particularly abundant but because it interacts so strongly with sunlight,” Meech said.

“There’ll be a lot of happy arguments around ‘Where did this form in the disk of its home star, and what does that tell us about the conditions that were like in that protoplanetary disk.’”

Indeed, scientists are seeing an object not too unlike a domestic comet, and they’ll continue to monitor its outgassing as it gets closer to the Sun.

The outgassing of carbon monoxide would be particularly telling, as the compound freezes solid only in extremely cold conditions like those that exist in the outer reaches of a star system. So if 3I/ATLAS outgasses carbon monoxide, Hopkins explained, it would be a strong hint that the object may have formed in the coldest outer regions of its system’s protoplanetary disk.

“There’ll be a lot of happy arguments around ‘Where did this form in the disk of its home star, and what does that tell us about the conditions that were like in that protoplanetary disk,’” Bannister added.

Still, who knows? “These are representative fragments of star formation elsewhere. There’s no reason that every protoplanetary disk has the same chemical distribution,” Meech said.

Every snapshot researchers get from now until 3I/ATLAS’s departure will help them put together a holistic, time series picture of the comet as it heats up and evolves. No one even knows whether it will survive its closest approach to the Sun in October.

All eyes, and telescopes, will be trained on its predicted point of emergence in late November.

Time Enough for Everyone

The biggest advantage that scientists have with 3I/ATLAS that they did not have with 1I/’Oumuamua is time—time not only to make more observations and analyses but to enable the widest participation possible.

‘Oumuamua arrived in October, the middle of the academic semester. Scientists who could respond quickly tended to be senior-level researchers, those with fewer teaching responsibilities, and those at institutions with easier access to telescope facilities, Kareta explained. Early-career scientists, those involved with research programs, or those who had inflexible responsibilities were less able to contribute to the groundbreaking discovery in the two-ish weeks before the object disappeared.

“The longer we have to study it, that means more people can work on it, more brains can take a crack at the problem and…leave their mark on this object.”

With 2I/Borisov and now with 3I/ATLAS, a monthslong observation window has enabled a larger, more diverse group of scientists from around the world to participate in observing, analyzing, and discussing this discovery.

“The longer we have to study it, that means more people can work on it, more brains can take a crack at the problem and…leave their mark on this object,” Kareta said.

And that can be only a positive thing for this nascent, but growing, field of science.

“We’re 7 years into this field of small-body galactic studies,” Bannister said. “There’s a whole different generation of people coming into this than were involved in 1I and even 2I. That’s really exciting to see.”

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

Citation: Cartier, K. M. S. (2025), How an interstellar interloper spurred astronomers into action, Eos, 106, https://doi.org/10.1029/2025EO250329. Published on 9 September 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

¿Pueden los microorganismos prosperar en la atmósfera terrestre o simplemente sobreviven allí?

Tue, 09/09/2025 - 13:19
Source: Journal of Geophysical Research: Biogeosciences

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

La atmósfera terrestre transporta diminutas formas de vida celular, tales como esporas de hongos, polen, bacterias y virus. En sus recorridos, estos microorganismos se enfrentan a condiciones desafiantes como bajas temperaturas, radiación ultravioleta y falta de disponibilidad de nutrientes. Investigaciones previas han demostrado que ciertos microorganismos pueden resistir estas condiciones extremas y, potencialmente, permanecer en estado de latencia hasta depositarse en un entorno más favorable. Pero ¿podría la misma atmósfera ser también el lugar de un sistema microbiano activo, que albergue microorganismos en crecimiento, adaptados y residentes?

El estudio de estas formas de vida flotantes se denomina aerobiología, pero avanzar en este campo resulta complicado: no existe un método estandarizado para muestrear el aeromicrobioma, es común que las muestras microbianas se contaminen, y resulta difícil reproducir las condiciones atmosféricas en un entorno de laboratorio.

Martinez-Rabert y colaboradores sugieren que la modelización computacional y los enfoques teóricos podrían contribuir a mejorar la comprensión del aeromicrobioma. A partir de la información conocida sobre el metabolismo y la bioenergética de la vida microbiana—especialmente en ambientes extremos—, así como de la química y la física de la atmósfera, los marcos de modelización especializados pueden proporcionar información sobre el aeromicrobioma.

Ese enfoque de modelado ascendente, proponen los investigadores, les permitiría comprobar cómo el cambio de elementos individuales de la atmósfera terrestre afectaría a la proliferación de la vida microbiana que contiene. Por ejemplo: ¿los microbios están mejor adaptados a un estilo de vida “libre” en los gases atmosféricos, dentro de gotas o adheridos a partículas sólidas? ¿Qué fuentes de energía están disponibles para estos microorganismos? ¿Cómo influye la acidez de los aerosoles atmosféricos en la capacidad de los microorganismos atmosféricos para prosperar?

El grupo sugiere que, combinados con datos obtenidos mediante muestreos, experimentos y observaciones, los modelos teóricos podrían ayudar a los investigadores a evaluar la capacidad de nuestra atmósfera para sostener una biosfera microbiana e, incluso, a comprender mejor cómo los microorganismos influyen en la composición química de la atmósfera. Este trabajo, señalan, también podría resultar útil en el futuro para modelar cómo podría existir la vida en otras atmósferas planetarias. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2025JG009071, 2025)

—Rebecca Owen (@beccapox.bsky.social), Escritora de ciencia

This translation by Saúl A. Villafañe-Barajas (@villafanne) was made possible by a partnership with Planeteando and Geolatinas. Esta traducción fue posible gracias a una asociación con Planeteando y Geolatinas.

Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Heat Spurs Unequal Consumption of Sweet Treats

Mon, 09/08/2025 - 17:12

The United States could consume the added-sugar equivalent of as much as 7 billion additional cans of soda per year by 2095 as a result of climate change, according to a new study. 

A new analysis of food consumption patterns and weather data in the country, published in Nature Climate Change, showed that warmer temperatures increase household purchases of food and beverage products with added sugar, especially among low-income and less educated populations. 

“There’s a huge difference across different socioeconomic groups.”

“There’s a huge difference across different socioeconomic groups,” said Duo Chan, a climate scientist at the University of Southampton and coauthor on the new study. 

Researchers analyzed retail food and drink purchases in more than 40,000 U.S. households from 2004 to 2019 along with monthly average temperatures at the county scale. They found that the average adult male purchased about 0.70 gram of additional added sugar per day for every additional 1°C (1.8°F) of temperature between 12°C (53.6°F) and 30°C (86°F).

Researchers ran multiple regression analyses to rule out other factors that could have influenced the purchasing of sugary products. With the effects of other variables, such as changes in product price, removed, added-sugar intake remained significantly associated with temperature, Chan said.

“It’s the change in temperature that [leads] to the sugar intake,” said Pengfei Liu, an environmental economist at the University of Rhode Island and coauthor of the new study.

The trends, according to the authors, are probably driven by people choosing hydrating, sweet beverages and cold desserts to mitigate the effects of heat. The patterns are “common sense,” said Thalia Sparling, a public health researcher at the London School of Hygiene & Tropical Medicine who was not involved in the new research. “Of course, when it gets hotter, you’re going to want to sit on the porch with your friends and have a cold drink or eat more ice cream.”

Education and income levels of the heads of households influenced how sensitive those households’ sugar consumption patterns were to increases in temperature. Households in which the head of household had a lower income and was less educated increased their added sugar purchases more per degree of increased temperature. Purchases of sweetened beverages and frozen desserts constituted the bulk of the increased sugary purchases.

The study authors used Coupled Model Intercomparison Project Phase 6 (CMIP6) climate models to project that warming temperatures in a high-emissions world could change diets enough to add nearly 3 grams of sugar per day to the average U.S. diet by the end of the century—equivalent to about 30 cans of soda per person per year. The projected effects were unequal among socioeconomic groups, too, with lower-income and less educated households expected to increase their sugar intake more than their higher-income, more educated counterparts. 

Dietary Inequality

“This is just another piece of evidence showing that the impacts of climate change on people are not equal.”

“This is just another piece of evidence showing that the impacts of climate change on people are not equal,” Sparling said.

Nutrition researchers have long known that lower-income groups eat less healthily, Sparling said, because of economic factors, lack of access to healthier foods, and even work environment. “People in communities with lower average [socioeconomic status] are less likely to have air-conditioned workplaces, schools, homes, or respite in other ways,” she said. Those without a way to escape the heat may be more likely to reach for cold, sweetened drinks or desserts for relief.

“Low-income people are most vulnerable to climate change in a lot of cases, and also in our case, in terms of excessive sugar intake,” Chan said. 

Higher added-sugar consumption can increase the risk of various health problems such as obesity, diabetes, and heart disease. But because the causes behind health problems are so complex, much more research would be needed to link warming to specific increases in disease, Sparling said.

She stressed that the onus to improve dietary choices should not be all on the individual: “You have to look at systems level change,” she said. Policies such as taxes on sweetened beverages have decreased added sugar consumption in some cities and countries. Education in communities, schools, and churches can also help people form healthier habits, Sparling said. 

More research is needed to determine whether the trends seen in the United States are reflected across the globe, said Pan He, an environmental social scientist at Cardiff University and first author of the new study. Then, scientists could have an even more comprehensive understanding of how global food consumption patterns may adapt to climate change, she said. 

“I hope our study may shed light on future evidence in developing countries, where sugary-beverage intake is already high and rising heat could further threaten nutrition security,” wrote Yan Bai, an economist at the World Bank and coauthor of the new study, in an email.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Citation: van Deelen, G. (2025), Heat spurs unequal consumption of sweet treats, Eos, 106, https://doi.org/10.1029/2025EO250333. Published on 8 September 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Protein-Powered Biosensors with a Nose for Environmental Ills

Mon, 09/08/2025 - 13:54

Imagine a farmer standing in her field—or even sitting at home—when she gets an alert from a handheld device: Her crops are showing signs of stress, not from heat, drought, or lack of nutrients in this case, but from a pesticide spill detected upstream. The alert doesn’t come from a lab test conducted days after the initial contamination. Instead, it is generated in real time by a portable biosensor containing a protein derived from a pig’s nose.

Chemical-sniffing sensors, though not yet widely adopted in agriculture or environmental science, represent an emerging field of research and development.

The protein has been reprogrammed to mimic the molecular recognition capabilities of animal olfaction, allowing it to detect specific volatile chemicals associated with pesticide contamination and enabling rapid on-site detection. With this early warning, the farmer can act swiftly to mitigate negative impacts and protect both her crops and future yield.

Chemical-sniffing sensors like this, though not yet widely adopted in agriculture or environmental science, represent an emerging field of research and development. They also offer viable new tools for addressing urgent 21st century environmental challenges related to climate change, rapid industrialization, urban sprawl, deforestation, and agricultural intensification, which threaten biodiversity, food security, and public health globally.

In response to these challenges, the United Nations’ Sustainable Development Goals (SDGs)—particularly SDG 6 (Clean Water and Sanitation), SDG 12 (Responsible Consumption and Production), SDG 13 (Climate Action), SDG 14 (Life Below Water), and SDG 15 (Life on Land)—call for integrated, data-driven approaches to environmental monitoring and management that support environmentally, economically, and socially responsible practices.

Implementing these goals, especially in remote regions and developing nations, requires affordable, scalable methods for monitoring air, water, and soil resources that deliver timely and actionable information. Naturally occurring animal proteins, paired with biosensing technology, offer a promising foundation.

A Transformative New Approach

Detecting pollutants in air, water, or soil often requires sending samples to distant laboratories for gas chromatography, mass spectrometry, or other high-precision analyses. These tools are indispensable for regulatory science because they deliver highly accurate, standardized measurements of trace contaminants that can withstand legal and policy scrutiny. However, the time required to collect, ship, and analyze samples delays results and limits their usefulness for rapid, local decisionmaking.

Conventional in situ monitoring systems, such as stationary air- and water-quality stations, provide continuous data but are expensive to install and maintain. As a result, they are typically sparsely distributed, provide limited spatial coverage, and require significant power, connectivity, and upkeep. Together the high costs and infrastructure demands of current methods make them impractical for widespread field deployment, especially in remote, resource-limited, or rapidly changing environments.

Biosensors present a viable, transformative alternative. Compact, energy-efficient, and often portable, these devices combine biological recognition elements, such as enzymes, antibodies, or odorant-binding proteins (OBPs), with signal transducers to detect specific compounds on-site, in real time, and at low cost. Notably, these devices can sense volatile chemicals and bioavailable pollutant fractions, making them well-suited to complement or even replace traditional environmental monitoring tools in certain settings.

Fig. 1. The structure of a porcine odorant-binding protein (pOBP) is shown by this ribbon diagram. A molecule of butanal, a volatile organic compound used in industrial manufacturing applications, is depicted within the binding cavity of the protein. Credit: Cennamo et al. [2015], CC BY 4.0

OBPs, a class of tiny but mighty proteins found in the olfactory systems of insects and vertebrates, are especially appealing options (Figure 1). They detect trace amounts of odorants—the molecules behind scents—in complex, chemically noisy environments. Whether it is a moth navigating miles to find a mate, or a mammal sniffing out food, OBPs enable detection of a few key molecules amid thousands.

Today researchers are repurposing OBPs to sniff out the chemical by-products of modern life. These proteins possess high thermal and chemical stability, are easy to synthesize, and are remarkably versatile. They can be integrated into portable devices and miniaturized sensors, affixed to biodegradable materials, and genetically engineered to target specific chemicals in soil, air, or water.

OBPs in Action

Despite their promise, biosensors remain underrepresented in discourse and planning related to environmental monitoring and sustainability. More often than not, prototypes developed and tested in the laboratories fail to reach broad application in the field. Specific uses of OBPs have remained largely siloed within biomedical and entomological research.

However, emerging applications of OBPs align closely with key geoscience priorities, including tracking pesticide and industrial runoff, monitoring volatile compounds and mapping soil emissions, and identifying plant health indicators tied to environmental stress and drought. Several proof-of-concept and real-world demonstrations are already underway, highlighting how OBPs can detect a range of pollutants across different environments.

Fig. 2. These prototype biosensors (top), with the gold-plated sensing area at the far left of each, were designed to detect benzene in the environment. The diagram (bottom) illustrates the process by which the sensing surface was chemically functionalized with pOBP (pink ribbon diagram). Credit: Capo et al. [2022], CC BY 4.0

Porcine OBPs have been engineered to detect BTEX pollutants (benzene, toluene, ethylbenzene, and xylene) originating from pesticides and petroleum runoff that threaten groundwater and soil health (Figure 2) [Capo et al., 2022, 2018]. Bovine OBPs, immobilized on cartridge-like devices, can selectively bind and remove triazine herbicides from water, demonstrating potential for both detection and remediation of the pollutant in water treatment [Bianchi et al., 2013].

Sensors coated with bovine and porcine OBPs detect trace, mold-related volatile organic compounds (VOCs) such as octenol and carvone [Di Pietrantonio et al., 2015, 2013], which is relevant to both indoor and outdoor air quality monitoring and mitigation of post-harvest crop losses. Low-cost, OBP-functionalized devices have also demonstrated selective detection of butanal, a common VOC linked to industrial and urban particulate matter [Cennamo et al., 2015].

In addition to bovine and porcine OBPs, rat OBP derivatives have been customized and immobilized on sensing platforms to enable simultaneous VOC profiling for air and water pollution diagnostics [Hurot et al., 2019]. Furthermore, insect OBPs, embedded in fluorescence-based biosensors, have shown efficacy for detecting bacterial metabolite, offering a possible approach for rapid coliform bacteria screening in drinking water [Dimitratos et al., 2019].

Beyond environmental and water quality applications, OBPs from multiple species have also been used to monitor for plant-emitted VOCs that signal stress, disease, drought, or pest infestation in agricultural systems [Wilson, 2013], providing valuable insights into crop health and enabling early intervention strategies.

The Potential Is Enormous

Integrating OBPs into environmental monitoring systems opens new frontiers in climate-smart agriculture, distributed sensing networks, and adaptive land use management.

Integrating OBPs into environmental monitoring systems opens new frontiers in climate-smart agriculture, distributed sensing networks, and adaptive land use management. These sensors offer lab-grade sensing of emissions from sources such as livestock waste, fertilizer application, and wetland activity. They may also enable real-time monitoring of greenhouse gas precursors and early detection of soil degradation, microbial shifts, or drought stress—all delivered through devices small enough to fit in your pocket.

Early detection of pollutant leaks or VOC hot spots could inform land use strategies that mitigate volatile emissions, improve air quality, and strengthen climate adaptation. OBP sensors’ low power requirements and biodegradability make them ideal for decentralized deployments, especially in low-resource or remote areas. Engineered differently, these proteins could even serve in preventative technologies as molecular sponges or scavengers that capture and bind VOCs before they accumulate or disperse.

Ultimately, OBPs could enable more data-driven decisions in conservation and climate policy, while offering novel tools for mapping environmental dynamics (e.g., tracking the spread of wildfire smoke plumes, monitoring methane emissions, or detecting waterborne coliforms across river networks) at finer spatial and temporal resolutions than current technologies permit.

From the Bench to the Biosphere

We envision a future in which OBPs are central to smart agriculture platforms, mobile environmental sensing labs, and biodegradable field-deployable kits. The underlying technology is sound, but breakthroughs like this don’t happen in isolation. Cross-disciplinary collaboration is crucial to accelerate and scale this development, reduce risks of field deployments, and ensure that innovations are aligned with real-world policy and practice.

We propose several pathways to support this collaboration and innovation. For example, targeted workshops and research consortia could facilitate dialogue among molecular biologists, environmental engineers, and Earth scientists to identify priority research questions and focus efforts on specific environmental challenges.

Key questions for advancing OBP-based sensing include the following: Which pollutants and ecosystem signals are most critical for understanding today’s environmental challenges? How can OBPs be tuned to target specific compounds under varying soil, air, or water conditions? What substrates can effectively host OBPs for real-world sensing without compromising environmental safety?

As part of this dialogue, environmental scientists could contribute by generating regional maps of priority VOCs linked to specific issues such as crop stress, emissions from peatlands, or urban air pollutants, guiding optimization of OBP-based sensors. Similarly, chemists and bioengineers could collaborate to expand the library of OBPs with tailored affinities for emerging pollutants, such as pharmaceutical residues, industrial solvents, or novel agrochemicals, broadening the range of compounds detectable in real-world settings. In parallel, data scientists and systems engineers could develop machine learning models to decode complex VOC signatures captured by OBP sensors, enabling real-time diagnostics, pattern recognition, and predictive analytics across environmental monitoring networks.

Expanding access to knowledge and resources represents another key pathway for advancing OBP-based sensing.

Expanding access to knowledge and resources represents another key pathway for advancing OBP-based sensing. Developing curated, open-source, and searchable repositories of OBPs from diverse organisms with characterized binding affinities for high-priority VOCs would accelerate biosensor design and prototyping. Such repositories should follow FAIR (Findable, Accessible, Interoperable, Reusable) data principles to maximize their usefulness across disciplines and platforms.

In the United States, agencies such as the National Science Foundation, the Department of Agriculture, EPA, and the Department of Energy could accelerate progress by hosting seed funding workshops to define shared goals, barriers, and applications and by providing joint funding for interdisciplinary biosensing projects.

Establishing and sharing experimental field test beds such as smart farms, urban air zones, and wetlands would enable pilot testing of OBP-based sensors alongside conventional instruments. These biosensors could be integrated into existing monitoring networks like the National Ecological Observatory Network and the Long Term Ecological Research Network. Their outputs could feed into land use, emissions, and ecological models to improve the spatiotemporal resolution of environmental data.

Building on these test beds and integrated networks, collaborating researchers could report cross-disciplinary benchmark studies and coauthor seminal papers detailing protocols, use cases, and best practices for OBP-based biosensing. This coordinated effort would guide future research and help establish the field’s credibility with regulators and funding agencies.

Clearing Barriers on the Road Ahead

For all of the upsides of OBP-based biosensing, several technical and logistical issues must be addressed before their widespread deployment is achieved.

Despite their superior stability compared to enzymes or typical antibodies [Dimitratos et al., 2019], OBPs remain susceptible to denaturation or degradation during prolonged environmental exposure. Environmental conditions such as humidity, pH, and salinity can affect their performance, underscoring the need for robust protocols to stabilize and calibrate these proteins across diverse ecosystems.

Advancing real-time data acquisition and remote monitoring with OBP-based biosensing also requires progress toward integrating the proteins with digital platforms in scalable and reproducible formats. Key challenges include reducing sensor-to-sensor variability, increasing sensor lifespans, and converting biological signals into stable, digitized outputs.

Realizing the technology’s broader potential will require rigorous technical validation, clear regulatory guidance, and proactive efforts to educate and engage stakeholders.

In addition to technical barriers, regulatory frameworks and approval pathways for OBP-based sensing technology remain underdeveloped, and concerns about the lack of standardized validation protocols and the effects of releasing recombinant proteins into agricultural or environmental settings persist. Moreover, low awareness among end users, including farmers and land managers, may hinder trust and uptake of the technology. Realizing its broader potential will thus require rigorous technical validation, clear regulatory guidance, and proactive efforts to educate and engage stakeholders across sectors. Notwithstanding these challenges, the promise is clear: OBPs offer a flexible and powerful approach for monitoring environmental changes and climate risk, helping to protect ecosystems, food systems, and communities. Once known primarily to entomologists, these little scent-sniffing proteins could become an unexpectedly powerful tool for advancing environmental resilience.

References

Bianchi, F., et al. (2013), An innovative bovine odorant binding protein-based filtering cartridge for the removal of triazine herbicides from water, Anal. Bioanal. Chem., 405, 1,067–1,075, https://doi.org/10.1007/s00216-012-6499-0.

Capo, A., et al. (2018), The porcine odorant-binding protein as molecular probe for benzene detection, PLOS One, 13(9), e0202630, https://doi.org/10.1371/journal.pone.0202630.

Capo, A., et al. (2022), The porcine odorant-binding protein as a probe for an impedenziometric-based detection of benzene in the environment, Int. J. Mol. Sci., 23(7), 4039, https://doi.org/10.3390/ijms23074039.

Cennamo, N., et al. (2015), Easy to use plastic optical fiber–based biosensor for detection of butanal, PLOS One, 10(3), e0116770, https://doi.org/10.1371/journal.pone.0116770.

Dimitratos, S. D., et al. (2019), Biosensors to monitor water quality utilizing insect odorant-binding proteins as detector elements, Biosensors, 9(2), 62, https://doi.org/10.3390/bios9020062.

Di Pietrantonio, F., et al. (2013), Detection of odorant molecules via surface acoustic wave biosensor array based on odorant-binding proteins, Biosensors Bioelectron., 41, 328–334, https://doi.org/10.1016/j.bios.2012.08.046.

Di Pietrantonio, F., et al. (2015), A surface acoustic wave bio-electronic nose for detection of volatile odorant molecules, Biosensors Bioelectron., 67, 516–523, https://doi.org/10.1016/j.bios.2014.09.027.

Hurot, C., et al. (2019), Highly sensitive olfactory biosensors for the detection of volatile organic compounds by surface plasmon resonance imaging, Biosensors Bioelectron., 123, 230–236, https://doi.org/10.1016/j.bios.2018.08.072.

Wilson, A. D. (2013), Diverse applications of electronic-nose technologies in agriculture and forestry, Sensors, 13(2), 2,295–2,348, https://doi.org/10.3390/s130202295.

Author Information

Ishani Ray (isray@okstate.edu) and Smita Mohanty, Department of Chemistry, Oklahoma State University, Stillwater

Citation: Ray, I., and S. Mohanty (2025), Protein-powered biosensors with a nose for environmental ills, Eos, 106, https://doi.org/10.1029/2025EO250330. Published on 8 September 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Strong Tides Speed Melting of Antarctic Ice Shelves

Mon, 09/08/2025 - 13:53
Source: Journal of Geophysical Research: Oceans

Antarctic ice is melting. But exactly which forces are causing it to melt and how melting will influence sea level rise are areas of active research. Understanding the decay of ice shelves, which extend off the edges of the continent, is particularly pressing because these shelves act as barriers between ocean water and land. Without ice shelves, the continent’s glaciers would flow freely into the ocean, hastening sea level rise.

In January 2015, a group of researchers used hot water to drill a hole through 740 meters of the Ronne Ice Shelf. They then lowered a mooring carrying temperature, salinity, and current sensors through the hole into the ocean below. A radio echo sounder deployed 15 meters from the mooring kept tabs on ice thickness. For the next 3 years, the instruments took measurements every 2 hours; these measurements were sent to a solar-powered data logger on the surface and then on to researchers via satellite.

Anselin et al. recently used these measurements to probe the forces responsible for melting ice shelves.

The tide is a major force contributing to ice shelf melting, the researchers found. As the tide rises, the water rushes across the bottom of the shelf. Friction between the shelf and the water causes the current just beneath the ice to slow. This slowdown leads to strong mixing within the ocean, and this mixing brings heat to the ice base, driving high melt rates. Because the strength of tides varies depending on the positions of the Sun and the Moon relative to Earth, ice shelf melting has a cyclical pattern, with melting ebbing and flowing every 2 weeks.

However, current models of melt rates fail to capture the full extent to which tidal mixing and warm ocean water combine to melt ice. When analyzing data from additional sites, scientists should focus on how the interaction between tides and ice shelves leads to melting, the researchers say. (Journal of Geophysical Research: Oceans, https://doi.org/10.1029/2025JC022524, 2025)

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), Strong tides speed melting of Antarctic ice shelves, Eos, 106, https://doi.org/10.1029/2025EO250331. Published on 8 September 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Quantifying Predictability of the Middle Atmosphere

Fri, 09/05/2025 - 13:43
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Journal of Geophysical Research: Atmospheres

Atmospheric circulations are chaotic and unpredictable beyond a certain time limit. Quantifying predictability helps determine what forecast problems are potentially tractable. However, while predictability of weather close to the surface is a much-studied problem, showing a prediction limit of approximately 10 days, less is known about how predictable the atmosphere is at higher layers.

Garny [2025] applies a high-resolution global model to study atmospheric predictability from the surface to the mesosphere/lower thermosphere (MLT; 50-120 kilometers altitude), providing new understanding of coupling between atmospheric levels and fundamental behavior of the upper atmosphere. The author shows that the MLT is somewhat less predictable than lower atmospheric layers due to rapid growth of ubiquitous small-scale waves, with predictability horizons of about 5 days. However, atmospheric flows in the MLT on larger horizontal scales of a few thousand kilometers can remain predictable for up to 3 weeks.

This research highlights the importance of high-resolution, ‘whole atmosphere’ models to understand and predict circulations in the middle atmosphere and coupling from the surface to the edge of space.

Citation: Garny, H. (2025). Intrinsic predictability from the troposphere to the mesosphere/lower thermosphere (MLT). Journal of Geophysical Research: Atmospheres, 130, e2025JD043363. https://doi.org/10.1029/2025JD043363

—William Randel, Editor, JGR: Atmospheres

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Dust Is the Sky’s Ice Maker

Fri, 09/05/2025 - 13:10

Dust plays a major role in the formation of ice in the atmosphere. A new analysis of satellite data, published in Science, shows that dust can cause a cloud’s water droplets to freeze at warmer temperatures than they otherwise would. The finding brings what researchers had observed in the laboratory to the scale of the atmosphere and may help climate scientists better model future climate changes.

In 1804, French scientist Joseph Louis Gay-Lussac ascended to about 23,000 feet (7,000 meters) in a hydrogen balloon from Paris, without supplemental oxygen, to collect air samples. He noted that clouds with more dust particles tended to have more frozen droplets.

In the 20th century, scientists found that pure water can remain liquid even when cooled to −34.5°C. But once even tiny amounts of material, such as dust, are introduced, it freezes at much warmer temperatures.

“It’s like Schrödinger’s cat. Either there’s an ice crystal, or there’s a liquid droplet.”

In 2012, researchers in Germany were finally able to test this directly in a cloud chamber experiment. They re-created cloud conditions in the lab, introduced different types of desert dust, and gradually cooled the chamber to observe the temperatures at which droplets froze.

For Diego Villanueva, an atmospheric scientist at ETH Zürich in Switzerland and lead author of the new study, it was striking that scientists had uncovered these processes in the lab, yet no one had examined them in such detail in nature.

The challenges were obvious. To watch an ice crystal nucleate, researchers would need instruments on an aircraft or balloon to catch a micrometer-sized droplet in a cloud at just the right moment. “It’s like Schrödinger’s cat,” said Daniel Knopf, an atmospheric scientist at Stony Brook University who was not involved in the work.. “Either there’s an ice crystal, or there’s a liquid droplet.”

In the new study, Villanueva and his colleagues analyzed 35 years of satellite data on cloud tops across the Northern Hemisphere’s extratropics—a region spanning the U.S. Midwest, southern Canada, western Europe, and northern Asia. The researchers wanted to see whether dust influenced whether cloud tops were liquid or ice. They focused on cloud tops, rather than entire clouds, simply because the tops are visible in satellite imagery.

Desert Dust and Cold Clouds

Villanueva and his colleagues examined two satellite datasets covering 1982–2016, trying to infer microscopic details of cloud tops such as the number of ice crystals or droplet sizes. One dataset tracked whether cloud tops were liquid or ice, and the other measured how much dust was in the air at the same time. Although the team examined global patterns, they focused on the northern extratropical belt, where mixed-phase clouds are common and large amounts of dust from deserts like the Sahara and Gobi circulate.

But the “dataset quality was just so poor that everything that came out was basically just noise,” Villanueva added. In the end, the researchers focused on a simpler detail: the fraction of clouds with ice at their tops. “This took me nearly 3 years,” Villanueva said.

The analysis revealed that regions with more dust had more ice-topped clouds. The effect was strongest in summer, when desert winds lift the most dust.

A distinctive pattern emerged: A tenfold increase in dust roughly doubled the likelihood of cloud tops freezing. “You’d need 100 times more dust to see freezing become 4 times as frequent,” Villanueva explained.

“I think the study is quite elegant.”

The new work showed that the same processes researchers have observed at the microscale in laboratories occur at much larger scales in Earth’s atmosphere. Even after accounting for humidity and air movement, dust remained the key factor for ice nucleation in most instances, though there are exceptions. In some places, such as above the Sahara, few clouds form despite the presence of dust, perhaps, the authors suggest, because the movement of large swaths of hot air prevents freezing.

“I think the study is quite elegant,” Knopf said. He explained that taking 35 years of satellite data, finding a relationship between dust levels and frozen cloud top rates, and then showing that it lines up perfectly with lab experiments is basically “the nail in the coffin” for proving dust’s role in ice nucleation. Scientists now have robust satellite evidence of dust aerosols directly affecting cloud freezing, matching what laboratory experiments had predicted.

The finding has implications for climate modeling. To predict the effects of climate change more accurately, models must account for dust and the ways it affects cloud freezing and helps shape precipitation. Liquid-topped clouds reflect more sunlight and cool the planet, whereas ice-topped clouds let in more sunlight and trap heat.

However, Knopf noted that there is more work to be done to understand exactly what the new observations mean for scientists’ understanding of climate. “If you want to really know the precipitation or climate impacts [of dust], you really need to know the number of liquid droplets or the number of ice crystals,” he said.

Villanueva is motivated to keep looking at clouds and aerosols. In the next 10–20 years, the Earth may have drier surfaces because of climate change, which will likely produce more dust aerosols in the atmosphere. He added, “I want to know how clouds will respond in the scenario.”

—Saugat Bolakhe (@saugat_optimist), Science Writer

Citation: Bolakhe, S. (2025), Dust is the sky’s ice maker, Eos, 106, https://doi.org/10.1029/2025EO250328. Published on 5 September 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Cruise to Measure Gulf Dead Zone Faces Stormy Funding Future

Fri, 09/05/2025 - 13:05

This story was originally published in the Louisiana Illuminator.

Despite being called a “cruise,” the people on board The Pelican described the experience on the hypoxia monitoring expedition as very different from the elaborate dinners on a towering vacation ship or booze- and buffet-filled Caribbean itinerary.

Passengers describe waves up to 5 feet high in the Gulf of Mexico, swinging the 116-foot research vessel like a pendulum, plaguing anyone who didn’t have sturdy sea legs with bouts of seasickness. Daytime temperatures in late July soared ever higher as sweat dripped down the backs of hard-hat covered heads.

The guests on board The Pelican weren’t seeking pleasure or status. They were unpaid students and researchers who say they weathered the conditions in the name of science itself.

“It’s not glamorous, but it is very important.”

“It’s not glamorous, but it is very important,” said Cassandra Glaspie, assistant professor at Louisiana State University and the chief scientist for the National Oceanic and Atmospheric Administration’s annual hypoxia cruise.

The 11-day voyage provides vital information on the sea life and environmental conditions within the seasonal low-oxygen zone that develops off the coast of Louisiana. The data the cruise collects informs state and federal efforts to reduce the size of the “dead zone” and sheds light on impacts to those who rely on the water for their livelihoods, like shrimpers and fishermen.

Now, after its 40th year and 38th hypoxia cruise, The Pelican’s annually planned journey faces challenges to stay afloat, potentially undermining decades of research and future plans to get the dead zone under control.

A Decades Long Struggle

Biologists, undergraduate student researchers and crew alike celebrated the cruise’s 40th anniversary aboard The Pelican with a party that had an “old bird” theme, chosen to honor the boat, which has also been sailing for 40 years.

The Pelican and the hypoxia cruise’s 40th anniversaries party on the water. Credit: Yuanheng Xiong

More than just an excuse to eat cake (with rainbow sprinkles), the purpose of the cruise is to capture information snapshots of just how bad conditions get in the dead zone.

“We bring water up to the surface. We have a little chemistry lab…to figure out what the oxygen level is chemically, and then we can validate that against what our sensors are telling us,” Glaspie said.

The low-oxygen area appears annually as nutrients, primarily from agricultural fertilizers from the massive Mississippi River Basin, drain downriver and spur algae overgrowth.

Algae eat, defecate and die, using up the oxygen in the water when they decompose and sink to the bottom. Fish, shrimp and other marine life leave the low oxygen area when they can and suffocate when they can’t, putting pressure on the vital commercial Gulf fishery and the people who rely on it. Exposure to low-oxygen waters can also alter reproduction, growth rates and diet in fish species.

Glaspie took over the work of coastal scientist Nancy Rabalais, who launched the maiden cruise in 1985 and led it for decades after. Every summer begins with a forecast of the zone’s predicted size, estimated by various scientific models and measurements of nitrogen and phosphorus throughout the river basin taken throughout the year.

“A lot of times with pollution, you hear anecdotal evidence of how it might be increasing cancer rates or it might be causing fisheries to fail,” Glaspie said. “Here, we have an actual, measurable impact of nutrient pollution in the Mississippi River watershed.”

The Mississippi River/Gulf of America Hypoxia Task Force, an interagency federal, state and tribal effort to reduce the size of the dead zone, uses data from the cruise to determine whether it is meeting its goals.

In the past five years, the dead zone has been as large as 6,700 square miles, and even larger in previous years, reaching nearly the size of New Jersey.

While still more than two times the size that the Task Force wants, the Gulf dead zone was slightly smaller than forecasted this year, about the size of Connecticut at around 4,400 square miles.

Federal and state officials lauded the limited success of the zone’s smaller size in a July 31 press conference held to discuss the results of the hypoxia cruise’s 2025 findings. They also called for continued work.

“It requires strong collaboration between states, tribes, federal partners and stakeholders,” said Brian Frazer, the EPA’s Office of Wetlands, Oceans and Watersheds director.

Mike Naig, Iowa’s agriculture secretary, said states should be “scaling up” initiatives to reduce nutrient pollution.

Whether or not this will actually happen is uncertain.

Funding Cuts

Since the Trump administration took office, funding for nutrient reduction efforts upriver as well as money to operate the cruise itself have been scaled back or cut entirely.

The Environmental Protection Agency’s 319 and 106 funding programs under the Clean Water Act are the main funding mechanisms for states to reduce nutrient pollution throughout the Basin. Those grants aren’t funded in President Trump’s proposed FY 2026 budget, said Frazer.

The 106 programs have historically doled out $18.5 million annually, according to the EPA, with additional money sometimes allocated from Congress. The 319 program provided $174.3 million in FY 2025.

The cuts to these programs are not yet final. Congress can decide to add in additional funding, and has in past years.

States rely on both funds to reduce and monitor nutrient runoff in their waters, said Matt Rota, senior policy director for Healthy Gulf, a nonprofit research group. Rota has monitored policy changes surrounding the Gulf dead zone for more than 20 years, and he questions whether current reduction strategies can be maintained, let alone efforts redoubled.

“It’s always good to see a dead zone that’s smaller than what was predicted,” Rota said. “I am not confident that this trend will continue.”

“It’s a relatively inexpensive program. … This is baseline stuff that our government should be doing.”

Aside from cuts to reduction efforts, money for The Pelican’s annual cruise is also slipping away. Glaspie said that, ideally, the cruise has 11 days of funding. It costs about $13,000 a day to operate the vessel, she said.

“It’s a relatively inexpensive program” with big payoffs for seafood industry workers who rely on the water for their livelihoods, Rota said. “This is baseline stuff that our government should be doing.”

Funding for the hypoxia cruise has been part of the National Oceanic and Atmospheric Administration’s annual operational budget, making it a more reliable source than grant funding. But with the Trump administration taking a hatchet to government-backed research, there is increasing uncertainty over whether The Pelican and its crew will embark upon future missions.

This year, Glaspie said, NOAA defunded a day of the cruise. The Gulf of America Alliance, a partnership group to support the Gulf’s economic and environmental health amongst the five bordering states, stepped in to make up the difference. Glaspie said having that additional day was a saving grace for the research.

“This is a fine-tuned machine, and the consequences for cutting it short are really predictable and well-known,” she said. “If I’m asked to create an estimate of the surface area of hypoxia, and we’re not able to cap off the end in Texas waters, I’m not really going to be able to give a reliable estimate.”

Even without additional cuts, Glaspie said she already conducts the hypoxia cruise “on a shoestring budget.” Researchers on board don’t get paid, and every person who supports its mission—besides the crew that runs the boat—are volunteers.

“It’s tough for me to not pay people. I mean, they’re working solid 12-hour shifts. It is not easy, and they are seasick for a lot of this, and they can’t call home,” Glaspie said. “It doesn’t sit well with me to not pay people for all this work, but this is what we’ve had to do because we don’t have the money to pay them.”

Students Jorddy Gonzalez and Lily Tubbs retrieve the CTD sensor package after measuring dissolved oxygen at a regular stop on the annual hypoxia cruise while students watch. Credit: Cassandra Glaspie A Rapidly Changing Gulf

Defunding research as climate change intensifies—creating extreme heat in the Gulf—could further undermine hypoxia containment efforts and the consistency of decades worth of data collection.

“I think the rising temperatures is a big question,” Rota said.

“We have 40 years of data, which is almost a gold standard,” Glaspie said. “We’ve just reached that threshold where we can really start to ask some more detailed questions about the impacts of hypoxia, and maybe the future of hypoxia.”

Despite this year’s smaller zone surface area, low oxygen levels went deeper into the water than Glaspie had ever seen before.

“The temperature drops [as the water gets deeper], the salinity increases, and the oxygen just goes basically to zero,” she said.

In some areas, Glaspie’s measurements showed negative oxygen levels.

“Oxygen doesn’t go in the negative. It was just so low that the sensor was having trouble with it,” she said. “It’s the first time I’ve seen it like this.”

She also noticed unusually large amounts of algae on the surface of the water “like ectoplasm in Ghostbusters.”

The smaller-than-forecasted size of the dead zone surprised researchers on The Pelican who saw just how deep the low oxygen levels went.

“None of us really thought until the estimate came out that it was below average size because we’re able to see the three-dimensionality of it. That’s not really incorporated into that estimate,” Glaspie said.

She also noticed unusually large amounts of algae on the surface of the water “like ectoplasm in Ghostbusters.” Toxic algae blooms can kill fish and other sea life as well as poison humans.

“If I had to say what would be important for us to monitor in the future, it would be these algal blooms, and making sure that we’ve got a good handle on which ones have harmful species,” she said.

This is why Glaspie, donned in her sun-protective clothes and work boots, braves the waves, the heat and the journey across the Gulf every year.

“This is our finger on the pulse of our nutrient pollution problem that Louisiana is inheriting from the entire country,” Glaspie said. “We cannot take our finger off that pulse. It is unfair to Louisiana. We have this pollution problem. We need to understand it.”

—Elise Plunk (@elise_plunk), Louisiana Illuminator

This story is a product of the Mississippi River Basin Ag & Water Desk, an independent reporting network based at the University of Missouri in partnership with Report for America, with major funding from the Walton Family Foundation.

Radar Surveys Reveal Permafrost Recovery After Wildfires

Thu, 09/04/2025 - 14:31
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Permafrost is considered a critical global component of the cryosphere given its climate-sensitive nature and its key geomorphological and ecosystem role. Permafrost is also affected by wildfires which may cause the crossing of a tipping point in cryospheric systems. In fact, wildfires may reduce vegetation, destroy organic layers, modify surface albedo, leading to active layer thickening and ground subsidence. Permafrost itself is subjected to long term deformation after wildfires, but this deformation is currently poorly understood.

Cao and Furuya [2025] use remote sensing to explore ground surface deformation signals across multiple fire scars in the past five decades in North Yukon. The authors find that post-permafrost evolution follows three distinct phases characterized by land subsidence soon after the event and final recovery of the permafrost over a 50-year timescale, which implies soil uplift. Such an uplift phase is rarely reported and is related to vegetation regeneration and soil greening after the fire. These provide thermal protection, suggesting a critical mechanism of permafrost recovery. These findings highlight the resilience of boreal-permafrost systems against wildfires, but continuous monitoring is needed as wildfire and climate change intensify.

Citation: Cao, Z., & Furuya, M. (2025). Decades-long evolution of post-fire permafrost deformation detected by InSAR: Insights from chronosequence in North Yukon. AGU Advances, 6, e2025AV001849. https://doi.org/10.1029/2025AV001849

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

An Accessible Alternative for Undergraduate Research Experiences

Thu, 09/04/2025 - 13:33

Undergraduate research experiences (UREs) in science, technology, engineering, and mathematics (STEM) offer students hands-on research experience and essential professional skills and connections to prepare them to succeed in the workforce. They also cultivate students’ sense of belonging, confidence, and identity—and promote retention—in STEM fields [National Academies of Sciences, Engineering, and Medicine, 2017; Rodenbusch et al., 2016].

To be effective, UREs should be thoughtfully designed to meet the needs of students who may otherwise miss out on career opportunities tied to networking and community-building through such programs. Existing URE programs have followed a range of approaches, but traditionally, many have been centered around short-duration, time-intensive, individual, mentor-directed experiences, such as full-time summer internships in field or laboratory settings. However, these traits can inadvertently exclude some student populations, a concern that is leading many programs to modify their structure and design to engage broader groups.

To lower barriers to participation in UREs, we developed the Authentic Research through Collaborative Learning (ARC-Learn) program at Oregon State University (OSU). ARC-Learn, which ran from 2021 to 2024 and comprised two overlapping student cohorts, offered a long-term, low-intensity program focused on Arctic science and inclusive mentorship. It was designed to help students engage in a science community, foster identities as STEM professionals, and develop critical scientific and data literacy skills and 21st century competencies such as teamwork and communication.

Table 1. Design Features of ARC-LearnFeatureDescriptionDuration18 months (2 academic years)Intensity2–4 hours per weekLocationOn campus or remoteMentorshipMultiple mentors working in teams with multiple studentsTopic selectionStudent drivenStudent supportMentors, peers, program administrators, academic advisorsMentorship developmentInclusive mentorship training, facilitated peer learning communityResearch tasksDevelop research question, find data and analyze data, draw conclusions, and present findingsStudent developmentDiscover own strengths as researchers, work with a team, supplemental training in missing skills

ARC-Learn incorporated alternative design features (Table 1) to meet the needs of students who do not typically have access to time-intensive field or lab-based UREs, such as transfer students, remote students, and those with dependent care, military service, and other work commitments [Scott et al., 2019] or who have nontraditional enrollment patterns (e.g., dual enrollment in both university and community college, varying enrollment from term to term).

The program was framed in the context of Arctic science because of the region’s outsize effects on climate, ecosystems, and communities globally and to engage students with long-term research investments in polar regions [Marrongelle and Easterling, 2019]. The Arctic also offers a dynamic and interdisciplinary context for a URE program, enabling students to follow their interests in investigating complex science questions. In addition, numerous long-term Arctic monitoring programs offer rich datasets useful in all kinds of STEM careers.

Despite encountering challenges, the ARC-Learn model proved successful at engaging and motivating students and also adaptive as program organizers made adjustments from one cohort to the next in response to participant feedback.

The ARC-Learn Model

With support from mentors and peers, students experienced the whole research arc and gradually took ownership of their work.

Each ARC-Learn cohort lasted 2 academic years and included a dozen or more students. Participants received a stipend to offset costs associated with participation, such as childcare and missed work time, and had the option of obtaining a course credit each term to meet experiential learning requirements. With support from mentors and peers, they experienced the whole research arc and gradually took ownership of their work through three key phases of the program.

Early year 1: Build research teams. Some URE mentorship models involve a mentor primarily driving selection of a research topic and the student completing the work. In ARC-Learn, students learned from multiple mentors and peers, while mentors supported each other and received feedback from students (Figure 1). The students self-selected into research teams focused on a broad topic (e.g., marine heat waves or primary productivity), then developed individual research questions based on their strengths and interests.

Fig. 1. Some models of undergraduate research experiences have involved a mostly one-way transfer of knowledge from a single mentor to a single student, with the mentor deciding the research topic and the student completing the work. In ARC-Learn, students learned from multiple mentors and peers as part of small-group research teams, while mentors supported each other and received feedback from students.

Mentor-student teams met every other week—and students met one-on-one with mentors as needed—to support individual projects. The entire cohort also met twice a month to discuss topics including the fundamentals of Arctic science and the scientific process and to report out on progress toward milestones.

Late year 1 to middle of year 2: Develop research questions and find and analyze data. With no field or lab component to the program, ARC-Learn students worked exclusively with existing data. These data came from NASA and NOAA satellite-based sources such as the Moderate Resolution Imaging Spectroradiometer (MODIS), Advanced Very High Resolution Radiometer, and Soil Moisture Active Passive (SMAP) instruments; shipboard sources such as NOAA’s Distributed Biological Observatory, the Alaska Ocean Observing System, and the University-National Oceanographic Laboratory System’s Rolling Deck to Repository; and the National Science Foundation’s (NSF) Arctic Data Center and NOAA’s National Centers for Environmental Information.

Students often revised their research questions or the datasets they used multiple times to produce meaningful findings (Figure 2). Notably, access to these datasets proved critical to the educational experience of ARC-Learn students, highlighting the importance of maintaining them in public archives for future URE activities.

Fig. 2. ARC-Learn students developed their own research questions and worked exclusively with existing data to answer them. Students often revised their research questions or datasets multiple times to produce meaningful findings.

Many students struggled with finding, cleaning, analyzing, and interpreting data, often because of limited experience with tools such as geographic information system software and programming languages such as Python and R. At times, the required expertise was beyond even their mentors’ knowledge. Hands-on skill development workshops during cohort meetings connected students with additional mentors proficient in specific platforms and tools to help fill knowledge gaps and help students overcome obstacles.

Although the students encountered occasional setbacks, they reported that achievements such as settling on a final research question and creating rich data visualizations proved deeply rewarding and motivated further progress.

Late year 2: Share the results. Over several months, students created research posters with feedback and support along the way from their teammates, mentors, and the entire cohort. The program concluded with a grand finale, featuring on-campus gatherings for remote and in-person students, a dress rehearsal poster session, a celebratory dinner, and final presentations at a university-wide undergraduate research symposium.

Zoe’s Story

After a successful 7-year military career, Zoe enrolled at OSU to study the Arctic through her participation in ARC-Learn. As a student in cohort 2, she experienced several challenges along the research arc before finding success, and her experience helps illustrate the program’s model.

Zoe joined fellow students and mentors in the Marine Heatwaves research team and then narrowed her focus by exploring scientific literature and talking with her primary mentor to understand physical and chemical factors associated with marine heat waves as well as their effects on ocean ecosystems. She developed several research questions focused on how factors such as atmospheric pressure and temperature have affected the development and extent of marine heat waves off Alaska since 2010.

As Zoe and her mentor considered available datasets and relevant literature further, they realized that her questions were still too broad given the number of variables affecting ocean-atmosphere interactions. At one of the full-cohort meetings, she shared her difficulties and frustrations, prompting another mentor to offer their help. This mentor worked with Zoe to understand a key meteorological feature—the Aleutian Low—in the area she was studying, as well as relevant data available through the European Union’s Copernicus Climate Change Service [Hersbach et al., 2023] and the appropriate analysis platform.

“We jumped in and learned it together. She helped me find the right data, which in turn, allowed me to finalize my research question,” Zoe said.

Nuanced and iterative feedback from mentors and peers guided ARC-Learn participants, including Zoe, to design posters that balanced visual presentations of data alongside descriptive text to explain research findings. Credit: Ryan Brown

From that point, Zoe quickly landed on a focused question that she could address: Does a disruption in the Aleutian Low lead to marine heat waves over the North Pacific region? The final step was to develop a visually striking poster to invite attention, questions, and ideas during the research symposium.

“Seeing other people interested in my research…was validating of me as a scientist.”

Zoe’s experience at the poster session captured what we heard from many other students in the program. Even after her 2 years of being immersed in her project and working with mentors and peers, she said she felt imposter syndrome as a student trying to become a scientist and thought no one would care about her research.

“But people were really interested,” she said. “Seeing other people interested in my research, able to read and understand it on a poster, [and] ask me questions and suggest ideas was validating of me as a scientist.”

A Responsive Approach to URE Design

Through ARC-Learn, program leads sought to expand knowledge about the benefits and challenges of a long-duration, lower-intensity, team-based URE model. Because it was a design-based research program, mentor, student, and coordinator feedback was collected and continually used to make program adjustments [Preston et al., 2022, 2024].

Feedback was collected through pre-, mid-, and end-of-program surveys, as well as pre- and end-of-program interviews, and analyzed by a research and evaluation team. Findings were reported to the program leads, who also met regularly with external expert advisers to get additional recommendations for adjustments. By running two overlapping cohorts (the second started when the first was halfway completed), organizers could address issues that arose for the first cohort to improve the experience of the second one.

Lessons from ARC-Learn are documented in a practitioner guidebook, which discusses practical considerations for others interested in implementing alternative URE models [Brown et al., 2024]. In the guidebook, we examine each design component of ARC-Learn and offer recommendations for designing UREs that meet enrolled students’ specific learning needs and develop their science skills to meet relevant workforce demands.

Novel elements of the Authentic Research through Collaborative Learning (ARC-Learn) program were important in influencing participants’ persistence and success.

A few valuable lessons learned include the following.

Attrition. Expect high attrition rates in UREs designed for nontraditional students, and do not react by making drastic program changes that risk sacrificing otherwise successful program elements. We observed a 45% attrition rate in each cohort, which is indeed high but perhaps not surprising considering the population involved in the program—largely transfer students and those with caregiving or work responsibilities.

Most participants who left did so because of life crises or obligations that paused their research and educational goals. This observation embodies the complexity of students’ lives and reinforces the need for continually creative, flexible, inclusive program structures. For those who completed ARC-Learn, novel elements of the program (e.g., working in teams) were important in influencing their persistence and success.

Remote research applications. The first cohort started in 2021 entirely via remote instruction during the COVID-19 pandemic, before eventually transitioning to a hybrid approach as in-person instruction resumed. All ARC-Learn students in cohort 1 returned to campus, except one Ecampus student, who remained online. The program team and mentors struggled to balance the needs of the remote student, who eventually became somewhat detached from their research team.

As teamwork, camaraderie, and inclusivity are important qualities of the program, we decided for cohort 2 to recruit enough Ecampus students (plus two dedicated mentors) to form a research team of their own. The remote team was engaged and productive—meeting deadlines and producing high-quality work—highlighting the potential of all-remote URE models for students who might otherwise lack access to meaningful research opportunities.

Student-driven research. ARC-Learn empowered students to pursue their own research questions, fostering their autonomy and ownership of their work. However, the open-endedness of selecting their own research paths and the lack of guardrails proved challenging for participants.

We thus hired a program coordinator to provide one-on-one logistical support; establish clear expectations, timelines, and scaffolded assignments; and arrange workshops to teach programming and data analysis skills. This approach, as reported by students who worked with the coordinator, helped many program participants stay on track and ultimately complete their research project.

Mentor coordination. Enabling student success also meant supporting mentors. Organizers provided inclusive mentorship trainings and facilitated a peer learning community. They also made programmatic adjustments in response to experiences in the first cohort.

The student-driven nature of the research sometimes resulted in mismatches between student interests and mentor expertise in cohort 1. So in cohort 2, we engaged mentors earlier in the planning process to define thematic areas for the research teams, creating topics broad enough for students to find an area of interest but narrow enough for mentors to provide guidance. In addition, many mentors had field schedules typical of polar scientists, often resulting in weeks to months at sea. We purposefully paired mentors and asked about planned absences so we could fill any gaps with additional support.

Overall, students in cohort 2 reported feeling highly supported and valued by their mentors and that mentors created welcoming environments to ask questions and solve problems together.

A Foundation to Build On

Participants gained a deep understanding of the complexities and challenges of modern science as well as knowledge and skills needed in scientific education and careers.

From students’ feedback—and the research they did—it’s clear that participants who completed the ARC-Learn program gained a deep understanding of the complexities and challenges of modern science as well as knowledge and skills needed in scientific education and careers. The program thus highlights paths and lessons for others looking to develop successful alternatives to traditional UREs.

Many former ARC-Learn students are continuing to develop research skills, particularly in polar science, through internships and employment in field and lab research efforts. Zoe is working toward a bachelor’s degree in environmental sciences and exploring interests in environmental hazards, conservation, and restoration. For her, the program served as a foundation from which she is building a career and establishing confidence in herself as a scientist.

“I thought I’d have to play catch-up the whole time as an older, nontraditional student,” she said. But through the experience, “I realized I could start anywhere.”

Acknowledgments

ARC-Learn was a collaboration between OSU’s College of Earth, Ocean and Atmospheric Sciences and STEM Research Center. This work is supported by the U.S. NSF (award 2110854). Opinions, findings, conclusions, and recommendations in these materials are those of the authors and do not necessarily reflect the views of NSF.

References

Brown, R., et al. (2024), ARC-Learn Practitioner Guidebook: Practical considerations for implementing an alternative model of undergraduate research experience, Ore. State Univ., Corvallis, https://doi.org/10.5399/osu/1177.

Hersbach, H., et al. (2023), ERA5 monthly averaged data on single levels from 1940 to present, Copernicus Clim. Change Serv. Clim. Data Store, https://doi.org/10.24381/cds.f17050d7.

Marrongelle, K., and W. E. Easterling (2019), Support for engaging students and the public in polar research, Dear Colleague Letter prepared for the U.S. National Science Foundation, Alexandria, Va., www.nsf.gov/funding/opportunities/dcl-support-engaging-students-public-polar-research/nsf19-086.

National Academies of Sciences, Engineering, and Medicine (2017), Undergraduate Research Experiences for STEM Students: Successes, Challenges, and Opportunities, 278 pp., Natl. Acad. Press, Washington, D.C., https://doi.org/10.17226/24622.

Preston, K., J. Risien, and K. B. O’Connell (2022), Authentic Research through Collaborative Learning (ARC-Learn): Undergraduate research experiences in data rich Arctic science formative evaluation report, STEM Res. Cent., Ore. State Univ., Corvallis, https://doi.org/10.5399/osu/1156.

Preston, K., J. Risien, and N. Staus (2024), Authentic Research through Collaborative Learning (ARC-Learn): Undergraduate research experiences in data rich science summative evaluation report, STEM Res. Cent., Ore. State Univ., Corvallis, https://doi.org/10.5399/osu/1178.

Rodenbusch, S. E., et al. (2016), Early engagement in course-based research increases graduation rates and completion of science, engineering, and mathematics degrees, CBE Life Sci. Educ., 15(2), ar20, https://doi.org/10.1187/cbe.16-03-0117.

Scott, G. W., S. Humphries, and D. C. Henri (2019), Expectation, motivation, engagement and ownership: Using student reflections in the conative and affective domains to enhance residential field courses, J. Geogr. Higher Educ., 43(3), 280–298, https://doi.org/10.1080/03098265.2019.1608516.

Author Information

Ryan Brown (ryan.brown@oregonstate.edu), Laurie Juranek, and Miguel Goñi, College of Earth, Ocean and Atmospheric Sciences, Oregon State University, Corvallis; and Julie Risien and Kimberley Preston, STEM Research Center, Oregon State University, Corvallis

Citation: Brown, R., L. Juranek, M. Goñi, J. Risien, and K. Preston (2025), An accessible alternative for undergraduate research experiences, Eos, 106, https://doi.org/10.1029/2025EO250326. Published on 4 September 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Spacecraft Surveys Shed New Light on Auroral Kilometric Radiation

Wed, 09/03/2025 - 18:53
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Auroral Kilometric Radiation (AKR) is a type of radio wave emitted from Earth’s auroral regions. It is the dominant radio emission from Earth and has been extensively studied, though previous analyses were constrained by limited spacecraft coverage.

Today, with the availability of more spacecraft observations, it is possible to improve our understanding of the Earth’s most intense natural radio emission. Thanks to these data, Wu et al. [2025]  find that Auroral Kilometric Radiation preferentially occurs at high-latitudes and on the Earth’s night-side. They also found that the dense plasmasphere, which is a region of high-density plasma around Earth, blocks AKR from traveling, thus forming an equatorial shadow zone around the plasmasphere. Furthermore, the authors discover that the low-density ducts within the plasmasphere act as waveguides, enabling AKR to penetrate the dense plasmasphere and propagate along these channels.

The findings provide valuable insights into Earth’s electromagnetic environments, space weather events and geomagnetic storms that may adversely affect satellites, communication systems, GPS, and power grids on Earth.  

Citation: Wu, S., Whiter, D. K., Zhang, S., Taubenschuss, U., Zarka, P., Fischer, G., et al. (2025). Spatial distribution and plasmaspheric ducting of auroral kilometric radiation revealed by Wind, Polar, and Arase. AGU Advances, 6, e2025AV001743. https://doi.org/10.1029/2025AV001743

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Bridging Old and New Gravity Data Adds 10 Years to Sea Level Record

Wed, 09/03/2025 - 13:38

As climate change accelerates, it’s more important than ever to understand the individual drivers of sea level rise, from land subsidence and coastal erosion to changes in ocean volume. For the past 20 years, scientists have had access to high-resolution, satellite-derived maps of Earth’s gravity field, which allows them to calculate fluctuations in global ocean mass.

Recently, geodesists have found a way to extend that record back 10 more years, significantly extending the time frame by which they can consistently measure global ocean mass change.

“This is the first observation-based global ocean mass time series” from 1993 to the present, said Jianli Chen, a geodesy researcher at Hong Kong Polytechnic University in China and a coauthor on the research.

By reconciling older and newer techniques for measuring ocean mass change, the team’s work improves calculations of long-term trends and provides a potential stopgap should satellite data no longer be available.

Shooting Lasers into Space

When scientists measure sea level rise, they consider two main components: how much the ocean’s volume has grown because of changes in water density—the steric component—and how much it has grown because it has gained mass from melted ice—the barystatic component.

Past estimates of total ocean mass change have relied on indirect methods like adding up mass loss from ice sheets, glaciers, and land water storage, explained Yufeng Nie, a geodesy researcher also at Hong Kong Polytechnic University and lead researcher on the new study. Mass lost from these areas is assumed to translate to an increase in ocean mass.

“But these individual estimates are not necessarily consistent, because they are developed by different groups” with different methodologies, Nie said.

In light of this, some researchers adapted satellite laser ranging (SLR), a technique in which scientists bounce ground-based lasers off orbiting satellites to track changes in ocean mass. SLR has been used for decades to measure Earth’s nonuniform gravity field by observing shifts in satellite orbits. A satellite’s altitude depends on Earth’s gravity at any given point, and gravity in turn depends on the distribution of mass beneath that point. Measuring satellite altitudes thus provides a window into measuring ocean mass changes.

“How can you observe, for example, ocean mass change from Antarctic melting using a technique with 4,000-kilometer spatial resolution?”

However, one key drawback to using SLR to measure barystatic sea level (BSL) change is that it can measure changes only on very large spatial scales, which limits its application in climate research, Chen said.

“How can you observe, for example, ocean mass change from Antarctic melting using a technique with 4,000-kilometer spatial resolution?” asked Chen.

Enter NASA’s Gravity Recovery and Climate Experiment (GRACE) missions. GRACE and its successor, GRACE Follow-On (GRACE-FO), each consisted of two satellites chasing each other along the same orbit, continuously sending laser beams back and forth. Like SLR, this process allowed the GRACE missions to provide maps of Earth’s surface mass, but at 10 times the resolution of SLR. And like with SLR, scientists have used GRACE gravity maps to track global ocean mass change.

But GRACE data, too, have their caveats. The first GRACE mission spanned 2002–2017, and GRACE-FO has spanned from 2018 to the present, a short time for understanding long-term trends. What’s more, the 11-month gap between GRACE and its successor meant that scientists were not able to calibrate the two satellites with each other, leaving some uncertainty about systematic differences between the missions.

A Near-Perfect Match

Nie, Chen, and their team were able to address both of these caveats by comparing SLR-based measurements of global ocean mass change with those from GRACE/-FO for the same time period, 2003–2022.

According to gravity maps provided by SLR, barystatic sea level change was 2.16 millimeters per year from 2003 to 2022, while GRACE/-FO measured 2.13 millimeters per year.

The new analysis shows that SLR and GRACE/-FO “agree quite well for the long-term trends,” Nie said. What’s more, researchers found no significant change in the calculation when the data transitioned from GRACE to GRACE-FO. “This gives us confidence that the SLR data, although it is of very low spatial resolution, can be used to tell us the ocean mass variations before 2002,” he added.

“Our SLR measurements…can provide a global constraint of the mass changes for the pre-GRACE era.”

The researchers were able to extend the time frame of their analysis back to 1993 by using SLR data, and they calculated a barystatic sea level change of 1.75 millimeters per year for 1993–2022. They attribute the lower rate of sea level rise in the past to recent acceleration of ice loss in Greenland.

“Our SLR measurements…can provide a global constraint of the mass changes for the pre-GRACE era,” Nie said.

This study was published in Proceedings of the National Academy of Sciences of the United States of America in June.

“Extending the record of measured BSL using satellite laser ranging back to 1993 is an important achievement,” said Bryant Loomis, chief of the Geodesy and Geophysics Laboratory at NASA’s Goddard Space Flight Center in Greenbelt, Md. “It allows the disaggregation of total sea level change, which is measured by altimetry, into its barystatic and steric components.”

“The long-term BSL estimate is also useful for assessing the accuracy of previous efforts to quantify the major land ice contributions to BSL prior to the launch of GRACE,” he added, referring to the method of adding together mass changes from glaciers, ice sheets, and land water storage. Loomis was not involved in the new research.

Nie, Chen, and their team are working to push the limits of SLR-derived barystatic sea level measurements to smaller spatial scales and lower uncertainties. They hope to demonstrate that SLR data can be used to measure mass change in Antarctica.

GRACE Continuity?

GRACE-FO launched in 2018 and is 7 years into its nominal 5-year mission. The satellites are in good health, and the nearly identical GRACE mission set a good precedent—it lived for more than 15 years. GRACE-FO might well overlap with its planned successor, GRACE-Continuity (GRACE-C), which is scheduled to launch in 2028.

The GRACE missions are designed to measure minute changes in Earth’s gravity at high spatial resolution. However, there was a coverage gap between the end of the GRACE mission and the start of GRACE-FO, and there may be a similar gap between GRACE-FO and GRACE-C. Credit: NASA/JPL-Caltech, Public Domain

However, recent woes for federally funded science in the United States have put GRACE-C’s future in doubt. Although NASA requested funding for GRACE-C for fiscal year 2026 through the mission’s launch, NASA’s acting administrator, Sean Duffy, recently stated his, and presumably President Donald Trump’s, desire to eliminate all Earth science at the agency (including healthy satellites). That cutback would likely nix GRACE-C.

In the near future, both Europe and China plan to launch satellite-to-satellite laser ranging missions that will provide GRACE-like measurements of Earth’s gravity, Chen said. However, the loss of GRACE-quality data would hamper climate scientists’ ability to accurately track drivers of sea level rise, he added. The SLR-derived measurements demonstrated in this recent research could help mitigate the loss, but only somewhat.

“There’s no way SLR can reach the same [resolution] as GRACE,” Chen said. “We can only use SLR to see the long-term, the largest scale, to fill the gap. But for many of GRACE’s applications—regional water storage or glacial mass change—no, there’s no way SLR can help.”

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

Citation: Cartier, K. M. S. (2025), Bridging old and new gravity data adds 10 years to sea level record, Eos, 106, https://doi.org/10.1029/2025EO250321. Published on 3 September 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Primera evaluación a nivel de especies revela riesgo de extinción en Mesoamérica

Wed, 09/03/2025 - 13:35

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

La reforestación es más compleja que simplemente plantar árboles. Esta incluye la evaluación de hábitats y ecosistemas, la identificación de la salud y la sostenibilidad de diferentes especies y el estudio de las estrategias para establecer nuevos asentamientos de árboles.

En regiones como Mesoamérica, donde los bosques están gravemente amenazados por las actividades humanas y el cambio climático, los conservacionistas interesados en la reforestación deben priorizar las especies cuyas poblaciones están disminuyendo. Para facilitar esta tarea, un grupo de investigadores evaluó el estado de conservación de las 4,046 especies de árboles endémicas de Mesoamérica, descritas en el proyecto Global Tree Assessment (Evaluación global de árboles). Es así como descubrieron que el 46% de estos árboles se encuentran en cierto riesgo de extinción.

Este estudio es el primero en evaluar el estado de todos los árboles endémicos en Mesoamérica.

El estudio, publicado en la revista Plants, People, Planet, es el primero en evaluar el estado de todos los árboles endémicos en Mesoamérica.

Emily Beech, autora principal del estudio y jefa de conservación en Botanic Gardens Conservation International (Conservación Internacional de Jardines Botánicos), enfatizó la importancia de enfocarse en esta región debido a sus altos niveles de biodiversidad, que con frecuencia están subrepresentados. Los países centroamericanos (Belice, Costa Rica, El Salvador, Guatemala, Honduras, Nicaragua y Panamá), dijo Beech, rara vez figuran entre los de mayor biodiversidad o como el hogar del mayor número de especies en peligro de extinción. Esta ausencia no se debe a una falta de biodiversidad, explicó, sino que es simplemente atribuible a su tamaño. El tamaño reducido de estos países hace que sean eclipsados por países grandes con bosques más extensos, como Brasil y la República Democrática del Congo. Pero, junto con México, Centroamérica alberga el 10% de la diversidad vegetal del mundo a pesar de representar menos del 1% de su superficie terrestre.

Para abordar esta brecha, los científicos primero identificaron árboles endémicos mesoamericanos a partir de evaluaciones presentadas en la Lista Roja de especies amenazadas de la Unión Internacional para la Conservación de la Naturaleza (IUCN, por sus siglas en inglés). Posteriormente, para evaluar el estado de conservación de los árboles, los investigadores superpusieron mapas de distribución de las especies arbóreas seleccionadas sobre mapas de la Base de Datos Mundial de Áreas Protegidas.

De las 4,046 especies arbóreas analizadas, encontraron que 1,867 están en peligro de extinción. México fue el único país que tenía especies arbóreas extintas en la base de datos, o extintas en estado silvestre. En los árboles existentes, México y Costa Rica presentaron el mayor número de especies amenazadas, con 888 y 227, respectivamente. La amenaza más común en general fue la pérdida de hábitat debido a la expansión agrícola.

La mayoría de las especies (3,349) contaban con al menos un punto de datos dentro de un área protegida. Sin embargo, el 72% de las especies mesoamericanas en áreas protegidas están amenazadas.

Un enfoque personalizado

Neptalí Ramírez Marcial no participó en la nueva investigación, pero como jefe del grupo de restauración del South Border College en México, trabaja con especies arbóreas que se encuentran en diferentes categorías de amenaza. Los bosques de Chiapas, donde él y sus colegas residen, solían estar repletos de encinos, que albergaban altos niveles de biodiversidad. Debido a la influencia humana, ahora hay más pinos que encinos, y el clima es menos favorable para las especies sensibles de la Lista Roja de la UICN.

A pesar del uso de la Lista Roja por parte de Ramírez Marcial, este se mantiene crítico con la herramienta y su uso en la investigación. Por ejemplo, señaló que la nueva evaluación de árboles mesoamericanos clasifica a la Furcraea macdougallii (planta del siglo de MacDougall) como extinta en México. Ramírez Marcial cree que esta planta es similar al agave y no debería considerarse un árbol en absoluto, por lo cual no debería incluirse en el estudio.

También señaló que el nuevo estudio considera a todo México como parte de Mesoamérica. Desde el punto de vista ecológico, dijo, la región biogeográfica mesoamericana se extiende solamente por el centro de México y excluye la parte norte del país, la cual tiene ecosistemas discretos no compartidos con Centroamérica.

Ocotea monteverdensis “pasó de no estar siquiera incluido en la lista a estar en la categoría de conservación más vulnerable”.

Ramírez Marcial coincidió con las conclusiones del nuevo estudio, sin embargo, argumenta que: las estrategias de restauración deben considerar la biodiversidad de las áreas que se desean proteger. Por ejemplo, señaló que los programas del gobierno mexicano priorizan la distribución de pinos para la reforestación en todo el país, en lugar de diseñar estrategias definidas para cada región.

Daniela Quesada, conservacionista del Instituto Monteverde en Costa Rica, afirmó que el nuevo estudio ofrece una visión más completa del estado de los árboles en Mesoamérica. No obstante, al igual que Ramírez Marcial, considera la información de la Lista Roja de la UICN como un punto de partida para la investigación. La exactitud de la Lista Roja, explicó, depende de la cantidad de información que se le presente.

Quesada apuntó que el siguiente paso para la conservación de los árboles en Mesoamérica es que los científicos “analicen con más detalle cada especie que apareció” en el nuevo estudio. Un análisis riguroso de la presencia e influencia de cada especie en cada región podría influir en el desarrollo de proyectos de conservación determinados.

Como ejemplo, mencionó el caso de Ocotea monteverdensis, un árbol que “pasó de no estar siquiera incluido en la lista a estar en la categoría de conservación más vulnerable” (en peligro crítico) gracias al trabajo del ecólogo John Devereux Joslin Jr. Este reconocimiento condujo al desarrollo de un programa comunitario de conservación específico y continuo para este árbol.

—Roberto González (@perrobertogg.bsky.social), Escritor de ciencia

This translation by translator Oriana Venturi Herrera (@OrianaVenturiH) was made possible by a partnership with Planeteando y GeoLatinas. Esta traducción fue posible gracias a una asociación con Planeteando and GeoLatinas.

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Experienced Researcher Book Publishing: Sharing Deep Expertise

Wed, 09/03/2025 - 12:00
Editors’ Vox is a blog from AGU’s Publications Department.

Being an experienced researcher can come with a lot of heavy professional responsibilities, such as leading grant proposals, managing research teams or labs, supervising doctoral students and postdoctoral scientists, serving on committees, mentoring younger colleagues … the list goes on. This may also be a time filled with greater personal responsibilities beyond the job. Why add to the workload by taking on a book project? In the third installment of career-focused articles, three scientists who wrote or edited books as experienced researchers reflect on their motivations and how their networks paved the way for—and grew during—the publishing process.

Douglas Alsdorf co-edited Congo Basin Hydrology, Climate, and Biogeochemistry: A Foundation for the Future, which discusses new scientific discoveries in the Congo Basin and is published in both English and French. Nancy French co-edited Landscape Fire, Smoke, and Health: Linking Biomass Burning Emissions to Human Well-Being, which presents a foundational knowledge base for interdisciplinary teams to interact more effectively in addressing the impacts of air pollution. Michael Liemohn authored Data Analysis for the Geosciences: Essentials of Uncertainty, Comparison, and Visualization, a textbook on scientific data analysis and hypothesis testing in the Earth, ocean, atmospheric, space, and planetary sciences. We asked these scientists why they decided to write or edit a book, what impacts they saw as a result, and what advice they would impart to prospective authors and editors.

Why did you decide to write or edit a book? Why at that point in your career?

ML: I was assigned to develop a new undergraduate class on data-model comparison techniques. I realized that the best textbooks for it were either quite advanced or rather old. One book I love included the line, “if the student has access to a computer…” in one of the homework questions. I also was not finding a book with the content set that I wanted to cover in the class. So, I developed my own course content set and note pack, which provided the foundation for the chapters of the book.

DA: Our 2022 book was a result of a 2018 AGU Chapman Conference in Washington, DC, that I was involved in organizing. About 100 researchers, including 25 from sub-Saharan Africa, attended the conference, and together we decided that an edited book in the AGU Geophysical Monograph Series would become a launching point for the next decade of research in the Congo Basin.

The motivation for the book was not to advance my career, but because the topic was important to get out there.

NF: The motivation for the book was not to advance my career, but because the topic was important to get out there. The book looks at how science is trying to better inform how to manage smoke from wildland fires. The work was important because people in fire, smoke modeling, and health sciences do not work together often, and there were some real misconceptions about how others do the research and how detailed the topics can be.

What were some benefits of completing a book as an experienced researcher? 

NF: Once you have been working in a field for a while you want to see how your deep expertise can benefit more than just the community of researchers that you know or know of. Reaching into other disciplines allows you to understand how your work can have broader impact. And, you are ready to know more about other, adjacent topics, rather than a deeper view of what you know already. I think these feelings grow more true as you move to later stages of a career.

I think that I would have greatly struggled with this breadth of content if I had tried to write this particular book 10 years earlier.

ML: I was developing my data-model comparison techniques course and textbook for all students in my department, so I wanted to include examples across that diverse list of disciplines—Earth, atmosphere, space, and planetary sciences. Luckily, over the years I had taught a number of classes spanning these topics. Additionally, I had attended quite a few presentations across these fields, not only at seminars on campus but also at the annual AGU meeting. I felt comfortable including examples and assignments from all these topics. Also, I knew colleagues in these fields, and I called on them for advice when I got stuck. I think that I would have greatly struggled with this breadth of content if I had tried to write this particular book 10 years earlier.

What impact do you hope your book will have?

The next great discoveries will happen in the Congo Basin and our monograph motivates researchers toward those exciting opportunities. 

DA: There are ten times fewer peer-reviewed papers on the Congo Basin compared to the Amazon Basin. Our monograph changes that! We have brought new attention to the Congo Basin, demonstrating to the global community of Earth scientists that there is a large, vibrant group of researchers working daily in the Congo Basin. The next great discoveries will happen in the Congo Basin and our monograph motivates researchers toward those exciting opportunities. 

ML: I hope that the book has two major impacts. The first expected benefit is to the students that use it with a course on data-model comparison methods. I want it to be a useful resource regardless of their future career direction. The second impact I wish for is on Earth and space scientist researchers; I hope that our conversations about data-model comparisons are ratcheted up to a higher level, allowing us to more thoughtfully conduct such assessments and therefore maximize scientific progress.

What advice would you give to experienced researchers who are considering pursuing a book project?

NF: Here are a few thoughts: One: Choose co-authors, editors, and contributors that you can count on. Don’t try to “mend fences” with people you have not been able to connect with. That said, if you do admire a specific person or know their point of view is valuable, this is the time to overcome any barriers to your relationship. Two: Give people assignments, and they will better understand your point of view. Three: Listen to your book production people. They are all skilled professionals who know more about this than you do. They can be great allies in getting it done!

DA: Do it! Because we publish papers, our thinking tends to focus on the one topic of a particular paper. A book, however, broadens our thinking so that we more fully understand the larger field of work. Each part of that bigger space has important advances as well as unknowns that beg for answers. A book author who can see each one of these past solutions and future challenges becomes a community resource who provides insights and directions for new research. 

—Douglas Alsdorf (alsdorf.1@osu.edu, 0000-0001-7858-1448), The Ohio State University, USA; Nancy French (nhfrench@mtu.edu, 0000-0002-2389-3003), Michigan Tech Research Institution, USA; and Michael Liemohn (liemohn@umich.edu, 0000-0002-7039-2631), University of Michigan, USA

This post is the third in a set of three. Learn about leading a book project as an early-career or mid-career researcher.

Citation: Alsdorf, D., N. French, and M. Liemohn (2025), Experienced researcher book publishing: sharing deep expertise, Eos, 106, https://doi.org/10.1029/2025EO255028. Published on 3 September 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Inside a Georgia Beach’s High-Tech Fight Against Erosion

Tue, 09/02/2025 - 13:09

This story was originally published by Grist. Sign up for Grist’s weekly newsletter here. This coverage is made possible through a partnership between Grist and WABE, Atlanta’s NPR station.

At low tide on Tybee Island, Georgia, the beach stretches out as wide as it gets with the small waves breaking far away across the sand—you’ll have a long walk if you want to take a dip. But these conditions are perfect for a team of researchers from the University of Georgia’s Skidaway Institute of Oceanography.

Every three months, at low tide, they set out a miniature helipad near the foot of the dune and send up their drone equipped with lidar—technology that points a laser down at the sand and uses it to measure the elevation of the beach and dunes. The team flies it back and forth from the breakers to the far side of the dune and back until they have a complete, detailed map of the island’s 7-mile beach, about 400 acres.

“I see every flip-flop on the beach.”

“It’s high accuracy, it’s a high resolution,” explained research technician Claudia Venherm, who leads this project. “I see every flip-flop on the beach.”

That detailed information is crucial because Tybee is a barrier island, and rising seas are constantly eating away at the sandy beach and dunes that protect the island’s homes and businesses as well as a stretch of the Georgia mainland. Knowing exactly where the island is eroding and how the dunes are holding up to constant battering can help local leaders protect this piece of coastline.

“Tybee wants to retain its beach. It also wants to maintain, obviously, its dune. It’s a protection for them,” said Venherm. “We also give some of our data to the Corps of Engineers so they know what’s going on and when they have to renourish the beach.”

Since the 1970s the Army Corps of Engineers has helped maintain Tybee Island’s beaches with regular renourishment: Every seven years or so, the Corps dredges up sand from the ocean floor and deposits on the beach to replace sand that’s washed away. The data from the Skidaway team will only help the Corps do this work more effectively. Lidar isn’t new, and neither is aerial coastal mapping. Several federal agencies monitor coastlines with lidar, but those surveys are more typically several years apart for any one location, rather than a few months.

The last renourishment finished in January 2020, and Venherm and her team got to work a few months later. That means they have five years of high-resolution beach data, recorded every three months and after major storms like Hurricane Helene, creating a precise picture of how the beach is changing.

“I can compute what the elevation of the dune is, as well as how much volume has been lost or gained since a previous survey.”

“I can compute what the elevation of the dune is, as well as how much volume has been lost or gained since a previous survey,” said Venherm. “I can also compute how long it will take until the beach is completely gone, or how long will it take until water reaches the dune system.”

The Corps conducts regular renourishment projects on beaches all along the East Coast, and uses a template to inform that work, said Alan Robertson, a consultant who leads the city of Tybee’s resilience planning. But he hopes that such granular evidence of specific changes over time can shift where exactly the sand gets placed within the bounds of that template. An area near the island’s north end, for instance, is a clear hot spot for erosion, so the city may push for concentrating sand there, and north of that point so that it can travel south to fill in the erosion.

“We know exactly where the hotspots of erosion are. We know where there’s accretion,” he said, referring to areas where sand tends to build up. “[We] never had that before.”

The data can also inform the city’s own decision-making, because it provides a much clearer picture of what happens to the dunes and beach over time after the fresh sand is added. In the past, they’ve been able to see the most obvious erosion, but now they can compare how different methods of dune-building and even sources of sand hold up. The vegetation that’s critical to holding dunes together, for instance, takes root far better in sand dredged from the ocean compared to sand trucked in from the mainland, Robertson said.

“There’s an example of the research and the monitoring. I actually can make that statement,” he said. “I actually know where you should get your sand from if you can, and why. No one could have told you that eight years ago.”

That sort of proven information is key in resilience projects, which are often expensive and funded by grants from agencies that want confirmation their money is being spent well.

“Everything we do now on resiliency, measuring, and monitoring has become a priority,” said Robertson. “We’ve been able over these years through proof statements of ‘look at what this does for you’ to make it part of the project.”

—Emily Jones (@ejreports.bsky.social), Grist

This article originally appeared in Grist at https://grist.org/science/inside-a-georgia-beachs-high-tech-fight-against-erosion/.

Grist is a nonprofit, independent media organization dedicated to telling stories of climate solutions and a just future. Learn more at Grist.org

How Researchers Have Studied the Where, When, and Eye of Hurricanes Since Katrina

Fri, 08/29/2025 - 12:02

On 28 August 2005, New Orleans area residents received a bulletin from the National Weather Service (NWS) office in Slidell, La., warning them of “a most powerful hurricane with unprecedented strength.” One excerpt of the chilling announcement, issued via NOAA radio and the Federal Communications Commission’s Emergency Alert Service, read,

BLOWN DEBRIS WILL CREATE ADDITIONAL DESTRUCTION. PERSONS…PETS…AND LIVESTOCK EXPOSED TO THE WINDS WILL FACE CERTAIN DEATH IF STRUCK.

POWER OUTAGES WILL LAST FOR WEEKS…AS MOST POWER POLES WILL BE DOWN AND TRANSFORMERS DESTROYED. WATER SHORTAGES WILL MAKE HUMAN SUFFERING INCREDIBLE BY MODERN STANDARDS.

Hurricane Katrina, which caused 1,833 fatalities and about $108 billion in damage (more than $178 billion in 2025 dollars), remains the costliest hurricane on record to hit the United States and among the top five deadliest.

“If we were to have a Katrina today, that [forecast] cone would be half the size that it was in 2005.”

In the 20 years since the hurricane, meteorologists, modelers, computer scientists, and other experts have worked to improve the hurricane forecasting capabilities that inform bulletins like that one.

Consider the forecast cone, for instance. Also known as the cone of uncertainty, this visualization outlines the likely path of a hurricane with decreasing specificity into the future: The wider part of the cone might represent the forecasted path 36 hours in advance, and the narrower part might represent the forecasted path 12 hours in advance.

“If we were to have a Katrina today, that cone would be half the size that it was in 2005,” said Jason Beaman, meteorologist-in-charge at the National Weather Service Mobile/Pensacola office.

How to Make a Hurricane

The ingredients for a hurricane boil down to warm water and low pressure. When an atmospheric low-pressure area moves over warm ocean water, surface water evaporates, rises, then condenses into clouds. Earth’s rotation causes the mass of clouds to spin as the low pressure pulls air toward its center.

Storms born in the Gulf of Mexico or that traverse it, as Katrina did, benefit from the body’s sheltered, warm water, and the region’s shallow continental shelf makes storm surges particularly destructive for Gulf Coast communities.

Hurricanes gain strength as long as they remain over warm ocean waters. But countless factors contribute to how intense a storm becomes and what path it takes, from water temperature and wind speed to humidity and proximity to the equator.

Because predicting the behavior of hurricanes requires understanding how they work, data gathered by satellites, radar, and aircraft are crucial for researchers. Feeding these data into computer simulations helps researchers understand the mechanisms behind hurricanes and predict how future storms may behave.

“Since 2005, [there have been] monumental leaps in observation skill,” Beaman said.

Seeing a Storm More Clearly

Many observations of the weather conditions leading up to hurricanes come from satellites, which can offer a year-round bird’s-eye view of Earth.

NOAA operates a pair of geostationary satellites that collect imagery and monitor weather over the United States and most of the Atlantic and Pacific oceans. The mission, known as the Geostationary Operational Environmental Satellite (GOES) program, has been around since 1975; the current satellites are GOES-18 and GOES-19.

When Beaman started his career just a few years before Katrina hit, satellite imagery from GOES-8 to GOES-12 was typically beamed to Earth every 30–45 minutes—sometimes as often as every 15 minutes. Now it’s routine to receive images every 5 minutes or even as often as every 30 seconds. Having more frequent updates makes for much smoother animations of a hurricane’s track, meaning fewer gaps in the understanding of a storm’s path and intensification.

For Beaman, the launch of the GOES-16 satellite in 2016 marked a particularly important advance: In addition to beaming data to scientists more frequently, it scanned Earth with 4 times the resolution of the previous generation of satellites. It could even detect lightning flashes, which can sometimes affect the structure and intensity of a hurricane.

The transition to GOES-16 “was like going from black-and-white television to 4K television.”

The transition to GOES-16 “was like going from black-and-white television to 4K television,” Beaman said.

NOAA also has three polar-orbiting satellites, launched between 2011 and 2017, that orbit Earth from north to south 14 times a day. As part of the Joint Polar Satellite System (JPSS) program, the satellites’ instruments collect data such as temperature, moisture, rainfall rates, and wind for large swaths of the planet. They also provide microwave imagery using radiation emitted from water droplets and ice. NOAA’s earlier polar-orbiting satellites had lower resolution at the edges of scans, a more difficult time differentiating clouds from snow and fog, and less accurate measurements of sea surface temperature.

“With geostationary satellites, you’re really just looking at the cloud tops,” explained Daniel Brown, branch chief of the Hurricane Specialist Unit at NOAA’s National Hurricane Center in Miami. “With those microwave images, you can really kind of see into the storm, looking at structure, whether an eye has formed. It’s really helpful for seeing the signs of what could be rapid intensification.”

NOAA’s Geostationary Operational Environmental Satellites (GOES) monitor weather over the United States and most of the Atlantic and Pacific oceans. Credit: NOAA/Lockheed Martin, Public Domain

Rapid intensification is commonly defined as an increase in maximum sustained wind speed of 30 or more nautical miles per hour in a 24-hour period. Katrina had two periods of rapid intensification, and they were one reason the storm was so deadly. In the second period, the storm strengthened from a low-end category 3 hurricane (in which winds blow between 178 and 208 kilometers per hour, or between 111 and 129 miles per hour) to a category 5 hurricane (in which winds blow faster than 252 kilometers per hour, or 157 miles per hour) in less than 12 hours.

New Angles

Radar technology has also made strides in the decades since Katrina. Hurricane-tracking radar works via a ground- or aircraft-based transmitter sending out a radio signal. When the signal encounters an obstacle in the atmosphere, such as a raindrop, it bounces back to a receiver. The amount of time it takes for the signal to return provides information about the location of the obstacle.

Between 2011 and 2013, NWS upgraded its 150+ ground-based radars throughout the United States with dual-polarization technology—a change a 2013 NWS news release called “the most significant enhancement made to the nation’s radar network since Doppler radar was first installed in the early 1990s.”

So-called dual-pol technology sends both horizontal and vertical pulses through the atmosphere. With earlier technology, a radar signal might tell researchers only the location of precipitation. Dual-pol can offer information about how much precipitation is falling, the sizes of raindrops, and the type of precipitation or can even help researchers identify debris being transported in a storm.

Credit: NOAA

“That’s not something that we had back in Katrina’s time,” Beaman said. In 2005, forecasters used “much more crude ways of trying to calculate, from radar, how much rain may have fallen.”

Radar updates have become more frequent as well. Beaman said his office used to receive routine updates every 5 or 6 minutes. Now they receive updated radar imagery as often as every minute.

Hunting Hurricanes from the Skies

For a more close-up view of a hurricane, NOAA and the U.S. Air Force employ Hurricane Hunters—planes that fly directly through or around a storm to take measurements of pressure, humidity, temperature, and wind speed and direction. These aircraft also scan the storms with radar and release devices called dropwindsondes, which take similar measurements at various altitudes on their way down to the ocean.

NOAA’s P-3 Orion planes and the 53rd Weather Reconnaissance Squadron’s WC-130J planes fly through the eyes of storms. NOAA’s Gulfstream IV jet takes similar measurements from above hurricanes and thousands of square kilometers around them, also releasing dropwindsondes along the way. These planes gather information about the environment in which storms form. A 2025 study showed that hurricane forecasts that use data from the Gulfstream IV are 24% more accurate than forecasts based only on satellite imagery and ground observations.

The NOAA P-3 Hurricane Hunter aircraft captured this image from within the eye of Hurricane Katrina on 28 August 2005, 1 day before the storm made landfall. Credit: NOAA, Public Domain

Hurricane Hunters’ tactics have changed little since Katrina, but Brown said that in the past decade or so, more Hurricane Hunter data have been incorporated into models and have contributed to down-to-Earth forecasting.

Sundararaman “Gopal” Gopalakrishnan, senior meteorologist with NOAA’s Atlantic Oceanographic and Meteorological Laboratory’s (AOML) Hurricane Research Division, emphasized that Hurricane Hunter data have been “pivotal” for improving both the initial conditions of models and the forecasting of future storms.

With Hurricane Hunters, “you get direct, inner-core structure of the storm,” he said.

Hurricane Hunters are responsible for many of the improvements in hurricane intensity forecasting over the past 10–15 years, said Ryan Torn, an atmospheric and environmental scientist at the University at Albany and an author of the recent study about Gulfstream IVs. One part of this improvement, he explained, is that NOAA began flying Hurricane Hunters not just for the largest storms but for weaker and smaller ones as well, allowing scientists to compare what factors differentiate the different types.

“We now have a very comprehensive observation dataset that’s come from years of flying Hurricane Hunters into storms,” he said. These datasets, he added, make it possible to test how accurately a model is predicting wind, temperature, precipitation, and humidity.

In 2021, NOAA scientists also began deploying uncrewed saildrones in the Caribbean Sea and western Atlantic to measure changes in momentum at the sea surface. The drones are designed to fill observational gaps between floats and buoys on the sea surface and Hurricane Hunters above.

Modeling Track and Intensity

From the 1980s to the early 2000s, researchers were focused on improving their ability to forecast the path of a hurricane, not necessarily what that hurricane might look like when it made landfall, Gopalakrishnan explained.

Brown said a storm’s track is easier to forecast than its intensity because a hurricane generally moves “like a cork in the stream,” influenced by large-scale weather features like fronts, which are more straightforward to identify. Intensity forecasting, on the other hand, requires a more granular look at factors ranging from wind speed and air moisture to water temperature and wind shear.

Storms like 2005’s Katrina and Rita “showed the importance of [tracking a storm’s] intensity, especially rapid intensification.”

Gopalakrishnan said storms like 2005’s Katrina and Rita “showed the importance of [tracking a storm’s] intensity, especially rapid intensification.”

Without intensity forecasting, Gopalakrishnan said, some of the most destructive storms might appear “innocuous” not long before they wreak havoc on coastlines and lives. “Early in the evening, nobody knows about it,” he explained. “And then, early in the morning, you see a category 3 appear from nowhere.”

Gopalakrishnan came to AOML in 2007 to set up both the Hurricane Modeling Group and NOAA’s Hurricane Forecast Improvement Project. He had begun working on what is now known as the Hurricane Weather Research Forecast model (HWRF) in 2002 in his role at NOAA’s Environmental Modeling Center. With the formation of the hurricane modeling group in 2007, scientists decided to focus on using HWRF to forecast intensity changes.

HWRF used a technique called moving nests to model the path of a storm in higher resolution than surrounding areas. Gopalakrishnan compared a nest to using a magnifying glass focused on the path of a storm. Though a model might simulate a large area to provide plenty of context for a storm’s environment, capturing most of an area in lower resolution and the storm path itself in higher resolution can save computing power.

By 2014, Gopalakrishnan said, the model’s tracking and intensity forecasting capabilities had improved 25% since 2007. The model’s resolution also upgraded from 9 square kilometers in 2007 to 1.5 square kilometers by the time it was retired in 2023.

Since 2007, the National Hurricane Center’s official (OFCL) track forecast errors decreased between 30% and 50%, and intensity errors shrank by up to 55%. MAE = mean absolute error; VMAX = maximum sustained 10-meter winds. Credit: Alaka et al., 2024, https://doi.org/10.1175/BAMS-D-23-0139.1

Over time, advances in how data are introduced into models meant that the better data researchers were receiving from satellites, radars, and Hurricane Hunters improved modeling abilities even further. Gopalakrishnan estimated that by 2020, his office could predict hurricane track and intensity with somewhere between 50% and 54% more accuracy than in 2007.

NOAA began transitioning operations to a new model known as the Hurricane Analysis and Forecast System (HAFS) in 2019, and HAFS became the National Hurricane Center’s operational forecasting model in 2023. HAFS, developed jointly by several NOAA offices, can more reliably forecast storms, in part by increasing the use of multiple nests—or multiple high-resolution areas in a model—to follow multiple storms at the same time. HAFS predicted the rapid intensification of Hurricanes Helene and Milton in 2024.

Just as they did with HWRF, scientists run multiple versions of HAFS each year: an operational model, used to inform the public, and a handful of experimental models to see which of them work the best. At the end of hurricane season, researchers examine which versions performed the best and begin combining elements to develop the next generation of the operational model. The team expects that as HAFS improves, it will lengthen the forecast from the 5 days offered by previous models.

“As a developer [in 2007], I would have been happy to even get 2 days forecast correctly,” Gopalakrishnan said. “And today, I’m aiming to get a 7-day forecast.”

NOAA’s budget plan for 2026 could throw a wrench into this progress, as it proposes eliminating all NOAA labs, including AOML.

The Role of Communication

An accurate hurricane forecast does little good if the information isn’t shared with the people who need it. And communication about hurricane forecasts has seen its own improvements in the past 2 decades. NWS has partnered with social scientists to learn how to craft the most effective messages for the public, something Beaman said has paid dividends.

Communication between the National Hurricane Center and local weather service offices can be done over video calls, rather than by phone as was once done. Sharing information visually can make these calls more straightforward and efficient. NWS began sending wireless emergency alerts directly to cell phones in 2012.

In 2017, the National Hurricane Center began issuing storm surge watches and warnings in addition to hurricane watches and warnings. Beaman said storm surge inundation graphics, which show which areas may experience flooding, may have contributed to a reduction in storm surge–related fatalities. In the 50-year period between 1963 and 2012, around 49% of storm fatalities were related to storm surge, but by 2022, that number was down to 11%.

“You take [the lack of visualization] back to Katrina in 2005, one of the greatest storm surge disasters our country has seen, we’re trying to express everything in words,” Beaman said. “There’s no way a human can properly articulate all the nuances of that.”

Efforts to create storm data visualization go beyond NOAA.

Carola and Hartmut Kaiser moved to Baton Rouge, La., just weeks before Hurricane Katrina made landfall. Hartmut, a computer scientist, and Carola, an information technology consultant with a cartography background, were both working at Louisiana State University. When the historic storm struck, Hartmut said they wondered, “What did we get ourselves into?”

Shortly after the storm, the Kaisers combined their expertise and began work on the Coastal Emergency Risks Assessment (CERA). The project, led by Carola, is an easy-to-use interface that creates visual representations of data, including storm path, wind speed, and water height, from the National Hurricane Center, the Advanced Circulation Model (ADCIRC), and other sources.

The Coastal Emergency Risks Assessment tool aims to help the public understand the potential timing and impacts of storm surge. Here, it shows a forecast cone for Hurricane Erin in August 2025, along with predicted maximum water height levels. Credit: Coastal Emergency Risks Assessment

“We know of a lot of people who said, ‘Yes, thank you, [looking at CERA] caused me to evacuate.”

What started as an idea for how to make information more user-friendly for the public, emergency managers, and the research community grew quickly: Hundreds of thousands of people now use the tool during incoming storm events, Hartmut said. The Coast Guard often moves its ships to safe regions on the basis of CERA’s predictions, and the team frequently receives messages of thanks.

“We know of a lot of people who said, ‘Yes, thank you, [looking at CERA] caused me to evacuate,” Hartmut said. “And now my house is gone, and I don’t know what would have happened if I didn’t go.”

Looking Forward

Unlike hurricane season itself, the work of hurricane modelers has no end. When the season is over, teams such as Gopalakrishnan’s review the single operational and several experimental models that ran throughout the season, then work all year on building an upgraded operational model.

“It’s 365 days of model developments, testing, and evaluation,” he said.

NOAA scientists aren’t the only ones working to improve hurricane forecasting. For instance, researchers at the University of South Florida’s Ocean Circulation Lab (OCL) and the Florida Flood Hub created a storm surge forecast visualization tool based on the lab’s models. The West Florida Coastal Ocean Model, East Florida Coastal Ocean Model, and Tampa Bay Coastal Ocean Model were designed for the coastal ocean with a sufficiently high resolution to model small estuaries and shipping channels.

Though Yonggang Liu, a coastal oceanographer and director of OCL, cited examples of times his lab’s models have outperformed NOAA’s models, the tool is not used in operational NOAA forecasts. But it is publicly available on the OCL website (along with a disclaimer that the analyses and data are “research products under development”).

The Cyclone Global Navigation Satellite System (CYGNSS) is a NASA mission that pairs signals from existing GPS satellites with a specialized radar receiver to measure reflections off the ocean surface—a proxy for wind levels. The constellation of eight satellites can take measurements more frequently than GOES satellites, allowing for better measurement of rapid intensification, said Chris Ruf, a University of Michigan climate and space scientist and CYGNSS principal investigator.

It might seem that if a method or mission offers a way to more accurately forecast hurricanes, it should be promptly integrated into NOAA’s operational models. But Ruf explained NOAA’s hesitation to use data from university-led efforts: Because they are outside of NOAA’s control and could therefore lose funding or otherwise stop running, it’s too risky for NOAA to rely on such projects.

“CYGNSS is a one-off mission that was funded to go up there and do its thing, and then, when it deorbits, it’s over,” Ruf said. “They [at NWS] don’t want to invest a lot of time learning how to assimilate some new data source and then have the data disappear later. They want to have operational usage where they can trust that it’s going to be there later on.”

“These improvements cannot happen as a one-man army.”

Whatever office they’re in, it’s scientists who make the work of hurricane forecasting possible. Gopalakrishnan said that during Katrina, there were two or three people at NOAA associated with model development. He credits the modeling improvements made since then to the fact that, now, there’s a team of several dozen. And more advances may be on the horizon. For instance, NOAA expects a new Hurricane Hunter jet, a G550, to join the ranks by 2026.

However, some improvements are stalling. The Geostationary Extended Observations (GeoXO) satellite system is slated to begin expanding observations of GOES satellites in the early 2030s. But the 2026 U.S. budget proposal, which suggests slashing $209 million from NOAA’s efforts to procure weather satellites and infrastructure, specifically suggests a “rescope” of the GeoXO program

Hundreds of NOAA scientists have been laid off since January 2025, including Hurricane Hunter flight directors and researchers at AOML (though NWS received permission to rehire hundreds of meteorologists, hydrologists, and radar technicians, as well as hire for previously approved positions, in August).

In general, hurricane fatalities are decreasing: As of 2024, the 10-year average in the United States was 27, whereas the 30-year average was 51. But this decrease is not because storms are becoming less dangerous.

“Improved data assimilation, improved computing, improved physics, improved observations, and more importantly, the research team that I could bring together [were] pivotal” in enabling the past 2 decades of forecasting improvements, said Gopalakrishnan. “These improvements cannot happen as a one-man army. It’s a team.”

—Emily Dieckman (@emfurd.bsky.social), Associate Editor

Citation: Dieckman, E. (2025), How researchers have studied the where, when, and eye of hurricanes since Katrina, Eos, 106, https://doi.org/10.1029/2025EO250320. Published on 28 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer