EOS

Syndicate content Eos
Science News by AGU
Updated: 14 hours 33 min ago

¿Pueden los microorganismos prosperar en la atmósfera terrestre o simplemente sobreviven allí?

Tue, 09/09/2025 - 13:19
Source: Journal of Geophysical Research: Biogeosciences

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

La atmósfera terrestre transporta diminutas formas de vida celular, tales como esporas de hongos, polen, bacterias y virus. En sus recorridos, estos microorganismos se enfrentan a condiciones desafiantes como bajas temperaturas, radiación ultravioleta y falta de disponibilidad de nutrientes. Investigaciones previas han demostrado que ciertos microorganismos pueden resistir estas condiciones extremas y, potencialmente, permanecer en estado de latencia hasta depositarse en un entorno más favorable. Pero ¿podría la misma atmósfera ser también el lugar de un sistema microbiano activo, que albergue microorganismos en crecimiento, adaptados y residentes?

El estudio de estas formas de vida flotantes se denomina aerobiología, pero avanzar en este campo resulta complicado: no existe un método estandarizado para muestrear el aeromicrobioma, es común que las muestras microbianas se contaminen, y resulta difícil reproducir las condiciones atmosféricas en un entorno de laboratorio.

Martinez-Rabert y colaboradores sugieren que la modelización computacional y los enfoques teóricos podrían contribuir a mejorar la comprensión del aeromicrobioma. A partir de la información conocida sobre el metabolismo y la bioenergética de la vida microbiana—especialmente en ambientes extremos—, así como de la química y la física de la atmósfera, los marcos de modelización especializados pueden proporcionar información sobre el aeromicrobioma.

Ese enfoque de modelado ascendente, proponen los investigadores, les permitiría comprobar cómo el cambio de elementos individuales de la atmósfera terrestre afectaría a la proliferación de la vida microbiana que contiene. Por ejemplo: ¿los microbios están mejor adaptados a un estilo de vida “libre” en los gases atmosféricos, dentro de gotas o adheridos a partículas sólidas? ¿Qué fuentes de energía están disponibles para estos microorganismos? ¿Cómo influye la acidez de los aerosoles atmosféricos en la capacidad de los microorganismos atmosféricos para prosperar?

El grupo sugiere que, combinados con datos obtenidos mediante muestreos, experimentos y observaciones, los modelos teóricos podrían ayudar a los investigadores a evaluar la capacidad de nuestra atmósfera para sostener una biosfera microbiana e, incluso, a comprender mejor cómo los microorganismos influyen en la composición química de la atmósfera. Este trabajo, señalan, también podría resultar útil en el futuro para modelar cómo podría existir la vida en otras atmósferas planetarias. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2025JG009071, 2025)

—Rebecca Owen (@beccapox.bsky.social), Escritora de ciencia

This translation by Saúl A. Villafañe-Barajas (@villafanne) was made possible by a partnership with Planeteando and Geolatinas. Esta traducción fue posible gracias a una asociación con Planeteando y Geolatinas.

Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Heat Spurs Unequal Consumption of Sweet Treats

Mon, 09/08/2025 - 17:12

The United States could consume the added-sugar equivalent of as much as 7 billion additional cans of soda per year by 2095 as a result of climate change, according to a new study. 

A new analysis of food consumption patterns and weather data in the country, published in Nature Climate Change, showed that warmer temperatures increase household purchases of food and beverage products with added sugar, especially among low-income and less educated populations. 

“There’s a huge difference across different socioeconomic groups.”

“There’s a huge difference across different socioeconomic groups,” said Duo Chan, a climate scientist at the University of Southampton and coauthor on the new study. 

Researchers analyzed retail food and drink purchases in more than 40,000 U.S. households from 2004 to 2019 along with monthly average temperatures at the county scale. They found that the average adult male purchased about 0.70 gram of additional added sugar per day for every additional 1°C (1.8°F) of temperature between 12°C (53.6°F) and 30°C (86°F).

Researchers ran multiple regression analyses to rule out other factors that could have influenced the purchasing of sugary products. With the effects of other variables, such as changes in product price, removed, added-sugar intake remained significantly associated with temperature, Chan said.

“It’s the change in temperature that [leads] to the sugar intake,” said Pengfei Liu, an environmental economist at the University of Rhode Island and coauthor of the new study.

The trends, according to the authors, are probably driven by people choosing hydrating, sweet beverages and cold desserts to mitigate the effects of heat. The patterns are “common sense,” said Thalia Sparling, a public health researcher at the London School of Hygiene & Tropical Medicine who was not involved in the new research. “Of course, when it gets hotter, you’re going to want to sit on the porch with your friends and have a cold drink or eat more ice cream.”

Education and income levels of the heads of households influenced how sensitive those households’ sugar consumption patterns were to increases in temperature. Households in which the head of household had a lower income and was less educated increased their added sugar purchases more per degree of increased temperature. Purchases of sweetened beverages and frozen desserts constituted the bulk of the increased sugary purchases.

The study authors used Coupled Model Intercomparison Project Phase 6 (CMIP6) climate models to project that warming temperatures in a high-emissions world could change diets enough to add nearly 3 grams of sugar per day to the average U.S. diet by the end of the century—equivalent to about 30 cans of soda per person per year. The projected effects were unequal among socioeconomic groups, too, with lower-income and less educated households expected to increase their sugar intake more than their higher-income, more educated counterparts. 

Dietary Inequality

“This is just another piece of evidence showing that the impacts of climate change on people are not equal.”

“This is just another piece of evidence showing that the impacts of climate change on people are not equal,” Sparling said.

Nutrition researchers have long known that lower-income groups eat less healthily, Sparling said, because of economic factors, lack of access to healthier foods, and even work environment. “People in communities with lower average [socioeconomic status] are less likely to have air-conditioned workplaces, schools, homes, or respite in other ways,” she said. Those without a way to escape the heat may be more likely to reach for cold, sweetened drinks or desserts for relief.

“Low-income people are most vulnerable to climate change in a lot of cases, and also in our case, in terms of excessive sugar intake,” Chan said. 

Higher added-sugar consumption can increase the risk of various health problems such as obesity, diabetes, and heart disease. But because the causes behind health problems are so complex, much more research would be needed to link warming to specific increases in disease, Sparling said.

She stressed that the onus to improve dietary choices should not be all on the individual: “You have to look at systems level change,” she said. Policies such as taxes on sweetened beverages have decreased added sugar consumption in some cities and countries. Education in communities, schools, and churches can also help people form healthier habits, Sparling said. 

More research is needed to determine whether the trends seen in the United States are reflected across the globe, said Pan He, an environmental social scientist at Cardiff University and first author of the new study. Then, scientists could have an even more comprehensive understanding of how global food consumption patterns may adapt to climate change, she said. 

“I hope our study may shed light on future evidence in developing countries, where sugary-beverage intake is already high and rising heat could further threaten nutrition security,” wrote Yan Bai, an economist at the World Bank and coauthor of the new study, in an email.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Citation: van Deelen, G. (2025), Heat spurs unequal consumption of sweet treats, Eos, 106, https://doi.org/10.1029/2025EO250333. Published on 8 September 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Protein-Powered Biosensors with a Nose for Environmental Ills

Mon, 09/08/2025 - 13:54

Imagine a farmer standing in her field—or even sitting at home—when she gets an alert from a handheld device: Her crops are showing signs of stress, not from heat, drought, or lack of nutrients in this case, but from a pesticide spill detected upstream. The alert doesn’t come from a lab test conducted days after the initial contamination. Instead, it is generated in real time by a portable biosensor containing a protein derived from a pig’s nose.

Chemical-sniffing sensors, though not yet widely adopted in agriculture or environmental science, represent an emerging field of research and development.

The protein has been reprogrammed to mimic the molecular recognition capabilities of animal olfaction, allowing it to detect specific volatile chemicals associated with pesticide contamination and enabling rapid on-site detection. With this early warning, the farmer can act swiftly to mitigate negative impacts and protect both her crops and future yield.

Chemical-sniffing sensors like this, though not yet widely adopted in agriculture or environmental science, represent an emerging field of research and development. They also offer viable new tools for addressing urgent 21st century environmental challenges related to climate change, rapid industrialization, urban sprawl, deforestation, and agricultural intensification, which threaten biodiversity, food security, and public health globally.

In response to these challenges, the United Nations’ Sustainable Development Goals (SDGs)—particularly SDG 6 (Clean Water and Sanitation), SDG 12 (Responsible Consumption and Production), SDG 13 (Climate Action), SDG 14 (Life Below Water), and SDG 15 (Life on Land)—call for integrated, data-driven approaches to environmental monitoring and management that support environmentally, economically, and socially responsible practices.

Implementing these goals, especially in remote regions and developing nations, requires affordable, scalable methods for monitoring air, water, and soil resources that deliver timely and actionable information. Naturally occurring animal proteins, paired with biosensing technology, offer a promising foundation.

A Transformative New Approach

Detecting pollutants in air, water, or soil often requires sending samples to distant laboratories for gas chromatography, mass spectrometry, or other high-precision analyses. These tools are indispensable for regulatory science because they deliver highly accurate, standardized measurements of trace contaminants that can withstand legal and policy scrutiny. However, the time required to collect, ship, and analyze samples delays results and limits their usefulness for rapid, local decisionmaking.

Conventional in situ monitoring systems, such as stationary air- and water-quality stations, provide continuous data but are expensive to install and maintain. As a result, they are typically sparsely distributed, provide limited spatial coverage, and require significant power, connectivity, and upkeep. Together the high costs and infrastructure demands of current methods make them impractical for widespread field deployment, especially in remote, resource-limited, or rapidly changing environments.

Biosensors present a viable, transformative alternative. Compact, energy-efficient, and often portable, these devices combine biological recognition elements, such as enzymes, antibodies, or odorant-binding proteins (OBPs), with signal transducers to detect specific compounds on-site, in real time, and at low cost. Notably, these devices can sense volatile chemicals and bioavailable pollutant fractions, making them well-suited to complement or even replace traditional environmental monitoring tools in certain settings.

Fig. 1. The structure of a porcine odorant-binding protein (pOBP) is shown by this ribbon diagram. A molecule of butanal, a volatile organic compound used in industrial manufacturing applications, is depicted within the binding cavity of the protein. Credit: Cennamo et al. [2015], CC BY 4.0

OBPs, a class of tiny but mighty proteins found in the olfactory systems of insects and vertebrates, are especially appealing options (Figure 1). They detect trace amounts of odorants—the molecules behind scents—in complex, chemically noisy environments. Whether it is a moth navigating miles to find a mate, or a mammal sniffing out food, OBPs enable detection of a few key molecules amid thousands.

Today researchers are repurposing OBPs to sniff out the chemical by-products of modern life. These proteins possess high thermal and chemical stability, are easy to synthesize, and are remarkably versatile. They can be integrated into portable devices and miniaturized sensors, affixed to biodegradable materials, and genetically engineered to target specific chemicals in soil, air, or water.

OBPs in Action

Despite their promise, biosensors remain underrepresented in discourse and planning related to environmental monitoring and sustainability. More often than not, prototypes developed and tested in the laboratories fail to reach broad application in the field. Specific uses of OBPs have remained largely siloed within biomedical and entomological research.

However, emerging applications of OBPs align closely with key geoscience priorities, including tracking pesticide and industrial runoff, monitoring volatile compounds and mapping soil emissions, and identifying plant health indicators tied to environmental stress and drought. Several proof-of-concept and real-world demonstrations are already underway, highlighting how OBPs can detect a range of pollutants across different environments.

Fig. 2. These prototype biosensors (top), with the gold-plated sensing area at the far left of each, were designed to detect benzene in the environment. The diagram (bottom) illustrates the process by which the sensing surface was chemically functionalized with pOBP (pink ribbon diagram). Credit: Capo et al. [2022], CC BY 4.0

Porcine OBPs have been engineered to detect BTEX pollutants (benzene, toluene, ethylbenzene, and xylene) originating from pesticides and petroleum runoff that threaten groundwater and soil health (Figure 2) [Capo et al., 2022, 2018]. Bovine OBPs, immobilized on cartridge-like devices, can selectively bind and remove triazine herbicides from water, demonstrating potential for both detection and remediation of the pollutant in water treatment [Bianchi et al., 2013].

Sensors coated with bovine and porcine OBPs detect trace, mold-related volatile organic compounds (VOCs) such as octenol and carvone [Di Pietrantonio et al., 2015, 2013], which is relevant to both indoor and outdoor air quality monitoring and mitigation of post-harvest crop losses. Low-cost, OBP-functionalized devices have also demonstrated selective detection of butanal, a common VOC linked to industrial and urban particulate matter [Cennamo et al., 2015].

In addition to bovine and porcine OBPs, rat OBP derivatives have been customized and immobilized on sensing platforms to enable simultaneous VOC profiling for air and water pollution diagnostics [Hurot et al., 2019]. Furthermore, insect OBPs, embedded in fluorescence-based biosensors, have shown efficacy for detecting bacterial metabolite, offering a possible approach for rapid coliform bacteria screening in drinking water [Dimitratos et al., 2019].

Beyond environmental and water quality applications, OBPs from multiple species have also been used to monitor for plant-emitted VOCs that signal stress, disease, drought, or pest infestation in agricultural systems [Wilson, 2013], providing valuable insights into crop health and enabling early intervention strategies.

The Potential Is Enormous

Integrating OBPs into environmental monitoring systems opens new frontiers in climate-smart agriculture, distributed sensing networks, and adaptive land use management.

Integrating OBPs into environmental monitoring systems opens new frontiers in climate-smart agriculture, distributed sensing networks, and adaptive land use management. These sensors offer lab-grade sensing of emissions from sources such as livestock waste, fertilizer application, and wetland activity. They may also enable real-time monitoring of greenhouse gas precursors and early detection of soil degradation, microbial shifts, or drought stress—all delivered through devices small enough to fit in your pocket.

Early detection of pollutant leaks or VOC hot spots could inform land use strategies that mitigate volatile emissions, improve air quality, and strengthen climate adaptation. OBP sensors’ low power requirements and biodegradability make them ideal for decentralized deployments, especially in low-resource or remote areas. Engineered differently, these proteins could even serve in preventative technologies as molecular sponges or scavengers that capture and bind VOCs before they accumulate or disperse.

Ultimately, OBPs could enable more data-driven decisions in conservation and climate policy, while offering novel tools for mapping environmental dynamics (e.g., tracking the spread of wildfire smoke plumes, monitoring methane emissions, or detecting waterborne coliforms across river networks) at finer spatial and temporal resolutions than current technologies permit.

From the Bench to the Biosphere

We envision a future in which OBPs are central to smart agriculture platforms, mobile environmental sensing labs, and biodegradable field-deployable kits. The underlying technology is sound, but breakthroughs like this don’t happen in isolation. Cross-disciplinary collaboration is crucial to accelerate and scale this development, reduce risks of field deployments, and ensure that innovations are aligned with real-world policy and practice.

We propose several pathways to support this collaboration and innovation. For example, targeted workshops and research consortia could facilitate dialogue among molecular biologists, environmental engineers, and Earth scientists to identify priority research questions and focus efforts on specific environmental challenges.

Key questions for advancing OBP-based sensing include the following: Which pollutants and ecosystem signals are most critical for understanding today’s environmental challenges? How can OBPs be tuned to target specific compounds under varying soil, air, or water conditions? What substrates can effectively host OBPs for real-world sensing without compromising environmental safety?

As part of this dialogue, environmental scientists could contribute by generating regional maps of priority VOCs linked to specific issues such as crop stress, emissions from peatlands, or urban air pollutants, guiding optimization of OBP-based sensors. Similarly, chemists and bioengineers could collaborate to expand the library of OBPs with tailored affinities for emerging pollutants, such as pharmaceutical residues, industrial solvents, or novel agrochemicals, broadening the range of compounds detectable in real-world settings. In parallel, data scientists and systems engineers could develop machine learning models to decode complex VOC signatures captured by OBP sensors, enabling real-time diagnostics, pattern recognition, and predictive analytics across environmental monitoring networks.

Expanding access to knowledge and resources represents another key pathway for advancing OBP-based sensing.

Expanding access to knowledge and resources represents another key pathway for advancing OBP-based sensing. Developing curated, open-source, and searchable repositories of OBPs from diverse organisms with characterized binding affinities for high-priority VOCs would accelerate biosensor design and prototyping. Such repositories should follow FAIR (Findable, Accessible, Interoperable, Reusable) data principles to maximize their usefulness across disciplines and platforms.

In the United States, agencies such as the National Science Foundation, the Department of Agriculture, EPA, and the Department of Energy could accelerate progress by hosting seed funding workshops to define shared goals, barriers, and applications and by providing joint funding for interdisciplinary biosensing projects.

Establishing and sharing experimental field test beds such as smart farms, urban air zones, and wetlands would enable pilot testing of OBP-based sensors alongside conventional instruments. These biosensors could be integrated into existing monitoring networks like the National Ecological Observatory Network and the Long Term Ecological Research Network. Their outputs could feed into land use, emissions, and ecological models to improve the spatiotemporal resolution of environmental data.

Building on these test beds and integrated networks, collaborating researchers could report cross-disciplinary benchmark studies and coauthor seminal papers detailing protocols, use cases, and best practices for OBP-based biosensing. This coordinated effort would guide future research and help establish the field’s credibility with regulators and funding agencies.

Clearing Barriers on the Road Ahead

For all of the upsides of OBP-based biosensing, several technical and logistical issues must be addressed before their widespread deployment is achieved.

Despite their superior stability compared to enzymes or typical antibodies [Dimitratos et al., 2019], OBPs remain susceptible to denaturation or degradation during prolonged environmental exposure. Environmental conditions such as humidity, pH, and salinity can affect their performance, underscoring the need for robust protocols to stabilize and calibrate these proteins across diverse ecosystems.

Advancing real-time data acquisition and remote monitoring with OBP-based biosensing also requires progress toward integrating the proteins with digital platforms in scalable and reproducible formats. Key challenges include reducing sensor-to-sensor variability, increasing sensor lifespans, and converting biological signals into stable, digitized outputs.

Realizing the technology’s broader potential will require rigorous technical validation, clear regulatory guidance, and proactive efforts to educate and engage stakeholders.

In addition to technical barriers, regulatory frameworks and approval pathways for OBP-based sensing technology remain underdeveloped, and concerns about the lack of standardized validation protocols and the effects of releasing recombinant proteins into agricultural or environmental settings persist. Moreover, low awareness among end users, including farmers and land managers, may hinder trust and uptake of the technology. Realizing its broader potential will thus require rigorous technical validation, clear regulatory guidance, and proactive efforts to educate and engage stakeholders across sectors. Notwithstanding these challenges, the promise is clear: OBPs offer a flexible and powerful approach for monitoring environmental changes and climate risk, helping to protect ecosystems, food systems, and communities. Once known primarily to entomologists, these little scent-sniffing proteins could become an unexpectedly powerful tool for advancing environmental resilience.

References

Bianchi, F., et al. (2013), An innovative bovine odorant binding protein-based filtering cartridge for the removal of triazine herbicides from water, Anal. Bioanal. Chem., 405, 1,067–1,075, https://doi.org/10.1007/s00216-012-6499-0.

Capo, A., et al. (2018), The porcine odorant-binding protein as molecular probe for benzene detection, PLOS One, 13(9), e0202630, https://doi.org/10.1371/journal.pone.0202630.

Capo, A., et al. (2022), The porcine odorant-binding protein as a probe for an impedenziometric-based detection of benzene in the environment, Int. J. Mol. Sci., 23(7), 4039, https://doi.org/10.3390/ijms23074039.

Cennamo, N., et al. (2015), Easy to use plastic optical fiber–based biosensor for detection of butanal, PLOS One, 10(3), e0116770, https://doi.org/10.1371/journal.pone.0116770.

Dimitratos, S. D., et al. (2019), Biosensors to monitor water quality utilizing insect odorant-binding proteins as detector elements, Biosensors, 9(2), 62, https://doi.org/10.3390/bios9020062.

Di Pietrantonio, F., et al. (2013), Detection of odorant molecules via surface acoustic wave biosensor array based on odorant-binding proteins, Biosensors Bioelectron., 41, 328–334, https://doi.org/10.1016/j.bios.2012.08.046.

Di Pietrantonio, F., et al. (2015), A surface acoustic wave bio-electronic nose for detection of volatile odorant molecules, Biosensors Bioelectron., 67, 516–523, https://doi.org/10.1016/j.bios.2014.09.027.

Hurot, C., et al. (2019), Highly sensitive olfactory biosensors for the detection of volatile organic compounds by surface plasmon resonance imaging, Biosensors Bioelectron., 123, 230–236, https://doi.org/10.1016/j.bios.2018.08.072.

Wilson, A. D. (2013), Diverse applications of electronic-nose technologies in agriculture and forestry, Sensors, 13(2), 2,295–2,348, https://doi.org/10.3390/s130202295.

Author Information

Ishani Ray (isray@okstate.edu) and Smita Mohanty, Department of Chemistry, Oklahoma State University, Stillwater

Citation: Ray, I., and S. Mohanty (2025), Protein-powered biosensors with a nose for environmental ills, Eos, 106, https://doi.org/10.1029/2025EO250330. Published on 8 September 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Strong Tides Speed Melting of Antarctic Ice Shelves

Mon, 09/08/2025 - 13:53
Source: Journal of Geophysical Research: Oceans

Antarctic ice is melting. But exactly which forces are causing it to melt and how melting will influence sea level rise are areas of active research. Understanding the decay of ice shelves, which extend off the edges of the continent, is particularly pressing because these shelves act as barriers between ocean water and land. Without ice shelves, the continent’s glaciers would flow freely into the ocean, hastening sea level rise.

In January 2015, a group of researchers used hot water to drill a hole through 740 meters of the Ronne Ice Shelf. They then lowered a mooring carrying temperature, salinity, and current sensors through the hole into the ocean below. A radio echo sounder deployed 15 meters from the mooring kept tabs on ice thickness. For the next 3 years, the instruments took measurements every 2 hours; these measurements were sent to a solar-powered data logger on the surface and then on to researchers via satellite.

Anselin et al. recently used these measurements to probe the forces responsible for melting ice shelves.

The tide is a major force contributing to ice shelf melting, the researchers found. As the tide rises, the water rushes across the bottom of the shelf. Friction between the shelf and the water causes the current just beneath the ice to slow. This slowdown leads to strong mixing within the ocean, and this mixing brings heat to the ice base, driving high melt rates. Because the strength of tides varies depending on the positions of the Sun and the Moon relative to Earth, ice shelf melting has a cyclical pattern, with melting ebbing and flowing every 2 weeks.

However, current models of melt rates fail to capture the full extent to which tidal mixing and warm ocean water combine to melt ice. When analyzing data from additional sites, scientists should focus on how the interaction between tides and ice shelves leads to melting, the researchers say. (Journal of Geophysical Research: Oceans, https://doi.org/10.1029/2025JC022524, 2025)

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), Strong tides speed melting of Antarctic ice shelves, Eos, 106, https://doi.org/10.1029/2025EO250331. Published on 8 September 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Quantifying Predictability of the Middle Atmosphere

Fri, 09/05/2025 - 13:43
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Journal of Geophysical Research: Atmospheres

Atmospheric circulations are chaotic and unpredictable beyond a certain time limit. Quantifying predictability helps determine what forecast problems are potentially tractable. However, while predictability of weather close to the surface is a much-studied problem, showing a prediction limit of approximately 10 days, less is known about how predictable the atmosphere is at higher layers.

Garny [2025] applies a high-resolution global model to study atmospheric predictability from the surface to the mesosphere/lower thermosphere (MLT; 50-120 kilometers altitude), providing new understanding of coupling between atmospheric levels and fundamental behavior of the upper atmosphere. The author shows that the MLT is somewhat less predictable than lower atmospheric layers due to rapid growth of ubiquitous small-scale waves, with predictability horizons of about 5 days. However, atmospheric flows in the MLT on larger horizontal scales of a few thousand kilometers can remain predictable for up to 3 weeks.

This research highlights the importance of high-resolution, ‘whole atmosphere’ models to understand and predict circulations in the middle atmosphere and coupling from the surface to the edge of space.

Citation: Garny, H. (2025). Intrinsic predictability from the troposphere to the mesosphere/lower thermosphere (MLT). Journal of Geophysical Research: Atmospheres, 130, e2025JD043363. https://doi.org/10.1029/2025JD043363

—William Randel, Editor, JGR: Atmospheres

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Dust Is the Sky’s Ice Maker

Fri, 09/05/2025 - 13:10

Dust plays a major role in the formation of ice in the atmosphere. A new analysis of satellite data, published in Science, shows that dust can cause a cloud’s water droplets to freeze at warmer temperatures than they otherwise would. The finding brings what researchers had observed in the laboratory to the scale of the atmosphere and may help climate scientists better model future climate changes.

In 1804, French scientist Joseph Louis Gay-Lussac ascended to about 23,000 feet (7,000 meters) in a hydrogen balloon from Paris, without supplemental oxygen, to collect air samples. He noted that clouds with more dust particles tended to have more frozen droplets.

In the 20th century, scientists found that pure water can remain liquid even when cooled to −34.5°C. But once even tiny amounts of material, such as dust, are introduced, it freezes at much warmer temperatures.

“It’s like Schrödinger’s cat. Either there’s an ice crystal, or there’s a liquid droplet.”

In 2012, researchers in Germany were finally able to test this directly in a cloud chamber experiment. They re-created cloud conditions in the lab, introduced different types of desert dust, and gradually cooled the chamber to observe the temperatures at which droplets froze.

For Diego Villanueva, an atmospheric scientist at ETH Zürich in Switzerland and lead author of the new study, it was striking that scientists had uncovered these processes in the lab, yet no one had examined them in such detail in nature.

The challenges were obvious. To watch an ice crystal nucleate, researchers would need instruments on an aircraft or balloon to catch a micrometer-sized droplet in a cloud at just the right moment. “It’s like Schrödinger’s cat,” said Daniel Knopf, an atmospheric scientist at Stony Brook University who was not involved in the work.. “Either there’s an ice crystal, or there’s a liquid droplet.”

In the new study, Villanueva and his colleagues analyzed 35 years of satellite data on cloud tops across the Northern Hemisphere’s extratropics—a region spanning the U.S. Midwest, southern Canada, western Europe, and northern Asia. The researchers wanted to see whether dust influenced whether cloud tops were liquid or ice. They focused on cloud tops, rather than entire clouds, simply because the tops are visible in satellite imagery.

Desert Dust and Cold Clouds

Villanueva and his colleagues examined two satellite datasets covering 1982–2016, trying to infer microscopic details of cloud tops such as the number of ice crystals or droplet sizes. One dataset tracked whether cloud tops were liquid or ice, and the other measured how much dust was in the air at the same time. Although the team examined global patterns, they focused on the northern extratropical belt, where mixed-phase clouds are common and large amounts of dust from deserts like the Sahara and Gobi circulate.

But the “dataset quality was just so poor that everything that came out was basically just noise,” Villanueva added. In the end, the researchers focused on a simpler detail: the fraction of clouds with ice at their tops. “This took me nearly 3 years,” Villanueva said.

The analysis revealed that regions with more dust had more ice-topped clouds. The effect was strongest in summer, when desert winds lift the most dust.

A distinctive pattern emerged: A tenfold increase in dust roughly doubled the likelihood of cloud tops freezing. “You’d need 100 times more dust to see freezing become 4 times as frequent,” Villanueva explained.

“I think the study is quite elegant.”

The new work showed that the same processes researchers have observed at the microscale in laboratories occur at much larger scales in Earth’s atmosphere. Even after accounting for humidity and air movement, dust remained the key factor for ice nucleation in most instances, though there are exceptions. In some places, such as above the Sahara, few clouds form despite the presence of dust, perhaps, the authors suggest, because the movement of large swaths of hot air prevents freezing.

“I think the study is quite elegant,” Knopf said. He explained that taking 35 years of satellite data, finding a relationship between dust levels and frozen cloud top rates, and then showing that it lines up perfectly with lab experiments is basically “the nail in the coffin” for proving dust’s role in ice nucleation. Scientists now have robust satellite evidence of dust aerosols directly affecting cloud freezing, matching what laboratory experiments had predicted.

The finding has implications for climate modeling. To predict the effects of climate change more accurately, models must account for dust and the ways it affects cloud freezing and helps shape precipitation. Liquid-topped clouds reflect more sunlight and cool the planet, whereas ice-topped clouds let in more sunlight and trap heat.

However, Knopf noted that there is more work to be done to understand exactly what the new observations mean for scientists’ understanding of climate. “If you want to really know the precipitation or climate impacts [of dust], you really need to know the number of liquid droplets or the number of ice crystals,” he said.

Villanueva is motivated to keep looking at clouds and aerosols. In the next 10–20 years, the Earth may have drier surfaces because of climate change, which will likely produce more dust aerosols in the atmosphere. He added, “I want to know how clouds will respond in the scenario.”

—Saugat Bolakhe (@saugat_optimist), Science Writer

Citation: Bolakhe, S. (2025), Dust is the sky’s ice maker, Eos, 106, https://doi.org/10.1029/2025EO250328. Published on 5 September 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Cruise to Measure Gulf Dead Zone Faces Stormy Funding Future

Fri, 09/05/2025 - 13:05

This story was originally published in the Louisiana Illuminator.

Despite being called a “cruise,” the people on board The Pelican described the experience on the hypoxia monitoring expedition as very different from the elaborate dinners on a towering vacation ship or booze- and buffet-filled Caribbean itinerary.

Passengers describe waves up to 5 feet high in the Gulf of Mexico, swinging the 116-foot research vessel like a pendulum, plaguing anyone who didn’t have sturdy sea legs with bouts of seasickness. Daytime temperatures in late July soared ever higher as sweat dripped down the backs of hard-hat covered heads.

The guests on board The Pelican weren’t seeking pleasure or status. They were unpaid students and researchers who say they weathered the conditions in the name of science itself.

“It’s not glamorous, but it is very important.”

“It’s not glamorous, but it is very important,” said Cassandra Glaspie, assistant professor at Louisiana State University and the chief scientist for the National Oceanic and Atmospheric Administration’s annual hypoxia cruise.

The 11-day voyage provides vital information on the sea life and environmental conditions within the seasonal low-oxygen zone that develops off the coast of Louisiana. The data the cruise collects informs state and federal efforts to reduce the size of the “dead zone” and sheds light on impacts to those who rely on the water for their livelihoods, like shrimpers and fishermen.

Now, after its 40th year and 38th hypoxia cruise, The Pelican’s annually planned journey faces challenges to stay afloat, potentially undermining decades of research and future plans to get the dead zone under control.

A Decades Long Struggle

Biologists, undergraduate student researchers and crew alike celebrated the cruise’s 40th anniversary aboard The Pelican with a party that had an “old bird” theme, chosen to honor the boat, which has also been sailing for 40 years.

The Pelican and the hypoxia cruise’s 40th anniversaries party on the water. Credit: Yuanheng Xiong

More than just an excuse to eat cake (with rainbow sprinkles), the purpose of the cruise is to capture information snapshots of just how bad conditions get in the dead zone.

“We bring water up to the surface. We have a little chemistry lab…to figure out what the oxygen level is chemically, and then we can validate that against what our sensors are telling us,” Glaspie said.

The low-oxygen area appears annually as nutrients, primarily from agricultural fertilizers from the massive Mississippi River Basin, drain downriver and spur algae overgrowth.

Algae eat, defecate and die, using up the oxygen in the water when they decompose and sink to the bottom. Fish, shrimp and other marine life leave the low oxygen area when they can and suffocate when they can’t, putting pressure on the vital commercial Gulf fishery and the people who rely on it. Exposure to low-oxygen waters can also alter reproduction, growth rates and diet in fish species.

Glaspie took over the work of coastal scientist Nancy Rabalais, who launched the maiden cruise in 1985 and led it for decades after. Every summer begins with a forecast of the zone’s predicted size, estimated by various scientific models and measurements of nitrogen and phosphorus throughout the river basin taken throughout the year.

“A lot of times with pollution, you hear anecdotal evidence of how it might be increasing cancer rates or it might be causing fisheries to fail,” Glaspie said. “Here, we have an actual, measurable impact of nutrient pollution in the Mississippi River watershed.”

The Mississippi River/Gulf of America Hypoxia Task Force, an interagency federal, state and tribal effort to reduce the size of the dead zone, uses data from the cruise to determine whether it is meeting its goals.

In the past five years, the dead zone has been as large as 6,700 square miles, and even larger in previous years, reaching nearly the size of New Jersey.

While still more than two times the size that the Task Force wants, the Gulf dead zone was slightly smaller than forecasted this year, about the size of Connecticut at around 4,400 square miles.

Federal and state officials lauded the limited success of the zone’s smaller size in a July 31 press conference held to discuss the results of the hypoxia cruise’s 2025 findings. They also called for continued work.

“It requires strong collaboration between states, tribes, federal partners and stakeholders,” said Brian Frazer, the EPA’s Office of Wetlands, Oceans and Watersheds director.

Mike Naig, Iowa’s agriculture secretary, said states should be “scaling up” initiatives to reduce nutrient pollution.

Whether or not this will actually happen is uncertain.

Funding Cuts

Since the Trump administration took office, funding for nutrient reduction efforts upriver as well as money to operate the cruise itself have been scaled back or cut entirely.

The Environmental Protection Agency’s 319 and 106 funding programs under the Clean Water Act are the main funding mechanisms for states to reduce nutrient pollution throughout the Basin. Those grants aren’t funded in President Trump’s proposed FY 2026 budget, said Frazer.

The 106 programs have historically doled out $18.5 million annually, according to the EPA, with additional money sometimes allocated from Congress. The 319 program provided $174.3 million in FY 2025.

The cuts to these programs are not yet final. Congress can decide to add in additional funding, and has in past years.

States rely on both funds to reduce and monitor nutrient runoff in their waters, said Matt Rota, senior policy director for Healthy Gulf, a nonprofit research group. Rota has monitored policy changes surrounding the Gulf dead zone for more than 20 years, and he questions whether current reduction strategies can be maintained, let alone efforts redoubled.

“It’s always good to see a dead zone that’s smaller than what was predicted,” Rota said. “I am not confident that this trend will continue.”

“It’s a relatively inexpensive program. … This is baseline stuff that our government should be doing.”

Aside from cuts to reduction efforts, money for The Pelican’s annual cruise is also slipping away. Glaspie said that, ideally, the cruise has 11 days of funding. It costs about $13,000 a day to operate the vessel, she said.

“It’s a relatively inexpensive program” with big payoffs for seafood industry workers who rely on the water for their livelihoods, Rota said. “This is baseline stuff that our government should be doing.”

Funding for the hypoxia cruise has been part of the National Oceanic and Atmospheric Administration’s annual operational budget, making it a more reliable source than grant funding. But with the Trump administration taking a hatchet to government-backed research, there is increasing uncertainty over whether The Pelican and its crew will embark upon future missions.

This year, Glaspie said, NOAA defunded a day of the cruise. The Gulf of America Alliance, a partnership group to support the Gulf’s economic and environmental health amongst the five bordering states, stepped in to make up the difference. Glaspie said having that additional day was a saving grace for the research.

“This is a fine-tuned machine, and the consequences for cutting it short are really predictable and well-known,” she said. “If I’m asked to create an estimate of the surface area of hypoxia, and we’re not able to cap off the end in Texas waters, I’m not really going to be able to give a reliable estimate.”

Even without additional cuts, Glaspie said she already conducts the hypoxia cruise “on a shoestring budget.” Researchers on board don’t get paid, and every person who supports its mission—besides the crew that runs the boat—are volunteers.

“It’s tough for me to not pay people. I mean, they’re working solid 12-hour shifts. It is not easy, and they are seasick for a lot of this, and they can’t call home,” Glaspie said. “It doesn’t sit well with me to not pay people for all this work, but this is what we’ve had to do because we don’t have the money to pay them.”

Students Jorddy Gonzalez and Lily Tubbs retrieve the CTD sensor package after measuring dissolved oxygen at a regular stop on the annual hypoxia cruise while students watch. Credit: Cassandra Glaspie A Rapidly Changing Gulf

Defunding research as climate change intensifies—creating extreme heat in the Gulf—could further undermine hypoxia containment efforts and the consistency of decades worth of data collection.

“I think the rising temperatures is a big question,” Rota said.

“We have 40 years of data, which is almost a gold standard,” Glaspie said. “We’ve just reached that threshold where we can really start to ask some more detailed questions about the impacts of hypoxia, and maybe the future of hypoxia.”

Despite this year’s smaller zone surface area, low oxygen levels went deeper into the water than Glaspie had ever seen before.

“The temperature drops [as the water gets deeper], the salinity increases, and the oxygen just goes basically to zero,” she said.

In some areas, Glaspie’s measurements showed negative oxygen levels.

“Oxygen doesn’t go in the negative. It was just so low that the sensor was having trouble with it,” she said. “It’s the first time I’ve seen it like this.”

She also noticed unusually large amounts of algae on the surface of the water “like ectoplasm in Ghostbusters.”

The smaller-than-forecasted size of the dead zone surprised researchers on The Pelican who saw just how deep the low oxygen levels went.

“None of us really thought until the estimate came out that it was below average size because we’re able to see the three-dimensionality of it. That’s not really incorporated into that estimate,” Glaspie said.

She also noticed unusually large amounts of algae on the surface of the water “like ectoplasm in Ghostbusters.” Toxic algae blooms can kill fish and other sea life as well as poison humans.

“If I had to say what would be important for us to monitor in the future, it would be these algal blooms, and making sure that we’ve got a good handle on which ones have harmful species,” she said.

This is why Glaspie, donned in her sun-protective clothes and work boots, braves the waves, the heat and the journey across the Gulf every year.

“This is our finger on the pulse of our nutrient pollution problem that Louisiana is inheriting from the entire country,” Glaspie said. “We cannot take our finger off that pulse. It is unfair to Louisiana. We have this pollution problem. We need to understand it.”

—Elise Plunk (@elise_plunk), Louisiana Illuminator

This story is a product of the Mississippi River Basin Ag & Water Desk, an independent reporting network based at the University of Missouri in partnership with Report for America, with major funding from the Walton Family Foundation.

Radar Surveys Reveal Permafrost Recovery After Wildfires

Thu, 09/04/2025 - 14:31
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Permafrost is considered a critical global component of the cryosphere given its climate-sensitive nature and its key geomorphological and ecosystem role. Permafrost is also affected by wildfires which may cause the crossing of a tipping point in cryospheric systems. In fact, wildfires may reduce vegetation, destroy organic layers, modify surface albedo, leading to active layer thickening and ground subsidence. Permafrost itself is subjected to long term deformation after wildfires, but this deformation is currently poorly understood.

Cao and Furuya [2025] use remote sensing to explore ground surface deformation signals across multiple fire scars in the past five decades in North Yukon. The authors find that post-permafrost evolution follows three distinct phases characterized by land subsidence soon after the event and final recovery of the permafrost over a 50-year timescale, which implies soil uplift. Such an uplift phase is rarely reported and is related to vegetation regeneration and soil greening after the fire. These provide thermal protection, suggesting a critical mechanism of permafrost recovery. These findings highlight the resilience of boreal-permafrost systems against wildfires, but continuous monitoring is needed as wildfire and climate change intensify.

Citation: Cao, Z., & Furuya, M. (2025). Decades-long evolution of post-fire permafrost deformation detected by InSAR: Insights from chronosequence in North Yukon. AGU Advances, 6, e2025AV001849. https://doi.org/10.1029/2025AV001849

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

An Accessible Alternative for Undergraduate Research Experiences

Thu, 09/04/2025 - 13:33

Undergraduate research experiences (UREs) in science, technology, engineering, and mathematics (STEM) offer students hands-on research experience and essential professional skills and connections to prepare them to succeed in the workforce. They also cultivate students’ sense of belonging, confidence, and identity—and promote retention—in STEM fields [National Academies of Sciences, Engineering, and Medicine, 2017; Rodenbusch et al., 2016].

To be effective, UREs should be thoughtfully designed to meet the needs of students who may otherwise miss out on career opportunities tied to networking and community-building through such programs. Existing URE programs have followed a range of approaches, but traditionally, many have been centered around short-duration, time-intensive, individual, mentor-directed experiences, such as full-time summer internships in field or laboratory settings. However, these traits can inadvertently exclude some student populations, a concern that is leading many programs to modify their structure and design to engage broader groups.

To lower barriers to participation in UREs, we developed the Authentic Research through Collaborative Learning (ARC-Learn) program at Oregon State University (OSU). ARC-Learn, which ran from 2021 to 2024 and comprised two overlapping student cohorts, offered a long-term, low-intensity program focused on Arctic science and inclusive mentorship. It was designed to help students engage in a science community, foster identities as STEM professionals, and develop critical scientific and data literacy skills and 21st century competencies such as teamwork and communication.

Table 1. Design Features of ARC-LearnFeatureDescriptionDuration18 months (2 academic years)Intensity2–4 hours per weekLocationOn campus or remoteMentorshipMultiple mentors working in teams with multiple studentsTopic selectionStudent drivenStudent supportMentors, peers, program administrators, academic advisorsMentorship developmentInclusive mentorship training, facilitated peer learning communityResearch tasksDevelop research question, find data and analyze data, draw conclusions, and present findingsStudent developmentDiscover own strengths as researchers, work with a team, supplemental training in missing skills

ARC-Learn incorporated alternative design features (Table 1) to meet the needs of students who do not typically have access to time-intensive field or lab-based UREs, such as transfer students, remote students, and those with dependent care, military service, and other work commitments [Scott et al., 2019] or who have nontraditional enrollment patterns (e.g., dual enrollment in both university and community college, varying enrollment from term to term).

The program was framed in the context of Arctic science because of the region’s outsize effects on climate, ecosystems, and communities globally and to engage students with long-term research investments in polar regions [Marrongelle and Easterling, 2019]. The Arctic also offers a dynamic and interdisciplinary context for a URE program, enabling students to follow their interests in investigating complex science questions. In addition, numerous long-term Arctic monitoring programs offer rich datasets useful in all kinds of STEM careers.

Despite encountering challenges, the ARC-Learn model proved successful at engaging and motivating students and also adaptive as program organizers made adjustments from one cohort to the next in response to participant feedback.

The ARC-Learn Model

With support from mentors and peers, students experienced the whole research arc and gradually took ownership of their work.

Each ARC-Learn cohort lasted 2 academic years and included a dozen or more students. Participants received a stipend to offset costs associated with participation, such as childcare and missed work time, and had the option of obtaining a course credit each term to meet experiential learning requirements. With support from mentors and peers, they experienced the whole research arc and gradually took ownership of their work through three key phases of the program.

Early year 1: Build research teams. Some URE mentorship models involve a mentor primarily driving selection of a research topic and the student completing the work. In ARC-Learn, students learned from multiple mentors and peers, while mentors supported each other and received feedback from students (Figure 1). The students self-selected into research teams focused on a broad topic (e.g., marine heat waves or primary productivity), then developed individual research questions based on their strengths and interests.

Fig. 1. Some models of undergraduate research experiences have involved a mostly one-way transfer of knowledge from a single mentor to a single student, with the mentor deciding the research topic and the student completing the work. In ARC-Learn, students learned from multiple mentors and peers as part of small-group research teams, while mentors supported each other and received feedback from students.

Mentor-student teams met every other week—and students met one-on-one with mentors as needed—to support individual projects. The entire cohort also met twice a month to discuss topics including the fundamentals of Arctic science and the scientific process and to report out on progress toward milestones.

Late year 1 to middle of year 2: Develop research questions and find and analyze data. With no field or lab component to the program, ARC-Learn students worked exclusively with existing data. These data came from NASA and NOAA satellite-based sources such as the Moderate Resolution Imaging Spectroradiometer (MODIS), Advanced Very High Resolution Radiometer, and Soil Moisture Active Passive (SMAP) instruments; shipboard sources such as NOAA’s Distributed Biological Observatory, the Alaska Ocean Observing System, and the University-National Oceanographic Laboratory System’s Rolling Deck to Repository; and the National Science Foundation’s (NSF) Arctic Data Center and NOAA’s National Centers for Environmental Information.

Students often revised their research questions or the datasets they used multiple times to produce meaningful findings (Figure 2). Notably, access to these datasets proved critical to the educational experience of ARC-Learn students, highlighting the importance of maintaining them in public archives for future URE activities.

Fig. 2. ARC-Learn students developed their own research questions and worked exclusively with existing data to answer them. Students often revised their research questions or datasets multiple times to produce meaningful findings.

Many students struggled with finding, cleaning, analyzing, and interpreting data, often because of limited experience with tools such as geographic information system software and programming languages such as Python and R. At times, the required expertise was beyond even their mentors’ knowledge. Hands-on skill development workshops during cohort meetings connected students with additional mentors proficient in specific platforms and tools to help fill knowledge gaps and help students overcome obstacles.

Although the students encountered occasional setbacks, they reported that achievements such as settling on a final research question and creating rich data visualizations proved deeply rewarding and motivated further progress.

Late year 2: Share the results. Over several months, students created research posters with feedback and support along the way from their teammates, mentors, and the entire cohort. The program concluded with a grand finale, featuring on-campus gatherings for remote and in-person students, a dress rehearsal poster session, a celebratory dinner, and final presentations at a university-wide undergraduate research symposium.

Zoe’s Story

After a successful 7-year military career, Zoe enrolled at OSU to study the Arctic through her participation in ARC-Learn. As a student in cohort 2, she experienced several challenges along the research arc before finding success, and her experience helps illustrate the program’s model.

Zoe joined fellow students and mentors in the Marine Heatwaves research team and then narrowed her focus by exploring scientific literature and talking with her primary mentor to understand physical and chemical factors associated with marine heat waves as well as their effects on ocean ecosystems. She developed several research questions focused on how factors such as atmospheric pressure and temperature have affected the development and extent of marine heat waves off Alaska since 2010.

As Zoe and her mentor considered available datasets and relevant literature further, they realized that her questions were still too broad given the number of variables affecting ocean-atmosphere interactions. At one of the full-cohort meetings, she shared her difficulties and frustrations, prompting another mentor to offer their help. This mentor worked with Zoe to understand a key meteorological feature—the Aleutian Low—in the area she was studying, as well as relevant data available through the European Union’s Copernicus Climate Change Service [Hersbach et al., 2023] and the appropriate analysis platform.

“We jumped in and learned it together. She helped me find the right data, which in turn, allowed me to finalize my research question,” Zoe said.

Nuanced and iterative feedback from mentors and peers guided ARC-Learn participants, including Zoe, to design posters that balanced visual presentations of data alongside descriptive text to explain research findings. Credit: Ryan Brown

From that point, Zoe quickly landed on a focused question that she could address: Does a disruption in the Aleutian Low lead to marine heat waves over the North Pacific region? The final step was to develop a visually striking poster to invite attention, questions, and ideas during the research symposium.

“Seeing other people interested in my research…was validating of me as a scientist.”

Zoe’s experience at the poster session captured what we heard from many other students in the program. Even after her 2 years of being immersed in her project and working with mentors and peers, she said she felt imposter syndrome as a student trying to become a scientist and thought no one would care about her research.

“But people were really interested,” she said. “Seeing other people interested in my research, able to read and understand it on a poster, [and] ask me questions and suggest ideas was validating of me as a scientist.”

A Responsive Approach to URE Design

Through ARC-Learn, program leads sought to expand knowledge about the benefits and challenges of a long-duration, lower-intensity, team-based URE model. Because it was a design-based research program, mentor, student, and coordinator feedback was collected and continually used to make program adjustments [Preston et al., 2022, 2024].

Feedback was collected through pre-, mid-, and end-of-program surveys, as well as pre- and end-of-program interviews, and analyzed by a research and evaluation team. Findings were reported to the program leads, who also met regularly with external expert advisers to get additional recommendations for adjustments. By running two overlapping cohorts (the second started when the first was halfway completed), organizers could address issues that arose for the first cohort to improve the experience of the second one.

Lessons from ARC-Learn are documented in a practitioner guidebook, which discusses practical considerations for others interested in implementing alternative URE models [Brown et al., 2024]. In the guidebook, we examine each design component of ARC-Learn and offer recommendations for designing UREs that meet enrolled students’ specific learning needs and develop their science skills to meet relevant workforce demands.

Novel elements of the Authentic Research through Collaborative Learning (ARC-Learn) program were important in influencing participants’ persistence and success.

A few valuable lessons learned include the following.

Attrition. Expect high attrition rates in UREs designed for nontraditional students, and do not react by making drastic program changes that risk sacrificing otherwise successful program elements. We observed a 45% attrition rate in each cohort, which is indeed high but perhaps not surprising considering the population involved in the program—largely transfer students and those with caregiving or work responsibilities.

Most participants who left did so because of life crises or obligations that paused their research and educational goals. This observation embodies the complexity of students’ lives and reinforces the need for continually creative, flexible, inclusive program structures. For those who completed ARC-Learn, novel elements of the program (e.g., working in teams) were important in influencing their persistence and success.

Remote research applications. The first cohort started in 2021 entirely via remote instruction during the COVID-19 pandemic, before eventually transitioning to a hybrid approach as in-person instruction resumed. All ARC-Learn students in cohort 1 returned to campus, except one Ecampus student, who remained online. The program team and mentors struggled to balance the needs of the remote student, who eventually became somewhat detached from their research team.

As teamwork, camaraderie, and inclusivity are important qualities of the program, we decided for cohort 2 to recruit enough Ecampus students (plus two dedicated mentors) to form a research team of their own. The remote team was engaged and productive—meeting deadlines and producing high-quality work—highlighting the potential of all-remote URE models for students who might otherwise lack access to meaningful research opportunities.

Student-driven research. ARC-Learn empowered students to pursue their own research questions, fostering their autonomy and ownership of their work. However, the open-endedness of selecting their own research paths and the lack of guardrails proved challenging for participants.

We thus hired a program coordinator to provide one-on-one logistical support; establish clear expectations, timelines, and scaffolded assignments; and arrange workshops to teach programming and data analysis skills. This approach, as reported by students who worked with the coordinator, helped many program participants stay on track and ultimately complete their research project.

Mentor coordination. Enabling student success also meant supporting mentors. Organizers provided inclusive mentorship trainings and facilitated a peer learning community. They also made programmatic adjustments in response to experiences in the first cohort.

The student-driven nature of the research sometimes resulted in mismatches between student interests and mentor expertise in cohort 1. So in cohort 2, we engaged mentors earlier in the planning process to define thematic areas for the research teams, creating topics broad enough for students to find an area of interest but narrow enough for mentors to provide guidance. In addition, many mentors had field schedules typical of polar scientists, often resulting in weeks to months at sea. We purposefully paired mentors and asked about planned absences so we could fill any gaps with additional support.

Overall, students in cohort 2 reported feeling highly supported and valued by their mentors and that mentors created welcoming environments to ask questions and solve problems together.

A Foundation to Build On

Participants gained a deep understanding of the complexities and challenges of modern science as well as knowledge and skills needed in scientific education and careers.

From students’ feedback—and the research they did—it’s clear that participants who completed the ARC-Learn program gained a deep understanding of the complexities and challenges of modern science as well as knowledge and skills needed in scientific education and careers. The program thus highlights paths and lessons for others looking to develop successful alternatives to traditional UREs.

Many former ARC-Learn students are continuing to develop research skills, particularly in polar science, through internships and employment in field and lab research efforts. Zoe is working toward a bachelor’s degree in environmental sciences and exploring interests in environmental hazards, conservation, and restoration. For her, the program served as a foundation from which she is building a career and establishing confidence in herself as a scientist.

“I thought I’d have to play catch-up the whole time as an older, nontraditional student,” she said. But through the experience, “I realized I could start anywhere.”

Acknowledgments

ARC-Learn was a collaboration between OSU’s College of Earth, Ocean and Atmospheric Sciences and STEM Research Center. This work is supported by the U.S. NSF (award 2110854). Opinions, findings, conclusions, and recommendations in these materials are those of the authors and do not necessarily reflect the views of NSF.

References

Brown, R., et al. (2024), ARC-Learn Practitioner Guidebook: Practical considerations for implementing an alternative model of undergraduate research experience, Ore. State Univ., Corvallis, https://doi.org/10.5399/osu/1177.

Hersbach, H., et al. (2023), ERA5 monthly averaged data on single levels from 1940 to present, Copernicus Clim. Change Serv. Clim. Data Store, https://doi.org/10.24381/cds.f17050d7.

Marrongelle, K., and W. E. Easterling (2019), Support for engaging students and the public in polar research, Dear Colleague Letter prepared for the U.S. National Science Foundation, Alexandria, Va., www.nsf.gov/funding/opportunities/dcl-support-engaging-students-public-polar-research/nsf19-086.

National Academies of Sciences, Engineering, and Medicine (2017), Undergraduate Research Experiences for STEM Students: Successes, Challenges, and Opportunities, 278 pp., Natl. Acad. Press, Washington, D.C., https://doi.org/10.17226/24622.

Preston, K., J. Risien, and K. B. O’Connell (2022), Authentic Research through Collaborative Learning (ARC-Learn): Undergraduate research experiences in data rich Arctic science formative evaluation report, STEM Res. Cent., Ore. State Univ., Corvallis, https://doi.org/10.5399/osu/1156.

Preston, K., J. Risien, and N. Staus (2024), Authentic Research through Collaborative Learning (ARC-Learn): Undergraduate research experiences in data rich science summative evaluation report, STEM Res. Cent., Ore. State Univ., Corvallis, https://doi.org/10.5399/osu/1178.

Rodenbusch, S. E., et al. (2016), Early engagement in course-based research increases graduation rates and completion of science, engineering, and mathematics degrees, CBE Life Sci. Educ., 15(2), ar20, https://doi.org/10.1187/cbe.16-03-0117.

Scott, G. W., S. Humphries, and D. C. Henri (2019), Expectation, motivation, engagement and ownership: Using student reflections in the conative and affective domains to enhance residential field courses, J. Geogr. Higher Educ., 43(3), 280–298, https://doi.org/10.1080/03098265.2019.1608516.

Author Information

Ryan Brown (ryan.brown@oregonstate.edu), Laurie Juranek, and Miguel Goñi, College of Earth, Ocean and Atmospheric Sciences, Oregon State University, Corvallis; and Julie Risien and Kimberley Preston, STEM Research Center, Oregon State University, Corvallis

Citation: Brown, R., L. Juranek, M. Goñi, J. Risien, and K. Preston (2025), An accessible alternative for undergraduate research experiences, Eos, 106, https://doi.org/10.1029/2025EO250326. Published on 4 September 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Spacecraft Surveys Shed New Light on Auroral Kilometric Radiation

Wed, 09/03/2025 - 18:53
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Auroral Kilometric Radiation (AKR) is a type of radio wave emitted from Earth’s auroral regions. It is the dominant radio emission from Earth and has been extensively studied, though previous analyses were constrained by limited spacecraft coverage.

Today, with the availability of more spacecraft observations, it is possible to improve our understanding of the Earth’s most intense natural radio emission. Thanks to these data, Wu et al. [2025]  find that Auroral Kilometric Radiation preferentially occurs at high-latitudes and on the Earth’s night-side. They also found that the dense plasmasphere, which is a region of high-density plasma around Earth, blocks AKR from traveling, thus forming an equatorial shadow zone around the plasmasphere. Furthermore, the authors discover that the low-density ducts within the plasmasphere act as waveguides, enabling AKR to penetrate the dense plasmasphere and propagate along these channels.

The findings provide valuable insights into Earth’s electromagnetic environments, space weather events and geomagnetic storms that may adversely affect satellites, communication systems, GPS, and power grids on Earth.  

Citation: Wu, S., Whiter, D. K., Zhang, S., Taubenschuss, U., Zarka, P., Fischer, G., et al. (2025). Spatial distribution and plasmaspheric ducting of auroral kilometric radiation revealed by Wind, Polar, and Arase. AGU Advances, 6, e2025AV001743. https://doi.org/10.1029/2025AV001743

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Bridging Old and New Gravity Data Adds 10 Years to Sea Level Record

Wed, 09/03/2025 - 13:38

As climate change accelerates, it’s more important than ever to understand the individual drivers of sea level rise, from land subsidence and coastal erosion to changes in ocean volume. For the past 20 years, scientists have had access to high-resolution, satellite-derived maps of Earth’s gravity field, which allows them to calculate fluctuations in global ocean mass.

Recently, geodesists have found a way to extend that record back 10 more years, significantly extending the time frame by which they can consistently measure global ocean mass change.

“This is the first observation-based global ocean mass time series” from 1993 to the present, said Jianli Chen, a geodesy researcher at Hong Kong Polytechnic University in China and a coauthor on the research.

By reconciling older and newer techniques for measuring ocean mass change, the team’s work improves calculations of long-term trends and provides a potential stopgap should satellite data no longer be available.

Shooting Lasers into Space

When scientists measure sea level rise, they consider two main components: how much the ocean’s volume has grown because of changes in water density—the steric component—and how much it has grown because it has gained mass from melted ice—the barystatic component.

Past estimates of total ocean mass change have relied on indirect methods like adding up mass loss from ice sheets, glaciers, and land water storage, explained Yufeng Nie, a geodesy researcher also at Hong Kong Polytechnic University and lead researcher on the new study. Mass lost from these areas is assumed to translate to an increase in ocean mass.

“But these individual estimates are not necessarily consistent, because they are developed by different groups” with different methodologies, Nie said.

In light of this, some researchers adapted satellite laser ranging (SLR), a technique in which scientists bounce ground-based lasers off orbiting satellites to track changes in ocean mass. SLR has been used for decades to measure Earth’s nonuniform gravity field by observing shifts in satellite orbits. A satellite’s altitude depends on Earth’s gravity at any given point, and gravity in turn depends on the distribution of mass beneath that point. Measuring satellite altitudes thus provides a window into measuring ocean mass changes.

“How can you observe, for example, ocean mass change from Antarctic melting using a technique with 4,000-kilometer spatial resolution?”

However, one key drawback to using SLR to measure barystatic sea level (BSL) change is that it can measure changes only on very large spatial scales, which limits its application in climate research, Chen said.

“How can you observe, for example, ocean mass change from Antarctic melting using a technique with 4,000-kilometer spatial resolution?” asked Chen.

Enter NASA’s Gravity Recovery and Climate Experiment (GRACE) missions. GRACE and its successor, GRACE Follow-On (GRACE-FO), each consisted of two satellites chasing each other along the same orbit, continuously sending laser beams back and forth. Like SLR, this process allowed the GRACE missions to provide maps of Earth’s surface mass, but at 10 times the resolution of SLR. And like with SLR, scientists have used GRACE gravity maps to track global ocean mass change.

But GRACE data, too, have their caveats. The first GRACE mission spanned 2002–2017, and GRACE-FO has spanned from 2018 to the present, a short time for understanding long-term trends. What’s more, the 11-month gap between GRACE and its successor meant that scientists were not able to calibrate the two satellites with each other, leaving some uncertainty about systematic differences between the missions.

A Near-Perfect Match

Nie, Chen, and their team were able to address both of these caveats by comparing SLR-based measurements of global ocean mass change with those from GRACE/-FO for the same time period, 2003–2022.

According to gravity maps provided by SLR, barystatic sea level change was 2.16 millimeters per year from 2003 to 2022, while GRACE/-FO measured 2.13 millimeters per year.

The new analysis shows that SLR and GRACE/-FO “agree quite well for the long-term trends,” Nie said. What’s more, researchers found no significant change in the calculation when the data transitioned from GRACE to GRACE-FO. “This gives us confidence that the SLR data, although it is of very low spatial resolution, can be used to tell us the ocean mass variations before 2002,” he added.

“Our SLR measurements…can provide a global constraint of the mass changes for the pre-GRACE era.”

The researchers were able to extend the time frame of their analysis back to 1993 by using SLR data, and they calculated a barystatic sea level change of 1.75 millimeters per year for 1993–2022. They attribute the lower rate of sea level rise in the past to recent acceleration of ice loss in Greenland.

“Our SLR measurements…can provide a global constraint of the mass changes for the pre-GRACE era,” Nie said.

This study was published in Proceedings of the National Academy of Sciences of the United States of America in June.

“Extending the record of measured BSL using satellite laser ranging back to 1993 is an important achievement,” said Bryant Loomis, chief of the Geodesy and Geophysics Laboratory at NASA’s Goddard Space Flight Center in Greenbelt, Md. “It allows the disaggregation of total sea level change, which is measured by altimetry, into its barystatic and steric components.”

“The long-term BSL estimate is also useful for assessing the accuracy of previous efforts to quantify the major land ice contributions to BSL prior to the launch of GRACE,” he added, referring to the method of adding together mass changes from glaciers, ice sheets, and land water storage. Loomis was not involved in the new research.

Nie, Chen, and their team are working to push the limits of SLR-derived barystatic sea level measurements to smaller spatial scales and lower uncertainties. They hope to demonstrate that SLR data can be used to measure mass change in Antarctica.

GRACE Continuity?

GRACE-FO launched in 2018 and is 7 years into its nominal 5-year mission. The satellites are in good health, and the nearly identical GRACE mission set a good precedent—it lived for more than 15 years. GRACE-FO might well overlap with its planned successor, GRACE-Continuity (GRACE-C), which is scheduled to launch in 2028.

The GRACE missions are designed to measure minute changes in Earth’s gravity at high spatial resolution. However, there was a coverage gap between the end of the GRACE mission and the start of GRACE-FO, and there may be a similar gap between GRACE-FO and GRACE-C. Credit: NASA/JPL-Caltech, Public Domain

However, recent woes for federally funded science in the United States have put GRACE-C’s future in doubt. Although NASA requested funding for GRACE-C for fiscal year 2026 through the mission’s launch, NASA’s acting administrator, Sean Duffy, recently stated his, and presumably President Donald Trump’s, desire to eliminate all Earth science at the agency (including healthy satellites). That cutback would likely nix GRACE-C.

In the near future, both Europe and China plan to launch satellite-to-satellite laser ranging missions that will provide GRACE-like measurements of Earth’s gravity, Chen said. However, the loss of GRACE-quality data would hamper climate scientists’ ability to accurately track drivers of sea level rise, he added. The SLR-derived measurements demonstrated in this recent research could help mitigate the loss, but only somewhat.

“There’s no way SLR can reach the same [resolution] as GRACE,” Chen said. “We can only use SLR to see the long-term, the largest scale, to fill the gap. But for many of GRACE’s applications—regional water storage or glacial mass change—no, there’s no way SLR can help.”

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

Citation: Cartier, K. M. S. (2025), Bridging old and new gravity data adds 10 years to sea level record, Eos, 106, https://doi.org/10.1029/2025EO250321. Published on 3 September 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Primera evaluación a nivel de especies revela riesgo de extinción en Mesoamérica

Wed, 09/03/2025 - 13:35

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

La reforestación es más compleja que simplemente plantar árboles. Esta incluye la evaluación de hábitats y ecosistemas, la identificación de la salud y la sostenibilidad de diferentes especies y el estudio de las estrategias para establecer nuevos asentamientos de árboles.

En regiones como Mesoamérica, donde los bosques están gravemente amenazados por las actividades humanas y el cambio climático, los conservacionistas interesados en la reforestación deben priorizar las especies cuyas poblaciones están disminuyendo. Para facilitar esta tarea, un grupo de investigadores evaluó el estado de conservación de las 4,046 especies de árboles endémicas de Mesoamérica, descritas en el proyecto Global Tree Assessment (Evaluación global de árboles). Es así como descubrieron que el 46% de estos árboles se encuentran en cierto riesgo de extinción.

Este estudio es el primero en evaluar el estado de todos los árboles endémicos en Mesoamérica.

El estudio, publicado en la revista Plants, People, Planet, es el primero en evaluar el estado de todos los árboles endémicos en Mesoamérica.

Emily Beech, autora principal del estudio y jefa de conservación en Botanic Gardens Conservation International (Conservación Internacional de Jardines Botánicos), enfatizó la importancia de enfocarse en esta región debido a sus altos niveles de biodiversidad, que con frecuencia están subrepresentados. Los países centroamericanos (Belice, Costa Rica, El Salvador, Guatemala, Honduras, Nicaragua y Panamá), dijo Beech, rara vez figuran entre los de mayor biodiversidad o como el hogar del mayor número de especies en peligro de extinción. Esta ausencia no se debe a una falta de biodiversidad, explicó, sino que es simplemente atribuible a su tamaño. El tamaño reducido de estos países hace que sean eclipsados por países grandes con bosques más extensos, como Brasil y la República Democrática del Congo. Pero, junto con México, Centroamérica alberga el 10% de la diversidad vegetal del mundo a pesar de representar menos del 1% de su superficie terrestre.

Para abordar esta brecha, los científicos primero identificaron árboles endémicos mesoamericanos a partir de evaluaciones presentadas en la Lista Roja de especies amenazadas de la Unión Internacional para la Conservación de la Naturaleza (IUCN, por sus siglas en inglés). Posteriormente, para evaluar el estado de conservación de los árboles, los investigadores superpusieron mapas de distribución de las especies arbóreas seleccionadas sobre mapas de la Base de Datos Mundial de Áreas Protegidas.

De las 4,046 especies arbóreas analizadas, encontraron que 1,867 están en peligro de extinción. México fue el único país que tenía especies arbóreas extintas en la base de datos, o extintas en estado silvestre. En los árboles existentes, México y Costa Rica presentaron el mayor número de especies amenazadas, con 888 y 227, respectivamente. La amenaza más común en general fue la pérdida de hábitat debido a la expansión agrícola.

La mayoría de las especies (3,349) contaban con al menos un punto de datos dentro de un área protegida. Sin embargo, el 72% de las especies mesoamericanas en áreas protegidas están amenazadas.

Un enfoque personalizado

Neptalí Ramírez Marcial no participó en la nueva investigación, pero como jefe del grupo de restauración del South Border College en México, trabaja con especies arbóreas que se encuentran en diferentes categorías de amenaza. Los bosques de Chiapas, donde él y sus colegas residen, solían estar repletos de encinos, que albergaban altos niveles de biodiversidad. Debido a la influencia humana, ahora hay más pinos que encinos, y el clima es menos favorable para las especies sensibles de la Lista Roja de la UICN.

A pesar del uso de la Lista Roja por parte de Ramírez Marcial, este se mantiene crítico con la herramienta y su uso en la investigación. Por ejemplo, señaló que la nueva evaluación de árboles mesoamericanos clasifica a la Furcraea macdougallii (planta del siglo de MacDougall) como extinta en México. Ramírez Marcial cree que esta planta es similar al agave y no debería considerarse un árbol en absoluto, por lo cual no debería incluirse en el estudio.

También señaló que el nuevo estudio considera a todo México como parte de Mesoamérica. Desde el punto de vista ecológico, dijo, la región biogeográfica mesoamericana se extiende solamente por el centro de México y excluye la parte norte del país, la cual tiene ecosistemas discretos no compartidos con Centroamérica.

Ocotea monteverdensis “pasó de no estar siquiera incluido en la lista a estar en la categoría de conservación más vulnerable”.

Ramírez Marcial coincidió con las conclusiones del nuevo estudio, sin embargo, argumenta que: las estrategias de restauración deben considerar la biodiversidad de las áreas que se desean proteger. Por ejemplo, señaló que los programas del gobierno mexicano priorizan la distribución de pinos para la reforestación en todo el país, en lugar de diseñar estrategias definidas para cada región.

Daniela Quesada, conservacionista del Instituto Monteverde en Costa Rica, afirmó que el nuevo estudio ofrece una visión más completa del estado de los árboles en Mesoamérica. No obstante, al igual que Ramírez Marcial, considera la información de la Lista Roja de la UICN como un punto de partida para la investigación. La exactitud de la Lista Roja, explicó, depende de la cantidad de información que se le presente.

Quesada apuntó que el siguiente paso para la conservación de los árboles en Mesoamérica es que los científicos “analicen con más detalle cada especie que apareció” en el nuevo estudio. Un análisis riguroso de la presencia e influencia de cada especie en cada región podría influir en el desarrollo de proyectos de conservación determinados.

Como ejemplo, mencionó el caso de Ocotea monteverdensis, un árbol que “pasó de no estar siquiera incluido en la lista a estar en la categoría de conservación más vulnerable” (en peligro crítico) gracias al trabajo del ecólogo John Devereux Joslin Jr. Este reconocimiento condujo al desarrollo de un programa comunitario de conservación específico y continuo para este árbol.

—Roberto González (@perrobertogg.bsky.social), Escritor de ciencia

This translation by translator Oriana Venturi Herrera (@OrianaVenturiH) was made possible by a partnership with Planeteando y GeoLatinas. Esta traducción fue posible gracias a una asociación con Planeteando and GeoLatinas.

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Experienced Researcher Book Publishing: Sharing Deep Expertise

Wed, 09/03/2025 - 12:00
Editors’ Vox is a blog from AGU’s Publications Department.

Being an experienced researcher can come with a lot of heavy professional responsibilities, such as leading grant proposals, managing research teams or labs, supervising doctoral students and postdoctoral scientists, serving on committees, mentoring younger colleagues … the list goes on. This may also be a time filled with greater personal responsibilities beyond the job. Why add to the workload by taking on a book project? In the third installment of career-focused articles, three scientists who wrote or edited books as experienced researchers reflect on their motivations and how their networks paved the way for—and grew during—the publishing process.

Douglas Alsdorf co-edited Congo Basin Hydrology, Climate, and Biogeochemistry: A Foundation for the Future, which discusses new scientific discoveries in the Congo Basin and is published in both English and French. Nancy French co-edited Landscape Fire, Smoke, and Health: Linking Biomass Burning Emissions to Human Well-Being, which presents a foundational knowledge base for interdisciplinary teams to interact more effectively in addressing the impacts of air pollution. Michael Liemohn authored Data Analysis for the Geosciences: Essentials of Uncertainty, Comparison, and Visualization, a textbook on scientific data analysis and hypothesis testing in the Earth, ocean, atmospheric, space, and planetary sciences. We asked these scientists why they decided to write or edit a book, what impacts they saw as a result, and what advice they would impart to prospective authors and editors.

Why did you decide to write or edit a book? Why at that point in your career?

ML: I was assigned to develop a new undergraduate class on data-model comparison techniques. I realized that the best textbooks for it were either quite advanced or rather old. One book I love included the line, “if the student has access to a computer…” in one of the homework questions. I also was not finding a book with the content set that I wanted to cover in the class. So, I developed my own course content set and note pack, which provided the foundation for the chapters of the book.

DA: Our 2022 book was a result of a 2018 AGU Chapman Conference in Washington, DC, that I was involved in organizing. About 100 researchers, including 25 from sub-Saharan Africa, attended the conference, and together we decided that an edited book in the AGU Geophysical Monograph Series would become a launching point for the next decade of research in the Congo Basin.

The motivation for the book was not to advance my career, but because the topic was important to get out there.

NF: The motivation for the book was not to advance my career, but because the topic was important to get out there. The book looks at how science is trying to better inform how to manage smoke from wildland fires. The work was important because people in fire, smoke modeling, and health sciences do not work together often, and there were some real misconceptions about how others do the research and how detailed the topics can be.

What were some benefits of completing a book as an experienced researcher? 

NF: Once you have been working in a field for a while you want to see how your deep expertise can benefit more than just the community of researchers that you know or know of. Reaching into other disciplines allows you to understand how your work can have broader impact. And, you are ready to know more about other, adjacent topics, rather than a deeper view of what you know already. I think these feelings grow more true as you move to later stages of a career.

I think that I would have greatly struggled with this breadth of content if I had tried to write this particular book 10 years earlier.

ML: I was developing my data-model comparison techniques course and textbook for all students in my department, so I wanted to include examples across that diverse list of disciplines—Earth, atmosphere, space, and planetary sciences. Luckily, over the years I had taught a number of classes spanning these topics. Additionally, I had attended quite a few presentations across these fields, not only at seminars on campus but also at the annual AGU meeting. I felt comfortable including examples and assignments from all these topics. Also, I knew colleagues in these fields, and I called on them for advice when I got stuck. I think that I would have greatly struggled with this breadth of content if I had tried to write this particular book 10 years earlier.

What impact do you hope your book will have?

The next great discoveries will happen in the Congo Basin and our monograph motivates researchers toward those exciting opportunities. 

DA: There are ten times fewer peer-reviewed papers on the Congo Basin compared to the Amazon Basin. Our monograph changes that! We have brought new attention to the Congo Basin, demonstrating to the global community of Earth scientists that there is a large, vibrant group of researchers working daily in the Congo Basin. The next great discoveries will happen in the Congo Basin and our monograph motivates researchers toward those exciting opportunities. 

ML: I hope that the book has two major impacts. The first expected benefit is to the students that use it with a course on data-model comparison methods. I want it to be a useful resource regardless of their future career direction. The second impact I wish for is on Earth and space scientist researchers; I hope that our conversations about data-model comparisons are ratcheted up to a higher level, allowing us to more thoughtfully conduct such assessments and therefore maximize scientific progress.

What advice would you give to experienced researchers who are considering pursuing a book project?

NF: Here are a few thoughts: One: Choose co-authors, editors, and contributors that you can count on. Don’t try to “mend fences” with people you have not been able to connect with. That said, if you do admire a specific person or know their point of view is valuable, this is the time to overcome any barriers to your relationship. Two: Give people assignments, and they will better understand your point of view. Three: Listen to your book production people. They are all skilled professionals who know more about this than you do. They can be great allies in getting it done!

DA: Do it! Because we publish papers, our thinking tends to focus on the one topic of a particular paper. A book, however, broadens our thinking so that we more fully understand the larger field of work. Each part of that bigger space has important advances as well as unknowns that beg for answers. A book author who can see each one of these past solutions and future challenges becomes a community resource who provides insights and directions for new research. 

—Douglas Alsdorf (alsdorf.1@osu.edu, 0000-0001-7858-1448), The Ohio State University, USA; Nancy French (nhfrench@mtu.edu, 0000-0002-2389-3003), Michigan Tech Research Institution, USA; and Michael Liemohn (liemohn@umich.edu, 0000-0002-7039-2631), University of Michigan, USA

This post is the third in a set of three. Learn about leading a book project as an early-career or mid-career researcher.

Citation: Alsdorf, D., N. French, and M. Liemohn (2025), Experienced researcher book publishing: sharing deep expertise, Eos, 106, https://doi.org/10.1029/2025EO255028. Published on 3 September 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Inside a Georgia Beach’s High-Tech Fight Against Erosion

Tue, 09/02/2025 - 13:09

This story was originally published by Grist. Sign up for Grist’s weekly newsletter here. This coverage is made possible through a partnership between Grist and WABE, Atlanta’s NPR station.

At low tide on Tybee Island, Georgia, the beach stretches out as wide as it gets with the small waves breaking far away across the sand—you’ll have a long walk if you want to take a dip. But these conditions are perfect for a team of researchers from the University of Georgia’s Skidaway Institute of Oceanography.

Every three months, at low tide, they set out a miniature helipad near the foot of the dune and send up their drone equipped with lidar—technology that points a laser down at the sand and uses it to measure the elevation of the beach and dunes. The team flies it back and forth from the breakers to the far side of the dune and back until they have a complete, detailed map of the island’s 7-mile beach, about 400 acres.

“I see every flip-flop on the beach.”

“It’s high accuracy, it’s a high resolution,” explained research technician Claudia Venherm, who leads this project. “I see every flip-flop on the beach.”

That detailed information is crucial because Tybee is a barrier island, and rising seas are constantly eating away at the sandy beach and dunes that protect the island’s homes and businesses as well as a stretch of the Georgia mainland. Knowing exactly where the island is eroding and how the dunes are holding up to constant battering can help local leaders protect this piece of coastline.

“Tybee wants to retain its beach. It also wants to maintain, obviously, its dune. It’s a protection for them,” said Venherm. “We also give some of our data to the Corps of Engineers so they know what’s going on and when they have to renourish the beach.”

Since the 1970s the Army Corps of Engineers has helped maintain Tybee Island’s beaches with regular renourishment: Every seven years or so, the Corps dredges up sand from the ocean floor and deposits on the beach to replace sand that’s washed away. The data from the Skidaway team will only help the Corps do this work more effectively. Lidar isn’t new, and neither is aerial coastal mapping. Several federal agencies monitor coastlines with lidar, but those surveys are more typically several years apart for any one location, rather than a few months.

The last renourishment finished in January 2020, and Venherm and her team got to work a few months later. That means they have five years of high-resolution beach data, recorded every three months and after major storms like Hurricane Helene, creating a precise picture of how the beach is changing.

“I can compute what the elevation of the dune is, as well as how much volume has been lost or gained since a previous survey.”

“I can compute what the elevation of the dune is, as well as how much volume has been lost or gained since a previous survey,” said Venherm. “I can also compute how long it will take until the beach is completely gone, or how long will it take until water reaches the dune system.”

The Corps conducts regular renourishment projects on beaches all along the East Coast, and uses a template to inform that work, said Alan Robertson, a consultant who leads the city of Tybee’s resilience planning. But he hopes that such granular evidence of specific changes over time can shift where exactly the sand gets placed within the bounds of that template. An area near the island’s north end, for instance, is a clear hot spot for erosion, so the city may push for concentrating sand there, and north of that point so that it can travel south to fill in the erosion.

“We know exactly where the hotspots of erosion are. We know where there’s accretion,” he said, referring to areas where sand tends to build up. “[We] never had that before.”

The data can also inform the city’s own decision-making, because it provides a much clearer picture of what happens to the dunes and beach over time after the fresh sand is added. In the past, they’ve been able to see the most obvious erosion, but now they can compare how different methods of dune-building and even sources of sand hold up. The vegetation that’s critical to holding dunes together, for instance, takes root far better in sand dredged from the ocean compared to sand trucked in from the mainland, Robertson said.

“There’s an example of the research and the monitoring. I actually can make that statement,” he said. “I actually know where you should get your sand from if you can, and why. No one could have told you that eight years ago.”

That sort of proven information is key in resilience projects, which are often expensive and funded by grants from agencies that want confirmation their money is being spent well.

“Everything we do now on resiliency, measuring, and monitoring has become a priority,” said Robertson. “We’ve been able over these years through proof statements of ‘look at what this does for you’ to make it part of the project.”

—Emily Jones (@ejreports.bsky.social), Grist

This article originally appeared in Grist at https://grist.org/science/inside-a-georgia-beachs-high-tech-fight-against-erosion/.

Grist is a nonprofit, independent media organization dedicated to telling stories of climate solutions and a just future. Learn more at Grist.org

How Researchers Have Studied the Where, When, and Eye of Hurricanes Since Katrina

Fri, 08/29/2025 - 12:02

On 28 August 2005, New Orleans area residents received a bulletin from the National Weather Service (NWS) office in Slidell, La., warning them of “a most powerful hurricane with unprecedented strength.” One excerpt of the chilling announcement, issued via NOAA radio and the Federal Communications Commission’s Emergency Alert Service, read,

BLOWN DEBRIS WILL CREATE ADDITIONAL DESTRUCTION. PERSONS…PETS…AND LIVESTOCK EXPOSED TO THE WINDS WILL FACE CERTAIN DEATH IF STRUCK.

POWER OUTAGES WILL LAST FOR WEEKS…AS MOST POWER POLES WILL BE DOWN AND TRANSFORMERS DESTROYED. WATER SHORTAGES WILL MAKE HUMAN SUFFERING INCREDIBLE BY MODERN STANDARDS.

Hurricane Katrina, which caused 1,833 fatalities and about $108 billion in damage (more than $178 billion in 2025 dollars), remains the costliest hurricane on record to hit the United States and among the top five deadliest.

“If we were to have a Katrina today, that [forecast] cone would be half the size that it was in 2005.”

In the 20 years since the hurricane, meteorologists, modelers, computer scientists, and other experts have worked to improve the hurricane forecasting capabilities that inform bulletins like that one.

Consider the forecast cone, for instance. Also known as the cone of uncertainty, this visualization outlines the likely path of a hurricane with decreasing specificity into the future: The wider part of the cone might represent the forecasted path 36 hours in advance, and the narrower part might represent the forecasted path 12 hours in advance.

“If we were to have a Katrina today, that cone would be half the size that it was in 2005,” said Jason Beaman, meteorologist-in-charge at the National Weather Service Mobile/Pensacola office.

How to Make a Hurricane

The ingredients for a hurricane boil down to warm water and low pressure. When an atmospheric low-pressure area moves over warm ocean water, surface water evaporates, rises, then condenses into clouds. Earth’s rotation causes the mass of clouds to spin as the low pressure pulls air toward its center.

Storms born in the Gulf of Mexico or that traverse it, as Katrina did, benefit from the body’s sheltered, warm water, and the region’s shallow continental shelf makes storm surges particularly destructive for Gulf Coast communities.

Hurricanes gain strength as long as they remain over warm ocean waters. But countless factors contribute to how intense a storm becomes and what path it takes, from water temperature and wind speed to humidity and proximity to the equator.

Because predicting the behavior of hurricanes requires understanding how they work, data gathered by satellites, radar, and aircraft are crucial for researchers. Feeding these data into computer simulations helps researchers understand the mechanisms behind hurricanes and predict how future storms may behave.

“Since 2005, [there have been] monumental leaps in observation skill,” Beaman said.

Seeing a Storm More Clearly

Many observations of the weather conditions leading up to hurricanes come from satellites, which can offer a year-round bird’s-eye view of Earth.

NOAA operates a pair of geostationary satellites that collect imagery and monitor weather over the United States and most of the Atlantic and Pacific oceans. The mission, known as the Geostationary Operational Environmental Satellite (GOES) program, has been around since 1975; the current satellites are GOES-18 and GOES-19.

When Beaman started his career just a few years before Katrina hit, satellite imagery from GOES-8 to GOES-12 was typically beamed to Earth every 30–45 minutes—sometimes as often as every 15 minutes. Now it’s routine to receive images every 5 minutes or even as often as every 30 seconds. Having more frequent updates makes for much smoother animations of a hurricane’s track, meaning fewer gaps in the understanding of a storm’s path and intensification.

For Beaman, the launch of the GOES-16 satellite in 2016 marked a particularly important advance: In addition to beaming data to scientists more frequently, it scanned Earth with 4 times the resolution of the previous generation of satellites. It could even detect lightning flashes, which can sometimes affect the structure and intensity of a hurricane.

The transition to GOES-16 “was like going from black-and-white television to 4K television.”

The transition to GOES-16 “was like going from black-and-white television to 4K television,” Beaman said.

NOAA also has three polar-orbiting satellites, launched between 2011 and 2017, that orbit Earth from north to south 14 times a day. As part of the Joint Polar Satellite System (JPSS) program, the satellites’ instruments collect data such as temperature, moisture, rainfall rates, and wind for large swaths of the planet. They also provide microwave imagery using radiation emitted from water droplets and ice. NOAA’s earlier polar-orbiting satellites had lower resolution at the edges of scans, a more difficult time differentiating clouds from snow and fog, and less accurate measurements of sea surface temperature.

“With geostationary satellites, you’re really just looking at the cloud tops,” explained Daniel Brown, branch chief of the Hurricane Specialist Unit at NOAA’s National Hurricane Center in Miami. “With those microwave images, you can really kind of see into the storm, looking at structure, whether an eye has formed. It’s really helpful for seeing the signs of what could be rapid intensification.”

NOAA’s Geostationary Operational Environmental Satellites (GOES) monitor weather over the United States and most of the Atlantic and Pacific oceans. Credit: NOAA/Lockheed Martin, Public Domain

Rapid intensification is commonly defined as an increase in maximum sustained wind speed of 30 or more nautical miles per hour in a 24-hour period. Katrina had two periods of rapid intensification, and they were one reason the storm was so deadly. In the second period, the storm strengthened from a low-end category 3 hurricane (in which winds blow between 178 and 208 kilometers per hour, or between 111 and 129 miles per hour) to a category 5 hurricane (in which winds blow faster than 252 kilometers per hour, or 157 miles per hour) in less than 12 hours.

New Angles

Radar technology has also made strides in the decades since Katrina. Hurricane-tracking radar works via a ground- or aircraft-based transmitter sending out a radio signal. When the signal encounters an obstacle in the atmosphere, such as a raindrop, it bounces back to a receiver. The amount of time it takes for the signal to return provides information about the location of the obstacle.

Between 2011 and 2013, NWS upgraded its 150+ ground-based radars throughout the United States with dual-polarization technology—a change a 2013 NWS news release called “the most significant enhancement made to the nation’s radar network since Doppler radar was first installed in the early 1990s.”

So-called dual-pol technology sends both horizontal and vertical pulses through the atmosphere. With earlier technology, a radar signal might tell researchers only the location of precipitation. Dual-pol can offer information about how much precipitation is falling, the sizes of raindrops, and the type of precipitation or can even help researchers identify debris being transported in a storm.

Credit: NOAA

“That’s not something that we had back in Katrina’s time,” Beaman said. In 2005, forecasters used “much more crude ways of trying to calculate, from radar, how much rain may have fallen.”

Radar updates have become more frequent as well. Beaman said his office used to receive routine updates every 5 or 6 minutes. Now they receive updated radar imagery as often as every minute.

Hunting Hurricanes from the Skies

For a more close-up view of a hurricane, NOAA and the U.S. Air Force employ Hurricane Hunters—planes that fly directly through or around a storm to take measurements of pressure, humidity, temperature, and wind speed and direction. These aircraft also scan the storms with radar and release devices called dropwindsondes, which take similar measurements at various altitudes on their way down to the ocean.

NOAA’s P-3 Orion planes and the 53rd Weather Reconnaissance Squadron’s WC-130J planes fly through the eyes of storms. NOAA’s Gulfstream IV jet takes similar measurements from above hurricanes and thousands of square kilometers around them, also releasing dropwindsondes along the way. These planes gather information about the environment in which storms form. A 2025 study showed that hurricane forecasts that use data from the Gulfstream IV are 24% more accurate than forecasts based only on satellite imagery and ground observations.

The NOAA P-3 Hurricane Hunter aircraft captured this image from within the eye of Hurricane Katrina on 28 August 2005, 1 day before the storm made landfall. Credit: NOAA, Public Domain

Hurricane Hunters’ tactics have changed little since Katrina, but Brown said that in the past decade or so, more Hurricane Hunter data have been incorporated into models and have contributed to down-to-Earth forecasting.

Sundararaman “Gopal” Gopalakrishnan, senior meteorologist with NOAA’s Atlantic Oceanographic and Meteorological Laboratory’s (AOML) Hurricane Research Division, emphasized that Hurricane Hunter data have been “pivotal” for improving both the initial conditions of models and the forecasting of future storms.

With Hurricane Hunters, “you get direct, inner-core structure of the storm,” he said.

Hurricane Hunters are responsible for many of the improvements in hurricane intensity forecasting over the past 10–15 years, said Ryan Torn, an atmospheric and environmental scientist at the University at Albany and an author of the recent study about Gulfstream IVs. One part of this improvement, he explained, is that NOAA began flying Hurricane Hunters not just for the largest storms but for weaker and smaller ones as well, allowing scientists to compare what factors differentiate the different types.

“We now have a very comprehensive observation dataset that’s come from years of flying Hurricane Hunters into storms,” he said. These datasets, he added, make it possible to test how accurately a model is predicting wind, temperature, precipitation, and humidity.

In 2021, NOAA scientists also began deploying uncrewed saildrones in the Caribbean Sea and western Atlantic to measure changes in momentum at the sea surface. The drones are designed to fill observational gaps between floats and buoys on the sea surface and Hurricane Hunters above.

Modeling Track and Intensity

From the 1980s to the early 2000s, researchers were focused on improving their ability to forecast the path of a hurricane, not necessarily what that hurricane might look like when it made landfall, Gopalakrishnan explained.

Brown said a storm’s track is easier to forecast than its intensity because a hurricane generally moves “like a cork in the stream,” influenced by large-scale weather features like fronts, which are more straightforward to identify. Intensity forecasting, on the other hand, requires a more granular look at factors ranging from wind speed and air moisture to water temperature and wind shear.

Storms like 2005’s Katrina and Rita “showed the importance of [tracking a storm’s] intensity, especially rapid intensification.”

Gopalakrishnan said storms like 2005’s Katrina and Rita “showed the importance of [tracking a storm’s] intensity, especially rapid intensification.”

Without intensity forecasting, Gopalakrishnan said, some of the most destructive storms might appear “innocuous” not long before they wreak havoc on coastlines and lives. “Early in the evening, nobody knows about it,” he explained. “And then, early in the morning, you see a category 3 appear from nowhere.”

Gopalakrishnan came to AOML in 2007 to set up both the Hurricane Modeling Group and NOAA’s Hurricane Forecast Improvement Project. He had begun working on what is now known as the Hurricane Weather Research Forecast model (HWRF) in 2002 in his role at NOAA’s Environmental Modeling Center. With the formation of the hurricane modeling group in 2007, scientists decided to focus on using HWRF to forecast intensity changes.

HWRF used a technique called moving nests to model the path of a storm in higher resolution than surrounding areas. Gopalakrishnan compared a nest to using a magnifying glass focused on the path of a storm. Though a model might simulate a large area to provide plenty of context for a storm’s environment, capturing most of an area in lower resolution and the storm path itself in higher resolution can save computing power.

By 2014, Gopalakrishnan said, the model’s tracking and intensity forecasting capabilities had improved 25% since 2007. The model’s resolution also upgraded from 9 square kilometers in 2007 to 1.5 square kilometers by the time it was retired in 2023.

Since 2007, the National Hurricane Center’s official (OFCL) track forecast errors decreased between 30% and 50%, and intensity errors shrank by up to 55%. MAE = mean absolute error; VMAX = maximum sustained 10-meter winds. Credit: Alaka et al., 2024, https://doi.org/10.1175/BAMS-D-23-0139.1

Over time, advances in how data are introduced into models meant that the better data researchers were receiving from satellites, radars, and Hurricane Hunters improved modeling abilities even further. Gopalakrishnan estimated that by 2020, his office could predict hurricane track and intensity with somewhere between 50% and 54% more accuracy than in 2007.

NOAA began transitioning operations to a new model known as the Hurricane Analysis and Forecast System (HAFS) in 2019, and HAFS became the National Hurricane Center’s operational forecasting model in 2023. HAFS, developed jointly by several NOAA offices, can more reliably forecast storms, in part by increasing the use of multiple nests—or multiple high-resolution areas in a model—to follow multiple storms at the same time. HAFS predicted the rapid intensification of Hurricanes Helene and Milton in 2024.

Just as they did with HWRF, scientists run multiple versions of HAFS each year: an operational model, used to inform the public, and a handful of experimental models to see which of them work the best. At the end of hurricane season, researchers examine which versions performed the best and begin combining elements to develop the next generation of the operational model. The team expects that as HAFS improves, it will lengthen the forecast from the 5 days offered by previous models.

“As a developer [in 2007], I would have been happy to even get 2 days forecast correctly,” Gopalakrishnan said. “And today, I’m aiming to get a 7-day forecast.”

NOAA’s budget plan for 2026 could throw a wrench into this progress, as it proposes eliminating all NOAA labs, including AOML.

The Role of Communication

An accurate hurricane forecast does little good if the information isn’t shared with the people who need it. And communication about hurricane forecasts has seen its own improvements in the past 2 decades. NWS has partnered with social scientists to learn how to craft the most effective messages for the public, something Beaman said has paid dividends.

Communication between the National Hurricane Center and local weather service offices can be done over video calls, rather than by phone as was once done. Sharing information visually can make these calls more straightforward and efficient. NWS began sending wireless emergency alerts directly to cell phones in 2012.

In 2017, the National Hurricane Center began issuing storm surge watches and warnings in addition to hurricane watches and warnings. Beaman said storm surge inundation graphics, which show which areas may experience flooding, may have contributed to a reduction in storm surge–related fatalities. In the 50-year period between 1963 and 2012, around 49% of storm fatalities were related to storm surge, but by 2022, that number was down to 11%.

“You take [the lack of visualization] back to Katrina in 2005, one of the greatest storm surge disasters our country has seen, we’re trying to express everything in words,” Beaman said. “There’s no way a human can properly articulate all the nuances of that.”

Efforts to create storm data visualization go beyond NOAA.

Carola and Hartmut Kaiser moved to Baton Rouge, La., just weeks before Hurricane Katrina made landfall. Hartmut, a computer scientist, and Carola, an information technology consultant with a cartography background, were both working at Louisiana State University. When the historic storm struck, Hartmut said they wondered, “What did we get ourselves into?”

Shortly after the storm, the Kaisers combined their expertise and began work on the Coastal Emergency Risks Assessment (CERA). The project, led by Carola, is an easy-to-use interface that creates visual representations of data, including storm path, wind speed, and water height, from the National Hurricane Center, the Advanced Circulation Model (ADCIRC), and other sources.

The Coastal Emergency Risks Assessment tool aims to help the public understand the potential timing and impacts of storm surge. Here, it shows a forecast cone for Hurricane Erin in August 2025, along with predicted maximum water height levels. Credit: Coastal Emergency Risks Assessment

“We know of a lot of people who said, ‘Yes, thank you, [looking at CERA] caused me to evacuate.”

What started as an idea for how to make information more user-friendly for the public, emergency managers, and the research community grew quickly: Hundreds of thousands of people now use the tool during incoming storm events, Hartmut said. The Coast Guard often moves its ships to safe regions on the basis of CERA’s predictions, and the team frequently receives messages of thanks.

“We know of a lot of people who said, ‘Yes, thank you, [looking at CERA] caused me to evacuate,” Hartmut said. “And now my house is gone, and I don’t know what would have happened if I didn’t go.”

Looking Forward

Unlike hurricane season itself, the work of hurricane modelers has no end. When the season is over, teams such as Gopalakrishnan’s review the single operational and several experimental models that ran throughout the season, then work all year on building an upgraded operational model.

“It’s 365 days of model developments, testing, and evaluation,” he said.

NOAA scientists aren’t the only ones working to improve hurricane forecasting. For instance, researchers at the University of South Florida’s Ocean Circulation Lab (OCL) and the Florida Flood Hub created a storm surge forecast visualization tool based on the lab’s models. The West Florida Coastal Ocean Model, East Florida Coastal Ocean Model, and Tampa Bay Coastal Ocean Model were designed for the coastal ocean with a sufficiently high resolution to model small estuaries and shipping channels.

Though Yonggang Liu, a coastal oceanographer and director of OCL, cited examples of times his lab’s models have outperformed NOAA’s models, the tool is not used in operational NOAA forecasts. But it is publicly available on the OCL website (along with a disclaimer that the analyses and data are “research products under development”).

The Cyclone Global Navigation Satellite System (CYGNSS) is a NASA mission that pairs signals from existing GPS satellites with a specialized radar receiver to measure reflections off the ocean surface—a proxy for wind levels. The constellation of eight satellites can take measurements more frequently than GOES satellites, allowing for better measurement of rapid intensification, said Chris Ruf, a University of Michigan climate and space scientist and CYGNSS principal investigator.

It might seem that if a method or mission offers a way to more accurately forecast hurricanes, it should be promptly integrated into NOAA’s operational models. But Ruf explained NOAA’s hesitation to use data from university-led efforts: Because they are outside of NOAA’s control and could therefore lose funding or otherwise stop running, it’s too risky for NOAA to rely on such projects.

“CYGNSS is a one-off mission that was funded to go up there and do its thing, and then, when it deorbits, it’s over,” Ruf said. “They [at NWS] don’t want to invest a lot of time learning how to assimilate some new data source and then have the data disappear later. They want to have operational usage where they can trust that it’s going to be there later on.”

“These improvements cannot happen as a one-man army.”

Whatever office they’re in, it’s scientists who make the work of hurricane forecasting possible. Gopalakrishnan said that during Katrina, there were two or three people at NOAA associated with model development. He credits the modeling improvements made since then to the fact that, now, there’s a team of several dozen. And more advances may be on the horizon. For instance, NOAA expects a new Hurricane Hunter jet, a G550, to join the ranks by 2026.

However, some improvements are stalling. The Geostationary Extended Observations (GeoXO) satellite system is slated to begin expanding observations of GOES satellites in the early 2030s. But the 2026 U.S. budget proposal, which suggests slashing $209 million from NOAA’s efforts to procure weather satellites and infrastructure, specifically suggests a “rescope” of the GeoXO program

Hundreds of NOAA scientists have been laid off since January 2025, including Hurricane Hunter flight directors and researchers at AOML (though NWS received permission to rehire hundreds of meteorologists, hydrologists, and radar technicians, as well as hire for previously approved positions, in August).

In general, hurricane fatalities are decreasing: As of 2024, the 10-year average in the United States was 27, whereas the 30-year average was 51. But this decrease is not because storms are becoming less dangerous.

“Improved data assimilation, improved computing, improved physics, improved observations, and more importantly, the research team that I could bring together [were] pivotal” in enabling the past 2 decades of forecasting improvements, said Gopalakrishnan. “These improvements cannot happen as a one-man army. It’s a team.”

—Emily Dieckman (@emfurd.bsky.social), Associate Editor

Citation: Dieckman, E. (2025), How researchers have studied the where, when, and eye of hurricanes since Katrina, Eos, 106, https://doi.org/10.1029/2025EO250320. Published on 28 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Debate over Wakes in the Land of 10,000 Lakes

Fri, 08/29/2025 - 12:01

Wakeboats are causing a stir in Minnesota.

Though all powerboats create wakes, these specialty craft have heavier sterns and engines specifically designed to shape water into surfable waves. That extra turbulence is drawing ire from other lake-lovers.

Across the state, Minnesotans are reporting eroding banks, murky waters, and shredded vegetation. When considering wakeboats, one person’s recreation is another’s resentment.

“It’s divisive,” said Joe Shneider, president of the Minnesota Coalition of Lake Associations. “The three big issues we hear all the time [about wakeboats] are personal safety, bank erosion, and lake bed disruption.”

Specialty wakeboats are designed to shape water into surfable waves, allowing riders to follow behind without needing a towrope. New research shows how those wakes can affect the lake bed below. Credit: Colin Van Dervort/Flickr, CC BY 2.0

As the popularity and size of wakeboats grow, so does the need for data. Communities are wrestling with issues of regulation and education, and both approaches require information. That’s why Shneider and more than 200 others helped crowdfund recent research from the University of Minnesota’s Saint Anthony Falls Laboratory. (The state also supported the project.) The resulting public dataset shows how wakeboats can churn lake beds, information that can help communities navigate the brewing conflict.

The Stakes

Minnesota is not the only state navigating a great wake debate. In 2024, Maine implemented wakeboat regulations and Vermont restricted wake surfing to its 30 largest lakes. (Some residents want the number further reduced to 20.) In Wisconsin, individual municipalities are debating bans on wake surfing at hundreds of lakes, prompting at least one lawsuit.

Minnesota, in contrast, has issued wakeboat regulations at only one of its 10,000 lakes.

“There’s a whole lot of people out there that need to make decisions about their lake.”

The environmental issues at stake arise in shallow water, where powerboats can stir up obvious trails of sediment. Resuspended sediment absorbs sunlight, which heats the water column. Turbidity reduces the feeding rates of some fishes. Once-buried nutrients again become available, triggering toxic algal blooms that choke beaches and rob fish of oxygen.

But to connect the dots between wakeboat use and ecosystem disruption, researchers needed to document how various powerboats affect sediment dispersal.

“We want to understand how boats are interacting with the water column and provide data, because there’s a whole lot of people out there that need to make decisions about their lake,” said Jeff Marr, a hydraulic engineer at the University of Minnesota and a coauthor of the study.

The Wake

On Lake Minnetonka, just west of Minneapolis, seven locals lent their boats for the research. These watercraft ranged from relatively light, low-power deck boats (150-horsepower, 2,715 pounds) to burly bowriders (760-horsepower, 14,530 pounds) and included two boats built for wake surfing.

On test days, volunteers piloted their boats between buoy-marked goalposts. Acoustic sensors on the lake bed tracked pressure changes in the water column.

Powerboats mostly operate at either displacement speed (chugging low in the water) or planing speed (skipping faster along the surface). But there’s a transition called semidisplacement, in which the stern sinks in the water and waves spike in size.

“It’s right at that transition that [wakeboats] like to operate,” said Andy Riesgraf, an aquatic biologist at the University of Minnesota and a coauthor of the study.

Boaters drove the course five times at planing speed (21–25 miles per hour, common for water-skiing and tubing) and five times at displacement or semidisplacement mode (7–11 miles per hour, common for cruising and wake surfing). Researchers in rowboats paddled to collect water samples at various intervals in the track.

Researchers Chris Feist and Jessica Kozarek stand by the research rowboat. To minimize disruption in the water column, the human-powered sampling team paddled into the wake racetrack to collect 1-liter water samples at three different depths. Credit: Saint Anthony Falls Laboratory

The acoustic sensors showed that three types of waves affected the water column. Pressure waves, created by the immediate shift and rebound of water around a boat, were short-lived but strong enough to shake loose sediments. Transverse waves, which follow the boat’s path, and propeller wash, the frothy vortex generated by its engines, both elevated loose sediment and caused minutes-long disturbances.

Though all boats created these waves, the wakeboats churned the most sediment.

In planing mode, all seven boats caused brief and minimal disturbances. Sediments settled in less than 10 seconds at 9- and 14-foot depths. But when operating in slower, semidisplacement mode, wakeboats created a distinct disturbance. Following a pass from a wakeboat, sediment needed 8 minutes to settle at 14-foot depth and more than 15 minutes at 9-foot depth.

The research team released simple recommendations based on their findings. One recommendation is that all recreational powerboats should operate in at least 10 feet of water to minimize disturbances. Another is that wakeboats, when used for surfing, need 20 feet of water to avoid stirring up sediments and altering the ecosystem.

The Uptake

The new research adds to the group’s existing dataset on powerboats’ hydrologic impacts on lake surfaces.

Whether the suggestions lead to regulations is up to lake managers.

“Our goal is just to get the data out,” Marr said. The researchers published their findings in the University of Minnesota’s open-access digital library so that everyday lake-goers can find the information. Three external experts reviewed the material.

The more we continue to collect these data, the more that we start to fill in those other gaps.

The results add information to the policy debate. “If there is going to be some type of environmental regulation [on powerboating], you need very clear evidence that under these conditions, it’s detrimental,” said Chris Houser, a coastal geomorphologist at the University of Waterloo who was not involved in the project.

There are other variables to study—such as the number of boats on the water and the paths they’re carving—but “the more we continue to collect this data, the more we start to fill in those other gaps of different depths and different configurations,” Houser said.

For Shneider, the new data add much-needed clarity. The latest report “is monumental,” he said.

Marr, Riesgraf, and their colleagues are now comparing the impacts of boat-generated wakes against wind-driven waves. Those data could further isolate the impacts powerboats have on lakes.

—J. Besl (@J_Besl, @jbesl.bsky.social), Science Writer

This news article is included in our ENGAGE resource for educators seeking science news for their classroom lessons. Browse all ENGAGE articles, and share with your fellow educators how you integrated the article into an activity in the comments section below.

Citation: Besl, J. (2025), A debate over wakes in the land of 10,000 lakes, Eos, 106, https://doi.org/10.1029/2025EO250316. Published on 29 August 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

USDA Moves to Rescind Roadless Rule Protecting 45 Million Acres of Wild Area

Thu, 08/28/2025 - 21:11
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

The U.S. Department of Agriculture (USDA) is proposing rescinding the 2001 Roadless Area Conservation Rule, which protects about 45 million acres (182,000 square kilometers) of National Forest System lands from road construction, reconstruction, and timber harvests.

Of the land that would be affected by the rescission, more than 95% is in 10 western states: Alaska, Montana, California, Utah, Wyoming, Nevada, Washington, Oregon, New Mexico, and Arizona. The change would not apply to Colorado and Idaho, which have state-specific roadless rules.

Secretary of Agriculture Brooke L. Rollins first announced the USDA’s rescission of the rule on 23 June, prompting negative responses from several environmental, conservation, and native groups.

“The Tongass is more than an ecosystem—it is our home. It is the foundation of our identity, our culture, and our way of life,” said a letter from the Central Council of the Tlingit and Haida Indian Tribes of Alaska to the USDA and the U.S. Forest Service. “We understand the need for sustainable industries and viable resource development in Southeast Alaska. Our communities need opportunities for economic growth, but that growth must be guided by those who call this place home.”

 
Related

On 27 August, the USDA released a statement about the agency taking “the next step in the rulemaking process,” noting that the proposal aligned with several recent executive orders, including Executive Order 14192, Unleashing Prosperity Through Deregulation and Executive Order 14153, Unleashing Alaska’s Extraordinary Resource Potential.

“This administration is dedicated to removing burdensome, outdated, one-size-fits-all regulations that not only put people and livelihoods at risk but also stifle economic growth in rural America,” Rollins said in the release.

A notice of intent seeking public comment on the proposal was published in the Federal Register on Friday, 29 August, but a preview of the document became available for public inspection on 28 August. The document suggests that the rule has posed “undue burden on production of the Nation’s timber and identification, development, and use of domestic energy and mineral resources.” Repealing the rule, the document states, would allow for local land managers to make more tailored decisions and would allow for better wildfire suppression.

“This scam is cloaked in efficiency and necessity,” said Nicole Whittington-Evans, senior director of Alaska and Northwest programs at Defenders of Wildlife, in a statement. “But in reality, it will liquidate precious old-growth forest lands critical to Alaska Natives, local communities, tourists and countless wildlife, who all depend on intact habitat for subsistence harvesting, recreation and shelter. Rare and ancient trees will be shipped off at a loss to taxpayers, meaning that Americans will subsidize the destruction of our own natural heritage.”  

The proposal will be open for public comment through 19 September.

–Emily Dieckman, Associate Editor (@emfurd.bsky.social)

29 August 2025: This article was updated with a link to the notice of intent published in the Federal Registrar.

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Burst of Subglacial Water Cracked the Greenland Ice Sheet

Thu, 08/28/2025 - 13:12

Greenland, despite its name, is largely blanketed in ice. And beneath that white expanse lies a world of hidden lakes. Researchers have now used satellite observations to infer that one such subglacial lake recently burst through the surface of the Greenland Ice Sheet, an unexpected and unprecedented event. By connecting this outburst with changes in the velocity and calving of a nearby glacier, the researchers helped to unravel how subglacial lakes affect ice sheet dynamics. These results were published in Nature Geoscience.

Researchers have known for decades that pools of liquid water exist beneath the Antarctic Ice Sheet, but scientific understanding of subglacial lakes in Greenland is much more nascent. “We first discovered them about 10 years ago,” said Mal McMillan, a polar scientist at Lancaster University and the Centre for Polar Observation and Modelling, both in the United Kingdom.

Subglacial lakes can exert a significant influence on an ice sheet. That’s because they affect how water drains from melting glaciers, a mechanism that in turn causes sea level rise, water freshening, and a host of other processes that affect local and global ecosystems.

McMillan is part of a team that recently studied an unusual subglacial lake beneath the Greenland Ice Sheet. The work was led by Jade Bowling, who was a graduate student of McMillan’s at the time; Bowling is now employed by Natural England.

Old, but Not Forgotten, Data

In the course of mining archival satellite observations of the height of the Greenland Ice Sheet, the team spotted something unusual in a 2014 dataset: An area of roughly 2 square kilometers had dropped in elevation by more than 80 meters (260 feet) between two satellite passes just 10 days apart. That deflation reflected something going on deep beneath the surface of the ice, the researchers surmised.

A subglacial lake that previously was situated at the interface between the ice and the underlying bedrock must have drained, said McMillan, leaving the ice above it hanging unsupported until it tumbled down. The team used the volume of the depression to estimate that roughly 90 million cubic meters (more than 3.1 billion cubic feet) of water had drained from the lake between subsequent satellite observations, making the event one of Greenland’s biggest subglacial floods in recorded history.

“We haven’t seen this before.”

Subglacial lakes routinely grow and shrink, however, so that observation by itself wasn’t surprising. What was truly unexpected lay nearby.

“We also saw an appearance, about a kilometer downstream, of a huge area of fractures and crevassing,” McMillan said. And beyond that lay 6 square kilometers (2.3 square miles)—an area roughly the size of lower Manhattan—that was unusually smooth.

The researchers concluded that after the subglacial lake drained, its waters likely encountered ice frozen to the underlying bedrock and were forced upward and through the surface of the ice. The water then flowed across the Greenland Ice Sheet before reentering the ice several kilometers downstream, leaving behind the polished, 6-square-kilometer expanse.

“This was unexpected,” said McMillan. “We haven’t seen this before.”

A Major Calving, a Slowing Glacier

It’s most likely that the floodwater traveled under northern Greenland’s Harder Glacier before finally flowing into the ocean.

Within the same 10-day period, Harder Glacier experienced its seventh-largest calving event in the past 3 decades. It’s impossible to know whether there’s a direct link between the subglacial lake draining and the calving, but it’s suggestive, said McMillan. “The calving event that happened at the same point is consistent with lots of water flooding out” from the glacier.

Using data from several Earth-observing satellites, scientists discovered that a huge subglacial flood beneath the Greenland Ice Sheet occurred with such force that it fractured the ice sheet, resulting in a vast quantity of meltwater bursting upward through the ice surface. Credit: ESA/CPOM/Planetary Visions

“It’s like you riding on a waterslide versus a rockslide. You’re going to slide a lot faster on the waterslide.”

The team also found that Harder Glacier rapidly decelerated—3 times more quickly than normal—in 2014. That’s perhaps because the influx of water released by the draining lake carved channels in the ice that acted as conduits for subsequent meltwater, the team suggested. “When you have normal melting, it can just drain through these channels,” said McMillan. Less water in and around the glacier means less lubrication. “That’s potentially why the glacier slowed down.”

That reasoning makes sense, said Winnie Chu, a polar geophysicist at the Georgia Institute of Technology in Atlanta who was not involved in the research. “It’s like you riding on a waterslide versus a rockslide. You’re going to slide a lot faster on the waterslide.”

Just a One-Off?

In the future, McMillan and his colleagues hope to pinpoint similar events. “We don’t have a good understanding currently of whether it was a one-off,” he said.

Getting access to higher temporal resolution data will be important, McMillan added, because such observations would help researchers understand just how rapidly subglacial lakes are draining. Right now, it’s unclear whether this event occurred over the course of hours or days, because the satellite observations were separated by 10 days, McMillan said.

It’s also critical to dig into the mechanics of why the meltwater traveled vertically upward and ultimately made it to the surface of the ice sheet, Chu said. The mechanism that this paper is talking about is novel and not well reproduced in models, she added. “They need to explain a lot more about the physical mechanism.”

But something this investigation clearly shows is the value of digging through old datasets, said Chu. “They did a really good job combining tons and tons of observational data.”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), A burst of subglacial water cracked the Greenland Ice Sheet, Eos, 106, https://doi.org/10.1029/2025EO250317. Published on 28 August 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Fossilized Micrometeorites Record Ancient CO2 Levels

Thu, 08/28/2025 - 13:10

Micrometeorites, unlike their larger brethren, rarely get a spotlight at museums. But there’s plenty to learn from these extraterrestrial particles, despite the largest of them measuring just millimeters across.

Nearly 50 tons of extraterrestrial material fall on Earth every day, and the majority of that cosmic detritus is minuscule. Micrometeorites are, by definition, smaller than 2 millimeters in diameter, and they’re ubiquitous, said Fabian Zahnow, an isotope geochemist at Ruhr-Universität Bochum in Germany. “You can basically find them everywhere.”

Researchers recently analyzed fossilized micrometeorites that fell to Earth millions of years ago. They extracted whiffs of atmospheric oxygen incorporated into the particles and showed that carbon dioxide (CO2) levels during the Miocene and Cretaceous did not differ wildly from modern-day values. The results were published in Communications Earth and Environment.

Extraterrestrial Needles in Rocky Haystacks

Newly fallen micrometeorites can be swept from rooftops and dredged from the bottoms of lakes.

Zahnow and his collaborators, however, opted to turn back the clock: The team analyzed a cadre of micrometeorites that fell to Earth millions of years ago and have since been fossilized. The team sifted through more than a hundred kilograms of sedimentary rocks, mostly unearthed in Europe, to discover 92 micrometeorites rich in iron. They added eight other iron-dominated micrometeorites from personal collections to bring their sample to 100 specimens.

Metal-rich micrometeorites such as these are special, said Zahnow, because they function like atmospheric time capsules. As they hurtle through the upper atmosphere on their way to Earth, they melt and oxidize, meaning that atmospheric oxygen gets incorporated into their otherwise oxygen-free makeup.

“When we extract them from the rock record, we have our oxygen, in the best case, purely from the Earth’s atmosphere,” said Zahnow.

Ancient Carbon Dioxide Levels

And that oxygen holds secrets about the past. It turns out that atmospheric oxygen isotope ratios—that is, the relative concentrations of the three isotopes of oxygen, 16O, 17O, and 18O—correlate with the amount of photosynthesis occurring and how much CO2 is present at the time. That fact, paired with model simulations of ancient photosynthesis, allowed Zahnow and his colleagues to infer long-ago atmospheric CO2 concentrations.

“The story of the atmosphere is the story of life on Earth.”

Reconstructing Earth’s atmosphere as it was millions of years ago is important because atmospheric gases affect our planet so fundamentally, said Matt Genge, a planetary scientist at Imperial College London not involved in the work. “The story of the atmosphere is the story of life on Earth.”

But Zahnow and his collaborators first had to make sure the oxygen in their micrometeorites hadn’t been contaminated. Terrestrial water, with its own unique oxygen isotope ratios, can seep into micrometeorites that would otherwise reflect atmospheric oxygen isotope ratios from long ago. That’s a common problem, said Zahnow, given the ubiquity of water on Earth. “There’s always some water present.”

The team found that the presence of manganese in their micrometeorites was a tip-off that contamination had occurred. “Extraterrestrial metal has basically no manganese,” said Zahnow. “Manganese is really a tracer for alteration.”

Unfortunately, the vast majority of the researchers’ micrometeorites contained measurable quantities of manganese. In the end, Zahnow and his collaborators deemed that only four of their micrometeorites were uncontaminated.

Those micrometeorites, which fell to Earth during the Miocene (9 million years ago) and the Late Cretaceous (87 million years ago), suggested that CO2 levels during those time periods were, on average, roughly 250–300 parts per million. That’s a bit lower than modern-day levels, which hover around 420 parts per million.

“What we really hoped for was to get pristine micrometeorites from periods where the reconstructions say really high concentrations.”

The team’s findings are consistent with values suggested previously, said Genge, but unfortunately, the team’s numbers just aren’t precise enough to conclude anything meaningful. “You have a really huge uncertainty,” he said.

The team’s methods are solid, however, said Genge, and the researchers made a valiant effort to measure what are truly faint whiffs of ancient oxygen. “It’s a brave attempt.”

In the future, it would be valuable to collect a larger number of pristine micrometeorites dating to time periods when model reconstructions suggest anomalously high CO2 levels, said Zahnow. “What we really hoped for was to get pristine micrometeorites from periods where the reconstructions say really high concentrations.”

Confirming, with data, whether such time periods, such as the Triassic, truly had off-the-charts CO2 levels would be valuable for understanding how life on Earth responded to such an abundance of CO2.

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), Fossilized micrometeorites record ancient CO2 levels, Eos, 106, https://doi.org/10.1029/2025EO250319. Published on 28 August 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

As Simple as Possible: The Importance of Idealized Climate Models

Thu, 08/28/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

 “Everything should be made as simple as possible, but not simpler.” This popular saying paraphrases a sentiment expressed by Einstein about the need for simplicity, though not at the expense of accuracy. Modeling of the Earth’s climate system has become an incredibly complex endeavor, especially when coupling of physics of atmospheric movement with complex and nonlinear feedbacks with the ocean and land surface and forcing by collective human actions. Such complexity can make the underlying causes of model behaviors hard to diagnose and can make it prohibitively expensive to perform targeted experiments.

Two very recent developments, the emergence of kilometer-scale simulations and the rapid growth of machine learning (ML) approaches, have further increased the computational complexity of modeling global climate. In their commentary, Reed et al. [2025] remind us of the benefits of maintaining and applying a hierarchy of models with different levels of complexity. They make a special plea not to forget the power of using idealized, or simplified, climate models for hypothesis testing, model development, and teaching. 

Citation: Reed, K. A., Medeiros, B., Jablonowski, C., Simpson, I. R., Voigt, A., & Wing, A. A. (2025). Why idealized models are more important than ever in Earth system science. AGU Advances, 6, e2025AV001716. https://doi.org/10.1029/2025AV001716

—Susan Trumbore, Editor, AGU Advances

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer