EOS

Syndicate content Eos
Science News by AGU
Updated: 17 hours 33 min ago

An Accessible Alternative for Undergraduate Research Experiences

Thu, 09/04/2025 - 13:33

Undergraduate research experiences (UREs) in science, technology, engineering, and mathematics (STEM) offer students hands-on research experience and essential professional skills and connections to prepare them to succeed in the workforce. They also cultivate students’ sense of belonging, confidence, and identity—and promote retention—in STEM fields [National Academies of Sciences, Engineering, and Medicine, 2017; Rodenbusch et al., 2016].

To be effective, UREs should be thoughtfully designed to meet the needs of students who may otherwise miss out on career opportunities tied to networking and community-building through such programs. Existing URE programs have followed a range of approaches, but traditionally, many have been centered around short-duration, time-intensive, individual, mentor-directed experiences, such as full-time summer internships in field or laboratory settings. However, these traits can inadvertently exclude some student populations, a concern that is leading many programs to modify their structure and design to engage broader groups.

To lower barriers to participation in UREs, we developed the Authentic Research through Collaborative Learning (ARC-Learn) program at Oregon State University (OSU). ARC-Learn, which ran from 2021 to 2024 and comprised two overlapping student cohorts, offered a long-term, low-intensity program focused on Arctic science and inclusive mentorship. It was designed to help students engage in a science community, foster identities as STEM professionals, and develop critical scientific and data literacy skills and 21st century competencies such as teamwork and communication.

Table 1. Design Features of ARC-LearnFeatureDescriptionDuration18 months (2 academic years)Intensity2–4 hours per weekLocationOn campus or remoteMentorshipMultiple mentors working in teams with multiple studentsTopic selectionStudent drivenStudent supportMentors, peers, program administrators, academic advisorsMentorship developmentInclusive mentorship training, facilitated peer learning communityResearch tasksDevelop research question, find data and analyze data, draw conclusions, and present findingsStudent developmentDiscover own strengths as researchers, work with a team, supplemental training in missing skills

ARC-Learn incorporated alternative design features (Table 1) to meet the needs of students who do not typically have access to time-intensive field or lab-based UREs, such as transfer students, remote students, and those with dependent care, military service, and other work commitments [Scott et al., 2019] or who have nontraditional enrollment patterns (e.g., dual enrollment in both university and community college, varying enrollment from term to term).

The program was framed in the context of Arctic science because of the region’s outsize effects on climate, ecosystems, and communities globally and to engage students with long-term research investments in polar regions [Marrongelle and Easterling, 2019]. The Arctic also offers a dynamic and interdisciplinary context for a URE program, enabling students to follow their interests in investigating complex science questions. In addition, numerous long-term Arctic monitoring programs offer rich datasets useful in all kinds of STEM careers.

Despite encountering challenges, the ARC-Learn model proved successful at engaging and motivating students and also adaptive as program organizers made adjustments from one cohort to the next in response to participant feedback.

The ARC-Learn Model

With support from mentors and peers, students experienced the whole research arc and gradually took ownership of their work.

Each ARC-Learn cohort lasted 2 academic years and included a dozen or more students. Participants received a stipend to offset costs associated with participation, such as childcare and missed work time, and had the option of obtaining a course credit each term to meet experiential learning requirements. With support from mentors and peers, they experienced the whole research arc and gradually took ownership of their work through three key phases of the program.

Early year 1: Build research teams. Some URE mentorship models involve a mentor primarily driving selection of a research topic and the student completing the work. In ARC-Learn, students learned from multiple mentors and peers, while mentors supported each other and received feedback from students (Figure 1). The students self-selected into research teams focused on a broad topic (e.g., marine heat waves or primary productivity), then developed individual research questions based on their strengths and interests.

Fig. 1. Some models of undergraduate research experiences have involved a mostly one-way transfer of knowledge from a single mentor to a single student, with the mentor deciding the research topic and the student completing the work. In ARC-Learn, students learned from multiple mentors and peers as part of small-group research teams, while mentors supported each other and received feedback from students.

Mentor-student teams met every other week—and students met one-on-one with mentors as needed—to support individual projects. The entire cohort also met twice a month to discuss topics including the fundamentals of Arctic science and the scientific process and to report out on progress toward milestones.

Late year 1 to middle of year 2: Develop research questions and find and analyze data. With no field or lab component to the program, ARC-Learn students worked exclusively with existing data. These data came from NASA and NOAA satellite-based sources such as the Moderate Resolution Imaging Spectroradiometer (MODIS), Advanced Very High Resolution Radiometer, and Soil Moisture Active Passive (SMAP) instruments; shipboard sources such as NOAA’s Distributed Biological Observatory, the Alaska Ocean Observing System, and the University-National Oceanographic Laboratory System’s Rolling Deck to Repository; and the National Science Foundation’s (NSF) Arctic Data Center and NOAA’s National Centers for Environmental Information.

Students often revised their research questions or the datasets they used multiple times to produce meaningful findings (Figure 2). Notably, access to these datasets proved critical to the educational experience of ARC-Learn students, highlighting the importance of maintaining them in public archives for future URE activities.

Fig. 2. ARC-Learn students developed their own research questions and worked exclusively with existing data to answer them. Students often revised their research questions or datasets multiple times to produce meaningful findings.

Many students struggled with finding, cleaning, analyzing, and interpreting data, often because of limited experience with tools such as geographic information system software and programming languages such as Python and R. At times, the required expertise was beyond even their mentors’ knowledge. Hands-on skill development workshops during cohort meetings connected students with additional mentors proficient in specific platforms and tools to help fill knowledge gaps and help students overcome obstacles.

Although the students encountered occasional setbacks, they reported that achievements such as settling on a final research question and creating rich data visualizations proved deeply rewarding and motivated further progress.

Late year 2: Share the results. Over several months, students created research posters with feedback and support along the way from their teammates, mentors, and the entire cohort. The program concluded with a grand finale, featuring on-campus gatherings for remote and in-person students, a dress rehearsal poster session, a celebratory dinner, and final presentations at a university-wide undergraduate research symposium.

Zoe’s Story

After a successful 7-year military career, Zoe enrolled at OSU to study the Arctic through her participation in ARC-Learn. As a student in cohort 2, she experienced several challenges along the research arc before finding success, and her experience helps illustrate the program’s model.

Zoe joined fellow students and mentors in the Marine Heatwaves research team and then narrowed her focus by exploring scientific literature and talking with her primary mentor to understand physical and chemical factors associated with marine heat waves as well as their effects on ocean ecosystems. She developed several research questions focused on how factors such as atmospheric pressure and temperature have affected the development and extent of marine heat waves off Alaska since 2010.

As Zoe and her mentor considered available datasets and relevant literature further, they realized that her questions were still too broad given the number of variables affecting ocean-atmosphere interactions. At one of the full-cohort meetings, she shared her difficulties and frustrations, prompting another mentor to offer their help. This mentor worked with Zoe to understand a key meteorological feature—the Aleutian Low—in the area she was studying, as well as relevant data available through the European Union’s Copernicus Climate Change Service [Hersbach et al., 2023] and the appropriate analysis platform.

“We jumped in and learned it together. She helped me find the right data, which in turn, allowed me to finalize my research question,” Zoe said.

Nuanced and iterative feedback from mentors and peers guided ARC-Learn participants, including Zoe, to design posters that balanced visual presentations of data alongside descriptive text to explain research findings. Credit: Ryan Brown

From that point, Zoe quickly landed on a focused question that she could address: Does a disruption in the Aleutian Low lead to marine heat waves over the North Pacific region? The final step was to develop a visually striking poster to invite attention, questions, and ideas during the research symposium.

“Seeing other people interested in my research…was validating of me as a scientist.”

Zoe’s experience at the poster session captured what we heard from many other students in the program. Even after her 2 years of being immersed in her project and working with mentors and peers, she said she felt imposter syndrome as a student trying to become a scientist and thought no one would care about her research.

“But people were really interested,” she said. “Seeing other people interested in my research, able to read and understand it on a poster, [and] ask me questions and suggest ideas was validating of me as a scientist.”

A Responsive Approach to URE Design

Through ARC-Learn, program leads sought to expand knowledge about the benefits and challenges of a long-duration, lower-intensity, team-based URE model. Because it was a design-based research program, mentor, student, and coordinator feedback was collected and continually used to make program adjustments [Preston et al., 2022, 2024].

Feedback was collected through pre-, mid-, and end-of-program surveys, as well as pre- and end-of-program interviews, and analyzed by a research and evaluation team. Findings were reported to the program leads, who also met regularly with external expert advisers to get additional recommendations for adjustments. By running two overlapping cohorts (the second started when the first was halfway completed), organizers could address issues that arose for the first cohort to improve the experience of the second one.

Lessons from ARC-Learn are documented in a practitioner guidebook, which discusses practical considerations for others interested in implementing alternative URE models [Brown et al., 2024]. In the guidebook, we examine each design component of ARC-Learn and offer recommendations for designing UREs that meet enrolled students’ specific learning needs and develop their science skills to meet relevant workforce demands.

Novel elements of the Authentic Research through Collaborative Learning (ARC-Learn) program were important in influencing participants’ persistence and success.

A few valuable lessons learned include the following.

Attrition. Expect high attrition rates in UREs designed for nontraditional students, and do not react by making drastic program changes that risk sacrificing otherwise successful program elements. We observed a 45% attrition rate in each cohort, which is indeed high but perhaps not surprising considering the population involved in the program—largely transfer students and those with caregiving or work responsibilities.

Most participants who left did so because of life crises or obligations that paused their research and educational goals. This observation embodies the complexity of students’ lives and reinforces the need for continually creative, flexible, inclusive program structures. For those who completed ARC-Learn, novel elements of the program (e.g., working in teams) were important in influencing their persistence and success.

Remote research applications. The first cohort started in 2021 entirely via remote instruction during the COVID-19 pandemic, before eventually transitioning to a hybrid approach as in-person instruction resumed. All ARC-Learn students in cohort 1 returned to campus, except one Ecampus student, who remained online. The program team and mentors struggled to balance the needs of the remote student, who eventually became somewhat detached from their research team.

As teamwork, camaraderie, and inclusivity are important qualities of the program, we decided for cohort 2 to recruit enough Ecampus students (plus two dedicated mentors) to form a research team of their own. The remote team was engaged and productive—meeting deadlines and producing high-quality work—highlighting the potential of all-remote URE models for students who might otherwise lack access to meaningful research opportunities.

Student-driven research. ARC-Learn empowered students to pursue their own research questions, fostering their autonomy and ownership of their work. However, the open-endedness of selecting their own research paths and the lack of guardrails proved challenging for participants.

We thus hired a program coordinator to provide one-on-one logistical support; establish clear expectations, timelines, and scaffolded assignments; and arrange workshops to teach programming and data analysis skills. This approach, as reported by students who worked with the coordinator, helped many program participants stay on track and ultimately complete their research project.

Mentor coordination. Enabling student success also meant supporting mentors. Organizers provided inclusive mentorship trainings and facilitated a peer learning community. They also made programmatic adjustments in response to experiences in the first cohort.

The student-driven nature of the research sometimes resulted in mismatches between student interests and mentor expertise in cohort 1. So in cohort 2, we engaged mentors earlier in the planning process to define thematic areas for the research teams, creating topics broad enough for students to find an area of interest but narrow enough for mentors to provide guidance. In addition, many mentors had field schedules typical of polar scientists, often resulting in weeks to months at sea. We purposefully paired mentors and asked about planned absences so we could fill any gaps with additional support.

Overall, students in cohort 2 reported feeling highly supported and valued by their mentors and that mentors created welcoming environments to ask questions and solve problems together.

A Foundation to Build On

Participants gained a deep understanding of the complexities and challenges of modern science as well as knowledge and skills needed in scientific education and careers.

From students’ feedback—and the research they did—it’s clear that participants who completed the ARC-Learn program gained a deep understanding of the complexities and challenges of modern science as well as knowledge and skills needed in scientific education and careers. The program thus highlights paths and lessons for others looking to develop successful alternatives to traditional UREs.

Many former ARC-Learn students are continuing to develop research skills, particularly in polar science, through internships and employment in field and lab research efforts. Zoe is working toward a bachelor’s degree in environmental sciences and exploring interests in environmental hazards, conservation, and restoration. For her, the program served as a foundation from which she is building a career and establishing confidence in herself as a scientist.

“I thought I’d have to play catch-up the whole time as an older, nontraditional student,” she said. But through the experience, “I realized I could start anywhere.”

Acknowledgments

ARC-Learn was a collaboration between OSU’s College of Earth, Ocean and Atmospheric Sciences and STEM Research Center. This work is supported by the U.S. NSF (award 2110854). Opinions, findings, conclusions, and recommendations in these materials are those of the authors and do not necessarily reflect the views of NSF.

References

Brown, R., et al. (2024), ARC-Learn Practitioner Guidebook: Practical considerations for implementing an alternative model of undergraduate research experience, Ore. State Univ., Corvallis, https://doi.org/10.5399/osu/1177.

Hersbach, H., et al. (2023), ERA5 monthly averaged data on single levels from 1940 to present, Copernicus Clim. Change Serv. Clim. Data Store, https://doi.org/10.24381/cds.f17050d7.

Marrongelle, K., and W. E. Easterling (2019), Support for engaging students and the public in polar research, Dear Colleague Letter prepared for the U.S. National Science Foundation, Alexandria, Va., www.nsf.gov/funding/opportunities/dcl-support-engaging-students-public-polar-research/nsf19-086.

National Academies of Sciences, Engineering, and Medicine (2017), Undergraduate Research Experiences for STEM Students: Successes, Challenges, and Opportunities, 278 pp., Natl. Acad. Press, Washington, D.C., https://doi.org/10.17226/24622.

Preston, K., J. Risien, and K. B. O’Connell (2022), Authentic Research through Collaborative Learning (ARC-Learn): Undergraduate research experiences in data rich Arctic science formative evaluation report, STEM Res. Cent., Ore. State Univ., Corvallis, https://doi.org/10.5399/osu/1156.

Preston, K., J. Risien, and N. Staus (2024), Authentic Research through Collaborative Learning (ARC-Learn): Undergraduate research experiences in data rich science summative evaluation report, STEM Res. Cent., Ore. State Univ., Corvallis, https://doi.org/10.5399/osu/1178.

Rodenbusch, S. E., et al. (2016), Early engagement in course-based research increases graduation rates and completion of science, engineering, and mathematics degrees, CBE Life Sci. Educ., 15(2), ar20, https://doi.org/10.1187/cbe.16-03-0117.

Scott, G. W., S. Humphries, and D. C. Henri (2019), Expectation, motivation, engagement and ownership: Using student reflections in the conative and affective domains to enhance residential field courses, J. Geogr. Higher Educ., 43(3), 280–298, https://doi.org/10.1080/03098265.2019.1608516.

Author Information

Ryan Brown (ryan.brown@oregonstate.edu), Laurie Juranek, and Miguel Goñi, College of Earth, Ocean and Atmospheric Sciences, Oregon State University, Corvallis; and Julie Risien and Kimberley Preston, STEM Research Center, Oregon State University, Corvallis

Citation: Brown, R., L. Juranek, M. Goñi, J. Risien, and K. Preston (2025), An accessible alternative for undergraduate research experiences, Eos, 106, https://doi.org/10.1029/2025EO250326. Published on 4 September 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Spacecraft Surveys Shed New Light on Auroral Kilometric Radiation

Wed, 09/03/2025 - 18:53
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

Auroral Kilometric Radiation (AKR) is a type of radio wave emitted from Earth’s auroral regions. It is the dominant radio emission from Earth and has been extensively studied, though previous analyses were constrained by limited spacecraft coverage.

Today, with the availability of more spacecraft observations, it is possible to improve our understanding of the Earth’s most intense natural radio emission. Thanks to these data, Wu et al. [2025]  find that Auroral Kilometric Radiation preferentially occurs at high-latitudes and on the Earth’s night-side. They also found that the dense plasmasphere, which is a region of high-density plasma around Earth, blocks AKR from traveling, thus forming an equatorial shadow zone around the plasmasphere. Furthermore, the authors discover that the low-density ducts within the plasmasphere act as waveguides, enabling AKR to penetrate the dense plasmasphere and propagate along these channels.

The findings provide valuable insights into Earth’s electromagnetic environments, space weather events and geomagnetic storms that may adversely affect satellites, communication systems, GPS, and power grids on Earth.  

Citation: Wu, S., Whiter, D. K., Zhang, S., Taubenschuss, U., Zarka, P., Fischer, G., et al. (2025). Spatial distribution and plasmaspheric ducting of auroral kilometric radiation revealed by Wind, Polar, and Arase. AGU Advances, 6, e2025AV001743. https://doi.org/10.1029/2025AV001743

—Alberto Montanari, Editor-in-Chief, AGU Advances

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Bridging Old and New Gravity Data Adds 10 Years to Sea Level Record

Wed, 09/03/2025 - 13:38

As climate change accelerates, it’s more important than ever to understand the individual drivers of sea level rise, from land subsidence and coastal erosion to changes in ocean volume. For the past 20 years, scientists have had access to high-resolution, satellite-derived maps of Earth’s gravity field, which allows them to calculate fluctuations in global ocean mass.

Recently, geodesists have found a way to extend that record back 10 more years, significantly extending the time frame by which they can consistently measure global ocean mass change.

“This is the first observation-based global ocean mass time series” from 1993 to the present, said Jianli Chen, a geodesy researcher at Hong Kong Polytechnic University in China and a coauthor on the research.

By reconciling older and newer techniques for measuring ocean mass change, the team’s work improves calculations of long-term trends and provides a potential stopgap should satellite data no longer be available.

Shooting Lasers into Space

When scientists measure sea level rise, they consider two main components: how much the ocean’s volume has grown because of changes in water density—the steric component—and how much it has grown because it has gained mass from melted ice—the barystatic component.

Past estimates of total ocean mass change have relied on indirect methods like adding up mass loss from ice sheets, glaciers, and land water storage, explained Yufeng Nie, a geodesy researcher also at Hong Kong Polytechnic University and lead researcher on the new study. Mass lost from these areas is assumed to translate to an increase in ocean mass.

“But these individual estimates are not necessarily consistent, because they are developed by different groups” with different methodologies, Nie said.

In light of this, some researchers adapted satellite laser ranging (SLR), a technique in which scientists bounce ground-based lasers off orbiting satellites to track changes in ocean mass. SLR has been used for decades to measure Earth’s nonuniform gravity field by observing shifts in satellite orbits. A satellite’s altitude depends on Earth’s gravity at any given point, and gravity in turn depends on the distribution of mass beneath that point. Measuring satellite altitudes thus provides a window into measuring ocean mass changes.

“How can you observe, for example, ocean mass change from Antarctic melting using a technique with 4,000-kilometer spatial resolution?”

However, one key drawback to using SLR to measure barystatic sea level (BSL) change is that it can measure changes only on very large spatial scales, which limits its application in climate research, Chen said.

“How can you observe, for example, ocean mass change from Antarctic melting using a technique with 4,000-kilometer spatial resolution?” asked Chen.

Enter NASA’s Gravity Recovery and Climate Experiment (GRACE) missions. GRACE and its successor, GRACE Follow-On (GRACE-FO), each consisted of two satellites chasing each other along the same orbit, continuously sending laser beams back and forth. Like SLR, this process allowed the GRACE missions to provide maps of Earth’s surface mass, but at 10 times the resolution of SLR. And like with SLR, scientists have used GRACE gravity maps to track global ocean mass change.

But GRACE data, too, have their caveats. The first GRACE mission spanned 2002–2017, and GRACE-FO has spanned from 2018 to the present, a short time for understanding long-term trends. What’s more, the 11-month gap between GRACE and its successor meant that scientists were not able to calibrate the two satellites with each other, leaving some uncertainty about systematic differences between the missions.

A Near-Perfect Match

Nie, Chen, and their team were able to address both of these caveats by comparing SLR-based measurements of global ocean mass change with those from GRACE/-FO for the same time period, 2003–2022.

According to gravity maps provided by SLR, barystatic sea level change was 2.16 millimeters per year from 2003 to 2022, while GRACE/-FO measured 2.13 millimeters per year.

The new analysis shows that SLR and GRACE/-FO “agree quite well for the long-term trends,” Nie said. What’s more, researchers found no significant change in the calculation when the data transitioned from GRACE to GRACE-FO. “This gives us confidence that the SLR data, although it is of very low spatial resolution, can be used to tell us the ocean mass variations before 2002,” he added.

“Our SLR measurements…can provide a global constraint of the mass changes for the pre-GRACE era.”

The researchers were able to extend the time frame of their analysis back to 1993 by using SLR data, and they calculated a barystatic sea level change of 1.75 millimeters per year for 1993–2022. They attribute the lower rate of sea level rise in the past to recent acceleration of ice loss in Greenland.

“Our SLR measurements…can provide a global constraint of the mass changes for the pre-GRACE era,” Nie said.

This study was published in Proceedings of the National Academy of Sciences of the United States of America in June.

“Extending the record of measured BSL using satellite laser ranging back to 1993 is an important achievement,” said Bryant Loomis, chief of the Geodesy and Geophysics Laboratory at NASA’s Goddard Space Flight Center in Greenbelt, Md. “It allows the disaggregation of total sea level change, which is measured by altimetry, into its barystatic and steric components.”

“The long-term BSL estimate is also useful for assessing the accuracy of previous efforts to quantify the major land ice contributions to BSL prior to the launch of GRACE,” he added, referring to the method of adding together mass changes from glaciers, ice sheets, and land water storage. Loomis was not involved in the new research.

Nie, Chen, and their team are working to push the limits of SLR-derived barystatic sea level measurements to smaller spatial scales and lower uncertainties. They hope to demonstrate that SLR data can be used to measure mass change in Antarctica.

GRACE Continuity?

GRACE-FO launched in 2018 and is 7 years into its nominal 5-year mission. The satellites are in good health, and the nearly identical GRACE mission set a good precedent—it lived for more than 15 years. GRACE-FO might well overlap with its planned successor, GRACE-Continuity (GRACE-C), which is scheduled to launch in 2028.

The GRACE missions are designed to measure minute changes in Earth’s gravity at high spatial resolution. However, there was a coverage gap between the end of the GRACE mission and the start of GRACE-FO, and there may be a similar gap between GRACE-FO and GRACE-C. Credit: NASA/JPL-Caltech, Public Domain

However, recent woes for federally funded science in the United States have put GRACE-C’s future in doubt. Although NASA requested funding for GRACE-C for fiscal year 2026 through the mission’s launch, NASA’s acting administrator, Sean Duffy, recently stated his, and presumably President Donald Trump’s, desire to eliminate all Earth science at the agency (including healthy satellites). That cutback would likely nix GRACE-C.

In the near future, both Europe and China plan to launch satellite-to-satellite laser ranging missions that will provide GRACE-like measurements of Earth’s gravity, Chen said. However, the loss of GRACE-quality data would hamper climate scientists’ ability to accurately track drivers of sea level rise, he added. The SLR-derived measurements demonstrated in this recent research could help mitigate the loss, but only somewhat.

“There’s no way SLR can reach the same [resolution] as GRACE,” Chen said. “We can only use SLR to see the long-term, the largest scale, to fill the gap. But for many of GRACE’s applications—regional water storage or glacial mass change—no, there’s no way SLR can help.”

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

Citation: Cartier, K. M. S. (2025), Bridging old and new gravity data adds 10 years to sea level record, Eos, 106, https://doi.org/10.1029/2025EO250321. Published on 3 September 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Primera evaluación a nivel de especies revela riesgo de extinción en Mesoamérica

Wed, 09/03/2025 - 13:35

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

La reforestación es más compleja que simplemente plantar árboles. Esta incluye la evaluación de hábitats y ecosistemas, la identificación de la salud y la sostenibilidad de diferentes especies y el estudio de las estrategias para establecer nuevos asentamientos de árboles.

En regiones como Mesoamérica, donde los bosques están gravemente amenazados por las actividades humanas y el cambio climático, los conservacionistas interesados en la reforestación deben priorizar las especies cuyas poblaciones están disminuyendo. Para facilitar esta tarea, un grupo de investigadores evaluó el estado de conservación de las 4,046 especies de árboles endémicas de Mesoamérica, descritas en el proyecto Global Tree Assessment (Evaluación global de árboles). Es así como descubrieron que el 46% de estos árboles se encuentran en cierto riesgo de extinción.

Este estudio es el primero en evaluar el estado de todos los árboles endémicos en Mesoamérica.

El estudio, publicado en la revista Plants, People, Planet, es el primero en evaluar el estado de todos los árboles endémicos en Mesoamérica.

Emily Beech, autora principal del estudio y jefa de conservación en Botanic Gardens Conservation International (Conservación Internacional de Jardines Botánicos), enfatizó la importancia de enfocarse en esta región debido a sus altos niveles de biodiversidad, que con frecuencia están subrepresentados. Los países centroamericanos (Belice, Costa Rica, El Salvador, Guatemala, Honduras, Nicaragua y Panamá), dijo Beech, rara vez figuran entre los de mayor biodiversidad o como el hogar del mayor número de especies en peligro de extinción. Esta ausencia no se debe a una falta de biodiversidad, explicó, sino que es simplemente atribuible a su tamaño. El tamaño reducido de estos países hace que sean eclipsados por países grandes con bosques más extensos, como Brasil y la República Democrática del Congo. Pero, junto con México, Centroamérica alberga el 10% de la diversidad vegetal del mundo a pesar de representar menos del 1% de su superficie terrestre.

Para abordar esta brecha, los científicos primero identificaron árboles endémicos mesoamericanos a partir de evaluaciones presentadas en la Lista Roja de especies amenazadas de la Unión Internacional para la Conservación de la Naturaleza (IUCN, por sus siglas en inglés). Posteriormente, para evaluar el estado de conservación de los árboles, los investigadores superpusieron mapas de distribución de las especies arbóreas seleccionadas sobre mapas de la Base de Datos Mundial de Áreas Protegidas.

De las 4,046 especies arbóreas analizadas, encontraron que 1,867 están en peligro de extinción. México fue el único país que tenía especies arbóreas extintas en la base de datos, o extintas en estado silvestre. En los árboles existentes, México y Costa Rica presentaron el mayor número de especies amenazadas, con 888 y 227, respectivamente. La amenaza más común en general fue la pérdida de hábitat debido a la expansión agrícola.

La mayoría de las especies (3,349) contaban con al menos un punto de datos dentro de un área protegida. Sin embargo, el 72% de las especies mesoamericanas en áreas protegidas están amenazadas.

Un enfoque personalizado

Neptalí Ramírez Marcial no participó en la nueva investigación, pero como jefe del grupo de restauración del South Border College en México, trabaja con especies arbóreas que se encuentran en diferentes categorías de amenaza. Los bosques de Chiapas, donde él y sus colegas residen, solían estar repletos de encinos, que albergaban altos niveles de biodiversidad. Debido a la influencia humana, ahora hay más pinos que encinos, y el clima es menos favorable para las especies sensibles de la Lista Roja de la UICN.

A pesar del uso de la Lista Roja por parte de Ramírez Marcial, este se mantiene crítico con la herramienta y su uso en la investigación. Por ejemplo, señaló que la nueva evaluación de árboles mesoamericanos clasifica a la Furcraea macdougallii (planta del siglo de MacDougall) como extinta en México. Ramírez Marcial cree que esta planta es similar al agave y no debería considerarse un árbol en absoluto, por lo cual no debería incluirse en el estudio.

También señaló que el nuevo estudio considera a todo México como parte de Mesoamérica. Desde el punto de vista ecológico, dijo, la región biogeográfica mesoamericana se extiende solamente por el centro de México y excluye la parte norte del país, la cual tiene ecosistemas discretos no compartidos con Centroamérica.

Ocotea monteverdensis “pasó de no estar siquiera incluido en la lista a estar en la categoría de conservación más vulnerable”.

Ramírez Marcial coincidió con las conclusiones del nuevo estudio, sin embargo, argumenta que: las estrategias de restauración deben considerar la biodiversidad de las áreas que se desean proteger. Por ejemplo, señaló que los programas del gobierno mexicano priorizan la distribución de pinos para la reforestación en todo el país, en lugar de diseñar estrategias definidas para cada región.

Daniela Quesada, conservacionista del Instituto Monteverde en Costa Rica, afirmó que el nuevo estudio ofrece una visión más completa del estado de los árboles en Mesoamérica. No obstante, al igual que Ramírez Marcial, considera la información de la Lista Roja de la UICN como un punto de partida para la investigación. La exactitud de la Lista Roja, explicó, depende de la cantidad de información que se le presente.

Quesada apuntó que el siguiente paso para la conservación de los árboles en Mesoamérica es que los científicos “analicen con más detalle cada especie que apareció” en el nuevo estudio. Un análisis riguroso de la presencia e influencia de cada especie en cada región podría influir en el desarrollo de proyectos de conservación determinados.

Como ejemplo, mencionó el caso de Ocotea monteverdensis, un árbol que “pasó de no estar siquiera incluido en la lista a estar en la categoría de conservación más vulnerable” (en peligro crítico) gracias al trabajo del ecólogo John Devereux Joslin Jr. Este reconocimiento condujo al desarrollo de un programa comunitario de conservación específico y continuo para este árbol.

—Roberto González (@perrobertogg.bsky.social), Escritor de ciencia

This translation by translator Oriana Venturi Herrera (@OrianaVenturiH) was made possible by a partnership with Planeteando y GeoLatinas. Esta traducción fue posible gracias a una asociación con Planeteando and GeoLatinas.

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Experienced Researcher Book Publishing: Sharing Deep Expertise

Wed, 09/03/2025 - 12:00
Editors’ Vox is a blog from AGU’s Publications Department.

Being an experienced researcher can come with a lot of heavy professional responsibilities, such as leading grant proposals, managing research teams or labs, supervising doctoral students and postdoctoral scientists, serving on committees, mentoring younger colleagues … the list goes on. This may also be a time filled with greater personal responsibilities beyond the job. Why add to the workload by taking on a book project? In the third installment of career-focused articles, three scientists who wrote or edited books as experienced researchers reflect on their motivations and how their networks paved the way for—and grew during—the publishing process.

Douglas Alsdorf co-edited Congo Basin Hydrology, Climate, and Biogeochemistry: A Foundation for the Future, which discusses new scientific discoveries in the Congo Basin and is published in both English and French. Nancy French co-edited Landscape Fire, Smoke, and Health: Linking Biomass Burning Emissions to Human Well-Being, which presents a foundational knowledge base for interdisciplinary teams to interact more effectively in addressing the impacts of air pollution. Michael Liemohn authored Data Analysis for the Geosciences: Essentials of Uncertainty, Comparison, and Visualization, a textbook on scientific data analysis and hypothesis testing in the Earth, ocean, atmospheric, space, and planetary sciences. We asked these scientists why they decided to write or edit a book, what impacts they saw as a result, and what advice they would impart to prospective authors and editors.

Why did you decide to write or edit a book? Why at that point in your career?

ML: I was assigned to develop a new undergraduate class on data-model comparison techniques. I realized that the best textbooks for it were either quite advanced or rather old. One book I love included the line, “if the student has access to a computer…” in one of the homework questions. I also was not finding a book with the content set that I wanted to cover in the class. So, I developed my own course content set and note pack, which provided the foundation for the chapters of the book.

DA: Our 2022 book was a result of a 2018 AGU Chapman Conference in Washington, DC, that I was involved in organizing. About 100 researchers, including 25 from sub-Saharan Africa, attended the conference, and together we decided that an edited book in the AGU Geophysical Monograph Series would become a launching point for the next decade of research in the Congo Basin.

The motivation for the book was not to advance my career, but because the topic was important to get out there.

NF: The motivation for the book was not to advance my career, but because the topic was important to get out there. The book looks at how science is trying to better inform how to manage smoke from wildland fires. The work was important because people in fire, smoke modeling, and health sciences do not work together often, and there were some real misconceptions about how others do the research and how detailed the topics can be.

What were some benefits of completing a book as an experienced researcher? 

NF: Once you have been working in a field for a while you want to see how your deep expertise can benefit more than just the community of researchers that you know or know of. Reaching into other disciplines allows you to understand how your work can have broader impact. And, you are ready to know more about other, adjacent topics, rather than a deeper view of what you know already. I think these feelings grow more true as you move to later stages of a career.

I think that I would have greatly struggled with this breadth of content if I had tried to write this particular book 10 years earlier.

ML: I was developing my data-model comparison techniques course and textbook for all students in my department, so I wanted to include examples across that diverse list of disciplines—Earth, atmosphere, space, and planetary sciences. Luckily, over the years I had taught a number of classes spanning these topics. Additionally, I had attended quite a few presentations across these fields, not only at seminars on campus but also at the annual AGU meeting. I felt comfortable including examples and assignments from all these topics. Also, I knew colleagues in these fields, and I called on them for advice when I got stuck. I think that I would have greatly struggled with this breadth of content if I had tried to write this particular book 10 years earlier.

What impact do you hope your book will have?

The next great discoveries will happen in the Congo Basin and our monograph motivates researchers toward those exciting opportunities. 

DA: There are ten times fewer peer-reviewed papers on the Congo Basin compared to the Amazon Basin. Our monograph changes that! We have brought new attention to the Congo Basin, demonstrating to the global community of Earth scientists that there is a large, vibrant group of researchers working daily in the Congo Basin. The next great discoveries will happen in the Congo Basin and our monograph motivates researchers toward those exciting opportunities. 

ML: I hope that the book has two major impacts. The first expected benefit is to the students that use it with a course on data-model comparison methods. I want it to be a useful resource regardless of their future career direction. The second impact I wish for is on Earth and space scientist researchers; I hope that our conversations about data-model comparisons are ratcheted up to a higher level, allowing us to more thoughtfully conduct such assessments and therefore maximize scientific progress.

What advice would you give to experienced researchers who are considering pursuing a book project?

NF: Here are a few thoughts: One: Choose co-authors, editors, and contributors that you can count on. Don’t try to “mend fences” with people you have not been able to connect with. That said, if you do admire a specific person or know their point of view is valuable, this is the time to overcome any barriers to your relationship. Two: Give people assignments, and they will better understand your point of view. Three: Listen to your book production people. They are all skilled professionals who know more about this than you do. They can be great allies in getting it done!

DA: Do it! Because we publish papers, our thinking tends to focus on the one topic of a particular paper. A book, however, broadens our thinking so that we more fully understand the larger field of work. Each part of that bigger space has important advances as well as unknowns that beg for answers. A book author who can see each one of these past solutions and future challenges becomes a community resource who provides insights and directions for new research. 

—Douglas Alsdorf (alsdorf.1@osu.edu, 0000-0001-7858-1448), The Ohio State University, USA; Nancy French (nhfrench@mtu.edu, 0000-0002-2389-3003), Michigan Tech Research Institution, USA; and Michael Liemohn (liemohn@umich.edu, 0000-0002-7039-2631), University of Michigan, USA

This post is the third in a set of three. Learn about leading a book project as an early-career or mid-career researcher.

Citation: Alsdorf, D., N. French, and M. Liemohn (2025), Experienced researcher book publishing: sharing deep expertise, Eos, 106, https://doi.org/10.1029/2025EO255028. Published on 3 September 2025. This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s). Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Inside a Georgia Beach’s High-Tech Fight Against Erosion

Tue, 09/02/2025 - 13:09

This story was originally published by Grist. Sign up for Grist’s weekly newsletter here. This coverage is made possible through a partnership between Grist and WABE, Atlanta’s NPR station.

At low tide on Tybee Island, Georgia, the beach stretches out as wide as it gets with the small waves breaking far away across the sand—you’ll have a long walk if you want to take a dip. But these conditions are perfect for a team of researchers from the University of Georgia’s Skidaway Institute of Oceanography.

Every three months, at low tide, they set out a miniature helipad near the foot of the dune and send up their drone equipped with lidar—technology that points a laser down at the sand and uses it to measure the elevation of the beach and dunes. The team flies it back and forth from the breakers to the far side of the dune and back until they have a complete, detailed map of the island’s 7-mile beach, about 400 acres.

“I see every flip-flop on the beach.”

“It’s high accuracy, it’s a high resolution,” explained research technician Claudia Venherm, who leads this project. “I see every flip-flop on the beach.”

That detailed information is crucial because Tybee is a barrier island, and rising seas are constantly eating away at the sandy beach and dunes that protect the island’s homes and businesses as well as a stretch of the Georgia mainland. Knowing exactly where the island is eroding and how the dunes are holding up to constant battering can help local leaders protect this piece of coastline.

“Tybee wants to retain its beach. It also wants to maintain, obviously, its dune. It’s a protection for them,” said Venherm. “We also give some of our data to the Corps of Engineers so they know what’s going on and when they have to renourish the beach.”

Since the 1970s the Army Corps of Engineers has helped maintain Tybee Island’s beaches with regular renourishment: Every seven years or so, the Corps dredges up sand from the ocean floor and deposits on the beach to replace sand that’s washed away. The data from the Skidaway team will only help the Corps do this work more effectively. Lidar isn’t new, and neither is aerial coastal mapping. Several federal agencies monitor coastlines with lidar, but those surveys are more typically several years apart for any one location, rather than a few months.

The last renourishment finished in January 2020, and Venherm and her team got to work a few months later. That means they have five years of high-resolution beach data, recorded every three months and after major storms like Hurricane Helene, creating a precise picture of how the beach is changing.

“I can compute what the elevation of the dune is, as well as how much volume has been lost or gained since a previous survey.”

“I can compute what the elevation of the dune is, as well as how much volume has been lost or gained since a previous survey,” said Venherm. “I can also compute how long it will take until the beach is completely gone, or how long will it take until water reaches the dune system.”

The Corps conducts regular renourishment projects on beaches all along the East Coast, and uses a template to inform that work, said Alan Robertson, a consultant who leads the city of Tybee’s resilience planning. But he hopes that such granular evidence of specific changes over time can shift where exactly the sand gets placed within the bounds of that template. An area near the island’s north end, for instance, is a clear hot spot for erosion, so the city may push for concentrating sand there, and north of that point so that it can travel south to fill in the erosion.

“We know exactly where the hotspots of erosion are. We know where there’s accretion,” he said, referring to areas where sand tends to build up. “[We] never had that before.”

The data can also inform the city’s own decision-making, because it provides a much clearer picture of what happens to the dunes and beach over time after the fresh sand is added. In the past, they’ve been able to see the most obvious erosion, but now they can compare how different methods of dune-building and even sources of sand hold up. The vegetation that’s critical to holding dunes together, for instance, takes root far better in sand dredged from the ocean compared to sand trucked in from the mainland, Robertson said.

“There’s an example of the research and the monitoring. I actually can make that statement,” he said. “I actually know where you should get your sand from if you can, and why. No one could have told you that eight years ago.”

That sort of proven information is key in resilience projects, which are often expensive and funded by grants from agencies that want confirmation their money is being spent well.

“Everything we do now on resiliency, measuring, and monitoring has become a priority,” said Robertson. “We’ve been able over these years through proof statements of ‘look at what this does for you’ to make it part of the project.”

—Emily Jones (@ejreports.bsky.social), Grist

This article originally appeared in Grist at https://grist.org/science/inside-a-georgia-beachs-high-tech-fight-against-erosion/.

Grist is a nonprofit, independent media organization dedicated to telling stories of climate solutions and a just future. Learn more at Grist.org

How Researchers Have Studied the Where, When, and Eye of Hurricanes Since Katrina

Fri, 08/29/2025 - 12:02

On 28 August 2005, New Orleans area residents received a bulletin from the National Weather Service (NWS) office in Slidell, La., warning them of “a most powerful hurricane with unprecedented strength.” One excerpt of the chilling announcement, issued via NOAA radio and the Federal Communications Commission’s Emergency Alert Service, read,

BLOWN DEBRIS WILL CREATE ADDITIONAL DESTRUCTION. PERSONS…PETS…AND LIVESTOCK EXPOSED TO THE WINDS WILL FACE CERTAIN DEATH IF STRUCK.

POWER OUTAGES WILL LAST FOR WEEKS…AS MOST POWER POLES WILL BE DOWN AND TRANSFORMERS DESTROYED. WATER SHORTAGES WILL MAKE HUMAN SUFFERING INCREDIBLE BY MODERN STANDARDS.

Hurricane Katrina, which caused 1,833 fatalities and about $108 billion in damage (more than $178 billion in 2025 dollars), remains the costliest hurricane on record to hit the United States and among the top five deadliest.

“If we were to have a Katrina today, that [forecast] cone would be half the size that it was in 2005.”

In the 20 years since the hurricane, meteorologists, modelers, computer scientists, and other experts have worked to improve the hurricane forecasting capabilities that inform bulletins like that one.

Consider the forecast cone, for instance. Also known as the cone of uncertainty, this visualization outlines the likely path of a hurricane with decreasing specificity into the future: The wider part of the cone might represent the forecasted path 36 hours in advance, and the narrower part might represent the forecasted path 12 hours in advance.

“If we were to have a Katrina today, that cone would be half the size that it was in 2005,” said Jason Beaman, meteorologist-in-charge at the National Weather Service Mobile/Pensacola office.

How to Make a Hurricane

The ingredients for a hurricane boil down to warm water and low pressure. When an atmospheric low-pressure area moves over warm ocean water, surface water evaporates, rises, then condenses into clouds. Earth’s rotation causes the mass of clouds to spin as the low pressure pulls air toward its center.

Storms born in the Gulf of Mexico or that traverse it, as Katrina did, benefit from the body’s sheltered, warm water, and the region’s shallow continental shelf makes storm surges particularly destructive for Gulf Coast communities.

Hurricanes gain strength as long as they remain over warm ocean waters. But countless factors contribute to how intense a storm becomes and what path it takes, from water temperature and wind speed to humidity and proximity to the equator.

Because predicting the behavior of hurricanes requires understanding how they work, data gathered by satellites, radar, and aircraft are crucial for researchers. Feeding these data into computer simulations helps researchers understand the mechanisms behind hurricanes and predict how future storms may behave.

“Since 2005, [there have been] monumental leaps in observation skill,” Beaman said.

Seeing a Storm More Clearly

Many observations of the weather conditions leading up to hurricanes come from satellites, which can offer a year-round bird’s-eye view of Earth.

NOAA operates a pair of geostationary satellites that collect imagery and monitor weather over the United States and most of the Atlantic and Pacific oceans. The mission, known as the Geostationary Operational Environmental Satellite (GOES) program, has been around since 1975; the current satellites are GOES-18 and GOES-19.

When Beaman started his career just a few years before Katrina hit, satellite imagery from GOES-8 to GOES-12 was typically beamed to Earth every 30–45 minutes—sometimes as often as every 15 minutes. Now it’s routine to receive images every 5 minutes or even as often as every 30 seconds. Having more frequent updates makes for much smoother animations of a hurricane’s track, meaning fewer gaps in the understanding of a storm’s path and intensification.

For Beaman, the launch of the GOES-16 satellite in 2016 marked a particularly important advance: In addition to beaming data to scientists more frequently, it scanned Earth with 4 times the resolution of the previous generation of satellites. It could even detect lightning flashes, which can sometimes affect the structure and intensity of a hurricane.

The transition to GOES-16 “was like going from black-and-white television to 4K television.”

The transition to GOES-16 “was like going from black-and-white television to 4K television,” Beaman said.

NOAA also has three polar-orbiting satellites, launched between 2011 and 2017, that orbit Earth from north to south 14 times a day. As part of the Joint Polar Satellite System (JPSS) program, the satellites’ instruments collect data such as temperature, moisture, rainfall rates, and wind for large swaths of the planet. They also provide microwave imagery using radiation emitted from water droplets and ice. NOAA’s earlier polar-orbiting satellites had lower resolution at the edges of scans, a more difficult time differentiating clouds from snow and fog, and less accurate measurements of sea surface temperature.

“With geostationary satellites, you’re really just looking at the cloud tops,” explained Daniel Brown, branch chief of the Hurricane Specialist Unit at NOAA’s National Hurricane Center in Miami. “With those microwave images, you can really kind of see into the storm, looking at structure, whether an eye has formed. It’s really helpful for seeing the signs of what could be rapid intensification.”

NOAA’s Geostationary Operational Environmental Satellites (GOES) monitor weather over the United States and most of the Atlantic and Pacific oceans. Credit: NOAA/Lockheed Martin, Public Domain

Rapid intensification is commonly defined as an increase in maximum sustained wind speed of 30 or more nautical miles per hour in a 24-hour period. Katrina had two periods of rapid intensification, and they were one reason the storm was so deadly. In the second period, the storm strengthened from a low-end category 3 hurricane (in which winds blow between 178 and 208 kilometers per hour, or between 111 and 129 miles per hour) to a category 5 hurricane (in which winds blow faster than 252 kilometers per hour, or 157 miles per hour) in less than 12 hours.

New Angles

Radar technology has also made strides in the decades since Katrina. Hurricane-tracking radar works via a ground- or aircraft-based transmitter sending out a radio signal. When the signal encounters an obstacle in the atmosphere, such as a raindrop, it bounces back to a receiver. The amount of time it takes for the signal to return provides information about the location of the obstacle.

Between 2011 and 2013, NWS upgraded its 150+ ground-based radars throughout the United States with dual-polarization technology—a change a 2013 NWS news release called “the most significant enhancement made to the nation’s radar network since Doppler radar was first installed in the early 1990s.”

So-called dual-pol technology sends both horizontal and vertical pulses through the atmosphere. With earlier technology, a radar signal might tell researchers only the location of precipitation. Dual-pol can offer information about how much precipitation is falling, the sizes of raindrops, and the type of precipitation or can even help researchers identify debris being transported in a storm.

Credit: NOAA

“That’s not something that we had back in Katrina’s time,” Beaman said. In 2005, forecasters used “much more crude ways of trying to calculate, from radar, how much rain may have fallen.”

Radar updates have become more frequent as well. Beaman said his office used to receive routine updates every 5 or 6 minutes. Now they receive updated radar imagery as often as every minute.

Hunting Hurricanes from the Skies

For a more close-up view of a hurricane, NOAA and the U.S. Air Force employ Hurricane Hunters—planes that fly directly through or around a storm to take measurements of pressure, humidity, temperature, and wind speed and direction. These aircraft also scan the storms with radar and release devices called dropwindsondes, which take similar measurements at various altitudes on their way down to the ocean.

NOAA’s P-3 Orion planes and the 53rd Weather Reconnaissance Squadron’s WC-130J planes fly through the eyes of storms. NOAA’s Gulfstream IV jet takes similar measurements from above hurricanes and thousands of square kilometers around them, also releasing dropwindsondes along the way. These planes gather information about the environment in which storms form. A 2025 study showed that hurricane forecasts that use data from the Gulfstream IV are 24% more accurate than forecasts based only on satellite imagery and ground observations.

The NOAA P-3 Hurricane Hunter aircraft captured this image from within the eye of Hurricane Katrina on 28 August 2005, 1 day before the storm made landfall. Credit: NOAA, Public Domain

Hurricane Hunters’ tactics have changed little since Katrina, but Brown said that in the past decade or so, more Hurricane Hunter data have been incorporated into models and have contributed to down-to-Earth forecasting.

Sundararaman “Gopal” Gopalakrishnan, senior meteorologist with NOAA’s Atlantic Oceanographic and Meteorological Laboratory’s (AOML) Hurricane Research Division, emphasized that Hurricane Hunter data have been “pivotal” for improving both the initial conditions of models and the forecasting of future storms.

With Hurricane Hunters, “you get direct, inner-core structure of the storm,” he said.

Hurricane Hunters are responsible for many of the improvements in hurricane intensity forecasting over the past 10–15 years, said Ryan Torn, an atmospheric and environmental scientist at the University at Albany and an author of the recent study about Gulfstream IVs. One part of this improvement, he explained, is that NOAA began flying Hurricane Hunters not just for the largest storms but for weaker and smaller ones as well, allowing scientists to compare what factors differentiate the different types.

“We now have a very comprehensive observation dataset that’s come from years of flying Hurricane Hunters into storms,” he said. These datasets, he added, make it possible to test how accurately a model is predicting wind, temperature, precipitation, and humidity.

In 2021, NOAA scientists also began deploying uncrewed saildrones in the Caribbean Sea and western Atlantic to measure changes in momentum at the sea surface. The drones are designed to fill observational gaps between floats and buoys on the sea surface and Hurricane Hunters above.

Modeling Track and Intensity

From the 1980s to the early 2000s, researchers were focused on improving their ability to forecast the path of a hurricane, not necessarily what that hurricane might look like when it made landfall, Gopalakrishnan explained.

Brown said a storm’s track is easier to forecast than its intensity because a hurricane generally moves “like a cork in the stream,” influenced by large-scale weather features like fronts, which are more straightforward to identify. Intensity forecasting, on the other hand, requires a more granular look at factors ranging from wind speed and air moisture to water temperature and wind shear.

Storms like 2005’s Katrina and Rita “showed the importance of [tracking a storm’s] intensity, especially rapid intensification.”

Gopalakrishnan said storms like 2005’s Katrina and Rita “showed the importance of [tracking a storm’s] intensity, especially rapid intensification.”

Without intensity forecasting, Gopalakrishnan said, some of the most destructive storms might appear “innocuous” not long before they wreak havoc on coastlines and lives. “Early in the evening, nobody knows about it,” he explained. “And then, early in the morning, you see a category 3 appear from nowhere.”

Gopalakrishnan came to AOML in 2007 to set up both the Hurricane Modeling Group and NOAA’s Hurricane Forecast Improvement Project. He had begun working on what is now known as the Hurricane Weather Research Forecast model (HWRF) in 2002 in his role at NOAA’s Environmental Modeling Center. With the formation of the hurricane modeling group in 2007, scientists decided to focus on using HWRF to forecast intensity changes.

HWRF used a technique called moving nests to model the path of a storm in higher resolution than surrounding areas. Gopalakrishnan compared a nest to using a magnifying glass focused on the path of a storm. Though a model might simulate a large area to provide plenty of context for a storm’s environment, capturing most of an area in lower resolution and the storm path itself in higher resolution can save computing power.

By 2014, Gopalakrishnan said, the model’s tracking and intensity forecasting capabilities had improved 25% since 2007. The model’s resolution also upgraded from 9 square kilometers in 2007 to 1.5 square kilometers by the time it was retired in 2023.

Since 2007, the National Hurricane Center’s official (OFCL) track forecast errors decreased between 30% and 50%, and intensity errors shrank by up to 55%. MAE = mean absolute error; VMAX = maximum sustained 10-meter winds. Credit: Alaka et al., 2024, https://doi.org/10.1175/BAMS-D-23-0139.1

Over time, advances in how data are introduced into models meant that the better data researchers were receiving from satellites, radars, and Hurricane Hunters improved modeling abilities even further. Gopalakrishnan estimated that by 2020, his office could predict hurricane track and intensity with somewhere between 50% and 54% more accuracy than in 2007.

NOAA began transitioning operations to a new model known as the Hurricane Analysis and Forecast System (HAFS) in 2019, and HAFS became the National Hurricane Center’s operational forecasting model in 2023. HAFS, developed jointly by several NOAA offices, can more reliably forecast storms, in part by increasing the use of multiple nests—or multiple high-resolution areas in a model—to follow multiple storms at the same time. HAFS predicted the rapid intensification of Hurricanes Helene and Milton in 2024.

Just as they did with HWRF, scientists run multiple versions of HAFS each year: an operational model, used to inform the public, and a handful of experimental models to see which of them work the best. At the end of hurricane season, researchers examine which versions performed the best and begin combining elements to develop the next generation of the operational model. The team expects that as HAFS improves, it will lengthen the forecast from the 5 days offered by previous models.

“As a developer [in 2007], I would have been happy to even get 2 days forecast correctly,” Gopalakrishnan said. “And today, I’m aiming to get a 7-day forecast.”

NOAA’s budget plan for 2026 could throw a wrench into this progress, as it proposes eliminating all NOAA labs, including AOML.

The Role of Communication

An accurate hurricane forecast does little good if the information isn’t shared with the people who need it. And communication about hurricane forecasts has seen its own improvements in the past 2 decades. NWS has partnered with social scientists to learn how to craft the most effective messages for the public, something Beaman said has paid dividends.

Communication between the National Hurricane Center and local weather service offices can be done over video calls, rather than by phone as was once done. Sharing information visually can make these calls more straightforward and efficient. NWS began sending wireless emergency alerts directly to cell phones in 2012.

In 2017, the National Hurricane Center began issuing storm surge watches and warnings in addition to hurricane watches and warnings. Beaman said storm surge inundation graphics, which show which areas may experience flooding, may have contributed to a reduction in storm surge–related fatalities. In the 50-year period between 1963 and 2012, around 49% of storm fatalities were related to storm surge, but by 2022, that number was down to 11%.

“You take [the lack of visualization] back to Katrina in 2005, one of the greatest storm surge disasters our country has seen, we’re trying to express everything in words,” Beaman said. “There’s no way a human can properly articulate all the nuances of that.”

Efforts to create storm data visualization go beyond NOAA.

Carola and Hartmut Kaiser moved to Baton Rouge, La., just weeks before Hurricane Katrina made landfall. Hartmut, a computer scientist, and Carola, an information technology consultant with a cartography background, were both working at Louisiana State University. When the historic storm struck, Hartmut said they wondered, “What did we get ourselves into?”

Shortly after the storm, the Kaisers combined their expertise and began work on the Coastal Emergency Risks Assessment (CERA). The project, led by Carola, is an easy-to-use interface that creates visual representations of data, including storm path, wind speed, and water height, from the National Hurricane Center, the Advanced Circulation Model (ADCIRC), and other sources.

The Coastal Emergency Risks Assessment tool aims to help the public understand the potential timing and impacts of storm surge. Here, it shows a forecast cone for Hurricane Erin in August 2025, along with predicted maximum water height levels. Credit: Coastal Emergency Risks Assessment

“We know of a lot of people who said, ‘Yes, thank you, [looking at CERA] caused me to evacuate.”

What started as an idea for how to make information more user-friendly for the public, emergency managers, and the research community grew quickly: Hundreds of thousands of people now use the tool during incoming storm events, Hartmut said. The Coast Guard often moves its ships to safe regions on the basis of CERA’s predictions, and the team frequently receives messages of thanks.

“We know of a lot of people who said, ‘Yes, thank you, [looking at CERA] caused me to evacuate,” Hartmut said. “And now my house is gone, and I don’t know what would have happened if I didn’t go.”

Looking Forward

Unlike hurricane season itself, the work of hurricane modelers has no end. When the season is over, teams such as Gopalakrishnan’s review the single operational and several experimental models that ran throughout the season, then work all year on building an upgraded operational model.

“It’s 365 days of model developments, testing, and evaluation,” he said.

NOAA scientists aren’t the only ones working to improve hurricane forecasting. For instance, researchers at the University of South Florida’s Ocean Circulation Lab (OCL) and the Florida Flood Hub created a storm surge forecast visualization tool based on the lab’s models. The West Florida Coastal Ocean Model, East Florida Coastal Ocean Model, and Tampa Bay Coastal Ocean Model were designed for the coastal ocean with a sufficiently high resolution to model small estuaries and shipping channels.

Though Yonggang Liu, a coastal oceanographer and director of OCL, cited examples of times his lab’s models have outperformed NOAA’s models, the tool is not used in operational NOAA forecasts. But it is publicly available on the OCL website (along with a disclaimer that the analyses and data are “research products under development”).

The Cyclone Global Navigation Satellite System (CYGNSS) is a NASA mission that pairs signals from existing GPS satellites with a specialized radar receiver to measure reflections off the ocean surface—a proxy for wind levels. The constellation of eight satellites can take measurements more frequently than GOES satellites, allowing for better measurement of rapid intensification, said Chris Ruf, a University of Michigan climate and space scientist and CYGNSS principal investigator.

It might seem that if a method or mission offers a way to more accurately forecast hurricanes, it should be promptly integrated into NOAA’s operational models. But Ruf explained NOAA’s hesitation to use data from university-led efforts: Because they are outside of NOAA’s control and could therefore lose funding or otherwise stop running, it’s too risky for NOAA to rely on such projects.

“CYGNSS is a one-off mission that was funded to go up there and do its thing, and then, when it deorbits, it’s over,” Ruf said. “They [at NWS] don’t want to invest a lot of time learning how to assimilate some new data source and then have the data disappear later. They want to have operational usage where they can trust that it’s going to be there later on.”

“These improvements cannot happen as a one-man army.”

Whatever office they’re in, it’s scientists who make the work of hurricane forecasting possible. Gopalakrishnan said that during Katrina, there were two or three people at NOAA associated with model development. He credits the modeling improvements made since then to the fact that, now, there’s a team of several dozen. And more advances may be on the horizon. For instance, NOAA expects a new Hurricane Hunter jet, a G550, to join the ranks by 2026.

However, some improvements are stalling. The Geostationary Extended Observations (GeoXO) satellite system is slated to begin expanding observations of GOES satellites in the early 2030s. But the 2026 U.S. budget proposal, which suggests slashing $209 million from NOAA’s efforts to procure weather satellites and infrastructure, specifically suggests a “rescope” of the GeoXO program

Hundreds of NOAA scientists have been laid off since January 2025, including Hurricane Hunter flight directors and researchers at AOML (though NWS received permission to rehire hundreds of meteorologists, hydrologists, and radar technicians, as well as hire for previously approved positions, in August).

In general, hurricane fatalities are decreasing: As of 2024, the 10-year average in the United States was 27, whereas the 30-year average was 51. But this decrease is not because storms are becoming less dangerous.

“Improved data assimilation, improved computing, improved physics, improved observations, and more importantly, the research team that I could bring together [were] pivotal” in enabling the past 2 decades of forecasting improvements, said Gopalakrishnan. “These improvements cannot happen as a one-man army. It’s a team.”

—Emily Dieckman (@emfurd.bsky.social), Associate Editor

Citation: Dieckman, E. (2025), How researchers have studied the where, when, and eye of hurricanes since Katrina, Eos, 106, https://doi.org/10.1029/2025EO250320. Published on 28 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Debate over Wakes in the Land of 10,000 Lakes

Fri, 08/29/2025 - 12:01

Wakeboats are causing a stir in Minnesota.

Though all powerboats create wakes, these specialty craft have heavier sterns and engines specifically designed to shape water into surfable waves. That extra turbulence is drawing ire from other lake-lovers.

Across the state, Minnesotans are reporting eroding banks, murky waters, and shredded vegetation. When considering wakeboats, one person’s recreation is another’s resentment.

“It’s divisive,” said Joe Shneider, president of the Minnesota Coalition of Lake Associations. “The three big issues we hear all the time [about wakeboats] are personal safety, bank erosion, and lake bed disruption.”

Specialty wakeboats are designed to shape water into surfable waves, allowing riders to follow behind without needing a towrope. New research shows how those wakes can affect the lake bed below. Credit: Colin Van Dervort/Flickr, CC BY 2.0

As the popularity and size of wakeboats grow, so does the need for data. Communities are wrestling with issues of regulation and education, and both approaches require information. That’s why Shneider and more than 200 others helped crowdfund recent research from the University of Minnesota’s Saint Anthony Falls Laboratory. (The state also supported the project.) The resulting public dataset shows how wakeboats can churn lake beds, information that can help communities navigate the brewing conflict.

The Stakes

Minnesota is not the only state navigating a great wake debate. In 2024, Maine implemented wakeboat regulations and Vermont restricted wake surfing to its 30 largest lakes. (Some residents want the number further reduced to 20.) In Wisconsin, individual municipalities are debating bans on wake surfing at hundreds of lakes, prompting at least one lawsuit.

Minnesota, in contrast, has issued wakeboat regulations at only one of its 10,000 lakes.

“There’s a whole lot of people out there that need to make decisions about their lake.”

The environmental issues at stake arise in shallow water, where powerboats can stir up obvious trails of sediment. Resuspended sediment absorbs sunlight, which heats the water column. Turbidity reduces the feeding rates of some fishes. Once-buried nutrients again become available, triggering toxic algal blooms that choke beaches and rob fish of oxygen.

But to connect the dots between wakeboat use and ecosystem disruption, researchers needed to document how various powerboats affect sediment dispersal.

“We want to understand how boats are interacting with the water column and provide data, because there’s a whole lot of people out there that need to make decisions about their lake,” said Jeff Marr, a hydraulic engineer at the University of Minnesota and a coauthor of the study.

The Wake

On Lake Minnetonka, just west of Minneapolis, seven locals lent their boats for the research. These watercraft ranged from relatively light, low-power deck boats (150-horsepower, 2,715 pounds) to burly bowriders (760-horsepower, 14,530 pounds) and included two boats built for wake surfing.

On test days, volunteers piloted their boats between buoy-marked goalposts. Acoustic sensors on the lake bed tracked pressure changes in the water column.

Powerboats mostly operate at either displacement speed (chugging low in the water) or planing speed (skipping faster along the surface). But there’s a transition called semidisplacement, in which the stern sinks in the water and waves spike in size.

“It’s right at that transition that [wakeboats] like to operate,” said Andy Riesgraf, an aquatic biologist at the University of Minnesota and a coauthor of the study.

Boaters drove the course five times at planing speed (21–25 miles per hour, common for water-skiing and tubing) and five times at displacement or semidisplacement mode (7–11 miles per hour, common for cruising and wake surfing). Researchers in rowboats paddled to collect water samples at various intervals in the track.

Researchers Chris Feist and Jessica Kozarek stand by the research rowboat. To minimize disruption in the water column, the human-powered sampling team paddled into the wake racetrack to collect 1-liter water samples at three different depths. Credit: Saint Anthony Falls Laboratory

The acoustic sensors showed that three types of waves affected the water column. Pressure waves, created by the immediate shift and rebound of water around a boat, were short-lived but strong enough to shake loose sediments. Transverse waves, which follow the boat’s path, and propeller wash, the frothy vortex generated by its engines, both elevated loose sediment and caused minutes-long disturbances.

Though all boats created these waves, the wakeboats churned the most sediment.

In planing mode, all seven boats caused brief and minimal disturbances. Sediments settled in less than 10 seconds at 9- and 14-foot depths. But when operating in slower, semidisplacement mode, wakeboats created a distinct disturbance. Following a pass from a wakeboat, sediment needed 8 minutes to settle at 14-foot depth and more than 15 minutes at 9-foot depth.

The research team released simple recommendations based on their findings. One recommendation is that all recreational powerboats should operate in at least 10 feet of water to minimize disturbances. Another is that wakeboats, when used for surfing, need 20 feet of water to avoid stirring up sediments and altering the ecosystem.

The Uptake

The new research adds to the group’s existing dataset on powerboats’ hydrologic impacts on lake surfaces.

Whether the suggestions lead to regulations is up to lake managers.

“Our goal is just to get the data out,” Marr said. The researchers published their findings in the University of Minnesota’s open-access digital library so that everyday lake-goers can find the information. Three external experts reviewed the material.

The more we continue to collect these data, the more that we start to fill in those other gaps.

The results add information to the policy debate. “If there is going to be some type of environmental regulation [on powerboating], you need very clear evidence that under these conditions, it’s detrimental,” said Chris Houser, a coastal geomorphologist at the University of Waterloo who was not involved in the project.

There are other variables to study—such as the number of boats on the water and the paths they’re carving—but “the more we continue to collect this data, the more we start to fill in those other gaps of different depths and different configurations,” Houser said.

For Shneider, the new data add much-needed clarity. The latest report “is monumental,” he said.

Marr, Riesgraf, and their colleagues are now comparing the impacts of boat-generated wakes against wind-driven waves. Those data could further isolate the impacts powerboats have on lakes.

—J. Besl (@J_Besl, @jbesl.bsky.social), Science Writer

This news article is included in our ENGAGE resource for educators seeking science news for their classroom lessons. Browse all ENGAGE articles, and share with your fellow educators how you integrated the article into an activity in the comments section below.

Citation: Besl, J. (2025), A debate over wakes in the land of 10,000 lakes, Eos, 106, https://doi.org/10.1029/2025EO250316. Published on 29 August 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

USDA Moves to Rescind Roadless Rule Protecting 45 Million Acres of Wild Area

Thu, 08/28/2025 - 21:11
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

The U.S. Department of Agriculture (USDA) is proposing rescinding the 2001 Roadless Area Conservation Rule, which protects about 45 million acres (182,000 square kilometers) of National Forest System lands from road construction, reconstruction, and timber harvests.

Of the land that would be affected by the rescission, more than 95% is in 10 western states: Alaska, Montana, California, Utah, Wyoming, Nevada, Washington, Oregon, New Mexico, and Arizona. The change would not apply to Colorado and Idaho, which have state-specific roadless rules.

Secretary of Agriculture Brooke L. Rollins first announced the USDA’s rescission of the rule on 23 June, prompting negative responses from several environmental, conservation, and native groups.

“The Tongass is more than an ecosystem—it is our home. It is the foundation of our identity, our culture, and our way of life,” said a letter from the Central Council of the Tlingit and Haida Indian Tribes of Alaska to the USDA and the U.S. Forest Service. “We understand the need for sustainable industries and viable resource development in Southeast Alaska. Our communities need opportunities for economic growth, but that growth must be guided by those who call this place home.”

 
Related

On 27 August, the USDA released a statement about the agency taking “the next step in the rulemaking process,” noting that the proposal aligned with several recent executive orders, including Executive Order 14192, Unleashing Prosperity Through Deregulation and Executive Order 14153, Unleashing Alaska’s Extraordinary Resource Potential.

“This administration is dedicated to removing burdensome, outdated, one-size-fits-all regulations that not only put people and livelihoods at risk but also stifle economic growth in rural America,” Rollins said in the release.

A notice of intent seeking public comment on the proposal was published in the Federal Register on Friday, 29 August, but a preview of the document became available for public inspection on 28 August. The document suggests that the rule has posed “undue burden on production of the Nation’s timber and identification, development, and use of domestic energy and mineral resources.” Repealing the rule, the document states, would allow for local land managers to make more tailored decisions and would allow for better wildfire suppression.

“This scam is cloaked in efficiency and necessity,” said Nicole Whittington-Evans, senior director of Alaska and Northwest programs at Defenders of Wildlife, in a statement. “But in reality, it will liquidate precious old-growth forest lands critical to Alaska Natives, local communities, tourists and countless wildlife, who all depend on intact habitat for subsistence harvesting, recreation and shelter. Rare and ancient trees will be shipped off at a loss to taxpayers, meaning that Americans will subsidize the destruction of our own natural heritage.”  

The proposal will be open for public comment through 19 September.

–Emily Dieckman, Associate Editor (@emfurd.bsky.social)

29 August 2025: This article was updated with a link to the notice of intent published in the Federal Registrar.

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Burst of Subglacial Water Cracked the Greenland Ice Sheet

Thu, 08/28/2025 - 13:12

Greenland, despite its name, is largely blanketed in ice. And beneath that white expanse lies a world of hidden lakes. Researchers have now used satellite observations to infer that one such subglacial lake recently burst through the surface of the Greenland Ice Sheet, an unexpected and unprecedented event. By connecting this outburst with changes in the velocity and calving of a nearby glacier, the researchers helped to unravel how subglacial lakes affect ice sheet dynamics. These results were published in Nature Geoscience.

Researchers have known for decades that pools of liquid water exist beneath the Antarctic Ice Sheet, but scientific understanding of subglacial lakes in Greenland is much more nascent. “We first discovered them about 10 years ago,” said Mal McMillan, a polar scientist at Lancaster University and the Centre for Polar Observation and Modelling, both in the United Kingdom.

Subglacial lakes can exert a significant influence on an ice sheet. That’s because they affect how water drains from melting glaciers, a mechanism that in turn causes sea level rise, water freshening, and a host of other processes that affect local and global ecosystems.

McMillan is part of a team that recently studied an unusual subglacial lake beneath the Greenland Ice Sheet. The work was led by Jade Bowling, who was a graduate student of McMillan’s at the time; Bowling is now employed by Natural England.

Old, but Not Forgotten, Data

In the course of mining archival satellite observations of the height of the Greenland Ice Sheet, the team spotted something unusual in a 2014 dataset: An area of roughly 2 square kilometers had dropped in elevation by more than 80 meters (260 feet) between two satellite passes just 10 days apart. That deflation reflected something going on deep beneath the surface of the ice, the researchers surmised.

A subglacial lake that previously was situated at the interface between the ice and the underlying bedrock must have drained, said McMillan, leaving the ice above it hanging unsupported until it tumbled down. The team used the volume of the depression to estimate that roughly 90 million cubic meters (more than 3.1 billion cubic feet) of water had drained from the lake between subsequent satellite observations, making the event one of Greenland’s biggest subglacial floods in recorded history.

“We haven’t seen this before.”

Subglacial lakes routinely grow and shrink, however, so that observation by itself wasn’t surprising. What was truly unexpected lay nearby.

“We also saw an appearance, about a kilometer downstream, of a huge area of fractures and crevassing,” McMillan said. And beyond that lay 6 square kilometers (2.3 square miles)—an area roughly the size of lower Manhattan—that was unusually smooth.

The researchers concluded that after the subglacial lake drained, its waters likely encountered ice frozen to the underlying bedrock and were forced upward and through the surface of the ice. The water then flowed across the Greenland Ice Sheet before reentering the ice several kilometers downstream, leaving behind the polished, 6-square-kilometer expanse.

“This was unexpected,” said McMillan. “We haven’t seen this before.”

A Major Calving, a Slowing Glacier

It’s most likely that the floodwater traveled under northern Greenland’s Harder Glacier before finally flowing into the ocean.

Within the same 10-day period, Harder Glacier experienced its seventh-largest calving event in the past 3 decades. It’s impossible to know whether there’s a direct link between the subglacial lake draining and the calving, but it’s suggestive, said McMillan. “The calving event that happened at the same point is consistent with lots of water flooding out” from the glacier.

Using data from several Earth-observing satellites, scientists discovered that a huge subglacial flood beneath the Greenland Ice Sheet occurred with such force that it fractured the ice sheet, resulting in a vast quantity of meltwater bursting upward through the ice surface. Credit: ESA/CPOM/Planetary Visions

“It’s like you riding on a waterslide versus a rockslide. You’re going to slide a lot faster on the waterslide.”

The team also found that Harder Glacier rapidly decelerated—3 times more quickly than normal—in 2014. That’s perhaps because the influx of water released by the draining lake carved channels in the ice that acted as conduits for subsequent meltwater, the team suggested. “When you have normal melting, it can just drain through these channels,” said McMillan. Less water in and around the glacier means less lubrication. “That’s potentially why the glacier slowed down.”

That reasoning makes sense, said Winnie Chu, a polar geophysicist at the Georgia Institute of Technology in Atlanta who was not involved in the research. “It’s like you riding on a waterslide versus a rockslide. You’re going to slide a lot faster on the waterslide.”

Just a One-Off?

In the future, McMillan and his colleagues hope to pinpoint similar events. “We don’t have a good understanding currently of whether it was a one-off,” he said.

Getting access to higher temporal resolution data will be important, McMillan added, because such observations would help researchers understand just how rapidly subglacial lakes are draining. Right now, it’s unclear whether this event occurred over the course of hours or days, because the satellite observations were separated by 10 days, McMillan said.

It’s also critical to dig into the mechanics of why the meltwater traveled vertically upward and ultimately made it to the surface of the ice sheet, Chu said. The mechanism that this paper is talking about is novel and not well reproduced in models, she added. “They need to explain a lot more about the physical mechanism.”

But something this investigation clearly shows is the value of digging through old datasets, said Chu. “They did a really good job combining tons and tons of observational data.”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), A burst of subglacial water cracked the Greenland Ice Sheet, Eos, 106, https://doi.org/10.1029/2025EO250317. Published on 28 August 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Fossilized Micrometeorites Record Ancient CO2 Levels

Thu, 08/28/2025 - 13:10

Micrometeorites, unlike their larger brethren, rarely get a spotlight at museums. But there’s plenty to learn from these extraterrestrial particles, despite the largest of them measuring just millimeters across.

Nearly 50 tons of extraterrestrial material fall on Earth every day, and the majority of that cosmic detritus is minuscule. Micrometeorites are, by definition, smaller than 2 millimeters in diameter, and they’re ubiquitous, said Fabian Zahnow, an isotope geochemist at Ruhr-Universität Bochum in Germany. “You can basically find them everywhere.”

Researchers recently analyzed fossilized micrometeorites that fell to Earth millions of years ago. They extracted whiffs of atmospheric oxygen incorporated into the particles and showed that carbon dioxide (CO2) levels during the Miocene and Cretaceous did not differ wildly from modern-day values. The results were published in Communications Earth and Environment.

Extraterrestrial Needles in Rocky Haystacks

Newly fallen micrometeorites can be swept from rooftops and dredged from the bottoms of lakes.

Zahnow and his collaborators, however, opted to turn back the clock: The team analyzed a cadre of micrometeorites that fell to Earth millions of years ago and have since been fossilized. The team sifted through more than a hundred kilograms of sedimentary rocks, mostly unearthed in Europe, to discover 92 micrometeorites rich in iron. They added eight other iron-dominated micrometeorites from personal collections to bring their sample to 100 specimens.

Metal-rich micrometeorites such as these are special, said Zahnow, because they function like atmospheric time capsules. As they hurtle through the upper atmosphere on their way to Earth, they melt and oxidize, meaning that atmospheric oxygen gets incorporated into their otherwise oxygen-free makeup.

“When we extract them from the rock record, we have our oxygen, in the best case, purely from the Earth’s atmosphere,” said Zahnow.

Ancient Carbon Dioxide Levels

And that oxygen holds secrets about the past. It turns out that atmospheric oxygen isotope ratios—that is, the relative concentrations of the three isotopes of oxygen, 16O, 17O, and 18O—correlate with the amount of photosynthesis occurring and how much CO2 is present at the time. That fact, paired with model simulations of ancient photosynthesis, allowed Zahnow and his colleagues to infer long-ago atmospheric CO2 concentrations.

“The story of the atmosphere is the story of life on Earth.”

Reconstructing Earth’s atmosphere as it was millions of years ago is important because atmospheric gases affect our planet so fundamentally, said Matt Genge, a planetary scientist at Imperial College London not involved in the work. “The story of the atmosphere is the story of life on Earth.”

But Zahnow and his collaborators first had to make sure the oxygen in their micrometeorites hadn’t been contaminated. Terrestrial water, with its own unique oxygen isotope ratios, can seep into micrometeorites that would otherwise reflect atmospheric oxygen isotope ratios from long ago. That’s a common problem, said Zahnow, given the ubiquity of water on Earth. “There’s always some water present.”

The team found that the presence of manganese in their micrometeorites was a tip-off that contamination had occurred. “Extraterrestrial metal has basically no manganese,” said Zahnow. “Manganese is really a tracer for alteration.”

Unfortunately, the vast majority of the researchers’ micrometeorites contained measurable quantities of manganese. In the end, Zahnow and his collaborators deemed that only four of their micrometeorites were uncontaminated.

Those micrometeorites, which fell to Earth during the Miocene (9 million years ago) and the Late Cretaceous (87 million years ago), suggested that CO2 levels during those time periods were, on average, roughly 250–300 parts per million. That’s a bit lower than modern-day levels, which hover around 420 parts per million.

“What we really hoped for was to get pristine micrometeorites from periods where the reconstructions say really high concentrations.”

The team’s findings are consistent with values suggested previously, said Genge, but unfortunately, the team’s numbers just aren’t precise enough to conclude anything meaningful. “You have a really huge uncertainty,” he said.

The team’s methods are solid, however, said Genge, and the researchers made a valiant effort to measure what are truly faint whiffs of ancient oxygen. “It’s a brave attempt.”

In the future, it would be valuable to collect a larger number of pristine micrometeorites dating to time periods when model reconstructions suggest anomalously high CO2 levels, said Zahnow. “What we really hoped for was to get pristine micrometeorites from periods where the reconstructions say really high concentrations.”

Confirming, with data, whether such time periods, such as the Triassic, truly had off-the-charts CO2 levels would be valuable for understanding how life on Earth responded to such an abundance of CO2.

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), Fossilized micrometeorites record ancient CO2 levels, Eos, 106, https://doi.org/10.1029/2025EO250319. Published on 28 August 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

As Simple as Possible: The Importance of Idealized Climate Models

Thu, 08/28/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

 “Everything should be made as simple as possible, but not simpler.” This popular saying paraphrases a sentiment expressed by Einstein about the need for simplicity, though not at the expense of accuracy. Modeling of the Earth’s climate system has become an incredibly complex endeavor, especially when coupling of physics of atmospheric movement with complex and nonlinear feedbacks with the ocean and land surface and forcing by collective human actions. Such complexity can make the underlying causes of model behaviors hard to diagnose and can make it prohibitively expensive to perform targeted experiments.

Two very recent developments, the emergence of kilometer-scale simulations and the rapid growth of machine learning (ML) approaches, have further increased the computational complexity of modeling global climate. In their commentary, Reed et al. [2025] remind us of the benefits of maintaining and applying a hierarchy of models with different levels of complexity. They make a special plea not to forget the power of using idealized, or simplified, climate models for hypothesis testing, model development, and teaching. 

Citation: Reed, K. A., Medeiros, B., Jablonowski, C., Simpson, I. R., Voigt, A., & Wing, A. A. (2025). Why idealized models are more important than ever in Earth system science. AGU Advances, 6, e2025AV001716. https://doi.org/10.1029/2025AV001716

—Susan Trumbore, Editor, AGU Advances

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The 26 August 2025 landslide on the Vaishno Devi pilgrimage route in India

Thu, 08/28/2025 - 06:51

On 26 August 2025, a landslide triggered by extraordinary rainfall killed at least 34 people and injured another 20 individuals.

On 26 August 2025, extremely intense late monsoon rainfall struck parts of Jammu and Kashmir in northern India, triggering extensive flooding and landslides. Unfortunately, a significant landslide occurred on the route to the Vaishno Devi shrine, a sacred Hindu site that attracts large numbers of pilgrims. At the time of writing, the reported loss of life is 34 people, with 20 more injured.

I can find little detailed information about this landslide at present – the site is remote and access is clearly extremely difficult. However, this highlights a very major issue that India faces during the monsoon.

The Google Earth image below shows the terrain around the Vaishno Devi shrine (which is located at [33.03004, 74.948032]:-

Google Earth image showing the terrain around the Vaishno Devi shrine in northern India.

The landscape here is steep and geologically vulnerable, and the shrine is located on a remote mountain side, accessed by tracks. There is a good account of making the pilgrimage here – this person started the journey at 19:15 and they arrived at 02:00 the next day. The route is well-established but the journey is long (13 km). Most people travel on foot. According to the temple itself, over 5.2 million people have the journey so far in 2025. Travel during the monsoon is not recommended, but many people inevitably make the trip at this time.

Thus, this pilgrimage, and others that also take devotees into the Himalayas, places people in a dynamic landscape at a time when landslides are most likely. Inevitably, the vulnerability of those people is high. The tragedy at Vaishno Devi is the consequence.

Unfortunately, this event is not isolated. On 14 August, another major landslide occurred at Chasoti in Kishtwar district, also in Jammu and Kashmir, on the route of the Machail Mata Yatra pilgrimage. The final loss of life is unclear, but at least 66 people were killed and some reports suggest as many as 75 more people may be missing. There have been a number of other fatal landslides this year on Hindu pilgrimage routes.

And loyal readers of this blog may remember the 2013 Kedarnath disaster, when vicious debris flows struck the route of the Chardham pilgrimage when it was packed with pilgrims. The remains of 733 victims were recovered, but 3,075 people remain missing. With a total of 3,808 victims, this was one of the worst landslide disasters of the last 30 years.

There are news reports that Jammu and Kashmir chief minister, Omar Abdullah, is questioning why the Shri Mata Vaishno Devi Shrine Board did not suspend the pilgrimage. Reports indicate that the area received 629.4 mm of rainfall in a rolling 24 hour period, exceeding the previous record (342 mm) by a huge margin. In the view of the minister, these totals should have alerted the authorities to the potential for a disaster.

Whilst this is a pertinent question, it is addressing a short term issue, rather considering the underlying problems. The reality is that peak rainfall intensities in the summer monsoon are rapidly increasing across South Asia as a result of climate change, triggering landslides (especially channelised debris flows) and floods. At the same time, huge numbers of pilgrims are travelling into the landscape, where they are extremely vulnerable.

Managing this risk is very taxing, but many more people will lose their lives if systematic action is not taken to protect the pilgrims. These levels of loss cannot be tolerable.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

FEMA Puts Dissenting Staff on Indefinite Leave

Wed, 08/27/2025 - 14:52
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

Update 27 August 2025: This article has been updated to reflect newly released numbers of staff placed on leave.

On 25 August, 182 current and former staff members of the Federal Emergency Management Agency (FEMA) signed a declaration opposing the Trump administration’s actions to obstruct FEMA’s mission to provide relief and recovery assistance after natural disasters. The following evening, 36 FEMA staff, all signatories of that Katrina Declaration, were placed on indefinite administrative leave.

Colette Delawalla, the executive director of Stand Up for Science, an advocacy group that helped publicize the letter, told the New York Times that the move appeared to be an act of retaliation.

“Once again, we are seeing the federal government retaliate against our civil servants for whistleblowing—which is both illegal and a deep betrayal of the most dedicated among us,” she said.

This is illegal, plain and simple. FEMA workers are doing their duty as public servants by blowing the whistle on the dismantling of their agency — and whistleblowing is protected under federal law.

Stand Up for Science! (@standupforscience.bsky.social) 2025-08-27T01:25:29.308Z

Employees were told the leave was effective immediately. Stand Up for Science and the Washington Post both confirmed that two of those suspended were taken off duty from recovery work at the site of Texas floods that killed at least 135 people in early July.

The notice of placement on administrative leave stated that the decision “is not a disciplinary action and is not intended to be punitive.” However, FEMA spokesperson Daniel Llargues said in a statement that “It is not surprising that some of the same bureaucrats who presided over decades of inefficiency are now objecting to reform.”

The staff who were placed on administrative leave will receive pay and benefits but do no work.

FEMA staff sent their letter of dissent to Congress 20 years after of Hurricane Katrina, one of the deadliest natural disasters in modern U.S. history. Experts have long argued that many more people died than should have because of human failures in disaster planning and implementation. Twenty years later, FEMA staff have warned that recent changes to the organization’s structure and procedures put the nation at risk for future Katrina-like disasters.

 
Related

The letter specifically calls out reductions in disaster workforce, failure to appoint a Senate-confirmed FEMA administrator, elimination or reduction of risk reduction programs, interference with preparedness programs, censorship of climate science, and new policies regarding spending that have already delayed FEMA deployment to disaster areas.

The Katrina Declaration followed similar letters of dissent, also facilitated by Stand Up for Science, from the National Institutes of Health, NSF, EPA, and NASA. Not long after EPA staff sent their letter of dissent, 144 signatories were placed on administrative leave.

Only 36 individuals signed their names to the Katrina Declaration, while the rest chose to remain anonymous, likely in fear of similar retribution. (More people have signed the letter since 25 August, all anonymously.) The fears seem to have been well-founded: All those who signed their name were placed on leave.

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Machine Learning Simulates 1,000 Years of Climate

Wed, 08/27/2025 - 13:19
Source: AGU Advances

In recent years, scientists have found that machine learning–based weather models can make weather predictions more quickly using less energy than traditional models. However, many of those models are unable to accurately predict the weather more than 15 days into the future and begin to simulate unrealistic weather by day 60.

The Deep Learning Earth System Model, or DLESyM, is built on two neural networks that run in parallel: One simulates the ocean while the other simulates the atmosphere. During model runs, predictions for the state of the ocean update every four model days. Because atmospheric conditions evolve more rapidly, predictions for the atmosphere update every 12 model hours.

The model’s creators, Cresswell-Clay et al., found that DLESyM closely matches the past observed climate and creates accurate short-term forecasts. Using Earth’s current climate as a baseline, it can also accurately simulate climate and interannual variability over 1,000-year periods in less than 12 hours of computing time. It generally equals or outperforms models based on the Coupled Model Intercomparison Project Phase 6, or CMIP6, which is widely used in computational climate research today.

The DLESyM model outperformed CMIP6 models in replicating tropical cyclones and Indian summer monsoons. It captured the frequency and spatial distribution of Northern Hemisphere atmospheric “blocking” events, which can cause extreme weather, at least as well as CMIP6 models. In addition, the storms the model predicts are also highly realistic. For instance, the structure of a nor’easter generated at the end of a 1,000-year simulation (in 3016) is very similar to a nor’easter observed in 2018.

However, both the new model and CMIP6 models poorly represent Atlantic hurricane climatology. Also, DLESyM is less accurate than other machine learning models for medium-range forecasts, or those made up to about 15 days into the future. Crucially, the DLESyM model only conducts simulations of the current climate, meaning it does not account for anthropogenic climate change.

The key benefit of the DLESyM model, the authors suggest, is that it uses far less computational power than running a CMIP6 model, making it more accessible than traditional models. (AGU Advances, https://doi.org/10.1029/2025AV001706, 2025)

—Madeline Reinsel, Science Writer

Citation: Reinsel, M. (2025), Machine learning simulates 1,000 years of climate, Eos, 106, https://doi.org/10.1029/2025EO250318. Published on 27 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Scientists Track Down Fresh Boulder Falls on the Moon

Wed, 08/27/2025 - 13:17

As a boulder rolls down a cliff slope on the Moon, it kicks up lunar dust, leaving behind a telltale herringbone pattern of ejecta.

In a recent study, for the first time, scientists geolocated and dated evidence of such boulder falls. They identified 245 fresh tracks created as boulders rolled, bounced, and slid down crater walls.

“For a long time, there was this belief that the Moon is geologically dead.…Our study shows that boulders with sizes ranging [from] tens to hundreds of meters and [with] weights in tons have moved from their places over time,” said Sivaprahasam Vijayan, the study’s lead author and an associate professor at the Physical Research Laboratory in Ahmedabad, India. “It is equally important to know how recent these boulder fall events are to understand the time periods when the geological agents were active.”

Tracking Boulder Falls

As lunar boulders bounce, they scoop up bright, unweathered subsurface material and bring it to the surface. As a result, fresh boulder fall tracks appear brighter than older ones.

“One can identify a boulder fall to be a recent one considering the boulder fall ejecta,” said Senthil Kumar Perumal, principal scientist with the Planetary Sciences Group at the National Geophysical Research Institute in Hyderabad, India, who was not involved in the new study.

The craters were found to be around 400,000 years old—which means the BFE tracks are more recent.

To identify relatively recent boulder tracks, Vijayan and his colleagues first manually searched thousands of images of the lunar surface between 40°S and 40°N. At these latitudes, the Sun makes the bright boulder tracks distinguishable from the rest of the lunar surface. Once they identified a track, the researchers studied corresponding images taken by NASA’s Lunar Reconnaissance Orbiter Narrow Angle Camera between 2009 and 2022.

Next, scientists estimated the age of the tracks by studying regions with both boulder fall ejecta (BFE) and distinct impact ejecta blankets. (Such blankets, nicknamed the “lunar equivalent of fossils,” have long been used to estimate the age of impact events.) The craters analyzed by Vijayan and his colleagues were found to be around 400,000 years old—which means the BFE tracks are more recent.

Finally, the scientists identified possible seismic faults or impact craters nearby that could have triggered the boulder falls.

Mapping the Moon

The new geological map of boulder falls, published in Icarus, highlights seismically active spots and fresh impact sites on the Moon. Researchers say these regions could be potential landing sites for future lunar missions focused on recent surface and subsurface activity.

The study authors plan to integrate artificial intelligence methods into the next iteration of their work, but ultimately, Vijayan said, “the next step is to more precisely determine whether the cause [of a fall] is endogenic or exogenic, which can be achieved by deploying additional seismometers in upcoming missions.”

Kumar concurred. “We need to have a large network of seismometers covering the entire [Moon] that monitors seismic activity continuously for several decades,” he said.

—Unnati Ashar, Science Writer

Citation: Ashar, U. (2025), Scientists track down fresh boulder falls on the Moon, Eos, 106, https://doi.org/10.1029/2025EO250314. Published on 27 August 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Equatorial Deep Ocean Response to the Madden-Julian Oscillation

Wed, 08/27/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Journal of Geophysical Research: Oceans

The Madden-Julian Oscillation (MJO) is the dominant weather system in the tropics. It lasts several weeks and changes rainfall, cloudiness, and winds across the tropics. The MJO is well known for triggering an extratropical and global atmospheric circulation response. And recently, several case studies have been conducted on a deeper ocean response to the MJO.

Using 18 years of output from a high-resolution ocean reanalysis product (GLORYS12) largely constrained by Argo data, Robbins et al. [2025] discover intraseasonal anomalies (20-200 days) signals in currents, temperature, and salinity in the tropical oceans down to at least 2,000 meters. They describe that such deep-penetrating structure are equatorial Kelvin waves, which are forced by the MJO in the equatorial Pacific and Indian Oceans. This is one of the first studies to examine the impact of the MJO on the deep ocean and will be beneficial for future investigations into deep-ocean changes.

Citation: Robbins, C., Matthews, A. J., Hall, R. A., Webber, B. G. M., & Heywood, K. J. (2025). The equatorial deep ocean structure associated with the Madden-Julian Oscillation from an ocean reanalysis. Journal of Geophysical Research: Oceans, 130, e2025JC022457.  https://doi.org/10.1029/2025JC022457

—Xin Wang, Editor, JGR: Oceans

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Fatal landslides in June 2025

Wed, 08/27/2025 - 07:43

In June 2025, I recorded 51 fatal landslides worldwide, resulting in 479 fatalities. The number of fatal landslides is significantly above the long term mean.

Yesterday, I provided an update on fatal landslides that occurred in May 2025. This post is a follow-up, providing the data for June.

As always, allow me to remind you that this is a dataset on landslides that cause loss of life, following the methodology of Froude and Petley (2018). At this point, the monthly data is provisional.

The headline is that I recorded 51 landslides over the course of the month, claiming 479 lives. Note that the landslide total is lower than for May (n=66), which is a little unusual. However, 51 landslides is still substantially higher than the 2004-2016 mean (n=40.8), whilst the number of fatalities is also below the mean (n=746).

So, this is the monthly total graph to the end of June 2025:-

The number of fatal landslides to the end of June 2025 by month.

Plotting the data by pentad to the end of pentad 36 (29 June), the trend looks like this (with the exceptional year of 2024 plus the 2004-2016 mean for comparison):-

The number of fatal landslides to 29 June 2025, displayed in pentads. For comparison, the long term mean (2004 to 2016) and the exceptional year of 2024 are also shown.

Through to about 10 June, the trend for 2025 very closely matched that of 2024. However, by the end of the month a significant difference had emerged, with the landslide rate this year being somewhat lower. The data for July and August will start to tell us whether this is a trend.

So, what lies behind a monthly figure that is above the long term average but below the exceptional year for 2024? The Copernicus surface air temperature data for June 2025 notes the following:-

“June 2025 was 0.47°C warmer than the 1991-2020 average for June with an absolute surface air temperature of 16.46°C. [It was the] third-warmest June on record, 0.20°C cooler than the warmest June in 2024, and 0.06°C cooler than 2023, the second warmest.”

Thus, if the hypothesis that the landslide numbers are driven in part by atmospheric temperature, the lower total than in 2024 is perhaps unsurprising.

Reference

Froude M.J. and Petley D.N. 2018. Global fatal landslide occurrence from 2004 to 2016Natural Hazards and Earth System Science 18, 2161-2181. https://doi.org/10.5194/nhess-18-2161-2018

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Fallout from the Fires

Tue, 08/26/2025 - 13:49
From Devastation to Data

Some effects of wildfire are immediately apparent: burned vegetation, smoldering ruins, dissipating smoke. As such, the massive Palisades and Eaton Fires cut charred wakes through western Los Angeles County that remain long after firefighters contained the blazes earlier this year.

This month, we shadow geoscientists investigating the fires’ less tangible, if no less serious, consequences for regional air, soil, and water quality.

Where There’s Fire, There’s Smoke,” writes Emily Dieckman in her profile of air quality following the fires—and where there’s smoke, there are particulates, including organic compounds, toxic chemicals, and hazardous dust and ash.

For Earth scientists, the liminal space between what is urban and what is wild is crucial for understanding postfire debris flows and the ground below. As profiled by Kimberly Cartier, these researchers consider the L.A. fires to be a case study of “how this urban-rural interface is changing and what…recovery looks like.”

Watersheds, those ever-changing interfaces between earth and water, are no less fraught, writes Grace van Deelen in “Scrambling to Study Smoke on the Water.” Scientists are documenting how ash-laden runoff is changing, if only ephemerally, both freshwater and marine ecosystems.

Perhaps the most elusive and powerful consequences of the fires are their effects on human health. And in places like Los Angeles, writes Dieckman, “Access to Air-Conditioning May Affect Wildfire-Related Health Outcomes.” The L.A. fires are yet another test case for extreme events augmented by a warming climate. The importance of thoughtful, science-based policy has never been more relevant for the health of both our planet and ourselves.

—Caryl-Sue Micalizio, Editor in Chief

Citation: Micalizio, C.-S. (2025), Fallout from the fires, Eos, 106, https://doi.org/10.1029/2025EO250311. Published on 26 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

When Disaster Science Strikes Close to Home

Tue, 08/26/2025 - 13:48
From Devastation to Data

Over 24 days in January, the Eaton and Palisades fires burned nearly 38,000 acres of Los Angeles County. Whole neighborhoods were destroyed, 29 people died, and thousands were displaced. The conditions that led to the fires were estimated to be 35% more likely because of climate change, and damage to public and private infrastructure made the blazes among the costliest wildfire disasters in U.S. history.

In the wake of the fires, multiple local, state, and federal disaster response agencies mobilized to contain the flames, document dangers, and communicate those findings to the public. Agencies’ emergency response playbooks are tried and tested and often require interagency cooperation.

Within this massive, coordinated effort in postfire monitoring and response, where have non-agency scientists with relevant skills and a desire to help fit in?

This is a question Michael Lamb has wrestled with this year. Lamb is a geomorphologist at the California Institute of Technology (Caltech) in Pasadena who was evacuated from his home and left without power for several days when the Eaton Fire tore through Altadena.

Lamb, who researched debris flow patterns after the 2009 Station Fire in the Angeles National Forest, wondered how to apply his knowledge to help with this latest disaster, and whether he should. He worried that members of his lab group, by inserting themselves into the disaster response apparatus, might inadvertently confuse official communications or make it harder for first responders to do their jobs.

“We don’t want to take time away [from agency scientists], especially when they’re in the middle of the emergency management part of work,” Lamb said.

Lamb wasn’t alone in his concern, or in his desire to help. The areas of Los Angeles County affected by the Palisades and Eaton fires are home to a high concentration of scientists who work or study at the area’s many scientific institutions. Some of them study fires and fire impacts and realized they could help, while many outside that niche were surprised to find that their work might have new, immediate applications close to home.

Scientists Spot Need

When the Palisades Fire was still burning in early January, Adit Ghosh watched the coverage at home on television. Ghosh, an Earth science graduate student at the University of Southern California (USC) in Los Angeles, had helped evacuate his in-laws and some of his friends from at-risk areas and couldn’t go in to work because campus was closed.

“They were showing the fires nonstop,” Ghosh recalled of news reports. In one broadcast, the camera zoomed in on a house in Mandeville Canyon near Topanga State Park. “I saw it on TV catching fire and then burning to the ground.”

“He took us to this house. Then it clicked. This is the house that I saw burning on TV.”

By the third week in January, Ghosh was back in his geochemistry class. His adviser, who works closely with the professor teaching the course, suggested a way for Ghosh and his fellow graduate students to contribute to the ongoing efforts to understand contamination in water runoff. They were eager to help in whatever ways they could.

That weekend, Ghosh went out with a team of other USC students to collect water runoff in burned areas. They hoped to analyze the samples for chemicals that might prove harmful to human and environmental health. A helpful resident showed the team around the burned area, pointing out places they might collect samples from.

A home burned by the Palisades Fire. Credit: Adit Ghosh

“He took us to this house,” Ghosh said. “Then it clicked. This is the house that I saw burning on TV.”

The postfire landscape and environmental conditions can change rapidly. Many scientists felt a sense of urgency to collect samples of ash, dust, soil, and water, as well as to study sediment and debris built up along the mountainside, because much of these data are considered “perishable,” Lamb explained.

In an area burned by the Palisades Fire, a University of Southern California student collects water runoff from a drainpipe. Credit: Adit Ghosh

Lamb’s team rushed to obtain flight permits and conduct drone flyovers of debris channels along the San Gabriel Mountains above Altadena. Knowing that weather reports anticipated rain soon after the fires, debris flow researchers wanted to obtain postfire, prerain lidar scans of the channels’ topographies to better understand how debris accumulates and what conditions can trigger dangerous flows.

If measurements weren’t taken quickly enough, information about immediate postfire impacts could be washed away. They shared their results with disaster response agencies and affected communities.

Serendipitous Science

In the wake of such a disaster, doing something, anything, to help others can be a powerful tool of healing and recovery.

“As soon as they were safe, people really wanted to contribute,” said Kimberley Miner, a climate scientist and representative of NASA’s Disasters Response Coordination System (DRCS) at the Jet Propulsion Laboratory (JPL) in Pasadena, Calif. NASA and JPL coordinated to fly the Airborne Visible/Infrared Imaging Spectrometer 3 (AVIRIS-3) instrument to survey damage immediately after the fire.

The first AVIRIS-3 flight on 11 January was serendipitous, explained Robert Green, principal investigator for AVIRIS-3 at JPL. The instrument had already been installed on a plane and approved to fly over a completely different area. The team was able to divert the flight path to cover the Eaton burn area instead.

“There were folks working out of hotel rooms [on all of that imagery] while they were evacuated.”

“There were folks working out of hotel rooms while they were evacuated,” Miner said. On some science teams, only one or two people had not been displaced.

The AVIRIS team has been on the scene after some of the most infamous disasters in modern U.S. history. The team flew an earlier version of the instrument over the ruins of the World Trade Center after the terrorist attack on 11 September 2001 to look for asbestos and residual hot spots burning under the rubble. After the 2010 Deepwater Horizon disaster, AVIRIS-3 data yielded the first estimates of how much oil had been released into the Gulf of Mexico.

Even in the context of those disasters, Green said that flying over the L.A. burn scars was “heartbreaking and poignant.”

“It’s especially poignant, I would say, because it is a local disaster,” Green said. “But for 9/11, the Gulf oil spill, or wherever we contribute, our team is committed to offer information via this unique spectroscopy to be helpful.”

The first AVIRIS-3 flyover provided some of the first aerial data assessing the scope of the fires. NASA’s DRCS provided those data to federal and state disaster response teams, and those data helped justify and expedite approval for subsequent flyovers.

Getting Involved but Not Being in the Way

As official emergency responders worked to contain the fires and rapidly document the damage, collecting samples from the air, ground, rivers, or ocean outside of those efforts presented logistical quandaries.

The USC team that Ghosh worked with to collect water runoff samples had been organized within his department and went out on its own volition. But getting to sample sites was a challenge.

“We’re trying to focus on whatever we can get our hands on, essentially, because access is really hard,” he said earlier this year. In some burned areas where runoff sampling would have yielded important science results, for example, the National Guard had restricted access to prevent looting.

“Even in sites that are open, the residents still didn’t really want us hanging around over there. And understandably, because their house almost burnt down,” Ghosh said. When members of his team encountered resistance from residents, he said, they respectfully moved to another location.

“Something that we can try to help with more as research scientists is to think about real forward-looking measurements.”

Lamb said that his research group considered a broad range of science that they might contribute before contacting government agencies operating in the area. “We reached out via email to people…leading debris flow hazard teams and just said, ‘We are interested in helping. These are some of the capabilities we have. We also don’t want to get in the way. Please let us know if this can be of help.’”

Lamb’s team was told it could help by monitoring the accumulation of sediment and debris in ravines on the slopes of the San Gabriel Mountains, and they gained approval to fly drones over certain landslide-prone areas. Those aerial lidar measurements will be helpful in assessing the ongoing risk of debris flows and landslides and also in monitoring for future hazards.

“Emergency managers and the federal agencies are mostly tasked with trying to deal with the immediate situation,” Lamb said. “Whereas something that we can try to help with more as research scientists is to think about real forward-looking measurements.”

Their lidar flights focused on areas of burned mountainside rather than on urban areas. “It’s sad to say, but in some of the areas that were really devastated by the fires, there aren’t homes there [anymore] to be damaged by the debris flows,” Lamb said.

Working with Their Communities

The public messaging that agencies provide is critical for residents to find out about the immediate risks they face, but non-agency scientists also have found ways to engage these communities deeper in the scientific discoveries that are helping them stay safe.

As crews started containing the fires, scientists at the Natural History Museum of Los Angeles County (NHMLA) recognized the need to collect and analyze samples of the ash, not only for the immediate emergency response but also to curate a catalog that scientists could use for longer-term and future studies. Because they have a small staff, the museum’s team solicited community members for ash samples rather than going in to the field themselves.

“They just lost their homes. They want to be treated with respect.”

“We didn’t want to reach out right away, because that would appear as insensitive and not really caring about the people but rather more caring about the science,” said Aaron Celestian, curator of mineral sciences at NHMLA. But once it started raining, they couldn’t wait any longer.

The museum’s community science team approached their existing community partners about collecting ash and found that people were already doing it themselves. The team pivoted, instead showing people how to collect ash without risking personal health or contaminating the samples.

“We didn’t want anybody to do anything that would have any kind of health effects on them long term,” Celestian said. “We had to develop a protocol that could be understood by the community at large, and so that we get the best kind of science out of it in the end.”

Celestian analyzed his first sample on 27 January, measuring the chemical composition of forest ash. He plans to compare the results with those from urban ash.

Natural History Museum of Los Angeles County mineral sciences curator Aaron Celestian prepares one of the collected ash samples for total element analysis to reveal its chemical composition. The whole process takes about 2 hours. Credit: Aaron Celestian/Natural History Museum of Los Angeles County

Then came the question of how to communicate the results. Celestian and the museum’s communications team came up with a two-pronged approach. First and foremost, they consulted with the community member who sent in the ash sample. “They get to decide on how they want their samples to be treated and communicated with everybody else,” Celestian said.

With a resident’s permission, the ash sample was entered into a museum collection for other scientists to check out and analyze. They received 11 samples for the collection.

“Even though I’m collecting the data, it really is their property,” Celestian said. “That’s a big part of making them feel comfortable, making them feel confident in the results.”

“They just lost their homes,” he emphasized. “They want to be treated with respect,” he said, adding that the samples “are really like a family member’s ashes.”

At the same time, Celestian recognized the importance of transparency and that timely information can not only protect people but also help them feel confident in their safety. He began live-streaming his analysis on social media and his blog using anonymized samples.

“People want to know,” Celestian said.

Lamb’s group took a similar approach. They shared their lidar data directly with emergency response managers so they could be incorporated into official responses. They also communicated directly with the public. Lamb had been scheduled to give a public science talk in late January, and he decided to center the science of postfire debris flows.

“I was going to talk about something completely different, and I changed the topic last minute because of this very heightened community interest in understanding what’s happening in the mountains,” Lamb said. Nearly 900 people showed up to listen.

Strong, and Mixed, Emotions

Having a way to help after a disaster—whether through distributing supplies or figuring out whether playground soil has elevated lead levels—can aid community recovery and empower personal healing. In some, it can also evoke a sense of duty.

“I think we have a responsibility to use our skill sets to help the greater Los Angeles area where we live,” Ghosh said. Logically, he knew that sampling water runoff and analyzing it for harmful chemicals is an important part of postfire recovery. But sometimes, it didn’t feel like enough.

“You go up there and you’re collecting water, and people have almost lost their homes,” Ghosh said. “It feels like, ‘Why the hell are you collecting water?’ It may not seem in the moment as important a thing to do. I definitely felt that.”

Studying the risks from these fires “does feel more personal.”

Some residents questioned what the sampling team was doing and whether they were focusing on the right problems. “But we also had neighbors who were like, ‘Thank you so much for doing this, coming out and helping us understand whether we can drink our water, or whether it’s safe to be out,’” Ghosh said. “In fact, some people even let us in to their house, and [we] collected tap water from their house.” Ghosh and his colleagues shared the results of those in-home water tests directly with the homeowners when they got them.

“It’s a lot of mixed emotions,” he added.

Studying the risks from these fires “does feel more personal” for local scientists, Lamb said. “We know people that live in those areas. There’s faculty from Caltech and graduate students that live there, and postdocs and friends. It’s very close to where we live and work. It certainly adds more motivation to try to do anything that we can to help.”

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

Citation: Cartier, K. M. S. (2025), When disaster science strikes close to home, Eos, 106, https://doi.org/10.1029/2025EO250315. Published on 26 August 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer