EOS

Syndicate content Eos
Science News by AGU
Updated: 19 hours 28 min ago

Storm Prediction Gets 10 Times Faster Thanks to AI

Tue, 05/20/2025 - 13:23
Source: Geophysical Research Letters

Artificial intelligence (AI) algorithms can produce weather predictions more quickly than traditional algorithms for a fraction of the computational cost. But because training AI takes such large amounts of data, it has so far been most successful at producing global-scale forecasts. Until recently, researchers lacked the data needed to train algorithms to predict small-scale weather patterns such as thunderstorms.

Flora and Potvin extended AI-based weather forecasting to thunderstorm-scale events by training Google’s neural network GraphCast on data from NOAA’s Warn-on-Forecast System. The Warn-on-Forecast research project generates high-resolution forecasts for areas likely to experience extreme weather with the aim of issuing earlier warnings for tornadoes, severe thunderstorms, and flash floods.

The AI model, named WoFSCast, learned the dynamics of key thunderstorm features, including updrafts, which feed thermodynamic energy into storms, and cold air pockets that form beneath storms, which influence how storms move and grow.

The model yielded largely accurate predictions of how storms would evolve for up to 2 hours; these predictions matched 70% to 80% of those generated by the Warn-on-Forecast system. The process of generating a prediction took only 30–40 seconds using one graphical processing unit. That’s at least a factor of 10 faster than using the current Warn-on-Forecast System to generate forecasts without AI.

With additional training data, the researchers suggest that WoFSCast could become even more versatile, predicting surface winds and rainfall within landfalling tropical cyclones, as well as how wildfires will spread, for instance. By using an AI-enhanced system, the National Weather Service may be able to issue severe weather warnings more quickly and reduce the harm caused by these extreme events. (Geophysical Research Letters, https://doi.org/10.1029/2024GL112383, 2025)

—Saima May Sidik (@saimamay.bsky.social), Science Writer

Citation: Sidik, S. M. (2025), Storm prediction gets 10 times faster thanks to AI, Eos, 106, https://doi.org/10.1029/2025EO250159. Published on 20 May 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Artisanal Gold Mining Is Destroying Amazonian Peatlands

Tue, 05/20/2025 - 13:22

For decades, wounds have surfaced in the Peruvian Amazon where the Rio Inambari merges with the Rio Madre de Dios, carving thick gashes into ancient tree stands. These almost lunar scars of barren rubble were formed at the hands of a growing enterprise capitalizing on gold that the rivers deposit throughout their floodplains.

A new study published in Environmental Research Letters shows that the spread of this devastation has quickened and increasingly affected a unique Amazonian ecosystem. Peatland swamps safeguard meters-deep deposits of carbon accumulated over millennia and contain unique assemblages of life distinct from the surrounding rain forest.

Around 15 years ago, “there was mining in peatlands, but it was the exception.”

Around 15 years ago, Ethan Householder first visited the Madre de Dios region, where 70% of artisanal gold mining takes place in Peru. Peatlands are dispersed in small pockets throughout the enormous Madre de Dios floodplain, which stretches into Bolivia, where the river’s confluence with the Mamore forms the Madeira and eventually empties into the Amazon itself.

Householder, a community ecologist at Germany’s Karlsruher Institut für Technologie and a study coauthor, recalled that at that time, “there was mining in peatlands, but it was the exception.”

Now, he and his colleagues have found that peat mining is surging in the region.

To determine how artisanal gold mining (defined as subsistence or small scale and often illegal) has affected peatlands along the Rio Madre de Dios, the team searched through decades of satellite imagery for sudden drops in the greenness of the forest canopy that might signal deforestation. Then, they looked for spectral information indicative of mining activity: the buildup of gravel and sand, for example, and the presence of water-filled pits at the site of formerly forested land.

Using a machine learning algorithm trained to pick out pixels that met those criteria, the team combed through imagery from 1985 to 2023. They found that more than 11,000 hectares of forest along the river had been converted to mines, with most of the growth taking place since the mid-2010s.

Studies like this “are all puzzle pieces of evidence saying this is a huge issue and still not resolved.”

The analysis found more than 550 hectares associated with mining activity in peatlands. Though that’s just a fraction of the total area that’s been mined, research showed that peatland mines have expanded faster than mining in the forest at large in recent years. (Fifty-five percent of peatland loss occurred within the past 2 years.)

Already, digging up these peatlands has released anywhere from 200,000 to 700,000 metric tons of carbon stored belowground—in addition to the carbon released from the loss of trees and plant life above it. If all the peat in the Madre de Dios region is lost, some 17 million metric tons of long-sequestered carbon could be released.

Mining in the Amazonian peatlands “is basically the entire force of global capitalism on top of one of the most carbon-rich habitats on Earth,” Householder said.

The new work continues and expands the story of how gold mining is degrading the Amazon, said Greg Asner, a conservation ecologist at Arizona State University who has been studying the effects of gold mining on the Peruvian Amazon since the early 2010s. To him, studies like this “are all puzzle pieces of evidence saying this is a huge issue and still not resolved.”

—Syris Valentine (@shapersyris.bsky.social), Science Writer

Citation: Valentine, S. (2025), Artisanal gold mining is destroying Amazonian peatlands, Eos, 106, https://doi.org/10.1029/2025EO250195. Published on 20 May 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

地下洪水:海平面上升的隐形风险

Tue, 05/20/2025 - 13:21
Source: Earth’s Future

This is an authorized translation of an Eos article. 本文是Eos文章的授权翻译。

随着气候变化持续推动全球海平面上升,许多生活在沿海地区的居民已感受到其影响。海岸侵蚀正在加速,海岸线向内陆移动,风暴潮也愈演愈烈。但隐藏在地表之下的还有另一个迄今为止鲜为人知的严重后果:地下水位上升。

有证据表明,在一些地势低洼、地下水位较浅的沿海地区,海平面上升将导致地下水位同步上升,这可能会给住宅、企业和其他基础设施带来严重风险。

在一篇聚焦新西兰沿海城市达尼丁的新论文中,Cox等人展示了一种预测海平面上升如何改变地下水位,从而增加内陆洪涝灾害的方法。达尼丁南部已经经历了周期性洪涝灾害,随着海平面上升,洪涝灾害将变得更加严峻。研究人员将该城市描述为新西兰社区应对和适应气候变化和海平面上升的典范。

研究人员使用了2019年至2023年的数据,这些数据来自安装在达尼丁低洼沿海地区的35个地下水传感器网络,该市的大部分基础设施都位于该区域。他们将传感器数据与潮汐、降雨和其他因素的数据进行比较,来预测未来海平面上升对地下水的影响。

研究结果表明,海平面上升首先会导致地下水位上升,从而降低土地吸收降雨的能力。随着海平面继续上升,地下水位可能会进一步上升,并在地下水位以下造成问题,例如淹没污水处理系统、渗入地下室以及破坏建筑物地基。最终,地下水可能会上升到足够高的地方,形成泉水,引发洪水。

研究人员得出结论,地下水位上升造成的洪水灾害可能向内陆延伸到比许多人预期的更远的地方。此外,假设达尼丁沙丘屏障的防护地形不发生重大变化,这些地下水效应将比海平面上升直接造成的洪水更早发生。

研究人员指出,他们的方法包含关键的假设和不确定性——例如,地下水和海平面将以相同的速度上升,地下水位将保持大致相同的形状,但保守的预测对于达尼丁的规划和灾害管理具有重要价值。他们表示,由于该方法相对简单且成本低廉,因此也可以应用到世界各地类似的沿海地区。 (Earth’s Future, https://doi.org/10.1029/2024EF004977, 2025)

—科学撰稿人Sarah Stanley

This translation was made by Wiley本文翻译由Wiley提供。

Read this article on WeChat. 在微信上阅读本文。

Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Inferring River Discharge from Google Earth Images

Tue, 05/20/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Geophysical Research Letters 

River discharge is an important variable for a wide range of water management applications, yet many rivers (even in the United States) remain ungauged or under-gauged due to the difficulty of conventional field methods, especially in regions of complex terrains. Existing remote sensing methods need gauge data for calibration and are subject to other limitations.

Legleiter et al. [2025] propose a new image-based method to infer river discharge based on critical flow theory. Specifically, slope transitions (from steep to mild) or channel constrictions can induce critical flow conditions, causing undular hydraulic jumps in the form of well-defined standing wave trains. For critical flow (with Froude number equal to 1), the spacing of the waves, the velocity of the flow, and the depth of the water are all uniquely related to one another, so the discharge can be calculated from basic measurements of wavelength and channel width.

The newly proposed method is used to derive discharges from 82 Google Earth images, which agreed closely with gauge records, providing preliminary confirmation for the reliability of the method. Although the method is only applicable to rivers with standing waves (which typically occur on steep slopes or near channel constrictions), these conditions are frequently met in mountainous regions, exactly where new monitoring methods are the most needed due to the lack or sparsity of gauge stations. This study provides a foundation for further evaluation and refinement of the theoretically grounded approach to river discharge estimation.

Citation: Legleiter, C. J., Grant, G., Bae, I., Fasth, B., Yager, E., White, D. C., et al. (2025). Remote sensing of river discharge based on critical flow theory. Geophysical Research Letters, 52, e2025GL114851. https://doi.org/10.1029/2025GL114851   

—Guiling Wang, Editor, Geophysical Research Letters

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

The incipient major rock slope failure at Blatten in Switzerland

Tue, 05/20/2025 - 08:20

The Landslide Blog is written by Dave Petley, who is widely recognized as a world leader in the study and management of landslides.

In Switzerland, a dramatic rock slope failure is developing above Blatten [46.4128, 7.7987], a village located in Vallais Canton.

Blue News is providing regular updates on a dedicated website. The drama started at the weekend with a major landslide in the Petit Nesthorn area, which impacted and entrained a part of the Birch glacier. This has resulted in evacuation of the majority of the population of Blatten.

There is little doubt that a major instability has developed. The estimated scale of the instable mass is up to 5 million cubic metres. At least 17 metres of displacement have been recorded in the last few days.

Melaine Le Roy is providing detailed coverage of the evolution of the event on BlueSky. Hopefully, you’ll be able to view one of his posts below, which shows the extraordinary scale of the mobile mass:-

INSANE !!

Ocean Current Affairs in the Gulf of Mexico

Mon, 05/19/2025 - 12:58

Over the past few years, hurricanes in the Gulf of Mexico have broken records for their intensity and the speed at which they have evolved from tropical storms into major cyclones. Hurricane Beryl, for example, strengthened quickly in early July 2024 to become the earliest category 5 hurricane on record. A few months later, in October, Hurricane Milton set a record for intensifying from a tropical depression to a category 5 hurricane in a little over 2 days.

Ocean currents that circulate warm water, including the Loop Current, are well-documented contributors to conditions around the Gulf today.

A wealth of scientific research has implicated anomalously warm seas as the primary cause for intensifying storms in the region [e.g., Liu et al., 2025]. Ocean currents that circulate warm water, including the Loop Current, which transports water from the tropics to latitudes farther north, are also well-documented contributors to conditions around the Gulf today.

But how these currents have behaved in the past and how they are responding to climate change, which may have significant implications for coastal and inland communities adversely affected by cyclones, are not entirely clear. An interdisciplinary group of scientists from Mexico and the United States has been collaborating in recent years to find out.

Why the Loop Current Matters

The Loop Current (Figure 1), which enters the Gulf of Mexico from the Caribbean by way of the Yucatán Channel between the Campeche Peninsula and Cuba, is a major pathway for water flowing from the tropics to the high-latitude North Atlantic. It is a key component of global thermohaline circulation (currents driven by differences in temperature and salinity), providing roughly 85% of the Gulf Stream as it flows through the Straits of Florida, up the U.S. East Coast, and across the North Atlantic. This warm, salty water substantially influences the Gulf’s hydrography, as well as North American and European climate.

Recent research has shed light on concerning trends in the Gulf, the Loop Current, and the broader system of ocean currents. For example, warming upper layer waters in the Gulf appear to be exacerbating rising sea levels there [Thirion et al., 2024], and warm-core eddies shed from the Loop Current have been shown to be an important factor in the rapid intensification of recent Gulf hurricanes [Liu et al., 2025] (Figure 1).

Fig. 1. Eddies shed by the Loop Current into the Gulf’s central basin on 21 July 2018 are evident in this depiction of water velocity measurements (U.S. Navy model). These eddies can have either warm or cool cores. They fundamentally influence environmental conditions in the Gulf, from the temperature balance to biological diversity. The presence of warm-core eddies is now being implicated as a cause of rapid hurricane intensification [Liu et al., 2025] and accelerated sea level rise [Thirion et al., 2024].

Slowing of the Atlantic Meridional Overturning Circulation could have far-reaching consequences for the habitability and sustainability of communities all around the Atlantic.

The Loop Current and Gulf Stream together also form an important part of the Atlantic Meridional Overturning Circulation (AMOC). The AMOC is a fundamental component of Earth’s climate system, circulating water north and south through the Atlantic—and from the surface to ocean depths—while also distributing heat and nutrients. With the recently documented slowing of the Gulf Stream [Piecuch and Beal, 2023], concern is growing that a similar change in AMOC, perhaps in response to a warming planet, will upset the global heat balance in the Northern Hemisphere. This sort of change could have far-reaching consequences—from cooling temperatures in northern Europe to rapidly rising sea levels along the U.S. East Coast—for the habitability and sustainability of communities all around the Atlantic.

Since 2017, researchers at the University of Texas Institute for Geophysics (UTIG) and Universidad Nacional Autónoma de México (UNAM) have been collaborating to study the paleoceanographic (i.e., deep-time) history of the Loop Current. Among its activities, this team has gone to sea to acquire high-resolution subseafloor seismic images [Lowery et al., 2024] (Figure 2), generate high-precision seafloor maps, and collect samples from the seafloor.

A broad international effort is also ongoing to understand the Loop Current’s modern complexity [National Academies of Sciences, Engineering, and Medicine (NASEM), 2018], using data from moored instruments, glider measurements across multiple transects in the Yucatán Channel, and modeling (Figure 3). This effort has focused primarily on characterizing today’s Loop Current in the region between eastern Mexico and Cuba, where historical data are limited.

Delving into the Current’s History

A current has been flowing through the Gulf of Mexico since at least the Late Cretaceous (~100 million years ago), and like ocean circulation generally, that current has probably strengthened gradually since then. However, hypotheses about when a current of roughly the size and strength of the modern Loop Current first developed are still debated. Understanding this timing is important because it will implicate either a climatic or nonclimatic (i.e., tectonic) driver for its onset and could therefore inform ideas about whether and how the current will respond to climate change. Whereas this region is now relatively stable tectonically, the state of climate is changing rapidly.

Fig. 2. High-resolution seismic profiles (top) crossing the flank of Campeche Bank/Yucatán Platform, on the west flank of the Yucatán Channel, were collected during a 2022 research cruise. Shown here is profile 1005. The associated line drawing (middle) shows the drifts (i.e., offlapping sediment wedges) that will be targeted for coring (red arrows indicate prospective coring locations), as well as other labeled geologic features [Lowery et al., 2024]. Biostratigraphic analyses of cores will help researchers deduce the history of the Loop Current. Locations of the seismic profiles collected in 2022, including line 1005, are shown on the map (bottom), along with the locations of moored instrument arrays in the Yucatán Strait used by Candela et al. [2019] and of the Deep Sea Drilling Project’s (DSDP) Site 95, where cores were collected in 1970. (H = horizon; MS = marine sequence). Click image for larger version. Credit: Adapted from Lowery et al. [2024]

Building on previous seismic and coring expeditions, the U.S.-Mexico team collected high-frequency, multichannel seismic profiles, multibeam bathymetry, and surficial seafloor sediments (i.e., grab samples) in the Yucatán Channel in 2022 and 2024 (Figure 2) while aboard the UNAM vessel Justo Sierra. The primary imaging target was a series of offlapping sediment drift deposits laid down by the interaction of the Loop Current with the seafloor over millions of years.

Drift deposits are lens-shaped accumulations elongated along the axis of prominent boundary flows like the Loop Current and are promising archives for precision samplings (i.e., piston coring and drilling) and dating. Their fine-grained compositions and rich concentrations of microfossil skeletal remains of benthic (bottom-dwelling) and calcareous planktonic (floating) foraminifera provide valuable chronological markers and proxy records of ocean temperature and salinity, important for reconstructing past oceanographic and climatic conditions. Preliminary observations from samples collected confirm that these skeletal remains are diverse and excellently preserved.

The at-sea data acquisition in the Gulf led to two follow-on workshops. The first, held in Mexico City in August 2023, brought together international investigators to examine the new seismic data from the Yucatán Channel and begin to identify potential sites to propose for future drilling (Figure 2). The second, held in Austin, Texas, in September 2024, focused on integrating the paleoceanographic perspectives of the Loop Current with knowledge of its modern physical oceanography.

As illuminated during discussions at the Austin workshop, physical oceanographic measurements collected across the Yucatán Channel from 2012 to 2016 using moored instrument arrays (Figure 2) established the modern Loop Current’s temporal complexity for the first time [Candela et al., 2019]. The current varied, both spatially and in strength, across that 4-year observation period. Tides play an important role in influencing the current, with both semidiurnal and diurnal components; the strength of transport in the current varies by 5%–10%.

Fig. 3. Temperature (top; yellow is warmer, red to blue is cooler) and salinity (bottom; bluer is more saline, yellower is less saline) data were collected in the Yucatán Channel from 18 January to 20 March 2024 during MASTR, the Mini-Adaptive Sampling Test Run. Credit: Courtesy of A. Knap, Geochemical and Environmental Research Group, Texas A&M University

This work has led to a multiyear set of studies of the Yucatán Channel, coordinated by the U.S. National Academies of Sciences, Engineering, and Medicine [NASEM, 2018], to characterize further modern conditions in the Loop Current. The 2024 portion of this study, called the Mini-Adaptive Sampling Test Run (MASTR), applied enhanced observation capacities, combining near-real-time surface and subsurface data from a simultaneous deployment of instrumented gliders and drifters with background observations from Argo floats and modeling. MASTR observations confirmed the Loop Current’s short-term complexity over short timescales, and they improved the performance of numerical models, including NOAA’s Real-Time Ocean Forecast System, in representing the current’s vertical hydrographic structure [DiMarco et al., 2024] (Figure 3).

Linking the Loop’s Past to Its Present

A key overlap, as revealed by recent research [Lowery et al., 2024], between modern and ancient oceanography in this region involves the seafloor. Current strength plays a vital role in our knowledge of past and present Loop Current conditions because it moves the grains that eventually become the current’s sedimentary archive. Seafloor topography also drives turbulent mixing of seawater in the Gulf, influencing both current flow and eddy formation. It is clear that more work and collaboration are needed to link our understanding of the long-term evolution of the Yucatán Channel seafloor with the Loop Current and its history.

An important, and thus far understudied, question is how the Loop Current responded to past warm climate events.

An important, and thus far understudied, question is how the Loop Current responded to past warm climate events, such as the Middle Miocene Climatic Optimum (~17.5–14.5 million years ago) [e.g., Steinthorsdottir et al., 2021]. Thoroughly addressing that question will require scientific ocean drilling to sample and date key buried sediment layers (i.e., seismic reflectors) in the Yucatán Channel to build a picture of Loop Current history. Planning for this work is underway, with support potentially coming from the U.S. National Science Foundation (NSF), the Scientific Ocean Drilling Coordination Office (which NSF has just established), and the current International Ocean Discovery Programme (IODP3), a partnership among Japan, Europe, and Australia and New Zealand.

Another issue on the minds of researchers studying the Loop Current is how anthropogenically driven changes in the current might negatively affect coastal resiliency and estuarine health along the entire Gulf Coast. Emerging problems include risks from sea level rise [Thirion et al., 2024] and strengthening hurricanes, both of which are directly affected by Loop Current flow.

Community organizations such as the Galveston Bay Foundation in Texas are leading efforts to adapt to changes in coastal environments brought by storms and sea level rise by, for example, building terraces and bulkheads, developing “living shorelines,” and restoring coastal prairie and by communicating with the public. As the global climate continues to warm, more effort is required to enhance coastal resilience. Scientists must partner with community organizations to build public awareness of ongoing, human-induced climate change and to train students, the future leaders in environmental mitigation efforts.

In addition to coastal ecosystems, millions of people around the Gulf region are affected by the Loop Current and its influences on weather and sea level rise. Studying these effects requires active participation and collaboration among researchers and various entities in Mexico and the United States. Indeed, the studies noted here could not have been attempted or completed without such participation—and continued collaboration is essential to continuing to collect crucial data. Unfortunately, despite ongoing efforts from all parties to involve representatives from Cuba in these initiatives, meaningful engagement has yet to be achieved.

Our long-term goal is to continue the tradition of international collaboration in the study of the Loop Current, which demands intensified, sustained scrutiny, considering the enormous stakes as human-induced climate change continues.

Acknowledgments

The August 2023 workshop was funded by the U.S. Science Support Program of the International Ocean Discovery Program. The September 2024 workshop was funded by the Jackson School of Geosciences and the Teresa Lozano Long Institute for Latin American Studies, both at the University of Texas at Austin. We also thank the officers and crew of the Justo Sierra.

References

Candela, J., et al. (2019), The flow through the Gulf of Mexico, J. Phys. Oceanogr., 49(6), 1,381–1,401, https://doi.org/10.1175/JPO-D-18-0189.1.

DiMarco, S. F., et al. (2024), Results of the Mini-Adaptive Sampling Test Run (MASTR) experiment: Autonomous vehicles, drifters, floats, ROCIS, and HF-radar, to improve Loop Current system dynamics and forecasts in the deepwater Gulf of México, paper presented at the Offshore Technology Conference, Houston, Texas, 6–9 May, https://doi.org/10.4043/35072-MS.

Liu, Y., et al. (2025), Rapid intensification of Hurricane Ian in relation to anomalously warm subsurface water on the wide continental shelf, Geophys. Res. Lett., 52(1), e2024GL113192, https://doi.org/10.1029/2024GL113192.

Lowery, C. M., et al. (2024), Seismic stratigraphy of contourite drift deposits associated with the Loop Current on the eastern Campeche Bank, Gulf of Mexico, Paleoceanogr. Paleoclimatol., 39(3), e2023PA004701, https://doi.org/10.1029/2023PA004701.

National Academies of Sciences, Engineering, and Medicine (NASEM) (2018), Understanding and Predicting the Gulf of Mexico Loop Current: Critical Gaps and Recommendations, 116 pp., Natl. Acad. Press, Washington, D.C., https://doi.org/10.17226/24823.

Piecuch, C. G., and L. M. Beal (2023), Robust weakening of the Gulf Stream during the past four decades observed in the Florida Straits, Geophys. Res. Lett., 50(18), e2023GL105170, https://doi.org/10.1029/2023GL105170.

Steinthorsdottir, M., et al. (2021), The Miocene: The future of the past, Paleoceanogr. Paleoclimatol., 36(4), e2020PA004037, https://doi.org/10.1029/2020PA004037.

Thirion, G., F. Birol, and J. Jouanno (2024), Loop Current eddies as a possible cause of the rapid sea level rise in the Gulf of Mexico, J. Geophys. Res. Oceans, 129(3), e2023JC019764, https://doi.org/10.1029/2023JC019764.

Author Information

James A. Austin Jr. (jamie@ig.utexas.edu) and Christopher Lowery, Institute for Geophysics, Jackson School of Geosciences, University of Texas at Austin; Ligia Pérez-Cruz and Jaime Urrutia-Fucugauchi, Universidad Nacional Autónoma de México, Mexico City; and Anthony H. Knap, Geochemical and Environmental Research Group, Texas A&M University, College Station

Citation: Austin, J. A., Jr., C. Lowery, L. Pérez-Cruz, J. Urrutia-Fucugauchi, and A. H. Knap (2025), Ocean current affairs in the Gulf of Mexico, Eos, 106, https://doi.org/10.1029/2025EO250190. Published on 19 May 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Deforestation Is Reducing Rainfall in the Amazon

Mon, 05/19/2025 - 12:56
Source: AGU Advances

The Amazon Basin lost about 27,000 square kilometers of forest each year from 2001 to 2016. By 2021, about 17% of the basin had been deforested.

Changes to forest cover can affect surface albedo, evapotranspiration, and other factors that can alter precipitation patterns. And, as the largest tropical forest on Earth, the Amazon plays a crucial role in regulating climate. Previous studies have modeled the effects of deforestation on precipitation, but most used hypothetical or extreme scenarios, such as complete Amazon deforestation.

About 30% of Brazilian Amazon deforestation occurred in the states of Mato Grosso and Rondônia in recent decades. Liu et al. used the regional coupled Weather Research and Forecasting model to better understand the effects of deforestation on moisture cycles and precipitation in this area. The researchers also embedded a water vapor tracer tool, which can track sources of moisture throughout the water cycle, into the model. To ensure the data they provided to the model realistically represented both deforestation and regrowth, they used multiple satellite datasets.

The team conducted three simulations of the period 2001–2015: two that included the changes in surface properties shown in the satellite data and one control simulation. (The first year of the simulation was used to allow the model to reach equilibrium and was excluded from the analysis.) They found that a 3.2% mean reduction in forest cover during the included 14-year period caused a 3.5% reduction in evapotranspiration and a 5.4% reduction in precipitation. The reduced evapotranspiration caused warming and drying in the lower atmosphere, which then reduced convection; this reduced atmospheric convection explained nearly 85% of the precipitation reduction seen during the dry season, they found.

The researchers point out that their study highlights the key role land cover changes play in the region’s precipitation levels, as well as the importance of forest protection and sustainable forest management practices. They note that the reduced precipitation during the dry season has negative impacts on river flow, energy generation for hydropower plants, agricultural yields, and fire hazard. (AGU Advances, https://doi.org/10.1029/2025AV001670, 2025)

—Sarah Derouin (@sarahderouin.com), Science Writer

Citation: Derouin, S. (2025), Deforestation is reducing rainfall in the Amazon, Eos, 106, https://doi.org/10.1029/2025EO250192. Published on 19 May 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Bringing Storms into Focus

Mon, 05/19/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Journal of Geophysical Research: Atmospheres

Large convective storms, known as mesoscale convective systems (MCSs), are the main drivers of extreme rainfall and severe weather. Accurately representing these storms in Earth system models is essential for predicting their variations and changes.

Feng et al. [2025] apply ten different feature tracking methods to assess MCSs in an ensemble of next-generation global kilometer-scale or storm-resolving simulations. Although different tracking methods produced somewhat different estimates of storm frequency and rainfall in observations, consistent patterns emerged when comparing model simulations with observations. While the models generally capture storm frequency well, they tend to underestimate the rainfall amount from these storms and their contribution to total precipitation, particularly over oceans. Most models predicted heavier MCS rainfall for a given amount of atmospheric water vapor compared to observations. Mesoscale Convective Systems tracking Method (MCSMIP) provides a framework for a more robust evaluation of model performance to guide future model development to improve predictions of storms and their attendant impacts.

Citation: Feng, Z., Prein, A. F., Kukulies, J., Fiolleau, T., Jones, W. K., Maybee, B., et al. (2025). Mesoscale convective systems tracking method intercomparison (MCSMIP): Application to DYAMOND global km-scale simulations. Journal of Geophysical Research: Atmospheres, 130, e2024JD042204. https://doi.org/10.1029/2024JD042204

—Rong Fu, Editor, JGR: Atmospheres

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Scientists Reveal Hidden Heat and Flood Hazards Across Texas

Fri, 05/16/2025 - 13:23
Source: AGU Advances

Not all extreme weather hazards are sufficiently documented in global databases. For instance, life-threatening high-heat events that fall within climatological norms are often not included in hazard studies, and local or regional flash flooding events frequently go undetected by satellite instruments.

Texas has experienced more than its fair share of extreme weather over the past 20 years, including increasingly frequent flooding and heat events. Using widely accessible daily precipitation and temperature satellite data, Preisser and Passalacqua created a more complete picture of the flooding and heat hazards that have affected the state in recent years.

In consulting rainfall data from 2001 to 2020, the researchers designated a hazardous flood event as one that had an average recurrence interval of 2 or more years—meaning that an event of that magnitude occurred in a given area no more often than every 2 years. They compared their findings to the flooding events documented in the NOAA Storm Events Database and Dartmouth Flood Observatory (DFO) database. Their analysis captured 3 times as many flooding events as the DFO database did and identified an additional $320 million in damages.

The team also broadened the analysis of extreme heat. Many previous multihazard studies considered only heat waves, in which temperature exceeds a percentile, such as the 90th or 95th, for three consecutive days or longer. This study also considered heat events, or periods in which the wet-bulb globe temperature exceeds a 30°C health threshold rather than a given percentile. Using this definition, the researchers determined that between 2003 and 2020, Texas experienced 2,517 days with a heat hazard event—nearly 40% of all days. Heat hazard events affected a total of 253.2 million square kilometers.

The study defined combinations of floods and extreme heat as multihazard experiences. Using the average recurrence interval method, combined with the broader definition of hazards, the researchers found that parts of the state with large minority populations faced higher risk from multihazard events. This suggests that older methods may underestimate both the extent of multihazard risks and their disproportionate impact on marginalized communities, the researchers say. (AGU Advances, https://doi.org/10.1029/2025AV001667, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2025), Scientists reveal hidden heat and flood hazards across Texas, Eos, 106, https://doi.org/10.1029/2025EO250191. Published on 16 May 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Scientists Map Where Orphan Wells Pose Threats to Aquifers

Fri, 05/16/2025 - 13:23

This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy, and the environment. Sign up for their newsletter here.

For the first time, scientists have mapped groundwater variables nationally to understand which aquifers are most vulnerable to contamination from orphan wells.

Oil and gas wells with no active owner that are no longer producing and have not been plugged are considered orphan wells. These unplugged wells can create pathways for contaminants like hydrocarbons and brine to migrate from the oil and gas formation into groundwater zones. Plugging a well seals off these potential pathways.

The researchers found that 54 percent of analyzed wells are within aquifers that supply 94 percent of groundwater used nationally.

USGS scientists Joshua Woda, Karl Haase, Nicholas Gianoutsos, Kalle Jahn and Kristina Gutchess published a geospatial analysis of water-quality threats from orphan wells this month in the journal Science of the Total Environment. They found that factors including large concentrations of orphaned wells and the advanced age of wells make aquifers in Appalachia, the Gulf Coast and California susceptible to contamination.

Using a USGS dataset of 117,672 documented orphan wells nationwide, the researchers found that 54 percent of the wells are within aquifers that supply 94 percent of groundwater used nationally.

“No matter where you live across the nation, you can go look at what’s happening in your backyard, how your aquifers compare to other aquifers and what the threats are,” said Gianoutsos.

Orphan Wells Pockmark Major U.S. Aquifers

The researchers mapped the locations of orphaned wells over principal and secondary aquifers using Geographic Information Systems datasets. They then analyzed the aquifers based on factors that could contribute to vulnerability to groundwater contamination, such as the average age of the orphan wells.

Older wells were subject to less regulation and are more prone to failure. The authors found that Pennsylvanian aquifers, which span several Appalachian states including Pennsylvania, present the “maximum confluence” of risk factors. The first oil wells in the country were drilled in Pennsylvania. Orphan wells can be over 100 years old and located near coal seams and residential water wells.

The Gulf Coast aquifers, including the Coastal Lowlands aquifer system, which stretches from Texas to the Florida Panhandle, were found to be susceptible in part because wells are located in areas like wetlands and open water that are more prone to contamination.

Credit: Inside Climate News

The analysis also considered the rates of pumping from each aquifer. That led them to the California Coastal aquifers and the Central Valley, where a high density of old orphan wells overlaps with highly urbanized areas and intensive groundwater use for agriculture.

The researchers found that the Ada-Vamoosa aquifer, in central Oklahoma, has the highest concentration of orphan wells per square mile of any principal aquifer in the country.

The authors note the paper is not an analysis of the amount of groundwater contamination from orphan wells or the number of leaking orphan wells. But they suggest that policymakers and researchers could use it as a basis to target aquifers for additional investigation.

“This could be a good starting point if someone wanted to do a local investigation,” said Woda.

Gianoutsos noted that the active list of orphan wells is changing as research into orphan wells and well plugging advances. He said some 40,000 orphan wells have been added to the national list since their dataset was created. Another approximately 10,000 orphan wells have been plugged in that time.

“The threats are still there,” he said. “Just as we discover more wells, we discover additional threats.”

The research was part of the U.S. Department of the Interior Orphaned Wells Program Office through the Bipartisan Infrastructure Law.

Parts of Pennsylvania Look Like “Swiss Cheese” from Drilling

A 2011 Ground Water Protection Council study found that orphan wells caused 41 groundwater contamination incidents in Ohio between 1983 and 2007.

Orphan wells have been linked to groundwater contamination in states including Pennsylvania, Ohio and Texas. A 2011 Ground Water Protection Council study found that orphan wells caused 41 groundwater contamination incidents in Ohio between 1983 and 2007. The study found orphan wells and sites caused 30 groundwater contamination incidents in Texas between 1993 and 2008.

The Pennsylvania Department of Environmental Protection (DEP) has reported several recent cases of orphan wells contaminating groundwater. An orphan well in Vowinckel in Clarion County contaminated a family’s drinking water before it was plugged last year, according to the DEP. Another orphan well in Shinglehouse, in Potter County, was plugged by DEP in 2024 with emergency funds after a homeowner reported contamination of their water well.

John Stolz, a professor of environmental microbiology at Duquesne University in Pittsburgh, has researched how fluids from oil and gas wells can migrate underground with unintended consequences.

We are going to have greater periods of drought, and these water resources are going to become far more valuable.”

Stolz said some of the wells in Pennsylvania are so old they were cased with wood or metal, unlike the cement that has been standard for decades. He said the wooden casings have often deteriorated completely. He said conventional drilling and more recent fracking have left much of Pennsylvania “looking like Swiss cheese.”

“It’s good to see a study that focuses on the water resources,” he said in response to the USGS study. “We are going to have greater periods of drought, and these water resources are going to become far more valuable.”

Stolz is studying a “frack-out” in the town of New Freeport in southwestern Pennsylvania. An unconventional well being fracked communicated with an orphan well over 3,000 feet away, forcing fluids to the surface. Residents of the town resorted to drinking bottled water, according to NBC News.

“The industry refuses to admit this stuff happens,” he said. “The reality is it happens on a somewhat regular basis.”

—Martha Pskowski (@psskow), Inside Climate News

Seaweed Surges May Alter Arctic Fjord Carbon Dynamics

Fri, 05/16/2025 - 13:22
Source: Journal of Geophysical Research: Oceans

In high-latitude Arctic fjords, warming seas and reduced sea ice are boosting seaweed growth. This expansion of seaweed “forests” could alter the storage and cycling of carbon in coastal Arctic ecosystems, but few studies have explored these potential effects.

Roy et al. present a snapshot of the carbon dynamics of seaweed in a fjord in Svalbard, a Norwegian archipelago in the High Arctic, highlighting key comparisons between different seaweed types and between various fjord zones. The findings suggest that warming-driven seaweed growth could lead to the expansion of oxygen-deficient areas in fjords, potentially disrupting local ecosystems.

A team from the National Centre for Polar and Ocean Research in Goa, India, led the Indian Arctic expeditions in 2017, 2022, and 2023. On these expeditions, researchers collected 20 seaweed samples and 13 sediment samples from a variety of locations across Kongsfjorden, a nearly 20-kilometer-long fjord in Svalbard. Then they analyzed the signatures of stable carbon isotopes and lipids (biomolecules made mostly of long hydrocarbon chains) in the seaweed samples.

They found that red, green, and brown seaweeds had different stable carbon isotope fingerprints, reflecting their distinct ways of obtaining carbon from their surroundings. However, the different seaweeds had similar lipid signatures, suggesting that they developed similar lipid synthesis processes in their shared Arctic fjord environment.

The researchers also detected differences in carbon isotope and lipid signatures in sediments from different parts of the fjord. These data suggested that inner-fjord sediments may contain organic matter from a variety of sources, including seaweed, fossilized carbon, and land plants imported by melting glaciers or surface runoff, whereas organic matter in outer-fjord sediments has a larger proportion of seaweed lipids.

Notably, sediment samples collected beneath areas of high seaweed growth showed chemical evidence of low-oxygen conditions, possibly because of microbes consuming oxygen while feeding on seaweed. If these microbes are the cause of the low-oxygen conditions, continued warming-driven growth of seaweed forests could lead to expansion of oxygen-starved zones in Kongsfjorden and other High Arctic fjords, potentially destabilizing these ecosystems, the researchers say. (Journal of Geophysical Research: Oceans, https://doi.org/10.1029/2024JC021900, 2025)

—Sarah Stanley, Science Writer

Citation: Stanley, S. (2025), Seaweed surges may alter arctic fjord carbon dynamics, Eos, 106, https://doi.org/10.1029/2025EO250187. Published on 16 May 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Revised Emissions Show Higher Cooling in 10th Century Eruption

Fri, 05/16/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Geophysical Research Letters

Using recent improvements in our understanding of volcanic emissions, as well as comparisons to ice core measurements of non-sea-salt sulfur, Fuglestvedt et al. [2025] developed revised estimates of the emissions of the Eldgjá eruption. These sulfur and halogen emission estimates were incorporated in an atmosphere/climate model simulation of the 10th century.

The resulting predictions show higher aerosol optical depth and more cooling during the eruption than predicted previously. In addition, the simulated effects on the ozone layer show depletions related to halogen emissions. The larger amount of cooling improves the comparison to tree-ring proxies of temperature. The work demonstrates that improved emissions resolve past disagreements between the simulated cooling from an atmosphere/climate model and the tree-ring based records of temperature, providing new insight on the consequences of a volcanic eruption 1,000 years ago.

Citation: Fuglestvedt, H. F., Gabriel, I., Sigl, M., Thordarson, T., & Krüger, K. (2025). Revisiting the 10th-century Eldgjá eruption: Modeling the climatic and environmental impacts. Geophysical Research Letters, 52, e2024GL110507. https://doi.org/10.1029/2024GL110507

—Lynn Russell, Editor, Geophysical Research Letters

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

New Global River Map Is the First to Include River Bifurcations and Canals

Thu, 05/15/2025 - 13:01
Source: Water Resources Research

Global river datasets represent rivers that flow downstream in single paths that follow surface elevation, but they often miss branching river systems found in areas such as floodplains, canals, and deltas. Forked, or bifurcated, rivers also often exist in densely populated areas, so mapping them at scale is crucial as climate change makes flooding more severe.

Wortmann et al. aimed to fill the gaps in existing global river maps with their new Global River Topology (GRIT) network, the first branching global river network that includes bifurcations, multithreaded channels, river distributaries, and large canals. GRIT uses a new digital elevation model with improved horizonal resolution of 30 meters, 3 times finer than the resolution of previous datasets, and incorporates high-resolution satellite imagery.

The GRIT network focuses on waterways with drainage areas greater than 50 square kilometers and bifurcations on rivers wider than 30 meters. GRIT consists of both vector maps, which use vertices and pathways to display features such as river segments and catchment boundaries, and raster layers, which are made up of pixels and capture continuously varying information, such as flow accumulation and height above the river.

In total, the effort maps approximately 19.6 million kilometers of waterways, including 818,000 confluences, 67,000 bifurcations, and 31,000 outlets—6,500 of which flow into closed basins. Most of the mapped bifurcations are on inland rivers, with nearly 30,000 in Asia, more than 12,000 in North and Central America, nearly 10,000 in South America, and nearly 4,000 in Europe.

GRIT provides a more precise and comprehensive view of the shape and connectivity of river systems than did previous reference datasets, the authors say, offering potential to improve hydrological and riverine habitat modeling, flood forecasting, and water management efforts globally. (Water Resources Research, https://doi.org/10.1029/2024WR038308, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2025), New global river map is the first to include river bifurcations and canals, Eos, 106, https://doi.org/10.1029/2025EO250173. Published on 15 May 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

An Ancient Warming Event May Have Lasted Longer Than We Thought

Thu, 05/15/2025 - 12:44
Source: Geophysical Research Letters

Fifty-six million years ago, during the Paleocene-Eocene Thermal Maximum (PETM), global temperatures rose by more than 5°C over 100,000 or more years. Between 3,000 and 20,000 petagrams of carbon were released into the atmosphere during this time, severely disrupting ecosystems and ocean life globally and creating a prolonged hothouse state.

Modern anthropogenic global warming is also expected to upend Earth’s carbon cycle for thousands of years. Between 1850 and 2019, approximately 2,390 petagrams of carbon dioxide (CO2) were released into the atmosphere, and the release of another 5,000 petagrams in the coming centuries is possible with continued fossil fuel consumption. However, estimates of how long the disruption will last range widely, from about 3,000 to 165,000 years.

Understanding how long the carbon cycle was disrupted during the PETM could offer researchers insights into how severe and how long-lasting disruptions stemming from anthropogenic climate change may be. Previous research used carbon isotope records to estimate that the PETM lasted 120,000–230,000 years. Piedrahita et al. now suggest that the warming event lasted almost 269,000 years.

Evidence of the PETM is indicated in the geological record by a substantial drop in stable carbon isotope ratios. This drop is split into three phases, each representing a different part of the carbon cycle’s disruption and recovery. Previous estimates of when the isotopic drop ended have varied widely because of noise in the data on which they’re based.

In the new research, scientists studied six sedimentary records whose ages have been reliably estimated in previous work: one terrestrial record from Wyoming’s Bighorn Basin and five marine sedimentary records from various locations. Rather than using only raw data, as in previous studies, they used a probabilistic-based detection limit to account for analytical and chronological uncertainties and constrain the time frame of the PETM.

The recovery period in particular, this new study suggests, took much longer than previous estimates indicated—more than 145,000 years. The extended recovery time during the PETM likely means that future climate change scenarios will influence the carbon cycle for longer than most carbon cycle models predict, according to the researchers. (Geophysical Research Letters, https://doi.org/10.1029/2024GL113117, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2025), An ancient warming event may have lasted longer than we thought, Eos, 106, https://doi.org/10.1029/2025EO250188. Published on 15 May 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

气候变暖正在改变欧亚大陆的干旱状况

Thu, 05/15/2025 - 12:42
Source: AGU Advances

This is an authorized translation of an Eos article. 本文是Eos文章的授权翻译。

确定全球干旱状况的变化在多大程度上归因于自然水文气候变化,又在多大程度上是由气候变化造成的,是一项复杂的任务。科学家经常使用复杂的计算机模型来模拟过去的气候变化,并识别前所未有的干旱状况。这些模型还可以帮助识别导致这些状况的因素,例如温度、降水和土地利用变化。然而,这些模型也可能存在偏差,这可能会影响某些地区干旱估计的可信度。

由于树木年轮在较温暖、较潮湿的年份长得较宽,而在较干燥、较寒冷的年份则长得比较薄,因此它们可以作为自然气候变化的记录,并为基于模型的水文气候重建提供一种补充方法。为了研究欧洲和亚洲的干旱情况,Marvel 等人利用新出版的《大欧亚干旱图集》(GEDA)进行了树木年轮测量,该图集包含了公元 1000 年至 2020 年之间生长的数千棵树木的记录。

研究团队依照政府间气候变化专门委员会第六次评估报告所定义的陆地区域对GEDA数据进行了划分。他们利用从1000年至1849年的树木年轮测量数据,估算了每个地区平均帕尔默干旱严重程度指数(PDSI,一种常用的干旱风险衡量指标)在工业化前的变化。随后,他们评估了这些工业化前的变化是否能够解释现代(1850-2020年)的PDSI值。

研究人员发现,在许多地区,现代PDSI的变化可以用全球气温上升来更准确地解释,这表明21世纪的干旱状况不太可能仅仅由自然变化引起。研究结果表明,随着气候变暖,东欧、地中海和俄罗斯北极地区都变得越来越干旱,而北欧、中亚东部和西藏则变得越来越湿润。

研究人员指出,除了气候变化之外,树木的年轮还会受到其他因素的影响。然而,这些因素不太可能对其结果产生重大影响,因为像GEDA这样的数据库通常包括来自选择性采样的地点和树种的数据,其中气候是影响树木年轮生长的主要因素。(AGU Advances, https://doi.org/10.1029/2024AV001289, 2025)

—科学撰稿人Sarah Derouin (@sarahderouin.bsky.social)

This translation was made by Wiley. 本文翻译由Wiley提供。

Read this article on WeChat. 在微信上阅读本文。

Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Can Desalination Quench Agriculture’s Thirst?

Thu, 05/15/2025 - 12:42

This story was originally published by Knowable Magazine.

Ralph Loya was pretty sure he was going to lose the corn. His farm had been scorched by El Paso’s hottest-ever June and second-hottest August; the West Texas county saw 53 days soar over 100 degrees Fahrenheit in the summer of 2024. The region was also experiencing an ongoing drought, which meant that crops on Loya’s eight-plus acres of melons, okra, cucumbers and other produce had to be watered more often than normal.

Loya had been irrigating his corn with somewhat salty, or brackish, water pumped from his well, as much as the salt-sensitive crop could tolerate. It wasn’t enough, and the municipal water was expensive; he was using it in moderation and the corn ears were desiccating where they stood.

Ensuring the survival of agriculture under an increasingly erratic climate is approaching a crisis in the sere and sweltering Western and Southwestern United States, an area that supplies much of our beef and dairy, alfalfa, tree nuts and produce. Contending with too little water to support their plants and animals, farmers have tilled under crops, pulled out trees, fallowed fields and sold off herds. They’ve also used drip irrigation to inject smaller doses of water closer to a plant’s roots, and installed sensors in soil that tell more precisely when and how much to water.

“We see it as a nice solution that’s appropriate in some contexts, but for agriculture it’s hard to justify, frankly.”

In the last five years, researchers have begun to puzzle out how brackish water, pulled from underground aquifers, might be de-salted cheaply enough to offer farmers another water resilience tool. Loya’s property, which draws its slightly salty water from the Hueco Bolson aquifer, is about to become a pilot site to test how efficiently desalinated groundwater can be used to grow crops in otherwise water-scarce places.

Desalination renders salty water less so. It’s usually applied to water sucked from the ocean, generally in arid lands with few options; some Gulf, African and island countries rely heavily or entirely on desalinated seawater. Inland desalination happens away from coasts, with aquifer waters that are brackish—containing between 1,000 and 10,000 milligrams of salt per liter, versus around 35,000 milligrams per liter for seawater. Texas has more than three dozen centralized brackish groundwater desalination plants, California more than 20.

Such technology has long been considered too costly for farming. Some experts still think it’s a pipe dream. “We see it as a nice solution that’s appropriate in some contexts, but for agriculture it’s hard to justify, frankly,” says Brad Franklin, an agricultural and environmental economist at the Public Policy Institute of California. Desalting an acre-foot (almost 326,000 gallons) of brackish groundwater for crops now costs about $800, while farmers can pay a lot less—as little as $3 an acre-foot for some senior rights holders in some places—for fresh municipal water. As a result, desalination has largely been reserved to make liquid that’s fit for people to drink. In some instances, too, inland desalination can be environmentally risky, endangering nearby plants and animals and reducing stream flows.

Brackish (slightly salty) groundwater is found mostly in the Western United States. Click image for larger version. Credit: J.S. Stanton et al. / Brackish Groundwater in the United States: USGS professional paper 1833, 2017

But the US Bureau of Reclamation, along with a research operation called the National Alliance for Water Innovation (NAWI) that’s been granted $185 million from the Department of Energy, have recently invested in projects that could turn that paradigm on its head. Recognizing the urgent need for fresh water for farms—which in the US are mostly inland—combined with the ample if salty water beneath our feet, these entities have funded projects that could help advance small, decentralized desalination systems that can be placed right on farms where they’re needed. Loya’s is one of them.

“We think we have a clear line of sight for agricultural-quality water.”

US farms consume over 83 million acre-feet (more than 27 trillion gallons) of irrigation water every year—the second most water-intensive industry in the country, after thermoelectric power. Not all aquifers are brackish, but most that are exist in the country’s West, and they’re usually more saline the deeper you dig. With fresh water everywhere in the world becoming saltier due to human activity, “we have to solve inland desal for ag…in order to grow as much food as we need,” says Susan Amrose, a research scientist at MIT who studies inland desalination in the Middle East and North Africa.

That means lowering energy and other operational costs; making systems simple for farmers to run; and figuring out how to slash residual brine, which requires disposal and is considered the process’s “Achilles’ heel,” according to one researcher.

The last half-decade of scientific tinkering is now yielding tangible results, says Peter Fiske, NAWI’s executive director. “We think we have a clear line of sight for agricultural-quality water.”

Swallowing the High Cost

Fiske believes farm-based mini-plants can be cost-effective for producing high-value crops like broccoli, berries and nuts, some of which need a lot of irrigation. That $800 per acre-foot has been achieved by cutting energy use, reducing brine and revolutionizing certain parts and materials. It’s still expensive but arguably worth it for a farmer growing almonds or pistachios in California—as opposed to farmers growing lesser-value commodity crops like wheat and soybeans, for whom desalination will likely never prove affordable. As a nut farmer, “I would sign up to 800 bucks per acre-foot of water till the cows come home,” Fiske says.

Loya’s pilot is being built with Bureau of Reclamation funding and will use a common process called reverse osmosis. Pressure pushes salty water through a semi-permeable membrane; fresh water comes out the other side, leaving salts behind as concentrated brine. Loya figures he can make good money using desalinated water to grow not just fussy corn, but even fussier grapes he might be able to sell at a premium to local wineries.

Such a tiny system shares some of the problems of its large-scale cousins—chiefly, brine disposal. El Paso, for example, boasts the biggest inland desalination plant in the world, which makes 27.5 million gallons of fresh drinking water a day. There, every gallon of brackish water gets split into two streams: fresh water and residual brine, at a ratio of 83 percent to 17 percent. Since there’s no ocean to dump brine into, as with seawater desalination, this plant injects it into deep, porous rock formations—a process too pricey and complicated for farmers.

But what if desalination could create 90 or 95 percent fresh water and 5 to 10 percent brine? What if you could get 100 percent fresh water, with just a bag of dry salts leftover? Handling those solids is a lot safer and easier, “because super-salty water brine is really corrosive…so you have to truck it around in stainless steel trucks,” Fiske says.

Finally, what if those salts could be broken into components—lithium, essential for batteries; magnesium, used to create alloys; gypsum, turned into drywall; as well as gold, platinum and other rare-earth elements that can be sold to manufacturers? Already, the El Paso plant participates in “mining” gypsum and hydrochloric acid for industrial customers.

Loya’s brine will be piped into an evaporation pond. Eventually, he’ll have to pay to landfill the dried-out solids, says Quantum Wei, founder and CEO of Harmony Desalting, which is building Loya’s plant. There are other expenses: drilling a well (Loya, fortuitously, already has one to serve the project); building the physical plant; and supplying the electricity to pump water up day after day. These are bitter financial pills for a farmer. “We’re not getting rich; by no means,” Loya says.

Rows of reverse osmosis membranes at the Kay Bailey Hutchison Desalination Plant in El Paso. Credit: Ada Cowan

More cost comes from the desalination itself. The energy needed for reverse osmosis is a lot, and the saltier the water, the higher the need. Additionally, the membranes that catch salt are gossamer-thin, and all that pressure destroys them; they also get gunked up and need to be treated with chemicals.

Reverse osmosis presents another problem for farmers. It doesn’t just remove salt ions from water but the ions of beneficial minerals, too, such as calcium, magnesium and sulfate. According to Amrose, this means farmers have to add fertilizer or mix in pretreated water to replace essential ions that the process took out.

To circumvent such challenges, one NAWI-funded team is experimenting with ultra-high-pressure membranes, fashioned out of stiffer plastic, that can withstand a much harder push. The results so far look “quite encouraging,” Fiske says. Another is looking into a system in which a chemical solvent dropped into water isolates the salt without a membrane, like the polymer inside a diaper absorbs urine. The solvent, in this case the common food-processing compound dimethyl ether, would be used over and over to avoid potentially toxic waste. It has proved cheap enough to be considered for agricultural use.

Amrose is testing a system that uses electrodialysis instead of reverse osmosis. This sends a steady surge of voltage across water to pull salt ions through an alternating stack of positively charged and negatively charged membranes. Explains Amrose, “You get the negative ions going toward their respective electrode until they can’t pass through the membranes and get stuck,” and the same happens with the positive ions. The process gets much higher fresh water recovery in small systems than reverse osmosis, and is twice as energy efficient at lower salinities. The membranes last longer, too—10 years versus three to five years, Amrose says—and can allow essential minerals to pass through.

Data-Based Design

At Loya’s farm, Wei paces the property on a sweltering summer morning with a local engineering company he’s tapped to design the brine storage pond. Loya is anxious that the pond be as small as possible to keep arable land in production; Wei is more concerned that it be big and deep enough. To factor this, he’ll look at average weather conditions since 1954 as well as worst-case data from the last 25 years pertaining to monthly evaporation and rainfall rates. He’ll also divide the space into two sections so one can be cleaned while the other is in use. Loya’s pond will likely be one-tenth of an acre, dug three to six feet deep.

(Left to right) West Texas farmer Ralph Loya, Quantum Wei of Harmony Desalting, and engineer Johanes Makahaube discuss where a desalination plant and brine pond might be placed on Loya’s farm. Credit: Ada Cowan

“Our goal is to make it as painless as possible.”

The desalination plant will pair reverse osmosis membranes with a “batch” process, pushing water through multiple times instead of once and gradually amping up the pressure. Regular reverse osmosis is energy-intensive because it constantly applies the highest pressures, Wei says, but Harmony’s process saves energy by using lower pressures to start with. A backwash between cycles prevents scaling by dissolving mineral crystals and washing them away. “You really get the benefit of the farmer not having to deal with dosing chemicals or replacing membranes,” Wei says. “Our goal is to make it as painless as possible.”

Another Harmony innovation concentrates leftover brine by running it through a nanofiltration membrane in their batch system; such membranes are usually used to pretreat water to cut back on scaling or to recover minerals, but Wei believes his system is the first to combine them with batch reverse osmosis.That’s what’s really going to slash brine volumes,” he says. The whole system will be hooked up to solar panels, keeping Loya’s energy off-grid and essentially free. If all goes to plan, the system will be operational by early 2025 and produce seven gallons of fresh water a minute during the strongest sun of the day, with a goal of 90 to 95 percent fresh water recovery. Any water not immediately used for irrigation will be stored in a tank.

Spreading Out the Research

Ninety-eight miles north of Loya’s farm, along a dead flat and endlessly beige expanse of road that skirts the White Sands Missile Range, more desalination projects burble away at the Brackish Groundwater National Desalination Research Facility in Alamogordo, New Mexico. The facility, run by the Bureau of Reclamation, offers scientists a lab and four wells of differing salinities to fiddle with.

On some parched acreage at the foot of the Sacramento Mountains, a longstanding farming pilot project bakes in relentless sunlight. After some preemptive words about the three brine ponds on the property—“They have an interesting smell, in between zoo and ocean”—facility manager Malynda Cappelle drives a golf cart full of visitors past solar arrays and water tanks to a fenced-in parcel of dust and plants. Here, since 2019, a team from the University of North Texas, New Mexico State University and Colorado State University has tested sunflowers, fava beans and, currently, 16 plots of pinto beans. Some plots are bare dirt; others are topped with compost that boosts nutrients, keeps soil moist and provides a salt barrier. Some plots are drip-irrigated with brackish water straight from a well; some get a desalinated/brackish water mix.

Eyeballing the plots even from a distance, the plants in the freshest-water plots look large and healthy. But those with compost are almost as vigorous, even when irrigated with brackish water. This could have significant implications for cash-conscious farmers. “Maybe we do a lesser level of desalination, more blending, and this will reduce the cost,” says Cappelle.

Pei Xu, has been co-investigator on this project since its start. She’s also the progenitor of a NAWI-funded pilot at the El Paso desalination plant. Later in the day, in a high-ceilinged space next to the plant’s treatment room, she shows off its consequential bits. Like Amrose’s system, hers uses electrodialysis. In this instance, though, Xu is aiming to squeeze a bit of additional fresh—at least freshish—water from the plant’s leftover brine. With suitably low levels of salinity, the plant could pipe it to farmers through the county’s existing canal system, turning a waste product into a valuable resource.

“I think our role now and in the future is as water stewards—to work with each farm to understand their situation and then to recommend their best path forward.”

Xu’s pinto bean and El Paso work, and Amrose’s in the Middle East, are all relevant to Harmony’s pilot and future projects. “Ideally we can improve desalination to the point where it’s an option which is seriously considered,” Wei says. “But more importantly, I think our role now and in the future is as water stewards—to work with each farm to understand their situation and then to recommend their best path forward…whether or not desalting is involved.”

Indeed, as water scarcity becomes ever more acute, desalination advances will help agriculture only so much; even researchers who’ve devoted years to solving its challenges say it’s no panacea. “What we’re trying to do is deliver as much water as cheaply as possible, but that doesn’t really encourage smart water use,” says NAWI’s Fiske. “In some cases, it encourages even the reverse. Why are we growing alfalfa in the middle of the desert?”

Franklin, of the California policy institute, highlights another extreme: Twenty-one of the state’s groundwater basins are already critically depleted, some due to agricultural overdrafting. Pumping brackish aquifers for desalination could aggravate environmental risks.

There are an array of measures, say researchers, that farmers themselves must take in order to survive, with rainwater capture and the fixing of leaky infrastructure at the top of the list. “Desalination is not the best, only or first solution,” Wei says. But he believes that when used wisely in tandem with other smart partial fixes, it could prevent some of the worst water-related catastrophes for our food system.

—Lela Nargi, Knowable Magazine

This article originally appeared in Knowable Magazine, a nonprofit publication dedicated to making scientific knowledge accessible to all. Sign up for Knowable Magazine’s newsletter. Read the original article here.

Old Forests in a New Climate

Thu, 05/15/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

The shading and evapotranspiration provided by forest vegetation buffers the understory climate, making it cooler than the surrounding non-forest. But does that buffering help prevent the forest from warming as much as its surroundings due to climate change?

Using a 45-year record in the H.J. Andrews Forest, Oregon, USA, Jones et al. [2025] compare changes in climate along a 1,000 meter elevation gradient with changes in nearby non-forested weather stations. The understory air temperature at every elevation within the forest increased at rates similar to, and in some cases greater than, those measured at meteorological stations throughout Oregon and Washington, indicating that the forest is not decoupled or protected from the effects of climate change.

Furthermore, the increase in summer maximum air temperature has been as large as 5 degrees Celsius throughout the forest. The temperature at the top elevation in July is now about the same as it was at the lowest elevation 45 years ago for some summer months. These findings are important because they indicate that, while forests confer cooler environments compared to non-forest, they are not protected from climate change.

Comparison of maximum air temperature in July from 1979 to 2023 in the Andrews Forest at 1,310 meters elevation (site RS04) and at 683 meters (site RS20) and the statewide average air temperature for Oregon. The high elevation site is consistently cooler than the low elevation site, and both are cooler than the average meteorological stations of Oregon, which includes non-forest sites. Hence, the forest vegetation does buffer (cool) the air temperature, but the slopes of the increase in temperature over time are similar, with the forest perhaps warming a bit faster than the statewide mean, indicating that the forests are not decoupled from the effects of climate change. Credit: Jones et al. [2025], Figure 4a

Citation: Jones, J. A., Daly, C., Schulze, M., & Stlll, C. J. (2025). Microclimate refugia are transient in stable old forests, Pacific Northwest, USA. AGU Advances, 6, e2024AV001492. https://doi.org/10.1029/2024AV001492 

—Eric Davidson, Editor, AGU Advances

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Geological complexity as a way to understand the distribution of landslides

Thu, 05/15/2025 - 06:37

The Landslide Blog is written by Dave Petley, who is widely recognized as a world leader in the study and management of landslides.

Over the course of my career, I have read many papers (and indeed, written a few) that have tried to explain the distribution of landslides based upon combinations of factors that we consider might be important in their causation (for example, slope angle and lithology). There is utility in this type of approach, and it has informed planning guidelines in some countries, for example. However, it also has severe limitations and, even with the advent of artificial intelligence, there have been few major advances in this area for a while.

However, there is a very interesting and thought-provoking paper (Zhang et al. 2025) in the Bulletin of Engineering Geology and the Environment that might stimulate considerable interest. One reason for highlighting it here is that it might drop below the radar – this is not a well-read journal in my experience, and the paper is behind a paywall. That would be a shame, but the link in this post should allow you to read the paper.

The authors argue that we tend to treat geological factors in a rather over-simplified way in susceptibility analyses:-

“The types, triggers, and spatial distribution of landslides are closely related to the spatial complexity of geological conditions, which are indispensable factors in landslide susceptibility assessment. However, geological conditions often consider only a single index, leading to under-utilisation of geological information in assessing landslide hazards.”

Instead, they propose the use of an index of “geological complexity”. This index combines four major geological components:

  • Structural complexity – capturing dip direction, dip angle, slope and aspect;
  • Lithologic complexity – this essentially uses a geological map to capture the number of lithologic types per unit area;
  • Tectonic complexity – this is representing the density of mapped faults;
  • Seismicity – this captures the distribution of the probability of peak ground accelerations.

Zhang et al. (2025) use an analytical approach to weight each of these factors to produce an index of geological complexity across the landscape. In this piece of work, they then compare the results with the distribution of mapped landslides in a study area in the Eastern Himalayan Syntaxis in Tibet (centred on about [29.5, 95.25]. This is the broad area studied:-

Google Earth map of the area studied by Zhang et al. (2025) to examine the role of geological complexity in landslide distribution.

Now this is a fascinating study area – the Google Earth image below shows a small part of it – note the many landslides:-

Google Earth image of a part of the area studied by Zhang et al. (2025) to examine the role of geological complexity in landslide distribution.

Zhang et al. (2025) are able to show that, for this area at least, the spatial distribution of their index of geological complexity correlates well with the mapped distribution of landslides (there are 366 mapped landslides in the 16,606 km2 of the study area).

The authors are clear that this is not the final word on this approach. There is little doubt that this part of Tibet is a highly dynamic area in terms of both climate and tectonics, which probably favours structurally controlled landslides. To what degree would this approach work in a different setting? In addition, acquiring reliable data that represents the components could be a real challenge (e.g. structural data and reliable estimates of probability of peak ground accelerations), and of course the relative weighting of the different components of the index is an open question.

But, it introduces a fresh and really interesting approach that is worth exploring more widely. Zhang et al. (2025) note that there is the potential to combine this index with other indices that measure factors in landslide causation (e.g.  topography, climate and human activity) to produce an enhanced susceptibility assessment.

And finally, of course, this approach is providing insights into the ways in which different geological factors aggregate at a landscape scale to generate landslides. That feels like a fundamental insight that is also worth developing.

Thus, this work potentially forms the basis of a range of new studies, which is tremendously exciting.

Reference

Zhang, Y., et al. 2025. Geological Complexity: a novel index for measuring the relationship between landslide occurrences and geological conditionsBulletin of Engineering Geology and the Environment84, 301. https://doi.org/10.1007/s10064-025-04333-9.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

EPA to Rescind Rules on Four Forever Chemicals

Wed, 05/14/2025 - 13:51
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

The EPA plans to reconsider drinking water limits for four different PFAS chemicals and extend deadlines for public water systems to comply, according to The Washington Post

PFAS, or per- and polyfluoroalkyl substances, are a group of chemicals that are widely used for their water- and stain-resistant properties. Exposure to PFAS is linked to higher risks of certain cancers, reproductive health issues, developmental delays and immune system problems. The so-called “forever chemicals” are ubiquitous in the environment and widely contaminate drinking water.

A rule implemented last year by President Joe Biden set drinking water limits for five common PFAS chemicals: PFOA, PFOS, PFHxS, PFNA, and GenX. Limits for PFOA and PFOS were set at 4 parts per trillion, and limits for PFHxS, PFNA, and GenX were set at 10 parts per trillion. The rule also set limits for mixtures of these chemicals and a sixth, PFBS.

Documents reviewed by The Washington Post show that the EPA plans to rescind and reconsider the limits for PFHxS, PFNA, GenX, and PFBS. Though the documents did not indicate a plan to reconsider limits for PFOA and PFOS, the agency does plan to extend the compliance deadlines for PFOA and PFOS limits from 2029 to 2031.

In the documents, Lee Zeldin, the agency’s administrator, said the plan will “protect Americans from PFOA and PFOS in their drinking water” and provide “common-sense flexibility in the form of additional time for compliance.”

 
Related

PFOA is a known carcinogen and PFOS is classified as a possible carcinogen by the National Cancer Institute.

The EPA plan comes after multiple lawsuits against the EPA in which trade associations representing water utilities challenged the science behind Biden’s drinking water standard. 

Experts expressed concern that rescinding and reconsidering limits for the four chemicals may not be legal because the Safe Drinking Water Act requires each revision to EPA drinking water standards to be at least as strict as the former regulation. 

“The law is very clear that the EPA can’t repeal or weaken the drinking water standard. Any effort to do so will clearly violate what Congress has required for decades,” Erik Olson, the senior strategic director for health at the Natural Resources Defense Council, an advocacy group, told The Washington Post

—Grace van Deelen (@gvd.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Resilient Solutions Involve Input and Data from the Community

Wed, 05/14/2025 - 13:36
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Community Science Exchange

Climate Safe Neighborhoods (CSN) (a national effort by Groundwork USA) is a program that supports local communities in understanding their climate risk and providing input about vulnerabilities and solutions. Working with students, local universities and organizations, the CSN program (first started in Cincinnati) was extended to northern Kentucky.

A GIS-based dashboard was created to provide communities with access to data related to climate change and other social issues from health to demographics, together in one place. A climate vulnerability model (part of the dashboard) helped identify communities most in need in Kentucky – these neighborhoods were the focus of community workshops where residents learned about climate impacts and collaborated on potential solutions. Community partners helped with planning and running the workshops which included opportunities for residents to provide feedback through mapping activities – data which was added to the dashboard and later used to support climate solutions, such as climate advisory groups and tree plantings.

In their project report, Robles et al. [2025] outline the process and outcomes of the program which can serve as inspiration to others looking to support and collaborate with communities in becoming more resilient to climate impacts.

Citation: Robles, Z., et al. (2025), Climate Safe Neighborhoods: A community collaboration for a more climate-resilient future, Community Science Exchange, https://doi.org/10.1029/2024CSE000101. Published 7 February 2025.  

—Kathryn Semmens, Deputy Editor, Community Science Exchange

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer