Feed aggregator

Revised Emissions Show Higher Cooling in 10th Century Eruption

EOS - Fri, 05/16/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Geophysical Research Letters

Using recent improvements in our understanding of volcanic emissions, as well as comparisons to ice core measurements of non-sea-salt sulfur, Fuglestvedt et al. [2025] developed revised estimates of the emissions of the Eldgjá eruption. These sulfur and halogen emission estimates were incorporated in an atmosphere/climate model simulation of the 10th century.

The resulting predictions show higher aerosol optical depth and more cooling during the eruption than predicted previously. In addition, the simulated effects on the ozone layer show depletions related to halogen emissions. The larger amount of cooling improves the comparison to tree-ring proxies of temperature. The work demonstrates that improved emissions resolve past disagreements between the simulated cooling from an atmosphere/climate model and the tree-ring based records of temperature, providing new insight on the consequences of a volcanic eruption 1,000 years ago.

Citation: Fuglestvedt, H. F., Gabriel, I., Sigl, M., Thordarson, T., & Krüger, K. (2025). Revisiting the 10th-century Eldgjá eruption: Modeling the climatic and environmental impacts. Geophysical Research Letters, 52, e2024GL110507. https://doi.org/10.1029/2024GL110507

—Lynn Russell, Editor, Geophysical Research Letters

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Applicability of semiclassical theories in the strong-field plasma regime

Physical Review E (Plasma physics) - Fri, 05/16/2025 - 10:00

Author(s): Haidar Al-Naseri and Gert Brodin

For many purposes, classical plasma dynamics models can work surprisingly well, even for strong electromagnetic fields, approaching the Schwinger critical fields, and high frequencies, approaching the Compton frequency. However, the applicability of classical models tends to depend rather sensitivel…


[Phys. Rev. E 111, 055205] Published Fri May 16, 2025

NASA satellite images could provide early volcano warnings

Phys.org: Earth science - Thu, 05/15/2025 - 21:30
Scientists know that changing tree leaves can indicate when a nearby volcano is becoming more active and might erupt. In a new collaboration between NASA and the Smithsonian Institution, scientists now believe they can detect these changes from space.

A vicious cycle: How methane emissions from warming wetlands could exacerbate climate change

Phys.org: Earth science - Thu, 05/15/2025 - 21:08
Warming in the Arctic is intensifying methane emissions, contributing to a vicious feedback loop that could accelerate climate change even more, according to a new study published in Nature.

NASA, French SWOT satellite offers big view of small ocean features

Phys.org: Earth science - Thu, 05/15/2025 - 17:25
Small things matter, at least when it comes to ocean features like waves and eddies. A recent NASA-led analysis using data from the SWOT (Surface Water and Ocean Topography) satellite found that ocean features as small as a mile across potentially have a larger impact on the movement of nutrients and heat in marine ecosystems than previously thought.

Ancient amber may contain traces of tsunamis

Phys.org: Earth science - Thu, 05/15/2025 - 16:48
Amber deposits found in ancient deep-sea sediment may represent one of the oldest records to date of a tsunami, suggests research published in Scientific Reports. The study describes large amber deposits discovered on Hokkaido Island in northern Japan, and proposes that they were likely swept out from a forest to the ocean by one or more tsunamis between 116 and 114 million years ago.

An ancient warming event may have lasted longer than we thought

Phys.org: Earth science - Thu, 05/15/2025 - 16:22
Fifty-six million years ago, during the Paleocene-Eocene Thermal Maximum (PETM), global temperatures rose by more than 5°C over 100,000 or more years. Between 3,000 and 20,000 petagrams of carbon were released into the atmosphere during this time, severely disrupting ecosystems and ocean life globally and creating a prolonged hothouse state.

New Global River Map Is the First to Include River Bifurcations and Canals

EOS - Thu, 05/15/2025 - 13:01
Source: Water Resources Research

Global river datasets represent rivers that flow downstream in single paths that follow surface elevation, but they often miss branching river systems found in areas such as floodplains, canals, and deltas. Forked, or bifurcated, rivers also often exist in densely populated areas, so mapping them at scale is crucial as climate change makes flooding more severe.

Wortmann et al. aimed to fill the gaps in existing global river maps with their new Global River Topology (GRIT) network, the first branching global river network that includes bifurcations, multithreaded channels, river distributaries, and large canals. GRIT uses a new digital elevation model with improved horizonal resolution of 30 meters, 3 times finer than the resolution of previous datasets, and incorporates high-resolution satellite imagery.

The GRIT network focuses on waterways with drainage areas greater than 50 square kilometers and bifurcations on rivers wider than 30 meters. GRIT consists of both vector maps, which use vertices and pathways to display features such as river segments and catchment boundaries, and raster layers, which are made up of pixels and capture continuously varying information, such as flow accumulation and height above the river.

In total, the effort maps approximately 19.6 million kilometers of waterways, including 818,000 confluences, 67,000 bifurcations, and 31,000 outlets—6,500 of which flow into closed basins. Most of the mapped bifurcations are on inland rivers, with nearly 30,000 in Asia, more than 12,000 in North and Central America, nearly 10,000 in South America, and nearly 4,000 in Europe.

GRIT provides a more precise and comprehensive view of the shape and connectivity of river systems than did previous reference datasets, the authors say, offering potential to improve hydrological and riverine habitat modeling, flood forecasting, and water management efforts globally. (Water Resources Research, https://doi.org/10.1029/2024WR038308, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2025), New global river map is the first to include river bifurcations and canals, Eos, 106, https://doi.org/10.1029/2025EO250173. Published on 15 May 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

GRIT remaps the world's rivers, branching into the unknown to aid global flood modeling

Phys.org: Earth science - Thu, 05/15/2025 - 13:00
A team led by researchers at the University of Oxford has created the most complete map of the world's rivers ever made, offering a major leap forward for flood prediction, climate risk planning, and water resource management in a warming world.

Sea expedition helps unravel why mercury levels are so high in the Arctic

Phys.org: Earth science - Thu, 05/15/2025 - 12:56
Mercury (Hg) is a naturally occurring element found across the globe, yet it becomes highly toxic as it accumulates up the food chain. Pollution from human activities has pumped increasing amounts of mercury into the atmosphere, and for reasons that are not well understood, the Arctic region has significantly higher levels of mercury, despite having a relatively sparse population and less pollution.

An Ancient Warming Event May Have Lasted Longer Than We Thought

EOS - Thu, 05/15/2025 - 12:44
Source: Geophysical Research Letters

Fifty-six million years ago, during the Paleocene-Eocene Thermal Maximum (PETM), global temperatures rose by more than 5°C over 100,000 or more years. Between 3,000 and 20,000 petagrams of carbon were released into the atmosphere during this time, severely disrupting ecosystems and ocean life globally and creating a prolonged hothouse state.

Modern anthropogenic global warming is also expected to upend Earth’s carbon cycle for thousands of years. Between 1850 and 2019, approximately 2,390 petagrams of carbon dioxide (CO2) were released into the atmosphere, and the release of another 5,000 petagrams in the coming centuries is possible with continued fossil fuel consumption. However, estimates of how long the disruption will last range widely, from about 3,000 to 165,000 years.

Understanding how long the carbon cycle was disrupted during the PETM could offer researchers insights into how severe and how long-lasting disruptions stemming from anthropogenic climate change may be. Previous research used carbon isotope records to estimate that the PETM lasted 120,000–230,000 years. Piedrahita et al. now suggest that the warming event lasted almost 269,000 years.

Evidence of the PETM is indicated in the geological record by a substantial drop in stable carbon isotope ratios. This drop is split into three phases, each representing a different part of the carbon cycle’s disruption and recovery. Previous estimates of when the isotopic drop ended have varied widely because of noise in the data on which they’re based.

In the new research, scientists studied six sedimentary records whose ages have been reliably estimated in previous work: one terrestrial record from Wyoming’s Bighorn Basin and five marine sedimentary records from various locations. Rather than using only raw data, as in previous studies, they used a probabilistic-based detection limit to account for analytical and chronological uncertainties and constrain the time frame of the PETM.

The recovery period in particular, this new study suggests, took much longer than previous estimates indicated—more than 145,000 years. The extended recovery time during the PETM likely means that future climate change scenarios will influence the carbon cycle for longer than most carbon cycle models predict, according to the researchers. (Geophysical Research Letters, https://doi.org/10.1029/2024GL113117, 2025)

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2025), An ancient warming event may have lasted longer than we thought, Eos, 106, https://doi.org/10.1029/2025EO250188. Published on 15 May 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

气候变暖正在改变欧亚大陆的干旱状况

EOS - Thu, 05/15/2025 - 12:42
Source: AGU Advances

This is an authorized translation of an Eos article. 本文是Eos文章的授权翻译。

确定全球干旱状况的变化在多大程度上归因于自然水文气候变化,又在多大程度上是由气候变化造成的,是一项复杂的任务。科学家经常使用复杂的计算机模型来模拟过去的气候变化,并识别前所未有的干旱状况。这些模型还可以帮助识别导致这些状况的因素,例如温度、降水和土地利用变化。然而,这些模型也可能存在偏差,这可能会影响某些地区干旱估计的可信度。

由于树木年轮在较温暖、较潮湿的年份长得较宽,而在较干燥、较寒冷的年份则长得比较薄,因此它们可以作为自然气候变化的记录,并为基于模型的水文气候重建提供一种补充方法。为了研究欧洲和亚洲的干旱情况,Marvel 等人利用新出版的《大欧亚干旱图集》(GEDA)进行了树木年轮测量,该图集包含了公元 1000 年至 2020 年之间生长的数千棵树木的记录。

研究团队依照政府间气候变化专门委员会第六次评估报告所定义的陆地区域对GEDA数据进行了划分。他们利用从1000年至1849年的树木年轮测量数据,估算了每个地区平均帕尔默干旱严重程度指数(PDSI,一种常用的干旱风险衡量指标)在工业化前的变化。随后,他们评估了这些工业化前的变化是否能够解释现代(1850-2020年)的PDSI值。

研究人员发现,在许多地区,现代PDSI的变化可以用全球气温上升来更准确地解释,这表明21世纪的干旱状况不太可能仅仅由自然变化引起。研究结果表明,随着气候变暖,东欧、地中海和俄罗斯北极地区都变得越来越干旱,而北欧、中亚东部和西藏则变得越来越湿润。

研究人员指出,除了气候变化之外,树木的年轮还会受到其他因素的影响。然而,这些因素不太可能对其结果产生重大影响,因为像GEDA这样的数据库通常包括来自选择性采样的地点和树种的数据,其中气候是影响树木年轮生长的主要因素。(AGU Advances, https://doi.org/10.1029/2024AV001289, 2025)

—科学撰稿人Sarah Derouin (@sarahderouin.bsky.social)

This translation was made by Wiley. 本文翻译由Wiley提供。

Read this article on WeChat. 在微信上阅读本文。

Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Can Desalination Quench Agriculture’s Thirst?

EOS - Thu, 05/15/2025 - 12:42

This story was originally published by Knowable Magazine.

Ralph Loya was pretty sure he was going to lose the corn. His farm had been scorched by El Paso’s hottest-ever June and second-hottest August; the West Texas county saw 53 days soar over 100 degrees Fahrenheit in the summer of 2024. The region was also experiencing an ongoing drought, which meant that crops on Loya’s eight-plus acres of melons, okra, cucumbers and other produce had to be watered more often than normal.

Loya had been irrigating his corn with somewhat salty, or brackish, water pumped from his well, as much as the salt-sensitive crop could tolerate. It wasn’t enough, and the municipal water was expensive; he was using it in moderation and the corn ears were desiccating where they stood.

Ensuring the survival of agriculture under an increasingly erratic climate is approaching a crisis in the sere and sweltering Western and Southwestern United States, an area that supplies much of our beef and dairy, alfalfa, tree nuts and produce. Contending with too little water to support their plants and animals, farmers have tilled under crops, pulled out trees, fallowed fields and sold off herds. They’ve also used drip irrigation to inject smaller doses of water closer to a plant’s roots, and installed sensors in soil that tell more precisely when and how much to water.

“We see it as a nice solution that’s appropriate in some contexts, but for agriculture it’s hard to justify, frankly.”

In the last five years, researchers have begun to puzzle out how brackish water, pulled from underground aquifers, might be de-salted cheaply enough to offer farmers another water resilience tool. Loya’s property, which draws its slightly salty water from the Hueco Bolson aquifer, is about to become a pilot site to test how efficiently desalinated groundwater can be used to grow crops in otherwise water-scarce places.

Desalination renders salty water less so. It’s usually applied to water sucked from the ocean, generally in arid lands with few options; some Gulf, African and island countries rely heavily or entirely on desalinated seawater. Inland desalination happens away from coasts, with aquifer waters that are brackish—containing between 1,000 and 10,000 milligrams of salt per liter, versus around 35,000 milligrams per liter for seawater. Texas has more than three dozen centralized brackish groundwater desalination plants, California more than 20.

Such technology has long been considered too costly for farming. Some experts still think it’s a pipe dream. “We see it as a nice solution that’s appropriate in some contexts, but for agriculture it’s hard to justify, frankly,” says Brad Franklin, an agricultural and environmental economist at the Public Policy Institute of California. Desalting an acre-foot (almost 326,000 gallons) of brackish groundwater for crops now costs about $800, while farmers can pay a lot less—as little as $3 an acre-foot for some senior rights holders in some places—for fresh municipal water. As a result, desalination has largely been reserved to make liquid that’s fit for people to drink. In some instances, too, inland desalination can be environmentally risky, endangering nearby plants and animals and reducing stream flows.

Brackish (slightly salty) groundwater is found mostly in the Western United States. Click image for larger version. Credit: J.S. Stanton et al. / Brackish Groundwater in the United States: USGS professional paper 1833, 2017

But the US Bureau of Reclamation, along with a research operation called the National Alliance for Water Innovation (NAWI) that’s been granted $185 million from the Department of Energy, have recently invested in projects that could turn that paradigm on its head. Recognizing the urgent need for fresh water for farms—which in the US are mostly inland—combined with the ample if salty water beneath our feet, these entities have funded projects that could help advance small, decentralized desalination systems that can be placed right on farms where they’re needed. Loya’s is one of them.

“We think we have a clear line of sight for agricultural-quality water.”

US farms consume over 83 million acre-feet (more than 27 trillion gallons) of irrigation water every year—the second most water-intensive industry in the country, after thermoelectric power. Not all aquifers are brackish, but most that are exist in the country’s West, and they’re usually more saline the deeper you dig. With fresh water everywhere in the world becoming saltier due to human activity, “we have to solve inland desal for ag…in order to grow as much food as we need,” says Susan Amrose, a research scientist at MIT who studies inland desalination in the Middle East and North Africa.

That means lowering energy and other operational costs; making systems simple for farmers to run; and figuring out how to slash residual brine, which requires disposal and is considered the process’s “Achilles’ heel,” according to one researcher.

The last half-decade of scientific tinkering is now yielding tangible results, says Peter Fiske, NAWI’s executive director. “We think we have a clear line of sight for agricultural-quality water.”

Swallowing the High Cost

Fiske believes farm-based mini-plants can be cost-effective for producing high-value crops like broccoli, berries and nuts, some of which need a lot of irrigation. That $800 per acre-foot has been achieved by cutting energy use, reducing brine and revolutionizing certain parts and materials. It’s still expensive but arguably worth it for a farmer growing almonds or pistachios in California—as opposed to farmers growing lesser-value commodity crops like wheat and soybeans, for whom desalination will likely never prove affordable. As a nut farmer, “I would sign up to 800 bucks per acre-foot of water till the cows come home,” Fiske says.

Loya’s pilot is being built with Bureau of Reclamation funding and will use a common process called reverse osmosis. Pressure pushes salty water through a semi-permeable membrane; fresh water comes out the other side, leaving salts behind as concentrated brine. Loya figures he can make good money using desalinated water to grow not just fussy corn, but even fussier grapes he might be able to sell at a premium to local wineries.

Such a tiny system shares some of the problems of its large-scale cousins—chiefly, brine disposal. El Paso, for example, boasts the biggest inland desalination plant in the world, which makes 27.5 million gallons of fresh drinking water a day. There, every gallon of brackish water gets split into two streams: fresh water and residual brine, at a ratio of 83 percent to 17 percent. Since there’s no ocean to dump brine into, as with seawater desalination, this plant injects it into deep, porous rock formations—a process too pricey and complicated for farmers.

But what if desalination could create 90 or 95 percent fresh water and 5 to 10 percent brine? What if you could get 100 percent fresh water, with just a bag of dry salts leftover? Handling those solids is a lot safer and easier, “because super-salty water brine is really corrosive…so you have to truck it around in stainless steel trucks,” Fiske says.

Finally, what if those salts could be broken into components—lithium, essential for batteries; magnesium, used to create alloys; gypsum, turned into drywall; as well as gold, platinum and other rare-earth elements that can be sold to manufacturers? Already, the El Paso plant participates in “mining” gypsum and hydrochloric acid for industrial customers.

Loya’s brine will be piped into an evaporation pond. Eventually, he’ll have to pay to landfill the dried-out solids, says Quantum Wei, founder and CEO of Harmony Desalting, which is building Loya’s plant. There are other expenses: drilling a well (Loya, fortuitously, already has one to serve the project); building the physical plant; and supplying the electricity to pump water up day after day. These are bitter financial pills for a farmer. “We’re not getting rich; by no means,” Loya says.

Rows of reverse osmosis membranes at the Kay Bailey Hutchison Desalination Plant in El Paso. Credit: Ada Cowan

More cost comes from the desalination itself. The energy needed for reverse osmosis is a lot, and the saltier the water, the higher the need. Additionally, the membranes that catch salt are gossamer-thin, and all that pressure destroys them; they also get gunked up and need to be treated with chemicals.

Reverse osmosis presents another problem for farmers. It doesn’t just remove salt ions from water but the ions of beneficial minerals, too, such as calcium, magnesium and sulfate. According to Amrose, this means farmers have to add fertilizer or mix in pretreated water to replace essential ions that the process took out.

To circumvent such challenges, one NAWI-funded team is experimenting with ultra-high-pressure membranes, fashioned out of stiffer plastic, that can withstand a much harder push. The results so far look “quite encouraging,” Fiske says. Another is looking into a system in which a chemical solvent dropped into water isolates the salt without a membrane, like the polymer inside a diaper absorbs urine. The solvent, in this case the common food-processing compound dimethyl ether, would be used over and over to avoid potentially toxic waste. It has proved cheap enough to be considered for agricultural use.

Amrose is testing a system that uses electrodialysis instead of reverse osmosis. This sends a steady surge of voltage across water to pull salt ions through an alternating stack of positively charged and negatively charged membranes. Explains Amrose, “You get the negative ions going toward their respective electrode until they can’t pass through the membranes and get stuck,” and the same happens with the positive ions. The process gets much higher fresh water recovery in small systems than reverse osmosis, and is twice as energy efficient at lower salinities. The membranes last longer, too—10 years versus three to five years, Amrose says—and can allow essential minerals to pass through.

Data-Based Design

At Loya’s farm, Wei paces the property on a sweltering summer morning with a local engineering company he’s tapped to design the brine storage pond. Loya is anxious that the pond be as small as possible to keep arable land in production; Wei is more concerned that it be big and deep enough. To factor this, he’ll look at average weather conditions since 1954 as well as worst-case data from the last 25 years pertaining to monthly evaporation and rainfall rates. He’ll also divide the space into two sections so one can be cleaned while the other is in use. Loya’s pond will likely be one-tenth of an acre, dug three to six feet deep.

(Left to right) West Texas farmer Ralph Loya, Quantum Wei of Harmony Desalting, and engineer Johanes Makahaube discuss where a desalination plant and brine pond might be placed on Loya’s farm. Credit: Ada Cowan

“Our goal is to make it as painless as possible.”

The desalination plant will pair reverse osmosis membranes with a “batch” process, pushing water through multiple times instead of once and gradually amping up the pressure. Regular reverse osmosis is energy-intensive because it constantly applies the highest pressures, Wei says, but Harmony’s process saves energy by using lower pressures to start with. A backwash between cycles prevents scaling by dissolving mineral crystals and washing them away. “You really get the benefit of the farmer not having to deal with dosing chemicals or replacing membranes,” Wei says. “Our goal is to make it as painless as possible.”

Another Harmony innovation concentrates leftover brine by running it through a nanofiltration membrane in their batch system; such membranes are usually used to pretreat water to cut back on scaling or to recover minerals, but Wei believes his system is the first to combine them with batch reverse osmosis.That’s what’s really going to slash brine volumes,” he says. The whole system will be hooked up to solar panels, keeping Loya’s energy off-grid and essentially free. If all goes to plan, the system will be operational by early 2025 and produce seven gallons of fresh water a minute during the strongest sun of the day, with a goal of 90 to 95 percent fresh water recovery. Any water not immediately used for irrigation will be stored in a tank.

Spreading Out the Research

Ninety-eight miles north of Loya’s farm, along a dead flat and endlessly beige expanse of road that skirts the White Sands Missile Range, more desalination projects burble away at the Brackish Groundwater National Desalination Research Facility in Alamogordo, New Mexico. The facility, run by the Bureau of Reclamation, offers scientists a lab and four wells of differing salinities to fiddle with.

On some parched acreage at the foot of the Sacramento Mountains, a longstanding farming pilot project bakes in relentless sunlight. After some preemptive words about the three brine ponds on the property—“They have an interesting smell, in between zoo and ocean”—facility manager Malynda Cappelle drives a golf cart full of visitors past solar arrays and water tanks to a fenced-in parcel of dust and plants. Here, since 2019, a team from the University of North Texas, New Mexico State University and Colorado State University has tested sunflowers, fava beans and, currently, 16 plots of pinto beans. Some plots are bare dirt; others are topped with compost that boosts nutrients, keeps soil moist and provides a salt barrier. Some plots are drip-irrigated with brackish water straight from a well; some get a desalinated/brackish water mix.

Eyeballing the plots even from a distance, the plants in the freshest-water plots look large and healthy. But those with compost are almost as vigorous, even when irrigated with brackish water. This could have significant implications for cash-conscious farmers. “Maybe we do a lesser level of desalination, more blending, and this will reduce the cost,” says Cappelle.

Pei Xu, has been co-investigator on this project since its start. She’s also the progenitor of a NAWI-funded pilot at the El Paso desalination plant. Later in the day, in a high-ceilinged space next to the plant’s treatment room, she shows off its consequential bits. Like Amrose’s system, hers uses electrodialysis. In this instance, though, Xu is aiming to squeeze a bit of additional fresh—at least freshish—water from the plant’s leftover brine. With suitably low levels of salinity, the plant could pipe it to farmers through the county’s existing canal system, turning a waste product into a valuable resource.

“I think our role now and in the future is as water stewards—to work with each farm to understand their situation and then to recommend their best path forward.”

Xu’s pinto bean and El Paso work, and Amrose’s in the Middle East, are all relevant to Harmony’s pilot and future projects. “Ideally we can improve desalination to the point where it’s an option which is seriously considered,” Wei says. “But more importantly, I think our role now and in the future is as water stewards—to work with each farm to understand their situation and then to recommend their best path forward…whether or not desalting is involved.”

Indeed, as water scarcity becomes ever more acute, desalination advances will help agriculture only so much; even researchers who’ve devoted years to solving its challenges say it’s no panacea. “What we’re trying to do is deliver as much water as cheaply as possible, but that doesn’t really encourage smart water use,” says NAWI’s Fiske. “In some cases, it encourages even the reverse. Why are we growing alfalfa in the middle of the desert?”

Franklin, of the California policy institute, highlights another extreme: Twenty-one of the state’s groundwater basins are already critically depleted, some due to agricultural overdrafting. Pumping brackish aquifers for desalination could aggravate environmental risks.

There are an array of measures, say researchers, that farmers themselves must take in order to survive, with rainwater capture and the fixing of leaky infrastructure at the top of the list. “Desalination is not the best, only or first solution,” Wei says. But he believes that when used wisely in tandem with other smart partial fixes, it could prevent some of the worst water-related catastrophes for our food system.

—Lela Nargi, Knowable Magazine

This article originally appeared in Knowable Magazine, a nonprofit publication dedicated to making scientific knowledge accessible to all. Sign up for Knowable Magazine’s newsletter. Read the original article here.

First machine learning model developed to calculate the volume of all glaciers on Earth

Phys.org: Earth science - Thu, 05/15/2025 - 12:03
A team of researchers led by Niccolò Maffezzoli, "Marie Curie" fellow at Ca' Foscari University of Venice and the University of California, Irvine, and an associate member of the Institute of Polar Sciences of the National Research Council of Italy, has developed the first global model based on artificial intelligence to calculate the ice thickness distribution of all the glaciers on Earth.

Old Forests in a New Climate

EOS - Thu, 05/15/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

The shading and evapotranspiration provided by forest vegetation buffers the understory climate, making it cooler than the surrounding non-forest. But does that buffering help prevent the forest from warming as much as its surroundings due to climate change?

Using a 45-year record in the H.J. Andrews Forest, Oregon, USA, Jones et al. [2025] compare changes in climate along a 1,000 meter elevation gradient with changes in nearby non-forested weather stations. The understory air temperature at every elevation within the forest increased at rates similar to, and in some cases greater than, those measured at meteorological stations throughout Oregon and Washington, indicating that the forest is not decoupled or protected from the effects of climate change.

Furthermore, the increase in summer maximum air temperature has been as large as 5 degrees Celsius throughout the forest. The temperature at the top elevation in July is now about the same as it was at the lowest elevation 45 years ago for some summer months. These findings are important because they indicate that, while forests confer cooler environments compared to non-forest, they are not protected from climate change.

Comparison of maximum air temperature in July from 1979 to 2023 in the Andrews Forest at 1,310 meters elevation (site RS04) and at 683 meters (site RS20) and the statewide average air temperature for Oregon. The high elevation site is consistently cooler than the low elevation site, and both are cooler than the average meteorological stations of Oregon, which includes non-forest sites. Hence, the forest vegetation does buffer (cool) the air temperature, but the slopes of the increase in temperature over time are similar, with the forest perhaps warming a bit faster than the statewide mean, indicating that the forests are not decoupled from the effects of climate change. Credit: Jones et al. [2025], Figure 4a

Citation: Jones, J. A., Daly, C., Schulze, M., & Stlll, C. J. (2025). Microclimate refugia are transient in stable old forests, Pacific Northwest, USA. AGU Advances, 6, e2024AV001492. https://doi.org/10.1029/2024AV001492 

—Eric Davidson, Editor, AGU Advances

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Amazon could survive long-term drought but at a high cost, study suggests

Phys.org: Earth science - Thu, 05/15/2025 - 11:24
The Amazon rainforest may be able to survive long-term drought caused by climate change, but adjusting to a drier, warmer world would exact a heavy toll, a study published in Nature Ecology and Evolution suggests.

Dynamic stabilization and parametric excitation of instabilities in an ablation front by a temporally modulated laser pulse

Physical Review E (Plasma physics) - Thu, 05/15/2025 - 10:00

Author(s): K. G. Zhao, J. W. Li, L. F. Wang, Z. Y. Li, Z. H. Di, C. Xue, J. Q. Dong, H. Zhang, J. F. Wu, H. B. Zhuo, W. H. Ye, C. T. Zhou, Y. K. Ding, W. Y. Zhang, and X. T. He

We conducted a numerical study of the effects of the modulation amplitude and period of a temporally modulated laser pulse on instabilities at an ablation front. The physical features of the oscillatory acceleration and ablation velocity in unperturbed ablative flows display periodic oscillations. A…


[Phys. Rev. E 111, 055204] Published Thu May 15, 2025

Geological complexity as a way to understand the distribution of landslides

EOS - Thu, 05/15/2025 - 06:37

The Landslide Blog is written by Dave Petley, who is widely recognized as a world leader in the study and management of landslides.

Over the course of my career, I have read many papers (and indeed, written a few) that have tried to explain the distribution of landslides based upon combinations of factors that we consider might be important in their causation (for example, slope angle and lithology). There is utility in this type of approach, and it has informed planning guidelines in some countries, for example. However, it also has severe limitations and, even with the advent of artificial intelligence, there have been few major advances in this area for a while.

However, there is a very interesting and thought-provoking paper (Zhang et al. 2025) in the Bulletin of Engineering Geology and the Environment that might stimulate considerable interest. One reason for highlighting it here is that it might drop below the radar – this is not a well-read journal in my experience, and the paper is behind a paywall. That would be a shame, but the link in this post should allow you to read the paper.

The authors argue that we tend to treat geological factors in a rather over-simplified way in susceptibility analyses:-

“The types, triggers, and spatial distribution of landslides are closely related to the spatial complexity of geological conditions, which are indispensable factors in landslide susceptibility assessment. However, geological conditions often consider only a single index, leading to under-utilisation of geological information in assessing landslide hazards.”

Instead, they propose the use of an index of “geological complexity”. This index combines four major geological components:

  • Structural complexity – capturing dip direction, dip angle, slope and aspect;
  • Lithologic complexity – this essentially uses a geological map to capture the number of lithologic types per unit area;
  • Tectonic complexity – this is representing the density of mapped faults;
  • Seismicity – this captures the distribution of the probability of peak ground accelerations.

Zhang et al. (2025) use an analytical approach to weight each of these factors to produce an index of geological complexity across the landscape. In this piece of work, they then compare the results with the distribution of mapped landslides in a study area in the Eastern Himalayan Syntaxis in Tibet (centred on about [29.5, 95.25]. This is the broad area studied:-

Google Earth map of the area studied by Zhang et al. (2025) to examine the role of geological complexity in landslide distribution.

Now this is a fascinating study area – the Google Earth image below shows a small part of it – note the many landslides:-

Google Earth image of a part of the area studied by Zhang et al. (2025) to examine the role of geological complexity in landslide distribution.

Zhang et al. (2025) are able to show that, for this area at least, the spatial distribution of their index of geological complexity correlates well with the mapped distribution of landslides (there are 366 mapped landslides in the 16,606 km2 of the study area).

The authors are clear that this is not the final word on this approach. There is little doubt that this part of Tibet is a highly dynamic area in terms of both climate and tectonics, which probably favours structurally controlled landslides. To what degree would this approach work in a different setting? In addition, acquiring reliable data that represents the components could be a real challenge (e.g. structural data and reliable estimates of probability of peak ground accelerations), and of course the relative weighting of the different components of the index is an open question.

But, it introduces a fresh and really interesting approach that is worth exploring more widely. Zhang et al. (2025) note that there is the potential to combine this index with other indices that measure factors in landslide causation (e.g.  topography, climate and human activity) to produce an enhanced susceptibility assessment.

And finally, of course, this approach is providing insights into the ways in which different geological factors aggregate at a landscape scale to generate landslides. That feels like a fundamental insight that is also worth developing.

Thus, this work potentially forms the basis of a range of new studies, which is tremendously exciting.

Reference

Zhang, Y., et al. 2025. Geological Complexity: a novel index for measuring the relationship between landslide occurrences and geological conditionsBulletin of Engineering Geology and the Environment84, 301. https://doi.org/10.1007/s10064-025-04333-9.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer