Feed aggregator

Supercomputer modeling unlocks longstanding mystery of subducted oceanic slabs

Phys.org: Earth science - Fri, 10/03/2025 - 13:03
An international research collaboration has harnessed supercomputing power to better understand how massive slabs of ancient ocean floors are shaped as they sink hundreds of kilometers below Earth's surface.

The AI Revolution in Weather Forecasting Is Here

EOS - Fri, 10/03/2025 - 13:01

Weather forecasting has become essential in modern life, reducing weather-related losses and improving societal outcomes. Severe weather alerts provide vital early warnings that help to protect life and property. And forecasts of temperatures, precipitation, wind, humidity, and other conditions—both extreme and average—support public safety, health, and economic prosperity by giving everyone from farmers and fishers to energy and construction companies a heads-up on expected weather.

However, not all forecasts are created equal, in part because weather prediction is chaotic, meaning small uncertainties in the initial conditions (data) input into weather models can lead to vastly different predicted outcomes. The accuracy of predictions is also affected by the complexity of models, the realism with which atmospheric conditions are represented, how far into the future weather is being forecast, and—at very resolved scales—local geography.

The application of novel artificial intelligence (AI) is providing the latest revolutionary influence on weather forecasting.

The skill and reliability of weather forecasts have steadily improved over the past century. In recent decades, improvements have been facilitated by advances in numerical weather prediction (NWP), growth in computing power, and the availability of more and better datasets that capture Earth’s physical conditions more frequently. The application of novel artificial intelligence (AI) is providing the latest revolutionary influence on forecasting. This revolution is borne out by trends in the scientific literature and in the development of new AI-based tools with the potential to enhance predictions of conditions hours, days, or weeks in advance.

Making the Models

All weather forecasts involve inputting data in the form of observations—readings from weather balloons, buoys, satellites, and other instruments—into models that predict future states of the atmosphere. Model outputs are then transformed into useful products such as daily weather forecasts, storm warnings, and fire hazard assessments.

Current forecasting methods are based on NWP, a mathematical framework that models the future of the atmosphere by treating it as a fluid that interacts with water bodies, land, and the biosphere. Models using this approach include the European Centre for Medium-Range Weather Forecasts’ (ECMWF) Integrated Forecasting System (IFS) model (widely considered the gold standard in modern weather forecasting), the National Center for Atmospheric Research’s Weather Research and Forecasting model, and NOAA’s Global Forecasting System.

NWP models solve fluid dynamics equations known as the Navier-Stokes equations that simplify the complex motions of fluids, such as air in the atmosphere, and can be used to describe relationships among their velocities, temperatures, pressures, and densities. The result is a set of predictions of what, for example, temperatures will be at given places at some point in the future. These predictions, together with estimates of other simplified physical processes not captured by fluid dynamics equations, make up a weather forecast.

This conceptually simple description obscures the massive scale of the work that goes into creating forecasts (Figure 1). Operating satellites, radar networks, and other necessary technology is expensive and requires substantial specialized expertise. Inputting observations from these disparate sources into models and getting them to work together harmoniously—no easy task—are a field of study unto themselves.

Fig. 1. An enormous amount of work and expertise go into producing weather forecasts. In brief, observations from multiple sources are combined and used to inform forecasting models, and the resulting model outputs are converted into forecasts that are communicated to the public. Artificial intelligence (AI) can be applied in many ways through this process.

Furthermore, forecast models are complicated and require some of the most powerful—not to mention expensive and energy-intensive—supercomputers in the world to function. Expert meteorologists are required to interpret model outputs, and communications teams are needed to translate those interpretations for the public.

The input-model-output structure for forecasting will be familiar to students of computer science. Indeed, the two fields have, in many ways, grown up together. The Navier-Stokes approach to weather forecasting first became truly useful when computing technology could produce results sufficiently quickly beginning in the 1950s and 1960s—after all, there is no point in having a forecast for 24 hours from now if it takes 36 hours to make!

The Rise of Machine Learning

As the power of computing hardware and software has increased, so too have the accuracy, resolution, and range of forecasting. The advent of early AI systems in the 1950s, which weather services adopted almost immediately, fed this advancement through the mid-20th century. These early AIs were hierarchical systems that mimicked human decisionmaking through decision trees comprising a series of “if this, then that” logic rules.

Interest in AI among forecasters picked up starting in the late 1990s and grew steadily into the 2000s and 2010s as computing resources became more powerful and useful data became more widely available.

The development of decision trees was followed by the emergence of machine learning (ML), a subdiscipline of AI involving training models to perform specific tasks without explicit programming. Instead of following coded instructions, these models learn from patterns in datasets to improve their performance over time. One method to achieve this improvement is to train a neural network, an algorithm said to be inspired by the human brain. Neural networks work by iteratively processing numerical representations of input data—image pixel brightnesses, temperatures, or wind speeds, for example—through multiple layers of mathematical operations to reorganize and refine the data until a meaningful output is obtained.

Even though experiments with ML have been ongoing within the wider scientific community since the 1970s, they initially failed to catch on as much more than a novelty in weather forecasting. AI systems at the time were limited by the computing power and relevant data available for use in ML. However, interest in AI among forecasters picked up starting in the late 1990s and grew steadily into the 2000s and 2010s as computing resources became more powerful and useful data became more widely available.

Model training methods also grew more efficient, and new ideas on how to adapt the original neural network concept created opportunities to tackle more complicated tasks. For example, 2010 saw the release of ImageNet, a huge database of labeled images that could be used to train AIs for 2D image recognition tasks.

Machine Learning Moves into Weather Forecasting

Weather forecasting is feeling the impact of this innovation. The growth of AI in research on nowcasting—forecasts of conditions a couple of hours in advance—and short-range weather forecasting up to a day or two out helps to reveal how.

We informally surveyed studies published between 2011 and 2022 using the Web of Science database and found that most of this research focused on applying AI to studies of classical weather forecast variables: precipitation, clouds, solar irradiation, wind speed and direction, and temperature (Figure 2).

Fig. 2. The number of newly published scientific studies concerning the use of AI in nowcasting or short-range weather forecasting grew substantially from 2011 to 2022. In this plot, the studies are divided according to their focus on five variables of interest. Credit: Authors; data included herein are derived from Clarivate’s Web of Science database (©Clarivate 2024. All rights reserved.)

The annual growth of new publications related to these five forecast variables indicates startling year-over-year growth averaging 375% over this period. This nearly fivefold annual increase is split about evenly across each variable: In 2010, the numbers of new publications addressing each of these variables were in the low single digits; by 2022, the numbers for each were in the hundreds.

Research in just a few nations drove most of this growth. Roughly half the papers published from 2011 to 2022 emerged from China (27.5%) and the United States (22.7%). India (~8%), Germany (~6.5%), and the United Kingdom and Australia (~5% each) also contributed significantly. Most, if not all, of this research output appears to be linked to interest in its relevance for or application to various economic sectors traditionally tied to weather forecasting, such as energy, transportation, and agriculture.

Fig. 3. The five most popular variables (left) are matched (by keyword association) to major economic sectors. Credit: Diagram created by authors using SankeyMATIC; data included herein are derived from Clarivate’s Web of Science database (©Clarivate 2024. All rights reserved.)

We determined links in the published studies by associating keywords from these sectors with the five forecast variables (Figure 3). This approach has limitations, including potential double counting of studies (e.g., because the same AI model may have multiple uses), not accounting for the relative sizes of the sectors (e.g., larger sectors like energy are naturally bigger motivators for research than smaller ones like fisheries), and not identifying proprietary research and models not released to the public. Nonetheless, the keyword associations reveal interesting trends.

For example, applications in the energy sector dominate AI forecasting research related to solar irradiance and wind. Comprehensive reviews have covered how AI technologies are being integrated into the energy industry at many stages in the supply chain. Choosing and planning sites (e.g., for solar or wind farms), management of solar and wind resources in day-to-day operations, predictive maintenance, energy demand matching in real time, and management of home and business consumers’ energy usage are all use cases in which AI is affecting the industry and driving research.

Applications in the agricultural sector are primarily driving research into temperature and precipitation forecasting.

Meanwhile, applications in the agricultural sector are primarily driving research into temperature and precipitation forecasting. This trend likely reflects the wider movement in the sector toward precision agriculture, a data-driven approach intended to boost crop yields and sustainability. Large companies, such as BASF, have promoted “digital farming,” which combines data sources, including forecasts and historical weather patterns, into ML models to predict future temperatures and precipitation. Farmers can then use these predictions to streamline operations and optimize resource usage through decisions about, for example, the best time to fertilize or water crops.

The construction industry, a significant driver of temperature forecasting research using AI, relies on temperature forecasts to plan operations. Weather can substantially influence project durations by affecting start dates and the time required for tasks such as pouring concrete. Accurate forecasts can also improve planning for worker breaks on hot days and for anticipating work stoppages during hard freezes.

In the transportation and aviation sectors, public safety concerns are likely driving AI-aided forecasting research. Intelligent transportation systems rely on weather forecast data to predict and mitigate road transportation problems through diversions or road and bridge closures. Similarly, accurate weather data can power aviation models to improve safety and comfort by, for example, predicting issues such as turbulence and icing.

Evolving Architectures

The methods and structures, or architectures, used in AI-based forecasting research have changed and grown more sophisticated as the field has advanced, particularly over the past decade (Figure 4). And this trajectory toward improvement appears to be accelerating.

Fig. 4. Significant growth and change in the AI/machine learning architectures used in the scientific literature on nowcasting or short-range weather forecasting occurred between 2011 and 2022. RNN = recurrent neural network; CNN = convolutional neural network; GAN = generative adversarial network; SVM = support vector machine; ELM = extreme learning machine. Credit: Authors; data included herein are derived from Clarivates Web of Science database (©Clarivate 2024. All rights reserved.)

In 2015, roughly 40% of AI models in the literature for nowcasting and short-range weather forecasting were support vector machines, but by 2022, this figure declined to just 8%. Over the same period, the use of more sophisticated convolutional neural networks ballooned from 11% to 43%. Newer architectures have also emerged for forecasting, with generative adversarial networks, U-Net, and transformer models gaining popularity.

Transformers, with their powerful attention mechanisms that detect long-range dependences among different variables (e.g., among atmospheric conditions and the formation of storms), may be on a course to become the preferred architecture for weather forecasting. Transformers have been widely adopted in other domains and have become synonymous with AI in general because of their prominent use in generative AI tools like OpenAI’s ChatGPT.

Some of today’s most advanced weather forecasting models make use of transformer models, rather than being based on numerical weather prediction.

Some of today’s most advanced weather forecasting models make use of transformer models, such as those from NVIDIA (FourCastNet), Huawei (Pangu-Weather), and Google (GraphCast), each of which is data driven, rather than being based on NWP. These models boast levels of accuracy and spatial resolution similar to ECMWF’s traditional IFS model across several important weather variables. However, their major innovation is in the computing resources required to generate a forecast: On the basis of (albeit imperfect) comparisons, NVIDIA estimates, for example, that FourCastNet may be up to 45,000 times faster than IFS, which equates to using 12,000 times less energy.

A View of the Future

Combining high-resolution data from multiple sources will be core to the weather forecasting revolution, meaning the observational approaches used to gather these data will play a central role.

Sophisticated AI architectures are already being used to combine observations from different sources to create new products that are difficult to create using traditional, physics-based methods. For example, advanced air quality forecasting tools rely on combining measurements from satellites and monitoring stations and ground-level traffic and topography data to produce realistic representations of pollutant concentrations. AIs are also being used for data assimilation, the process of mapping observations to regularly spaced, gridded representations of the atmosphere for use in weather forecast models (which themselves can be AI driven).

Another growing use case for AI is forecasting extreme weather. Extreme events can be challenging for AI models to predict because many models function by searching for patterns (i.e., averages) in data, meaning rarer events are inherently weighted less. Researchers have suggested that the most state-of-the-art AI weather forecasts have significantly underperformed traditional NWP counterparts in predicting extreme weather events, especially rare events such as category 5 hurricanes. However, improvements are in the works. For example, compared with traditional methods, Microsoft’s Aurora model boasts improved accuracy for Pacific typhoon tracks and wind speeds during European storms.

Whether scientists are using fully data-driven AI or so-called hybrid systems, which combine AI and traditional atmospheric physics models, predictions of weather events and of likely outcomes of those events (e.g., fires, floods, evacuations) need to be combined reliably and transparently. One example of a hybrid system blending physics and AI elements is Google’s Flood Hub, which integrates traditional modeling with AI tools to deliver early extreme flood warnings freely in 80 countries. Such work is an important part of the United Nations’ Early Warnings for All initiative, which aims to ensure that all people have actionable access to warnings and information about natural hazards.

Observations will play a key role in facilitating the growing accuracy and efficiency of new forecasting products by providing the data needed to train AI models.

Observations will play a key role in facilitating the growing accuracy and efficiency of new forecasting products by providing the data needed to train AI models. Today, models are generally pre-trained before use with a structured dataset generated from data assimilation methods. These pre-trained systems could be tailored to new specialized tasks at high resolution, such as short-range forecasting for locations where conditions change rapidly, like in high mountain ranges.

Satellites, such as the recently launched Geostationary Operational Environmental Satellite 19 (GOES-19) and NOAA-21 missions, have been an increasingly critical source of data for training AI. These data will soon be supplemented with even higher-resolution observations from next-generation satellite instruments such as the European Organisation for the Exploitation of Meteorological Satellites’ (EUMETSAT) recently launched Meteosat Third Generation (MTG) and EUMETSAT Polar System – Second Generation (EPS-SG) programs. NOAA’s planned Geostationary Extended Observations (GeoXO) and Near Earth Orbit Network (NEON) programs will further boost both traditional and AI modeling.

Looking farther ahead, some experiments have attempted to fully replace traditional data assimilation systems, moving directly from observations to gridded forecast model inputs. A natural end point could be a fully automated, end-to-end weather forecast system, potentially with multiple models working together in sequence. Such a system would process observations into inputs for forecast models, then run those models and process forecast outputs into useful products.

The effects of the AI revolution are beginning to be felt across society, including in key sectors of the economy such as energy, agriculture, and transportation. For weather forecasting, AI technology has the potential to streamline observational data processing, use computational resources more efficiently, improve forecast accuracy and range, and even create entirely new products. Ultimately, current technologies and coming innovations may save money and help better protect lives by seamlessly delivering faster and more useful predictions of future conditions.

Author Information

Justin Shenolikar (justin.shenolikar@iup.uni-heidelberg.de), European Organisation for the Exploitation of Meteorological Satellites, Darmstadt, Germany; now at Universität Heidelberg, Germany; and Paolo Ruti and Chris Yoon Sang Chung, European Organisation for the Exploitation of Meteorological Satellites, Darmstadt, Germany

Citation: Shenolikar, J., P. Ruti, and C. Y. S. Chung (2025), The AI revolution in weather forecasting is here, Eos, 106, https://doi.org/10.1029/2025EO250363. Published on 3 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Charged particle transport across magnetic field driven by magnetosonic solitons in plasma

Physical Review E (Plasma physics) - Fri, 10/03/2025 - 10:00

Author(s): N. V. Gerasimenko, F. M. Trukhachev, M. M. Vasiliev, and O. F. Petrov

This work investigates the unidirectional transport of charged particles across magnetic field lines driven by magnetosonic solitons in a collisionless plasma. Within the framework of a magnetohydrodynamic model, numerical solutions as well as analytical expressions for small-amplitude solitons are …


[Phys. Rev. E 112, 045201] Published Fri Oct 03, 2025

The evolution of the Matai’an landslide dam

EOS - Fri, 10/03/2025 - 07:10

Some excellent before and after imagery is now available showing the evolution of the Matai’an landslide dam.

The active GIS/spatial analysis community in Taiwan has produced some fascinating analysis of the Matai’an landslide. Much of this has been posted to Facebook (which is not my favourite platform, but sometimes you have to go where the information resides).

Tony Lee has produced an incredibly interesting comparison of the dam before and after the overtopping and breach event, based upon imagery captured before the event on 18 August and after the event on 25 September. Unfortunately, WordPress really doesn’t like Facebook embeds, so you’ll need to follow this link:

Tony Lee Facebook post

This is a still from the video:-

Before and after images of the Matai’an landslide dam. Video by Tony Lee, posted to Facebook.

The depth and scale of the incision is very clear – the flow clearly rapidly cut into and eroded into the debris. It has left very steep slopes on both sides in weak and poorly consolidated materials.

So, a very interesting question will now pertain to the stability of these slopes. How will they perform in conditions of intense rainfall and/or earthquake shaking? Is there the potential for a substantial slope failure on either side, allowing a new (enlarged) lake to form.

This will need active monitoring (InSAR may well be ideal). The potential problems associated with the Matai’an landslide are most certainly not over yet.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Estimation of the depth of magnetisation from 3D elastic net sparse inversion model of geomagnetic data

Geophysical Journal International - Fri, 10/03/2025 - 00:00
SummaryInversion of geomagnetic anomaly data poses an ill-posed problem, and extremal models such as equivalent source layers or point-source distributions can explain observations to the same degree as volumetric magnetisation distributions. However, the spectral characteristics of magnetic anomalies provide fundamental constraints for magnetic source-depth estimation. Specifically, the maximum detectable depth of crustal magnetic sources is dictated by the longest wavelengths present in the field, which correspond to the low-wavenumber bands of the spectrum. This relationship is often analysed through the log power spectrum versus wave number plot, using the slopes of the linear segment for depth estimation. Methods aiming at reconstructing the depth to the bottom of magnetisation from spectral field characteristics are commonly referred to as spectral methods. However, these methods are based on assumptions about the statistical properties of the source distribution and are prone to misinterpretations. Here, we apply sparsity-constrained 3D inversion of magnetic data using an elastic net regularisation to recover the susceptibility distribution and the bottom of magnetisation. We claim that the elastic net (ℓ2ℓ1 norm) regularisation, when properly tuned to balance the solution’s smoothness with sparsity, stabilises the inversion, avoiding extremal magnetisation distributions and generating a geologically plausible source depth distribution that is consistent with the expected source distribution. The ℓ1 norm brings sparsity and high resolution, while the ℓ2 norm brings inversion stability and structural continuity to the final model. From the recovered 3D elastic net sparse inversion model, we extract the depths of all the deepest non-zero susceptibility values and suggest this to be an alternative estimate to the base of magnetisation. Moreover, we suggest that the resulting 3D model has a value in itself and may aid geological interpretation.

Tracing geomagnetic field strength in South America south of 30°S: new archaeomagnetic data from well-dated pottery (San Juan, Argentina)

Geophysical Journal International - Fri, 10/03/2025 - 00:00
SummaryGaining insight into the centennial evolution of the geomagnetic field over the past 2000 years requires the acquisition of reliable palaeomagnetic data from the study of well-dated archaeological materials or rocks. However, despite previous efforts, palaeointensity data from regions south of 30°S are still underrepresented, potentially limiting the accuracy of global geomagnetic field models and their applications. In addition, a comprehensive understanding of the geomagnetic field evolution in South America is particularly relevant, as the recent geomagnetic secular variation has been mainly characterised by the significant growth of the South Atlantic Anomaly over the past three centuries. The evolution of this low-intensity region, currently centered over central South America, is well understood in detail only during the last few centuries, thanks to the availability of direct measurements. For both the geomagnetic and palaeomagnetic communities, understanding its evolution prior to this period remains a challenge. This study presents new palaeointensity estimates from San Juan Province, central western Argentina, based on the analysis of 23 pottery samples dated between the 3rd and 17th centuries CE using radiocarbon and archaeological constraints. We employed the Thellier-Thellier method, incorporating partial thermoremanent magnetisation (pTRM) checks, TRM anisotropy corrections, and cooling rate adjustments, and obtained 11 mean palaeointensity values of good technical quality for central South America. The results are consistent with the limited number of previously reported high-quality palaeointensity data within an area 900 km in radius centered on San Juan, all showing intensity values ranging from approximately 40 to 55 μT. The new data, combined with these previously published high-quality intensities, do not show anomalously low values in intensity in the region between 200 and 1750 CE, suggesting no significant impact of the South Atlantic Anomaly in the region before the past three centuries. Furthermore, the findings suggest the presence of rapid multidecadal variations between 800 and 1100 CE, a behaviour also observed in other regions worldwide, which may point to a global or dipolar origin for these variations. By enhancing the dataset for this latitude range, this work provides new constraints on the geomagnetic field’s past behaviour south of 30°S over South America and contributes to improving future global geomagnetic reconstructions.

Advanced GPR Signal Reconstruction Using a Hybrid Approach of Reverse Time Migration and Projection Onto Convex Sets

Geophysical Journal International - Fri, 10/03/2025 - 00:00
SummaryAbsence of traces tends to reduce the quality and reliability of Ground Penetrating Radar (GPR) data due to equipment, sensor coverage, and acquisition limitations. This is a significant limitation to Full Waveform Inversion (FWI) and Reverse Time Migration (RTM) advanced imaging techniques, which rely on dense and continuous data. To address this challenge, we propose an effective interpolation method using the Projection onto Convex Sets (POCS) algorithm, originally developed for seismic data reconstruction. The algorithm is formulated in a compressed sensing framework, taking advantage of Fourier sparsity and iterative thresholding in the time domain to iteratively update spectral coefficients during reconstruction. We compare its performance on synthetic and real GPR data with various percentages of missing data. Results indicate that the POCS algorithm, in addition to reconstructing missing traces at high precision, significantly improves subsequent RTM imaging structural resolution. We also compare POCS with conventional Kriging and a deep learning-based interpolation model (DL-Net) to benchmark its performance. The proposed method achieves superior reconstruction quality and stability, particularly under high sparsity conditions. This study highlights the practical potential of POCS in enhancing GPR image fidelity and interpretation under real-world acquisition limitations.

Study shows the world is far more ablaze now with damaging fires than in the 1980s

Phys.org: Earth science - Thu, 10/02/2025 - 19:16
Earth's nastiest and costliest wildfires are blazing four times more often now than they did in the 1980s because of human-caused climate change and people moving closer to wildlands, a new study found.

When it comes to storing carbon, the Arctic presents a winter surprise

Phys.org: Earth science - Thu, 10/02/2025 - 18:42
The ocean holds gigantic amounts of carbon, much more than all land-based plants and soil. Scientists previously studied these carbon stocks in spring and summer. Now, in two published studies, they have looked at what happens in winter.

Ancient plankton hint at steadier future for ocean life

Phys.org: Earth science - Thu, 10/02/2025 - 18:00
A team of scientists has uncovered a rare isotope in microscopic fossils, offering fresh evidence that ocean ecosystems may be more resilient than once feared.

Unexpected region of the Amazon is experiencing 'alarming' rapid growth in climate extremes

Phys.org: Earth science - Thu, 10/02/2025 - 17:39
An unexpected region of the Amazon is at the forefront of rapid growth in climate extremes, a new report reveals. The central north Amazon, a region with extensive areas of high forest cover, natural savannas and vast Indigenous territories, was not previously considered as being the most affected by climate change.

Microbes trapped in permafrost awake after thousands of years

Phys.org: Earth science - Thu, 10/02/2025 - 17:30
In a new study, a team of geologists and biologists led by CU Boulder resurrected ancient microbes that had been trapped in ice—in some cases for around 40,000 years.

A kinky twist: Some rock folds may strengthen Earth's crust, not weaken it

Phys.org: Earth science - Thu, 10/02/2025 - 14:50
A first thought when describing a rock formation likely isn't a mille-feuille, but there are actually certain types composed of many thin layers that bring the flaky pastry to mind. Not only that—but these rocks can quite literally fold under pressure. These formations have the interesting ability to fold under compressive forces and form sharply localized bends known as kink bands.

Volcanic ash may enhance phytoplankton growth in the ocean over 100 km away

Phys.org: Earth science - Thu, 10/02/2025 - 14:30
A research group in Japan has suggested that ash released from volcanic eruptions on Nishinoshima Island—part of Japan's Ogasawara Islands—led to a temporary surge in phytoplankton levels in the seawater around Mukojima Island, which is located 130 km northeast of Nishinoshima and is also part of the Ogasawara Islands.

Old Forests in the Tropics Are Getting Younger and Losing Carbon

EOS - Thu, 10/02/2025 - 13:10

The towering trees of old forests store massive amounts of carbon in their trunks, branches, and leaves. When these ancient giants are replaced by a younger cohort after logging, wildfire, or other disturbances, much of this carbon stock is lost.

“We wanted to actually quantify what it means if an old forest becomes young.”

“We’ve known for a long time that forest age is a key component of the carbon cycle,” said Simon Besnard, a remote sensing expert at the GFZ Helmholtz Centre for Geosciences in Potsdam, Germany. “We wanted to actually quantify what it means if an old forest becomes young.”

The resulting study, published in Nature Ecology and Evolution, measured the regional net aging of forests around the world across all age classes between 2010 and 2020, as well as the impact of these changes on aboveground carbon.

To do this, the team developed a new high-resolution global forest age dataset based on more than 40,000 forest inventory plots, biomass and height measurements, remote sensing observations, and climate data. They combined this information with biomass data from the European Space Agency and atmospheric carbon dioxide observations.

The results point to large regional differences. While forests in Europe, North America, and China have aged during this time, those in the Amazon, Southeast Asia, and the Congo Basin were younger in 2020 than 10 years prior.

A number of recent studies have shown that forests are getting younger, but the new analysis quantifies the impact of this shift on a global level, said Robin Chazdon, a tropical forest ecologist at the University of the Sunshine Coast in Queensland, Australia, who was not involved in the study. “That’s noteworthy and a very important concept to grasp because this has global implications, and it points out where in the world these trends are strongest.”

Carbon Impact

The study identifies the tropics, home to some of the world’s oldest forests, as a key region where younger forests are replacing older ones.

In this image from 2020, old-growth forests are most evident in tropical areas in South America, Africa, and Southeast Asia. Credit: Besnard et al., 2021, https://doi.org/10.5194/essd-13-4881-2021, CC BY 4.0

On average, forests that are at least 200 years old store 77.8 tons of carbon per hectare, compared to 23.8 tons per hectare in the case of forests younger than 20 years old.

The implications for carbon sequestration are more nuanced, however. Fast-growing young forests, for instance, can absorb carbon much more quickly than old ones, especially in the tropics, where the difference is 20-fold. But even this rate of sequestration is not enough to replace the old forests’ carbon stock.

Ultimately, said Besnard, “when it comes to a forest as a carbon sink, the stock is more important than the sink factor.”

“It’s usually more cost-, carbon-, and biodiversity-effective to keep the forest standing than it is to try to regrow it after the fact.”

In the study, only 1% of the total forest area transitioned from old to young, primarily in tropical regions. This tiny percentage, however, accounted for more than a third of the lost aboveground carbon documented in the research— approximately 140 million out of the total 380 million tons.

“It’s usually more cost-, carbon-, and biodiversity-effective to keep the forest standing than it is to try to regrow it after the fact. I think this paper shows that well,” said Susan Cook-Patton, a reforestation scientist at the Nature Conservancy in Arlington, Va., who was not involved in the study. “But we do need to draw additional carbon from the atmosphere, and putting trees back in the landscape represents one of the most cost-effective carbon removal solutions we have.”

The increased resolution and details provided by the study can help experts better understand how to manage forests effectively as climate solutions, she said. “But forest-based solutions are not a substitute for fossil fuel emissions reductions.”

Open Questions

When carbon stored in trees is released into the atmosphere depends on what happens after the trees are removed from the forest. The carbon can be stored in wooden products for a long time or released gradually through decomposition. Burning, whether in a forest fire, through slash-and-burn farming, or as fuel, releases the carbon almost instantly.

“I think there is a research gap here: What is the fate of the biomass being removed?” asked Besnard, pointing out that these effects have not yet been quantified on a global scale.

Differentiating between natural, managed, and planted forests, which this study lumps together, would also offer more clarity, said Chazdon: “That all forests are being put in this basket makes it a little bit more challenging to understand the consequences not only for carbon but for biodiversity.”

She would also like to see future research on forest age transitions focus on issues beyond carbon: “Biodiversity issues are really paramount, and it’s not as easy to numerically display the consequences of that as it is for carbon.”

“We are only looking at one metric, which is carbon, but a forest is more than that. It’s biodiversity, it’s water, it’s community, it’s many things,” agreed Besnard.

—Kaja Šeruga, Science Writer

Citation: Šeruga, K. (2025), Old forests in the tropics are getting younger and losing carbon, Eos, 106, https://doi.org/10.1029/2025EO250369. Published on 2 October 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

机器学习模拟千年气候

EOS - Thu, 10/02/2025 - 13:10
Source: AGU Advances

This is an authorized translation of an Eos article. 本文是Eos文章的授权翻译。

近年来,科学家们发现,基于机器学习的天气模型可以比传统模型更快地做出天气预测,且使用更少的能耗。然而,许多这些模型无法准确预测未来15天以上的天气,并且到第 60 天时就会开始模拟出不切实际的天气。

深度学习地球系统模型(Deep Learning Earth System Model,简称DLESyM)建立在两个并行运行的神经网络上:一个模拟海洋,另一个模拟大气。在模式运行期间,对海洋状况的预测每四个模式日更新一次。由于大气条件演变得更快,对大气的预测每12个模式小时更新一次。

该模型的创建者Cresswell-Clay 等人发现,DLESyM 与过去观测到的气候非常吻合,并能做出准确的短期预测。以地球当前的气候为基准,它还可以在不到 12 小时的计算时间内,准确模拟 1000 年周期内的气候和年际变化。它的性能通常与基于耦合模式比对计划第六阶段(CMIP6)的模型相当,甚至优于后者,CMIP6目前在计算气候研究中被广泛使用。

DLESyM 模型在模拟热带气旋和印度夏季季风方面优于 CMIP6 模型。它至少与 CMIP6 模型一样准确地捕捉了北半球大气“阻塞”事件的频率和空间分布,而这些事件可能导致极端天气。此外,该模型预测的风暴也非常真实。例如,在 1000 年模拟结束时(3016 年)生成的东北风暴的结构与 2018 年观测到的东北风暴非常相似。

然而,新模型和CMIP6 模型都无法很好地描述大西洋飓风 的气候特征。此外,对于中期预报(即未来 15 天左右的预报),DLESyM 的准确性低于其他机器学习模型。尤其重要的是,DLESyM 模型仅对当前气候进行模拟,这意味着它没有考虑人类活动引起的气候变化。

作者认为,DLESyM模型的主要优势在于,它比运行CMIP6 模型所需的计算成本要低得多,这使得它比传统模型更容易使用。(AGU Advances, https://doi.org/10.1029/2025AV001706, 2025)

—科学撰稿人Madeline Reinsel

This translation was made by Wiley. 本文翻译由Wiley提供。

Read this article on WeChat. 在微信上阅读本文。

Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

As California glaciers disappear, people will see ice-free peaks exposed for the first time in millennia

Phys.org: Earth science - Thu, 10/02/2025 - 09:32
For as long as there have been people in what is now California, the granite peaks of the Sierra Nevada have held masses of ice, according to new research that shows the glaciers have probably existed since the last Ice Age more than 11,000 years ago.

The aftermath of the Matai’an landslide and dam breach in Taiwan

EOS - Thu, 10/02/2025 - 07:54

Good digital data is now being published that presents the scale of landscape change that occurred as a result of the Matai’an landslide hazard cascade. There is also interesting information about the root causes of the vulnerability of the town of Guangfu, where the fatalities occurred.

Some interesting information is now emerging about the Matai’an landslide and dam breach, much of it published in Taiwan in Mandarin. A very interesting post has appeared on the website of the Aerial Survey and Remote Sensing Branch that uses aerial imagery before and after the hazard cascade to analyse terrain changes. It is based upon this figure that they have published:-

Vertical elevation change before and after the Matai’an landslide and dam breach. Published by ASRS in Taiwan.

This uses LIDAR data from before and after the sequence of events, which has been turned into one metre Digital Elevation Model, which have then been digitally compared. Note this gives vertical change.

In the source area of the landslide, where the topography is extremely steep, there is over 300 metres of elevation reduction. Downslope and in the area of the dam and lake, the elevation change is over 200 m of accumulation – this is the landslide debris, whivch will now be mobilised in successive rain storm events. In the main channel, the river bed has aggraded (increased in elevation) by over ten metres, although the analysis shows that at point C this was 52 metres! This is going to cause very substantial issues in the future unless a large scale mitigation exercise is undertaken.

The cross-section through the landslide is fascinating:-

A cross-section showing vertical elevation change before and after the Matai’an landslide and dam breach. Published by ASRS in Taiwan.

This shows extremely well the rupture surface of the failure, which clearly had a rotational element, and the infilling of the bedrock topography by the landslide debris. Meanwhile, there is a good helicopter video on Facebook that shows the aftermath of the dam breach.

On a different matter, there is a huge amount of discussion in Taiwan as to why so little effort was made to mitigate the hazard associated with a breach of the Matai-an landslide dam. Writing in the Taipei Times, Michael Turton has a great article exploring the socio-political reasons why this disaster played out as it did. The bottom line is that Guangfu was built on a floodplain – a problem in so many places, but particularly acute in the almost uniquely dynamic physical geography of Taiwan. Levees were built to protect the town, which caused the river to aggrade even before the dam break event. And thus, the scene was set.

Hazards can be natural, disasters are not.

Return to The Landslide Blog homepage Text © 2023. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Antarctic Sea ice emerges as key predictor of accelerated ocean warming

Phys.org: Earth science - Thu, 10/02/2025 - 06:00
A study published today in Earth System Dynamics provides a critical and previously underestimated connection between Antarctic sea ice, cloud cover, and global warming. This research is important because it shows that a greater extent of Antarctic sea ice today, compared to climate model predictions, means we can expect more significant global warming in the coming decades.

Satellite inspection flying using a Lorentz spacecraft

Publication date: Available online 24 September 2025

Source: Advances in Space Research

Author(s): M.A. Klyushin, A.A. Tikhonov

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer