Feed aggregator

Rigid-flexible-thermal coupling dynamic modeling and attitude control of large spaceborne deployable parabolic truss antenna

Publication date: 15 February 2026

Source: Advances in Space Research, Volume 77, Issue 4

Author(s): Shijie Zhou, Zhen Yang, Kaicheng Zhang, Xiang Liu, Guoping Cai

Thunderstorms conjure ghostly coronae in treetops, observed outdoors for the first time

Phys.org: Earth science - Tue, 02/24/2026 - 17:00
For the first time, researchers have observed and measured weak electrical discharges, known as coronae, on trees during thunderstorms. A new study describes the near-invisible sparkles appearing similarly on branches of several tree species up and down the U.S. East Coast during the summer of 2024, implying that thunderstorms may paint entire canopies with a scintillating blue glow, albeit too faintly for human eyes to see.

How to Accelerate Advances in Ecological Forecasting

EOS - Tue, 02/24/2026 - 13:59

Just as meteorologists routinely predict temperature changes, storm trajectories, and other weather patterns, ecologists also forecast how ecosystems and environmental conditions can change in the near future. These ecological forecasts are rooted in scientific understanding of how natural systems behave and react, providing predictions of the future of ecosystems along with information about associated uncertainties.

Ecological forecasts offer tangible, practical insights. For example, they can estimate grass availability and quality for livestock and predict red tides along a coastline. They can support decisionmaking across society, guiding strategies for managing farms, forests, and fisheries, as well as for monitoring invasive or endangered species, assessing water quality, and implementing nature-based climate solutions. These forecasts can also influence everyday choices, such as when to take allergy medication during pollen season, whether to avoid the beach because of harmful algal blooms, and whether to reconsider a move to an area at risk of wildfires.

Ecological forecasts are increasingly vital today as we face rapid environmental changes and catastrophic biodiversity losses.

Demand for ecological forecasts is growing as more decisionmakers and natural resource managers recognize the importance of ecosystem services such as carbon storage, pollination, natural hazard mitigation, cultural benefits, and the provisioning of water, food, and other natural resources. Critically, these forecasts—produced by a community of researchers and practitioners across academia, government agencies, and industry—are increasingly vital today as we face rapid environmental changes and catastrophic biodiversity losses.

Iteratively developing forecasting models improves their predictive capabilities and scientific understanding of the systems they’re modeling. Weather forecasting models, for example, have seen tremendous improvements in accuracy and reliability over the past few decades, largely because meteorologists use them to generate and test hypotheses about atmospheric dynamics multiple times a day across millions of locations.

By comparison, ecological forecasting capabilities remain underdeveloped, partly because it is a much younger field that has received less sustained focus. Ecological forecasts also encompass a greater variety of processes and timescales. For example, some researchers model coupled physical, biogeochemical, and ecological processes across large regions to forecast forest productivity decades into the future, while others must incorporate highly localized weather conditions to predict stream dissolved oxygen levels just a day ahead.

U.S. Geological Survey scientist Jenny Briggs measures the trunk of a tree killed by mountain pine beetles. Such measurements inform ecological forecasting, which can help foresters to predict and respond to future insect outbreaks. Credit: U.S. Geological Survey, Public Domain

These complexities have contributed to the lack of a unified or standardized system for ecological forecasting. As a result, various organizations, such as federal and state agencies, industry groups, and academic institutions, have independently developed their own boutique forecasting systems.

Some diversity in approaches is essential for innovation, especially in an evolving and multidisciplinary field. But the absence of a unified system, shared infrastructure, and scalable practices often creates unnecessary duplication and inefficiencies that can hamper the scientific community’s ability to generate critical ecological predictions reliably. It may also limit our ability to deepen understanding of the environment. In brief, the current state of ecological forecasting often falls short of meeting societal needs.

Plenty of Data, but Barriers to Forecasting Remain

During a series of meetings held from 2020 to 2022 and organized by the Ecological Forecasting Initiative (EFI), representatives from U.S. federal agencies concluded that the primary bottlenecks to providing actionable ecological forecasts do not stem from technical or scientific shortcomings of current ecological models or from data availability. Instead, the challenges lie in generating routine forecasts efficiently and in effectively communicating them to end users.

A primary barrier to efficient ecological forecast generation is the limited interoperability among forecasting systems [Geller et al., 2022]. Different systems use different data and metadata formats, modeling approaches, and workflow structures. Such diversity is not unique to forecasting, but the requirements of operationalizing a model, such as real-time data access, fault-tolerant workflows, and translating output to decision-relevant metrics, amplify the difficulties posed by noninteroperable systems.

The lack of standardization among forecasting systems slows—and in many cases prevents—the development of robust, scalable forecasts.

The lack of standardization slows—and in many cases prevents—the development of robust, scalable forecasts. It also limits their reuse across platforms, reducing their overall effectiveness. Adopting shared tools and standards across the ecological forecasting community would signal that the field of ecological forecasting is maturing, helping to build trust and encourage adoption by decisionmakers.

A second major barrier to efficiency is redundancy among different ecological forecasting efforts. Many agencies and institutions tackle similar forecasting problems using different tools and workflows, often without coordination. This duplication of effort wastes valuable time, labor, and computational power, and the absence of shared infrastructure and protocols leads teams to re-create processes and datasets instead of building on existing efforts. For example, organizations and research groups often maintain their own in-house workflows for downloading gridded weather forecasts, converting these data to more user-friendly formats, and ingesting them into their forecasting models and tools.

Shifting away from boutique approaches to reusable, community-developed workflows could substantially improve interoperability and reduce redundancy in ecological forecasting. Using shared tools, developed and improved by many contributors, can also lower the time, effort, and cost needed to launch new forecasts. Maintaining workflows based on these tools is often more affordable, easier to manage, and less prone to errors than sustaining separate, individually built systems [Fer et al., 2021]. This collaborative approach also fosters innovation as improved tools and techniques are adopted by a community of users, rather than only for specialized individual projects that may not justify the investment to develop the tools.

Without effective collaboration, the ecological forecasting community may miss valuable opportunities to combine the diverse expertise and resources.

Inefficiencies and the lack of interoperability in ecological forecasting often arise because many researchers work in isolation, limited by technological and institutional siloing. These silos restrict the exchange of knowledge, data, and tools. Without effective collaboration, the ecological forecasting community may miss valuable opportunities to combine the diverse expertise and resources found in academia, government, and industry.

This disconnection leads to fragmented knowledge bases and isolated advancements, making it difficult to develop cohesive and integrated approaches to ecological forecasting. By working together to improve the technical foundations, or cyberinfrastructure, of ecological forecasting, we could substantially enhance our ability to anticipate changes in ecosystems and support improved decisionmaking.

Learning from Success Stories

Examples of how shared cyberinfrastructure can enhance predictions about ecosystems come from both within and outside the ecological forecasting community. For instance, decades of sustained funding and incremental improvements for weather forecasting infrastructure, led by agencies such as NOAA’s National Weather Service, have enabled scalable, robust systems that transform vast amounts of data into reliable and actionable forecasts. These forecasts support decisionmaking across government, industry, and the public, informing choices related to safety, planning, resource management, and more.

A notable example of shared cyberinfrastructure advancing ecological science is the National Ecological Observatory Network’s (NEON) Ecological Forecasting Challenge [Thomas et al., 2023; Thomas and Boettiger, 2025]. This initiative welcomed forecasting experts and students to use large-scale environmental data from NEON and forecasting models to predict ecological changes at 81 sites across the United States.

Since the challenge launched in 2021, more than 82 million forecasts have been processed by the shared cyberinfrastructure, enabling synthesis of forecast skill across dozens of models and ecosystems. For example, air temperature emerged as a crucial predictor in lake water temperature and dissolved oxygen forecasts [Olsson et al., 2025], and the ability to forecast spring leaf out accurately in deciduous forests varied with how fast green-up occurred (leaf out predictions are harder to make where green-up is faster) [Wheeler et al., 2024].

A migratory barn swallow (Hirundo rustica) rests on a branch in Seedskadee National Wildlife Refuge, in Wyoming. By combining traditional bird banding surveys with radar technology and machine learning, researchers can now forecast bird migrations more accurately (e.g., with BirdCast). These forecasts benefit bird conservation efforts and help enhance public safety during migration seasons. Credit: Tom Koerner/U.S. Fish and Wildlife Service, Public Domain

Numerous other examples demonstrate the value of cyberinfrastructure for ecological forecasting, as well as related services and decisionmaking [e.g., White et al., 2019; Zwart et al., 2023]. However, many of these initiatives have been one-off projects that lack sustainability or broad applicability. To reduce the community’s reliance on specialized cyberinfrastructure and methods and to ensure interoperability across systems, it is crucial that the ecological forecasting community develop and adopt standards and protocols for data management, model inputs and outputs, and workflows [Dietze et al., 2023; Geller et al., 2022]. Establishing these conventions will enhance data consistency and efficient data analysis, facilitate dissemination of forecasted data, and support creation of shared, reusable tools.

Overcoming Obstacles to Build Forecasting Infrastructure

During a 2024 EFI workshop focused on synthesizing best practices for cyberinfrastructure, participants agreed on key design principles that should be adopted, such as common metadata standards, the use of open-source technologies, and modular and scalable architecture. However, they also recognized that establishing infrastructure that adheres to these best practices faces obstacles and institutional challenges, including technical complexity, organizational silos and resource constraints, and a lack of centralized leadership.

The technical skills required to develop ecological forecasts can present a steep learning curve for ecologists.

The technical skills required to develop ecological forecasts, such as in software development, cloud architecture, and data management, can present a steep learning curve for ecologists. To bridge this skills gap, the ecological forecasting community could adopt mentoring programs in which ecologists collaborate with cyberinfrastructure and open-source technology experts to build skills needed for automated forecast systems. Integrating software development and cloud technologies into higher education curricula would introduce these concepts early in ecological training. And embedding dedicated software engineers within forecasting teams—rather than expecting domain scientists to develop technical expertise alongside their core responsibilities—would distribute the technical workload needed for creating forecast systems.

Institutional culture and siloed structures often incentivize short-term, competitive research focused on novel science, rather than development of stable, iterative, and reusable forecasting approaches. In addition, differing missions and policies among agencies and between agencies, industry, and academic institutions can unintentionally hinder collaboration.

Overcoming these barriers could involve building broad, transdisciplinary communities of practice that bring together ecologists, modelers, information technology professionals, and decisionmakers. Such communities can foster collaboration, align incentives, and promote the adoption of best practices for ecological forecasting. Grassroots efforts like the EFI and more formal structures such as the Interagency Council for Advancing Meteorological Services offer complementary models for this kind of engagement.

By connecting individuals with complementary expertise, these communities can facilitate knowledge exchange, establish shared standards, advocate for cyberinfrastructure investment, and codevelop robust forecasting tools that address real-world ecological challenges. In addition, the success of shared cyberinfrastructure ultimately relies on leaders within agencies, industry, and academia championing these efforts—leaders whom grassroots communities can help identify and support. Such leaders can emerge at any level of an organization, from graduate students to professors and from technicians to directors.

A strong community and clear leadership are especially important now, as the systems supporting ecological forecasting are rapidly transitioning to cloud computing, which offers both opportunities and challenges. Cloud platforms offer unprecedented scalability, enabling high-resolution models, real-time data assimilation, and automated forecast pipelines. Cyberinfrastructure design principles, such as modularity, align well with cloud-based architecture because modular designs allow components to scale independently based on demand, isolate failures to prevent system-wide crashes, and promote reusability across different cloud-based projects.

The progress seen in weather forecasting demonstrates what becomes possible when scientific communities invest in shared infrastructure, open standards, and sustained collaboration.

However, as organizations deepen their reliance on commercial cloud services, they may face higher costs and increased dependence on vendors. To mitigate these risks, institutions could collaborate on shared strategies that balance the benefits of cloud-native tools with the stability and autonomy of maintaining selected on-premises resources, particularly for predictable, long-running workloads that are more cost-efficient to host locally.

The progress seen in weather forecasting demonstrates what becomes possible when scientific communities invest in shared infrastructure, open standards, and sustained collaboration. For example, the average 3-day hurricane track error decreased from about 220 miles (354 kilometers) in 2000 to roughly 70 miles (113 kilometers) today, a testament to the power of improved models, data systems, and coordinated expertise [Ritchie, 2024].

Ecological forecasting could similarly see transformative gains, but success hinges on establishing a unified, community-driven framework of best practices to overcome barriers and develop a robust shared cyberinfrastructure. Ultimately, this collective effort will enhance the reliability and impact of ecological forecasts, empowering decisionmakers to better manage natural resources, anticipate environmental change, and safeguard public well-being.

Acknowledgments

We thank David Watkins for a helpful review of an earlier version of the manuscript. Any use of trade, firm, or product names is for descriptive purposes only and does not imply endorsement by the U.S. government.

References

Dietze, M. C., et al. (2023), A community convention for ecological forecasting: Output files and metadata version 1.0, Ecosphere, 14(11), e4686, https://doi.org/10.1002/ecs2.4686.

Fer, I., et al. (2021), Beyond ecosystem modeling: A roadmap to community cyberinfrastructure for ecological data‐model integration, Global Change Biol., 27(1), 13–26, https://doi.org/10.1111/gcb.15409.

Geller, G. N., et al. (2022), NASA Biological Diversity and Ecological Forecasting: Current state of knowledge and considerations for the next decade, p. 201, NASA, Washington, D.C., cce-signin.gsfc.nasa.gov/files/announcements/announcement_271.pdf.

Olsson, F., et al. (2025), What can we learn from 100,000 freshwater forecasts? A synthesis from the NEON Ecological Forecasting Challenge, Ecol. Appl., 35(1), e70004, https://doi.org/10.1002/eap.70004.

Ritchie, H. (2024), Weather forecasts have become much more accurate; we now need to make them available to everyone, Our World in Data, archive.ourworldindata.org/20251125-173858/weather-forecasts.html.

Thomas, R. Q., and C. Boettiger (2025), Cyberinfrastructure to support ecological forecasting challenges, ESS Open Arch., https://doi.org/10.22541/essoar.175917344.44115142/v1.

Thomas, R. Q., et al. (2023), The NEON Ecological Forecasting Challenge, Front. Ecol. Environ., 21(3), 112–113, https://doi.org/10.1002/fee.2616.

Wheeler, K. I., et al. (2024), Predicting spring phenology in deciduous broadleaf forests: NEON phenology forecasting community challenge, Agric. For. Meteorol., 345, 109810, https://doi.org/10.1016/j.agrformet.2023.109810.

White, E. P., et al. (2019), Developing an automated iterative near‐term forecasting system for an ecological study, Methods Ecol. Evol., 10(3), 332–344, https://doi.org/10.1111/2041-210X.13104.

Zwart, J. A., et al. (2023), Near‐term forecasts of stream temperature using deep learning and data assimilation in support of management decisions, J. Am. Water Resour. Assoc., 59(2), 317–337, https://doi.org/10.1111/1752-1688.13093.

Author Information

Jacob A. Zwart (jzwart@usgs.gov), U.S. Geological Survey, San Francisco, Calif.; Cameron Thompson, Northeastern Regional Association of Coastal Ocean Observing Systems, Portsmouth, N.H.; Hassan Moustahfid, U.S. Integrated Ocean Observing System, NOAA, Silver Spring, Md.; Jessica Burnett, NASA, Washington, D.C.; and Michael Dietze, Boston University, Boston, Mass.

Citation: Zwart, J. A., C. Thompson, H. Moustahfid, J. Burnett, and M. Dietze (2026), How to accelerate advances in ecological forecasting, Eos, 107, https://doi.org/10.1029/2026EO260066. Published on 24 February 2026. Text not subject to copyright.
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Direct-3D Variational Bayesian Surface Wave Inversion and Its Application to Ambient Noise Tomography beneath Great Britain

Geophysical Journal International - Tue, 02/24/2026 - 00:00
SummaryWe present a new, variational, fully nonlinear, probabilistic ambient noise tomography method, which estimates subsurface structure and quantifies the corresponding uncertainties directly in three dimensions (3D) from inter-receiver seismic surface wave dispersion data. We use the method to invert for high resolution 3D seismic velocity models of the upper crust beneath Great Britain using seismic ambient noise data recorded around the region – a task that proved too high-dimensional and hence computationally demanding for Monte Carlo sampling to converge to a stable solution. We compare the inversion results from the new method to those obtained from two standard, indirect inversion methods, in which 2D (geographical) surface wave velocity maps and 1D (depth) shear velocity profiles are estimated in two separate, consecutive steps. The results show that the direct-3D scheme preserves better lateral continuity and produces better data fit than the two-step methods, and provides information about lateral correlations that is absent from the two-step solutions. The inversion results are consistent with large-scale geology of Great Britain, and for the first time provide seismologically-imaged evidence of the Great Glen Fault and other major tectonic faults. We therefore propose that direct-3D inversion schemes should be used where possible for surface wave inversion as they provide improved results at little additional computational cost.

Integrated Dislocation and Strain Models for 3D Coseismic Deformation field from GNSS and InSAR

Geophysical Journal International - Tue, 02/24/2026 - 00:00
SummaryAccurate three-dimensional coseismic deformation fields are critical for fault mechanics analysis and hazard assessment, but the sparse distribution of Global Navigation Satellite System (GNSS) stations often limits reconstruction accuracy. This study proposes Integrated Dislocation and Strain Models (IDSM) that seamlessly integrate GNSS and Interferometric Synthetic Aperture Radar (InSAR) data. This is achieved by combining a surface-constrained strain model and a subsurface-constrained dislocation model, which adaptively optimizes multi-source data weights through Variance Component Estimation (VCE), challenging the traditional reliance on uniformly distributed observations. Simulation experiments demonstrate that under insufficient GNSS coverage, this method improves deformation recovery accuracy by 10 per cent to 70 per cent in the vertical, north, and east components compared to the ESISTEM-VCE (Extended Simultaneous and Integrated Strain Tensor Estimation from Geodetic and Satellite Deformation Measurements-VCE) method, with particularly significant enhancement in the north component. Applied to the 2021 Yangbi MW6.4 and Maduo MW7.4 earthquakes, the study reveals distinct deformation patterns: the Yangbi event exhibits right-lateral strike-slip rupture with a maximum east-west extensional displacement of 87 mm and vertical subsidence of 59.8 mm, showing antisymmetric horizontal deformation around the epicenter. In contrast, the Maduo earthquake is dominated by left-lateral strike-slip motion, with east-west displacement reaching 1.4 m, while north-south and vertical deformations display patchy distributions along the fault. Error analysis confirms accuracy improvements over the ESISTEM‑VCE method. For the Yangbi earthquake, the Root Mean Square Error (RMSE) decreased by 50 per cent (east), 64 per cent (north), and 44 per cent (vertical) at GNSS validation points. Corresponding improvements of 6.1 per cent (east) and 53.5 per cent (north) were achieved for the Maduo earthquake.

Global greening: Study shows Earth's green wave is shifting northeast

Phys.org: Earth science - Mon, 02/23/2026 - 20:00
A team of scientists led by the German Center for Integrative Biodiversity Research (iDiv), the Helmholtz Center for Environmental Research (UFZ), and Leipzig University has developed a new method to track Earth's greenness—a key indicator of vegetation health and activity—by calculating its center of mass.

Scientists isolate climatic fingerprints of wildfires and volcanic eruptions

Phys.org: Earth science - Mon, 02/23/2026 - 20:00
Volcanoes and wildfires can inject millions of tons of gases and aerosol particles into the air, affecting temperatures on a global scale. But picking out the specific impact of individual events against a background of many contributing factors is like listening for one person's voice from across a crowded concourse. MIT scientists now have a way to quiet the noise and identify the specific signal of wildfires and volcanic eruptions, including their effects on Earth's global atmospheric temperatures.

Between flood and drought: The metric that could better explain what happens to water in the age of climate change

Phys.org: Earth science - Mon, 02/23/2026 - 18:40
A key question in any discussion about climate is "How much rain fell?" But perhaps there is an even more important one. Like any household budget, the global water economy is based on "income," that is, water entering the system as precipitation, and "expenditure"—water leaving the system through various forms of evaporation. On land, water evaporates mainly through vegetation, in a process known as evapo-transpiration.

Fracking in Argentina 'linked to hundreds of tremors'

Phys.org: Earth science - Mon, 02/23/2026 - 17:00
The extraction of gas and oil by fracking—large-scale fracturing of underground rocks by injecting water, sand and additives—is generating growing concern in Argentine Patagonia. Neuquén province—home to the country's largest hydrocarbon reserves—has experienced an increase in earthquakes since fracking operations began there in 2015.

AI deep denoiser can remove clouds from satellite images

Phys.org: Earth science - Mon, 02/23/2026 - 16:20
Thick cloud cover can completely obscure the surface of the Earth from satellite view, while thinner haze and shadows distort the image of rural and urban regions. As such, many remote sensing images for monitoring climate, crops, and urban growth are only partially usable.

These South Pole Seismometers Will Detect Vibrations 1.5 Miles Under the Ice

EOS - Mon, 02/23/2026 - 14:18

Right now, more than 1.5 miles (2.46 kilometers) below the surface at the South Pole, lie two seismometers—the deepest of their kind—built to withstand the extreme pressure, cold, and magnetic interference in one of Earth’s harshest environments.

Deploying the instruments, which will be part of the U.S. Geological Survey’s (USGS) Global Seismographic Network, was a “hail Mary” expedition because of the challenges faced, said Robert Anthony, a geophysicist in the Earthquake Hazards Program at the USGS who led the National Science Foundation (NSF)–funded project.

The new seismometers help “fill an enormous, continent-scale gap in our high-quality coverage of the Earth.”

“That they’re functioning a mile and a half deep in the ice is just incredible,” he added.

Now that the instruments have been successfully deployed, they’ll start collecting high-quality seismic information that scientists can use to measure earthquakes, detect tsunamis, and even monitor nuclear testing.

The new seismometers help “fill an enormous, continent-scale gap in our high-quality coverage of the Earth,” said Rick Aster, a seismologist at Colorado State University who was part of the technical review process for the seismometers. “Having a good distribution of stations around the world is a great thing for seismology and Earth science.”

Engineering Under Pressure

Creating seismometers that can withstand being buried in an ice sheet took years of planning, dozens of experts across many organizations, and cold, difficult work at the bottom of the world.

Each seismometer sits at the bottom of a borehole drilled as part of an NSF partnership with the USGS Albuquerque Seismological Laboratory, University of Wisconsin–Madison, and IceCube Neutrino Observatory, which had already been installing subsurface instruments to detect subatomic particles. The holes were drilled with hot water, meaning each is still filled with water that is slowly expanding as it freezes. This “violent, chaotic process,” said Anthony, is exerting extreme pressure on the seismometers, which must be capable of withstanding up to 8,500 pounds per square inch (58,605 kilopascals)—nearly 500 times the pressure of Earth’s atmosphere at sea level.

To protect them, each seismometer is held by a pressure vessel, first created for IceCube’s dark matter experiments, that can withstand about 10,000 pounds per square inch (68,948 kilopascals). The seismometers are also protected from magnetic storms, which can be particularly intense at the poles, with a metal covering that redirects the magnetic field around the instruments. 

USGS geophysicist Robert Anthony explains why the South Pole is the perfect place for these two new instruments. Credit: USGS, Public Domain

A scientific instrument company called Nanometrics helped the team determine how to mount the seismometers within the pressure vessels, while IceCube adapted their existing methods to create a system to allow the instruments to receive GPS signals far below the ice sheet’s surface.

“There’s such a high chance of failure, so many things that can go wrong, that it’s amazing that they both were installed and that they’re both functional.” 

The team finally had a fully operational product in July 2025, just 2 months before the shipping deadline to get the equipment to Antarctica. If their engineering solutions had taken just a month longer, the project may not have gone forward, Anthony said. In the 2 months before shipping, the instruments underwent extensive testing at the Albuquerque Seismological Laboratory, Michigan State University, and the University of Wisconsin. 

Anthony said he expects the seismometers, deployed during the Antarctic summer on 30 December and 9 January, to freeze fully into the ice within the next few months. Having them deployed is a “huge relief,” said David Wilson, director of the USGS Global Seismographic Network and a geophysicist involved in the project. “There’s such a high chance of failure, so many things that can go wrong, that it’s amazing that they both were installed and that they’re both functional.” 

Seismological Knowledge

The two seismometers will be able to record the movement of the planet after large earthquakes and pick up fainter signals with greater fidelity than any previously deployed instruments. The South Pole is the only place on Earth where seismometers can make such observations without distortion from Earth’s rotation. 

Also, the depth and location of the instruments mean they’re far from any surface noise, such as human activity, ocean waves, or wind. Even changes to atmospheric pressure, such as when storms roll in, can affect seismic data. The deeper seismometers are placed, the less those changes affect the instruments. Firn—dense snow in the process of compressing to glacial ice—also dampens surface noise.

Aster likens the installation of the instruments to astronomers trying to find the darkest sky to observe. “This is a vibrational sensor looking for the vibrationally quietest part of the world,” he said.

And because both seismometers will be frozen into the ice sheet, they will be extremely still and will remain so for a very long time. With such stable seismometers, “you can record minute ground motions, on the order of almost the size of an atom—very, very tiny ground motions,” Anthony said. 

The data from the seismometers could answer long-held questions about seismic activity in Antarctica, such as how its ice sheet is moving over bedrock. In places, the ice sheet could be sticking and slipping “in a way that we can observe at a new level of fidelity” using the new seismometers, Aster said. The instruments will also capture unique measurements of the seismic activity of icebergs off Antarctica’s coast and volcanoes in West Antarctica, he said.

The installation of these instruments showcases the value of having a U.S. science presence in Antarctica, Aster added. The South Pole station provides “an absolutely unique and world-class capability” for the U.S. scientific enterprise, he said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Citation: van Deelen, G. (2026), These South Pole seismometers will detect vibrations 1.5 miles under the ice, Eos, 107, https://doi.org/10.1029/2026EO260064. Published on 23 February 2026. Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Missing geomagnetic reversals: Earth's past may be incomplete

Phys.org: Earth science - Mon, 02/23/2026 - 14:00
Several studies have predicted that not all geomagnetic reversals have been discovered, but it was unknown in which periods they might be hidden. Researchers led by the National Institute of Polar Research used a statistical method called adaptive kernel density estimation to model the frequency of geomagnetic reversals at high temporal resolution. Based on the model, they proposed that undiscovered reversals may be hidden in four periods after the Cretaceous Normal Superchron.

Germany's coastal regions brace for change, fearing rising sea levels

Phys.org: Earth science - Mon, 02/23/2026 - 12:30
Standing on the coast and looking out to sea, you cannot detect the changes with the naked eye. But in northern Germany, sea levels are rising, as is the risk of flooding for the lower-lying coastal regions.

Earth's mantle may have been cooler than thought before Pangea's breakup

Phys.org: Earth science - Mon, 02/23/2026 - 12:00
When the supercontinent Pangea began to fragment around 200 million years ago during the Early Jurassic, it reshaped the face of the planet. Vast new oceans opened, continents drifted apart and the familiar geography of today slowly emerged. For decades, many geoscientists have suggested that this dramatic breakup was fueled by an accumulation of heat beneath the supercontinent, a kind of planetary "thermal insulation" effect that caused the underlying mantle (the thick layer of rock between Earth's crust and its core) to grow unusually hot.

Peatland lakes in Congo Basin release carbon that is thousands of years old

Phys.org: Earth science - Mon, 02/23/2026 - 10:00
Researchers at ETH Zurich have now discovered for the first time that large blackwater lakes in the extensive peatlands of the central Congo Basin are releasing ancient carbon. To date, climate researchers had assumed that carbon was stored safely for millenia in the peat. How the carbon is mobilized from the peat to the lake, where it is finally released to the atmosphere, is still unknown. Climate changes and altered land use, especially the conversion of forest to cropland, could exacerbate this trend—with consequences for the global climate.

The 20 February 2026 garbage landslide at Rodriguez, Rizal in the Philippines

EOS - Mon, 02/23/2026 - 07:42

Three people were killed in a major failure at a privately owned garbage dump on Friday. Earlier reports of 50 deaths are now believed to have been erroneous.

On 20 February 2026, the Philippines suffered another major garbage landslide, following the tragic events that occurred at Binaliw in Cebu on 8 January 2026, which killed 35 people. This most recent event occurred at Rodriguez in Rizal.

The location of 20 February 2026 landslide is reported to be Sitio 1B Harangan, Barangay San Isidro in Rodriguez. I believe that the landfill is at [14.77036°, 121.15283], although this is unconfirmed. This is a Google Earth image of the site from April 2025:-

Google Earth image of the likely site of the 20 February 2026 garbage landslide at Rodriguez in the Philippines.

PTV has a news article about this event, which includes mobile phone footage, apparently of the aftermath of the landslide. This is a still from that footage:-

The aftermath of the 20 February 2026 garbage landslide at Rodriguez in the Philippines. Still from a video posted to Facebook by PTV.

One person has been confirmed to have been killed in this landslide, and another two are missing. Early reports of up to 50 people being buried have now been dismissed.

The provincial Governor, Nina Ricci Ynares, has written to the Department of Environment and Natural Resources to request a probe into the event. The landfill was reportedly owned and operated by International Solid Waste Integrated Management Specialist, Inc. (ISWIMS), a private company.

There is a lack of high quality research on garbage landslides, despite their substantial impacts. However, Zhang et al. (2020) provided an interesting review of 62 examples from 22 different countries. They concluded that the following were the most common causes of garbage landslides:-

  • High landfill leachate level (40% of recorded cases);
  • Inadequate compaction (23%)
  • Insufficient bearing capacity of the foundation (19%)
  • Low shear strength of the interface between the liner and the garbage (11%)
  • Rapid release of landfill gas (6%).

It will be interesting to determine the cause of the garbage landslide at Rodriguez, but I would start with an examination of the compaction of the garbage and the management of water / leachate at the site.

Reference

Zhang, Z. et al. 2020. Global study on slope instability modes based on 62 municipal solid waste landfills. Waste Management & Research: The Journal for a Sustainable Circular Economy, 38 (12). https://doi.org/10.1177/0734242X209534.

Return to The Landslide Blog homepage Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Geometry, structure, and tectonic regime of oceanic transform faults revealed by teleseismic earthquake focal mechanisms

Geophysical Journal International - Mon, 02/23/2026 - 00:00
SummaryOceanic transform faults (OTFs) have long been viewed exclusively as vertical, strike-slip structures offsetting mid-ocean ridges, yet their deep geometry and structural complexity remain poorly constrained. Thus, key questions persist, including whether OTFs are single-stranded and continuous, whether they maintain vertical dip angles, if they accommodate mixed-mode slip, and what factors control their geometry. This study addresses these questions through a global statistical analysis of teleseismic earthquake focal mechanisms from 150 OTFs across diverse tectonic settings. We introduce stack maps, a novel method that quantifies fault dip and rake, providing a graphical representation of average focal mechanisms. Our findings reveal that while OTFs tend to conform to the standard vertical, strike-slip model, nearly half exhibit deviations, either in dip or motion, challenging the classical view of these plate boundaries. We identify four distinct OTF categories: (1) those adhering to the standard model, (2) non-vertical faults with transtensive/transpressive components, (3) non-vertical faults accommodating strike-slip motion, and (4) vertical faults with a vertical component of motion. Tectonic regime shifts emerge as a primary driver of structural changes, with non-vertical geometries persisting even after the regime reverts to pure strike-slip motion. This structural memory suggests that fault geometry, once established, remains stable over geological timescales of several tens of Myr. By reconciling previously ’unusual’ focal mechanisms with fault structure and dynamics, this work demonstrates that global seismic catalogues, when analysed statistically, offer robust insights into OTF geometry and tectonic regimes.

Data-driven magnetic anomaly data gap filling

Geophysical Journal International - Mon, 02/23/2026 - 00:00
SummaryAs a critical category of geophysical data, magnetic anomalies play vital roles in geological interpretation, resource exploration and target detection. For most applications involving magnetic anomaly data, the ideal dataset should have uniformly distributed data points, high resolution and completeness without gaps. However, because of the environmental constraints and measurement limitations, magnetic anomaly data obtained from real-world measurements often fail to meet these requirements. Thus, interpolation techniques present effective and cost-efficient technical approaches for processing measured magnetic anomaly data to meet the aforementioned criteria. To our knowledge, current research on magnetic anomaly data interpolation has primarily focused on gridding methods for interpolating irregularly sampled data into gridded data and super-resolution interpolation methods aimed at enhancing spatial resolution. Meanwhile, studies on interpolation methods specifically designed to fill large-area data gaps remain relatively scarce. To address the challenge of reconstructing large-area missing magnetic anomaly data, we propose a data-driven method for magnetic anomaly data gap filling. First, based on the analysis of the characteristics of magnetic anomaly data, we construct an open-source magnetic anomaly interpolation dataset (MAID) specifically designed for magnetic anomaly data interpolation tasks. Subsequently, we develop a magnetic anomaly data gap-filling generative adversarial network (MADGF-GAN) tailored for magnetic anomaly data gap filling. Upon sufficient training on the MAID training set, MADGF-GAN can directly fill gaps in given magnetic anomaly data. Finally, the effectiveness of MADGF-GAN is validated using four test samples from the MAID test set and Afghan aeromagnetic data. Compared with four existing interpolation methods, MADGF-GAN demonstrates considerable advantages in terms of interpolation accuracy, computational efficiency and practicality. This study demonstrates the potential of data-driven approaches in magnetic anomaly data processing, providing crucial technical support for related geoscientific applications.

Spectral induced polarization monitoring of chalcopyrite ore bioleaching: insights from laboratory column experiments

Geophysical Journal International - Mon, 02/23/2026 - 00:00
SummaryBioleaching is a biologically facilitated process that helps to dissolve valuable metals in order to extract them from the mineral gangue. Applied in the field to heap ores, its efficiency mainly depends on solution flow inside the heterogeneous heaps, which is often tortuous and can remain stagnant in the pores and crevices between the particles. Methodologies that can help to monitor the bioleaching processes are therefore needed to improve operational efficiency. In this article, we present for the first time preliminary laboratory-scale investigations on spectral induced polarization (SIP) during the bioleaching of chalcopyrite (CuFeS2) containing ore material from a mine in Chile. Two column experiments representing different stages of the bioleaching process were monitored under un-saturated and highly acidic environment (pH ~2). Our objective was to explore the feasibility of SIP for detecting changes in electrical properties potentially associated with bioleaching-induced mineral dissolution and alteration. The results show a rapid decrease in SIP phase shift and imaginary conductivity during the early stage of bioleaching, while the real conductivity remains relatively stable. At a more advanced stage of bioleaching, the phase response is weaker and more stable. A relaxation time distribution (RTD) analysis was applied to further investigate changes in polarization mechanisms. Prior to bioleaching, the RTD exhibits a well-defined peak consistent with polarization controlled by sulfide mineral grains, whereas after one month of bioleaching the RTD broadens and shifts toward larger relaxation times, accompanied by a decrease in chargeability. This combined evolution suggests bioleaching-induced modifications of electrochemically active surfaces, potentially related to mineral dissolution and the formation of passivation layers. Estimated particle sizes derived from the RTD analysis are consistent with scanning electron microscopy observations. Although, the absence of a dedicated abiotic control column prevents us from attributing these changes unambiguously to bioleaching alone, these results highlight the potential of SIP as a non-invasive, real-time and integrative tool to monitor leaching processes and to identify zones that may remain weakly affected by leaching.

Deep sea landscapes are a new frontier of human exploration—here's what we may find

Phys.org: Earth science - Sun, 02/22/2026 - 22:30
When we dream of landscapes, we might imagine rolling valleys or rugged mountains. But there is a whole landscape hidden from human view: the secret world of the seafloor.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer