Feed aggregator

Ejecta Discovered Near Site of Ancient Meteorite Impact

EOS - Tue, 08/17/2021 - 16:16

A meteorite impact is a colossal disruption—think intense ground shaking, sediments launching skyward, and enormous tsunamis. But evidence of all that mayhem can be erased by erosion over time. Scientists have now relied on clever geological sleuthing to discover impact ejecta near South Africa’s Vredefort impact structure, the site of a massive meteorite strike roughly 2 billion years ago. These ejecta might hold clues about the composition of the object that slammed into Earth during the Precambrian, the researchers suggest.

After more than 2 billion years of erosion, features of the crater created by the massive meteorite that impacted what is now Vredefort, South Africa, are barely discernible. Credit: NASA

The Vredefort impact structure, near Johannesburg, is estimated to be between 180 and 300 kilometers in diameter—it’s believed to be the largest impact structure on Earth. But it doesn’t look at all like a crater. It’s far too old—and therefore too eroded—to have preserved that characteristic signature of an impact.

What’s visible instead is an arc of uplifted sediments. That material is part of the “peak ring” that formed within the original crater. Such uplifted material is the calling card of a massive impact, said Matthew S. Huber, a geologist at the University of the Western Cape in Cape Town, South Africa. “If there’s a sufficiently large impact, there will be a rebound.”

Standing at the Roots

But even these uplifted sediments were buried far below Earth’s surface at the time of the impact, said Huber. “This [area] has experienced at least 10 kilometers of erosion. We’re at the deep structural roots.”

Because of all that erosion, there’s no hope of finding impact ejecta—sediments launched during an impact, which have often been altered by high temperatures and pressures—within the impact structure itself, said Huber. “It’s all eroded away. It’s gone.”

However, nearby sites—located within a few radii of the Vredefort impact structure—might still contain impact ejecta, Huber and his colleagues reasoned. (In previous studies, Huber and his collaborators had found millimeter-sized Vredefort ejecta much farther afield, in Greenland and Russia.)

To search for this so-called proximal ejecta, Huber and his colleagues looked a few hundred kilometers to the west. They focused on a swath of the Kaapvaal Craton, a geologic feature that, like other cratons around the world, preserves particularly ancient sediments.

A Violent Event, Told Through Rocks

The researchers collected material from a pair of sediment cores originally drilled by mining companies exploring the region for iron and manganese. Huber and his collaborators honed in on sediments dated to be 1.9 billion to 2.2 billion years old and assembled several thin sections of the rocks to analyze. The sediments exhibited telltale signs of a violent event, the team found.

To begin with, the researchers noticed bull’s-eye-looking features up to a few centimeters in diameter. These structures, called accretionary lapilli, form within clouds of ash. Much as hailstones grow via the addition of layers of ice, accretionary lapilli grow spherically as successive layers of ash are deposited on their outer surface. They’re associated with both volcanic eruptions and meteorite impacts.

“There’s no doubt that it is impact ejecta.”Huber and his colleagues also spotted parallel lines running through grains of quartz. These lines, known as planar deformation features, represent broken atomic bonds in the quartz’s crystal lattice. Ordinary geologic processes like earthquakes or volcanic eruptions are rarely powerful enough to create these features, said Huber. “These grains were subjected to a shock wave.”

Planar deformation features are “unequivocal evidence” of impact material, said Elmar Buchner, a geologist at the Neu-Ulm University of Applied Sciences in Neu-Ulm, Germany, not involved in the research. “There’s no doubt that it is impact ejecta.”

These results were presented today at the 84th Annual Meeting of the Meteoritical Society in Chicago.

There’s a lot more to learn from these ejecta, Huber and his collaborators suggest. The team next plans to analyze their samples for “impact melt,” material preserved from the time of the impact that’s sometimes a chemical amalgam of the impactor and the surrounding target rocks. Such ejecta could help reveal the composition of the object responsible for creating the Vredefort impact structure, the researchers suggest. “We are already planning our next analyses,” said Huber. “There is a lot of work to be done.”

—Katherine Kornei (@KatherineKornei), Science Writer

Magnetic Record of Early Nebular Dynamics

EOS - Tue, 08/17/2021 - 14:00

The early solar nebula was probably subject to strong magnetic fields, which influenced its dynamics and thus the rate at which the Sun and planets grew. Fortunately, some meteorites were forming during this epoch, and thus provide the potential to characterize these ancient nebular fields. Fu et al. [2021] make careful measurements of one particular meteorite and conclude that the fields recorded are an order of magnitude larger than fields recorded by other meteorites of similar ages. This result suggests that the nebula experienced either strong temporal variations in field strength, or strong spatial variations (for instance, because of the presence of gaps cleared by growing planets). As highlighted by Nichols [2021] in a companion Viewpoint, an important next step is to understand in more detail the chemical process(es) by which magnetization was acquired; so too is removing the lingering possibility that this field was due to an internal dynamo, rather than an external nebular field.

Citation: Fu, R., Volk, M., Bilardello, D., Libourel, G., Lesur, G., & Ben Dor, O. [2021]. The fine-scale magnetic history of the Allende meteorite: Implications for the structure of the solar nebula. AGU Advances, 2, e2021AV000486. https://doi.org/10.1029/2021AV000486

—Francis Nimmo, Editor, AGU Advances

Satellite Sensor EPIC Detects Aerosols in Earth’s Atmosphere

EOS - Tue, 08/17/2021 - 13:27

Aerosols are small, solid particles that drift aloft in Earth’s atmosphere. These minuscule motes may be any of a number of diverse substances, such as dust, pollution, and wildfire smoke. By absorbing or scattering sunlight, aerosols influence Earth’s climate. They also affect air quality and human health.

Accurate observations of aerosols are necessary to study their impact. As demonstrated by Ahn et al., the Earth Polychromatic Imaging Camera (EPIC) sensor on board the Deep Space Climate Observatory (DSCOVR) satellite provides new opportunities for monitoring these particles.

Launched in 2015, DSCOVR’s orbit keeps it suspended between Earth and the Sun, so EPIC can capture images of Earth in continuous daylight—both in the visible-light range and at ultraviolet (UV) and near-infrared wavelengths. The EPIC near-UV aerosol algorithm (EPICAERUV) can then glean more specific information about aerosol properties from the images.

Like other satellite-borne aerosol sensors, EPIC enables observation of aerosols in geographic locations that are difficult to access with ground- or aircraft-based sensors. However, unlike other satellite sensors that can take measurements only once per day, EPIC’s unique orbit allows it to collect aerosol data for the entire sunlit side of Earth up to 20 times per day.

To demonstrate EPIC’s capabilities, the researchers used EPICAERUV to evaluate various properties of the aerosols it observed, including characteristics known as optical depth, single-scattering albedo, above-cloud aerosol optical depth, and ultraviolet aerosol index. These properties are key for monitoring aerosols and their impact. The analysis showed that EPIC’s observations of these properties compared favorably with those from ground- and aircraft-based sensors.

The research team also used EPIC to evaluate the characteristics of smoke plumes produced by recent wildfires in North America, including extensive fires in British Columbia in 2017, California’s 2018 Mendocino Complex Fire, and numerous North American fires in 2020. EPIC contributed to the observational proof of smoke self-lofting via the tropopause by solar absorption–driven diabatic heating in 2017. EPIC observations successfully captured these huge aerosol plumes, and the derived plume characteristics aligned accurately with ground-based measurements.

This research suggests that despite coarse spatial resolution and potentially large errors under certain viewing conditions, EPIC can serve as a useful tool for aerosol monitoring. Future efforts will aim to improve the EPICAERUV algorithm to boost accuracy. (Journal of Geophysical Research: Atmospheres, https://doi.org/10.1029/2020JD033651, 2021)

—Sarah Stanley, Science Writer

Steady but Slow Progress on the Long Road Towards Gender Parity

EOS - Mon, 08/16/2021 - 19:04

Diversity among scientists expands the questions our science asks, the approaches it takes, and the quality and impact of its products. Unfortunately, the geosciences has one of the worst records of diversity among its ranks. Progress is being made to include more women in the geosciences, but Ranganathan et al. [2021] show that, assuming equity in hiring and retention going forward, gender parity in the geosciences at U.S. universities will not be reached until 2028, 2035, and 2056, for assistant, associate, and full professors, respectively. Women of color and all minoritized groups face a longer road to inclusion. In an accompanying Viewpoint, Hastings [2021] shares the policies, institutional support, and community support that helped her overcome several obstacles in her career. These data and personal stories show that actions have and will make a difference, but institutions and their leaders need to pick up the pace to make the geosciences more inclusive and equitable.

Citation: Ranganathan, M., Lalk, E., Freese, L. et al. [2021]. Trends in the representation of women among geoscience faculty from 1999-2020: the long road towards gender parity. AGU Advances, 2, e2021AV000436. https://doi.org/10.1029/2021AV000436

—Eric Davidson, Editor, AGU Advances

Predictive Forensics Helps Determine Where Soil Samples Came From

EOS - Mon, 08/16/2021 - 13:10

In the very first appearance of Sherlock Holmes, 1887’s A Study in Scarlet, Dr. Watson jots down notes on the famous detective’s incredible powers of observation. Holmes “tells at a glance different soils from each other. After walks, has shown me splashes upon his trousers and told me by their colour and consistence in what part of London he had received them.” Holmes finds footprints in a claylike soil in the story and uses his knowledge of geology in several subsequent mysteries.

Today, scientists are trying to expand the envelope of forensic geology, the science of using unique characteristics of geological materials in criminal investigations, through more refined techniques.

Better Tools for Soil Sleuths

Law enforcement agencies often try to piece together the tracks of criminals by analyzing soil and dust samples left on items such as clothing and vehicles. The concept has been around at least since the time of Arthur Conan Doyle, who popularized it.

In a study published in the Journal of Forensic Sciences, however, researchers in Australia outline their efforts to develop a more powerful tool for detectives. In a nutshell, they took soil samples and subjected them to advanced analytical methods and concluded that the “empirical soil provenancing approach can play an important role in forensic and intelligence applications.”

The researchers looked at 268 previously collected soil samples, each from its own square kilometer in an area of North Canberra measuring some 260 square kilometers, more than 4 times the size of Manhattan Island. Geochemical survey data were mapped to show what the chemical and physical properties should be between measured points, as well as the measured uncertainty for each cell. Comparative samples were analyzed using Fourier transform infrared spectroscopy, magnetic susceptibility, X-ray fluorescence, and inductively coupled plasma–mass spectrometry.

The researchers were given three samples from the surveyed area and challenged to identify the 250- × 250-meter cells they came from. Using the method, they were able to eliminate 60% of the area under consideration. In a real investigation, that would mean spending less of law enforcement’s time and money on areas that won’t yield any results.

“This is extremely useful in forensic investigations as it allows [investigators] to prioritize resources to the most promising parts of the area.”“We can use fairly standard analytical methods and achieve degrees of exclusion generally of the order of 60% to 90% of the investigated area,” said study lead author Patrice de Caritat, principal research scientist at Geoscience Australia. “This is extremely useful in forensic investigations as it allows [investigators] to prioritize resources to the most promising parts of the area. The greatest challenge was, predictably, the natural heterogeneity of soils. Even when characterized empirically at a density of one sample per 1 square kilometer, it is always possible that the evidentiary sample is uncharacteristic of that 1 square kilometer.”

De Caritat said predictive provenancing using existing digital soil maps has never been put forward before in a forensic application and offers an “effective desktop method of ruling out vast areas of territory for forensic search” as soon as a soil analysis is available. The research builds on an earlier paper he coauthored in which the method reduced the search area for soil samples by up to 90%.

“Using soil in forensic cases is an old technique,” added de Caritat. “Originally, bulk properties such as color were used and really useful, particularly to exclude matches between soil samples. But what has changed now is the breadth and depth of techniques brought to bear on the question.”

Hunt for Suspects

Lorna Dawson, head of the Soil Forensics Group at The James Hutton Institute in Aberdeen, Scotland, said the research was well carried out and is a good proof of concept but unrealistic because sampling at such a high resolution is not affordable.

“Often, the sample recovered from the car, tool, shoe, etc., is too small to allow elemental profile analysis to be carried out, so the methods described in Patrice’s research would not work in every country, and to test it would be prohibitively expensive,” said Dawson, who was not involved in the study.

“But if funding could be made available by some international donor, we as practitioners could work with key researchers such as Patrice, and we would be delighted to carry out the research to set up the appropriate databases and models to link with currently available soil databases,” Dawson added. “That level of detail would certainly help in many serious crime investigations such as fakes, frauds, precious metals, etc.”

De Caritat and colleagues have received a Defence Innovation Partnership grant from the South Australia government to apply their provenancing work to soil-derived dust and to include X-ray diffraction mineralogical and soil genomic information to increase specificity. The project is a collaboration involving the University of Canberra, the University of Adelaide, Flinders University, the Australian Federal Police, and Geoscience Australia. The research may be used for counterterrorism, where “environmental DNA” on the clothing or other personal effects of a suspect may prove valuable.

Jennifer Young, a lecturer in the College of Science and Engineering at Flinders University not involved in the new research, said last year that the technology could help provide “evidence of where a person of interest might have traveled based on the environmental DNA signature from dust on their belongings.”

—Tim Hornyak (@robotopia), Science Writer

Multicellular Algae Discovered in an Early Cambrian Formation

EOS - Mon, 08/16/2021 - 13:08

The Cambrian period, which occurred around 541–485 million years ago, is known for its explosive biological diversification. In warm oceans, the planet’s earliest eukaryotes began to thrive and diversify. A major contributing factor to the acceleration of life and the development of early metazoans is thought to be an increasingly efficient food web, created largely by algae. These new photosynthetic creatures allowed for easier nutrient transfer between species than their more ancient equivalents, the cyanobacteria.

A new study by Zheng et al. characterizes large, multicellular algae from a formation known as Kuanchuanpu. The site, located in southern Shaanxi Province in China, contains a famous collection of metazoan fossils from the Cambrian era. Using a combination of scanning electron microscopy and X-ray tomographic analysis, the authors reveal an organism with external membranes and cell walls. The cells in the specimens are organized into large spatial patterns, specifically, an inner and outer area, which the researchers refer to as a cortex and a medulla. These features lead the scientists to conclude that the fossil shows organized, multicellular algae enclosed in a membrane rather than a group of cyanobacteria or metazoan embryos.

The team also hypothesizes that the cortex-medulla organization seen in the specimens suggests an asexual life cycle wherein the organism grows from a single round ball of cells into a globular collection, where each lobe contains its own cortex-medulla organization inside. If their analysis is correct, these multicellular algae from Kuanchuanpu Formation appear consistent with contemporary specimens found at the Weng’an biota from the Ediacaran, south China. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2020JG006102, 2021)

—David Shultz, Science Writer

Ice Lenses May Cause Many Arctic Landslides

EOS - Fri, 08/13/2021 - 11:39

Climate change is driving periods of unusually high temperature across large swaths of the planet. These heat waves are especially detrimental in the Arctic, where they can push surface temperatures in regions of significant permafrost past the melting point of ice lenses. Melting ice injects liquid water into the soil, reducing its strength and increasing the likelihood of landslides. In populated areas, these events can cause economic damage and loss of life.

Mithan et al. investigate a shallow-landslide formation mechanism called active layer detachment (ALD), in which the upper, unfrozen—or active—layer of soil separates from the underlying solid permafrost base. They analyze the topography in the vicinity of ALD landslides spread over a 100-square-kilometer region of Alaska to characterize the factors that govern such events. This region experienced many ALD landsides after a period of unusually high temperature in 2004.

The authors identified 188 events in the study area using satellite imagery and established the local topography using a U.S. Geological Survey digital elevation model. To analyze the relationship between ALD landslides and topography, they simulated such events using a set of common software tools.

Because many Arctic regions have relatively shallow slopes, their modeling finds that the simple flow of water is generally unable to generate sufficient water pressure between soil grains to kick-start a landslide. Rather, a major factor in ALD events appears to be the presence of ice lenses, concentrated bodies of ice that grow underground. When a heat wave pushes the thawing point of the permafrost to the depth of these ice accumulations, their melting strongly raises the local water pressure, creating the conditions for a landslide.

As ice lens formation is governed by local topography, the authors propose that it may be possible to construct a mechanism for predicting locations likely to be susceptible to ALD landslides using only simple surface observations. As permafrost increasingly thaws in the face of a warming planet, such predictions are likely to take on greater importance in the coming decades. (Geophysical Research Letters, https://doi.org/10.1029/2020GL092264, 2021)

—Morgan Rehnberg, Science Writer

Lava from Bali Volcanoes Offers Window into Earth’s Mantle

EOS - Fri, 08/13/2021 - 11:39

Volcanoes along the 5,600-kilometer-long Sunda Arc subduction zone in Indonesia are among the most active and explosive in the world—and given the population density on the islands of the archipelago, some of the most hazardous.

“The most dangerous volcanoes are in subduction zones…. Any improvement in our knowledge of these volcanoes will help us be better prepared when they erupt.”“The most dangerous volcanoes are in subduction zones,” said Frances Deegan, a researcher in the Department of Earth Sciences at Uppsala University in Sweden.

In a new study published in Nature Communications, Deegan and her colleagues shed more light on the magma systems beneath four volcanoes in the Sunda Arc: Merapi in Central Java, Kelut in East Java, and Batur and Agung on Bali. Using relatively new technology to measure oxygen isotopes in crystals in lava samples from the four volcanoes, the researchers established a baseline measurement of the oxygen isotopic signal of the mantle beneath Bali and Java. That baseline can be used to measure how much the overlying crust, or subjected sediments, influences magmas as they rise toward the surface.

Researchers tested lava samples from the volcanoes. Credit: Frances Deegan Volcano Forensics Researchers used the Secondary Ion Mass Spectrometer (SIMS) at the Swedish Museum of Natural History in Stockholm. Credit: Frances Deegan

In the past, volcano forensic studies have relied on such technologies as conventional fluorination or laser fluorination to measure isotopes and minerals in samples, which are used to analyze pulverized lava samples but also often capture unwanted contaminates. In the new study, the research made use of the Secondary Ion Mass Spectrometer (SIMS) at the Swedish Museum of Natural History. “It allows you to do in situ isotope analysis of really small things like meteorites,” Deegan said, “things that are really precious where you can’t really mash them up and dissolve them.”

SIMS also allows for targeting of portions of individual crystals as small as 10 microns, which allowed the researchers to avoid the unwanted contamination sometimes found within an individual crystal, according to Terry Plank, a volcanologist at Columbia University who was not involved in the study. “The ion probe lets you avoid that and really analyze the pristine part of the crystal,” she said, “so we can see, in this case, its original oxygen isotope composition.”

New Measurements

Researchers can use SIMS to measure oxygen isotope ratios (18O to 16O)—expressed as a δ18O value, which normalizes the ratios to a standard—in various samples. On the basis of previous measurements for mid-ocean ridge basalts, Earth’s mantle is believed to have a δ18O value of around 5.5%, according to Deegan. “The crust is very variable and very heavy, so it can be maybe 15% to 20% to 25%,” she said. “If you mix in even just a little bit of crust with this very heavy oxygen isotope signal, it’s going to change the 5.5%—it’s going to go up.”

Deegan and colleagues used SIMS to determine δ18O values from the mineral clinopyroxene in samples from the four volcanoes. In lavas from the Sunda Arc, clinopyroxene is a common mineral phase and can potentially shed light on source compositions and magmatic evolution. The results showed that the average δ18O values for each volcano decreased as the researchers moved east, with Merapi in Central Java measuring 5.8%, Kelut in East Java measuring 5.6%, and the Bali volcanoes Batur and Agung measuring 5.3% and 5.2%, respectively.

“We actually have a really clean mantle signature, which is unusual to find in a subduction zone.”“What really surprised me the most was finding this really pristine mantle signature under Bali,” Deegan said. Researchers already knew that the crust grows thinner as you move east from Java to Bali, but Deegan expected to find more evidence of ocean sediment in the measurements under Bali—seafloor material that melts along with the Indo-Australian plate as it slides beneath the Eurasian plate at the Sunda Arc. “We didn’t see that. We actually have a really clean mantle signature, which is unusual to find in a subduction zone,” she said.

The researchers also measured magma crystallization depths of each of the four volcanoes and found that most of the sampled clinopyroxene from the two Java volcanoes formed in the middle to upper crust, while the crystallization occurred closer to the crust–mantle boundary beneath the Bali volcanoes. “I think that we have found a special view on the Indonesian mantle at Bali,” Deegan said. “Agung volcano on Bali seems to be the best mirror of mantle compositions in the whole region.”

These findings could help scientists better understand what happens when magma leaves its sources and moves toward the surface. It’s theorized that magma interaction with volatile components in the crust could be a driver of more explosive eruptions, Deegan said, and so having a clean, contained mantle baseline for the Sunda Arc region could aid future research.

Crust or Sediment?

Although Plank was excited by the measurements of uncontaminated, unaltered oxygen isotope baselines in the paper, she wondered whether the differences in δ18O values are really explained by thicker crust under Java. “The averages for each volcano almost overlap within 1 standard deviation, so there are more high ones at Merapi than at Agung, but they all have the same baseline,” she said. “[The authors] argue that’s crustal contamination, but I wonder if there are other processes that can cause that.” It’s not always so easy, geochemically speaking, to distinguish between crustal contamination and material from subducted seafloor, Plank added. “The crust erodes and goes into the ocean, and then that material on the seafloor gets subducted and comes back up again,” she said. “It’s the same stuff.”

As more research is conducted with SIMS, Plank would like to see work similar to Deegan’s done on samples from Alaskan volcanoes, which exhibit low δ18O values—like the Bali volcanoes—as well as on more shallow magma systems, like the Java volcanoes.

“Any improvement in our knowledge of these volcanoes [in subduction zones] will help us be better prepared when they erupt,” Deegan said.

—Jon Kelvey (@jonkelvey), Science Writer

Wildfires Are Threatening Municipal Water Supplies

EOS - Thu, 08/12/2021 - 11:52

In recent decades, wildfire conflagrations have increased in number, size, and intensity in many parts of the world, from the Amazon to Siberia and Australia to the western United States. The aftereffects of these fires provide windows into a future where wildfires have unprecedented deleterious effects on ecosystems and the organisms, including humans, that depend upon them—not the least of which is the potential for serious damage to municipal water supplies.

In 2013, the Rim Fire—at the time, the third-largest wildfire in California’s history—burned a large swath of Stanislaus National Forest near Hetch Hetchy Reservoir, raising concerns about the safety of drinking water provided from the reservoir to San Francisco.

The 2018 Camp Fire not only burned vegetation but also torched buildings and the water distribution system for the town of Paradise in north central California, leaving piles of charred electronics, furniture, and automobiles scattered amid the ruins. Postfire rainstorms flushed debris and dissolved toxicants from these burned materials into nearby water bodies and contaminated downstream reaches. Residents relying on these sources complained about smoke-tainted odors in their household tap water [Proctor et al., 2020]. And in some cases, water utilities had to stop using water supplies sourced from too near the wildfire and supply alternative sources of water to customers.

Water exported from severely burned watersheds can have greatly altered chemistry and may contain elevated levels of undesirable materials that are difficult to remove.Climate change is expected to increase the frequency and severity of wildfires, resulting in new risks to water providers and consumers. Water exported from severely burned watersheds can have greatly altered chemistry and may contain elevated levels of contaminants and other undesirable materials that are difficult to remove. For example, excess nutrients can fuel algal blooms and suspended soil erosion particles can clog water filters. Are water utilities in wildfire-affected areas prepared for these changes?

Our research team has conducted field studies after several severe wildfires to sample surface waters and investigate the fires’ effects on downstream water chemistry. In California, we have looked at the aftereffects of the 2007 Angora Fire, 2013 Rim Fire, 2015 Wragg Fire, 2015 Rocky-Jerusalem Fire, 2016 Cold Fire, 2018 Camp Fire, and 2020 LNU Lightning Complex Fire. We also studied the 2016 Pinnacle Mountain Fire in South Carolina and the long-term effects of the 2002 Hayman Fire in Colorado. These campaigns often involve hazardous working conditions, forcing researchers to wear personal protective gear such as respirator masks, heavy boots and gloves, and sometimes full gowns for protection from ash and dust. We also had to monitor the weather so as not to be surprised by the unpredictable and dangerous flash floods, debris flows, and landslides that can occur following fires.

The devastation of these burned landscapes is stunning and amplifies the urgency to better understand the fallout of fires on ecosystems and humans. Our field studies have provided important new insights about how surface water chemistry and quality are affected after fires—information useful in efforts to safeguard water treatment and water supplies in the future.

Wildfire Impacts on Water

Wildfires have well-documented effects on the quality of surface waters. Fires contaminate the rivers, streams, lakes, and reservoirs that supply public drinking water utilities with sediments, algae-promoting nutrients, and heavy metals [Bladon et al., 2014]. However, few researchers have addressed water treatability—the ease with which water is purified—or drinking water quality in water treated following wildfires (see video below).



Although wildfires can destroy forest ecosystems within days, changes in dissolved organic matter quantity and composition can persist in burned landscapes for decades.Contaminants are mobilized in the environment as a result of the forest fires, which volatilize biomass into gases like carbon dioxide while producing layers of loose ash on the soil surface. Dissolved organic matter (DOM) leached from this burned, or pyrogenic, material (PyDOM) has appreciably different chemical characteristics compared with DOM from the unburned parent materials [Chen et al., 2020]. Although wildfires can destroy forest ecosystems within days, changes in DOM quantity and composition can persist in burned landscapes for decades [Chow et al., 2019].

DOM itself is not a contaminant with direct impacts on human health, but it creates problems for water treatment. It can cause off colorations and tastes and serve as a substrate for unwanted microbial growth or a foulant of membranes and adsorption processes. Also, DOM can increase treatment costs and chemical demand levels, that is, the amount of added chemicals, like chlorine and ferric iron, required to disinfect water and remove DOM. In addition, treatment efforts can introduce unintended side effects: Disinfection processes for DOM-contaminated water can form a variety of carcinogenic disinfection by-products (DBPs) such as chloroform, some of which are regulated by the EPA.

The characteristics, treatability, and duration of PyDOM from burned watersheds are poorly understood and require more study, but it is clear that this material poses several major challenges and health concerns related to municipal water supplies in wildfire-prone areas. In particular, it negatively affects treatability while increasing the likelihood of algal blooms and toxic chemical releases (Figure 1).

Fig. 1. Threats to drinking water supplies from wildfires include releases of toxic chemicals from burned infrastructure, electronics, plastics, cars, and other artificial materials (left); releases of pyrogenic dissolved organic matter and toxic chemicals from ash deposits into source water supplies (middle); and postfire eutrophication and algal blooms in water supplies because of increased nutrient availability (right). Credit: Illustration, Wing-Yee Kelly Cheah; inset photos, Alex Tat-Shing Chow Treatability of Pyrogenic Dissolved Organic Matter

Postfire precipitation can easily promote the leaching of chemicals from burned residues, and it can also transport lightweight ash to nearby surface waters. This deposition raises levels of DOM and total suspended solids, increases turbidity, and lowers dissolved oxygen levels in the water, potentially killing aquatic organisms [Bladon et al., 2014; Abney et al., 2019].

Our controlled laboratory and field studies demonstrated that DOM concentrations in leached water depend on fire severity. Burned residuals could yield DOM concentrations up to 6–7 times higher than those in leached water from the unburned parent biomass. DOM concentrations in stream water from a completely burned watershed were 67% higher than concentrations in water from a nonburned watershed in the year following a severe wildfire [Chen et al., 2020; Uzun et al., 2020].

High total suspended solid levels complicate drinking water treatment by increasing chemical demand and reducing filtration. We observed that PyDOM—which had a lower average molecular weight but greater aromatic and nitrogen content than nonpyrogenic DOM—was removed from water with substantially lower efficiency (20%–30% removal) than nonpyrogenic DOM (generally 50%–60% removal or more) [Chen et al., 2020].

Elevated levels of PyDOM in water mean that higher chemical dosages are needed for treatment, and higher levels of DBPs are likely to be formed during treatment. PyDOM is also more reactive, which promotes the formation of potentially harmful oxygenated DBPs. For example, chlorinating water that contains PyDOM produces haloacetic acids, whereas chloramination, another form of disinfection, produces N-nitrosodimethylamine [Uzun et al., 2020]. In addition, increased levels of bromide, another DBP precursor, released from burned vegetation and soils has been observed in postfire surface runoff, especially in coastal areas. This bromide may enhance the formation of more toxic brominated DBPs (e.g., by converting chloroform to bromoform) [Wang et al., 2015; Uzun et al., 2020].

Only a small fraction (less than 30%) of total DBPs generated from DOM have been identified in chlorinated or chloraminated waters. The unique chemical characteristics of PyDOM generated from wildfire may give rise to DBPs that do not typically occur in water treatment and have not been identified or studied.

Postfire Nutrient Releases and Algal Blooms

After wildfires, burned biomass, fire retardant, and suppression agents often release nutrients, including inorganic nitrogen and phosphorus, into source waters.After wildfires, burned biomass, fire retardant, and suppression agents like ammonium sulfate and ammonium phosphate often release nutrients, including inorganic nitrogen and phosphorus, into source waters. Wildfire runoff is often alkaline (pH > 9), in part because of its interactions with wood ash and dissolved minerals. Under these conditions, high ammonia and ammonium ion concentrations can cause acute ammonia toxicity in aquatic organisms, especially in headwater streams where these contaminants are not as diluted as they become farther downstream.

Freshwater aquatic ecosystems tend to be phosphorus limited, meaning algal growth is naturally kept under control. But large phosphorus loads originating from burned watersheds, particularly phosphorus associated with sediments, can induce eutrophication (nutrient enrichment) and harmful algal blooms, particularly in lentic (still-water) ecosystems where nutrients accumulate. Blooms of cyanobacteria (blue-green algae) like Microcystis aeruginosa are especially hazardous for drinking water supplies because they produce neurotoxins and peptide hepatotoxins (liver toxins) such as microcystin and cyanopeptolin.

Algal organic matter is also nitrogen rich and contributes to the formation of a variety of carbonaceous and nitrogenous DBPs during drinking water disinfection [Tsai and Chow, 2016]. Although copper-based algicide treatments are options for controlling algal blooms, copper ions themselves catalyze DBP formation during drinking water disinfection [Tsai et al., 2019].

Releases of Toxic Chemicals

When forest vegetation burns, it can generate and directly release a variety of potentially toxic chemicals, including polycyclic aromatic hydrocarbons [Chen et al., 2018], mercury [Ku et al., 2018], and heavy metals [Bladon et al., 2014]. In addition, fires such as California’s 2017 Tubbs Fire and 2018 Camp Fire, which extended to the interfaces between wildlands and urban areas, have generated residues from burned infrastructure, electronics, plastics, cars, and other artificial materials, contributing a variety of toxic chemicals to source waters.

Fires can also burn plastic water pipelines in homes that are connected to municipal water distribution systems, potentially releasing hazardous volatile organic carbon into the larger water system. For example, up to 40 milligrams per liter of benzene, a known carcinogen, was reported in water distribution lines following the Tubbs Fire in an urban area of California [Proctor et al., 2020]. Benzene is one of many organic chemicals found in damaged water distribution networks, and experts worry there could be many other toxic chemicals released from burned pipes over time. Because these damaged pipes are downstream from the treatment facility, the best remediation option may be to replace the pipes entirely.

Climate Change and Wildfires Alter Watershed Hydrology

As wildfires burn hotter and consume more fuel in future climates, water quality will progressively degrade.Our recent research demonstrates that the degree of water quality impairment increases markedly with increasing wildfire severity and with the proportion of the watershed area burned [Chow et al., 2019; Uzun et al., 2020]. Hence, as wildfires burn hotter and consume more fuel in future climates, water quality will progressively degrade.

Severe wildfires consume vegetative and soil cover and often cause soils to become more water-repellent, which greatly increases surface runoff at the expense of soil infiltration. In turn, these changes lead to enhanced soil erosion and sediment transport—carrying associated pollutants directly to downstream waters—and to reduced filtration of water in the soil profile [Abney et al., 2019].

Although water quality in riverine systems may recover quickly following successive storm-flushing events, pollutants can accumulate in lakes and reservoir systems, which are more sedentary, degrading water quality for decades as pollutants are recycled between the water column and sediments.

Other factors are also likely to influence postfire runoff, erosion, and contamination transport amid changing climates. For example, many forested watersheds today still receive much of their precipitation as snowfall, which is much less erosive than a comparable volume of rainfall. But as the climate warms and more precipitation falls as rain, postfire surface runoff and erosion and water quality impairment will increase considerably. In addition, as extreme weather events are expected to be more prevalent in the future, more intense rainfall could greatly increase postfire pollutant transport.

Burned trees line the banks of a creek in the aftermath of the 2018 Camp Fire. A warming climate is expected to severely degrade water quality by contributing to larger burned areas and more severely burned watersheds. Credit: Alex Tat-Shing Chow

At present, portions of a watershed not burned during a wildfire serve to mitigate water pollution by providing fresh water that dilutes contaminants coming from burned areas. As wildfire sizes grow larger, this dilution will diminish. Furthermore, vegetation takes longer to recover after more severe wildfires, delaying recoveries in water quality.

Stream runoff dynamics will also be altered as increased surface runoff reduces soil water and groundwater recharge, leading to higher peak flows during storms and lower base flow conditions. The slow regeneration of vegetation, which will reduce consumption and transpiration of water by plants, will also lead to greater runoff following wildfires [Chow et al., 2019].

A warming climate is expected to severely degrade water quality by contributing to larger burned areas and more severely burned watersheds. The damage will be exacerbated by increases in rainfall compared with snow and by extreme storm events that enhance surface runoff and erosion.

Proactive and Prescribed Solutions

Mitigating wildfire impacts on drinking water safety requires effective, proactive management as well as postdisaster rehabilitation strategies from the forestry and water industries.Wildfires can cause press (ongoing) and pulse (limited-duration) perturbations in forested watersheds, altering watershed hydrology and surface water quality and, consequently, drinking water treatability. Mitigating wildfire impacts on drinking water safety requires effective, proactive management as well as postdisaster rehabilitation strategies from the forestry and water industries.

Water quality impairment increases exponentially with increasing burn intensity and area burned, so reducing forest fuel loads is critical. From a forestry management perspective, forest thinning and prescribed fire are both well-established, effective fuel reduction techniques. However, the operational costs of thinning are usually high, and the residual foliar litter it produces can cause increased DOM concentrations in source waters.

By comparison, prescribed fire is an economic management practice to reduce loads of forest litter and wildfire hazard. These low-severity fires reduce the quantity of DOM (and DBP precursors) potentially released to waterways while not appreciably affecting its composition or treatability in source water [Majidzadeh et al., 2019] (see video below). Establishing landscape-scale firebreaks is another effective management strategy, providing defensible corridors within forests to limit the rapid spread of fire and reduce the size of burned areas.

Water utilities, particularly those in fire-prone areas, should develop risk analysis and emergency response plans that combine multiple approaches. Such approaches include identifying alternative source waters, extensive and long-term postfire source water quality monitoring, and modifications in treatment processes and operations, such as using adsorbents and alternative oxidants that reduce taste and odor problems, remove specific contaminants, and decrease the formation of regulated DBPs.

Other research and preventive efforts should be encouraged as well. Such efforts include research studying the fates and effects of fire retardants in source water and the effects of postfire rehabilitation practices (e.g., mulching) on water chemistry, and the use of different pipeline and construction materials in newly developed housing near the wildland-urban interface. Furthermore, a collaborative system and an effective communication network between the forestry and water industries linking forest management to municipal water supplies will be critical in assessing and addressing wildfire impacts on drinking water safety.

Acknowledgments

The fieldwork efforts described in this article were supported by National Science Foundation RAPID Collaborative Proposal 1917156, U.S. EPA grant R835864 (National Priorities: Water Scarcity and Drought), and National Institute of Food and Agriculture grant 2018-67019-27795.

Is Earth’s Albedo Symmetric Between the Hemispheres?

EOS - Wed, 08/11/2021 - 14:41

The planetary albedo, the portion of insolation (sunlight) reflected by the planet back to space, is fundamentally important for setting how much the planet will warm or cool. Previous literature noted an intriguing feature that the albedo is essentially identical in the two hemispheres, on average, despite very different surface properties. Building upon earlier studies, Datseris & Stevens [2021] support the hemispheric albedo symmetry via advanced time-series analysis techniques using the latest release of CERES datasets. Because of land-sea fraction differences, the clear-sky albedo is greater in the northern than the southern hemisphere. However, this clear-sky albedo asymmetry is disrupted by the asymmetry in cloudiness, especially over the extratropical ocean. In search of a symmetry-establishing mechanism, the authors analyze the temporal variability and find substantial decadal trends in hemispheric albedo that is identical for both hemispheres. The results hint at a symmetry-enforcing mechanism that operates on large spatio-temporal scales.

Citation: Datseris, G. & Stevens, B. [2021]. Earth’s albedo and its symmetry. AGU Advances, 2, e2021AV000440. https://doi.org/10.1029/2021AV000440

—Sarah Kang, Editor, AGU Advances

 

Specifically Tailored Action Plans Combat Heat Waves in India

EOS - Wed, 08/11/2021 - 11:56

Unprecedented heat waves swept through Canada, the United States, and northern India this year, claiming hundreds of lives. These heat waves are not new: In India, 17,000 deaths have occurred because of heat waves since the 1990s.

A recent global study found that India had the highest burden of mortality associated with high temperatures between 2000 and 2019. In the summer of 2010, temperatures rose to 46.8°C in Ahmedabad, a metropolitan city in the state of Gujarat, causing an excess of 1,300 deaths in just 1 month.

“The heat wave in 2010 was devastating. People were caught unawares; they didn’t know they were experiencing symptoms of heat. In one hospital, there were 100 neonatal deaths.”Polash Mukerjee, the lead for Air Quality and Climate Resilience at the Natural Resources Defense Council (NRDC) India Program, said, “The heat wave in 2010 was devastating. People were caught unawares; they didn’t know they were experiencing symptoms of heat. In one hospital, there were 100 neonatal deaths.”

The deaths in Ahmedabad prompted scientists from NRDC; the Indian Institute of Public Health (IIPH), Gandhinagar; the India Meteorological Department (IMD); and officials from the Amdavad Municipal Corporation to develop India’s first heat action plan specifically tailored for a city in 2013.

The plan includes early-warning systems, color-coded temperature alerts, community outreach programs, capacity-building networks among government and health professionals for preparedness and reducing exposure, and staggered or reduced timings for schools and factories. The Amdavad Municipal Corporation also appointed a nodal office to coordinate the heat action plan with various agencies.

“The nodal officer sends out alerts: orange (very hot) if it is more than 40°C, red (extremely hot) for more than 45°C. Messages are then sent to the public through various media, to take precautions and not go out. Hospitals are made ready to receive heat stroke cases,” explained Dileep Mavalankar, director of IIPH, Gandhinagar.

A study conducted to assess the effectiveness of the Ahmedabad heat action plan found that an estimated 2,380 deaths were avoided during the summers of 2014 and 2015 as a result of the plan’s implementation.

Built to Scale

In 2015, India decided to scale up heat action plans based on the Ahmedabad model. In 2016, IMD started issuing 3-month, 3-week, and 5-day heat forecasts. India’s National Disaster Management Authority (NDMA) along with IMD and NRDC started developing heat action plans for other cities and states.

“We have developed city-specific threshold assessments based on temperature and mortality data from last 30 years, for example, how many deaths were there at 35°C and at 40°C. Based on this data, we send out area-specific advisory.”Anup Kumar Srivastava, a senior consultant at NDMA, said, “We have developed city-specific threshold assessments based on temperature and mortality data from last 30 years. We analyzed at what temperatures mortalities happened, for example, how many deaths were there at 35°C and at 40°C. Based on this data, we send out area-specific advisory for the temperature forecasted.” He added that of the 23 heat-prone states, 19 have heat action plans in place.

Action plans vary by region. The temperature threshold for the coastal city of Bhubaneswar, for example, takes into account the relative humidity of the place. “Yellow alert is issued at 35.9°C, orange at 41.5°C and red at 43.5°C for Bhubaneswar,” Srivastava said. In contrast, the temperature threshold for Nagpur, a city in the interior with a dry and arid climate, is 43°C (orange) and 45°C (red).

In addition, because of higher humidity, a coastal city feels hotter than a noncoastal city at the same temperature, Mukerjee explained. “So the coastal city heat action plan may include awareness components for heat-related illnesses associated with dehydration, whereas noncoastal cities will focus more on heat exhaustion and heat stroke,” he added.

Current Plans Are Just a Start

Not all experts agree on the efficacy of the heat plans. Abinash Mohanty, program lead for the Council on Energy, Environment and Water, Delhi, who was not involved in developing the heat action plans, said that studies indicate that the frequency of heat waves in India has increased since the 1990s, with a 62% increase in mortality rate.

“The numbers are an indication of the extremities yet to come, and the current limitations of heat action plans (lack of empirical evidence, limited adaptive capacity, and impact-based early warning) make them inadequate to mitigate heat waves,” he said.

Mohanty clarified that few cities like Ahmedabad have full-fledged heat action plans that identify and characterize climate actions focused on mitigating heat wave impacts. “Many cities lack climatological-led empirical evidence on metrics such as the number of heat wave days, seasonal variability, that are imperative for effective heat wave management,” he explained.

Mohanty added that though the current heat plans are a good starting point, they need to include factors such as the heat wave index, wet-bulb temperature, and updated heat wave definitions across varied geographies.

Kamal Kishore, a member of NDMA, said, “The heat plans are based on vulnerability factors for each city—people working outside (such as farmers, [those working in] open shops, and traffic police), type of dwellings (type of walls and roofs), access to drinking water, nutritional factors, etc.”

He added that preparedness workshops are planned well in advance of the season, and they constantly revise guidelines based on previous years’ lessons.

Mitigative Approaches Cool roofs, in which light-colored paint reflects sunlight away from the surface of a building, are one way Indian cities are combating heat waves with infrastructure strategies. Credit: Mahila Housing Trust – Natural Resources Defense Council

Mukerjee added that heat action plans are moving from reactionary to more mitigative approaches. These approaches include the cool-roof initiative, which is a low-cost method to reduce indoor temperatures and the corresponding health impact. “Cool-roof paints reflect the sunlight away from the surface of a building. This has potential to benefit the most vulnerable section of society, [such as] migrant workers, women, and children in low-income neighborhoods,” he said.

Thirty-five percent of India’s urban population live in low-income housing known as slums. These low-rise buildings trap heat through tin roofs, exacerbating the urban heat island effect.

The cities of Bhopal, Surat, and Udaipur have pioneered the use of cool roofs for the past 2 years, Mukerjee said. “Ahmedabad included cool roofs in 2017; the neonatal unit has benefitted. Telangana has a state level policy now wherein all new building plans must include cool roofs.”

Mohanty said that tackling heat waves should be a national imperative and a more robust and granular picture of heat wave impact should be mapped to the productivity of citizens.

He said, “2021 will be remembered as a year of heat wave anomalies ravaging lives and livelihoods across Indian states. Tackling heat waves calls for heat wave–proofed urban planning, revival and restoration of natural ecosystems that act as natural shock absorbers against extreme heat wave events.”

—Deepa Padmanaban (@deepa_padma), Science Writer

Is Your Home at Risk of Experiencing a Natural Disaster?

EOS - Wed, 08/11/2021 - 11:54

Reports from the scenes of natural disasters—raging wildfires, unrelenting floods, violent ground shaking, and devastating tornadoes and hurricanes—fill our news feeds every day. These hazards cause deep disruptions to the health of humans and ecosystems and threaten the safety and integrity of buildings and infrastructure.

The severity and frequency of some natural hazards are increasing with climate change. But humans are contributing to the problem in another way: building structures in hazard zones. In a new study that aims to determine the role development plays in the overall risk of natural hazards, Iglesias et al. looked at how development in the contiguous United States has influenced natural hazard risks to structures. They looked at the changes in the number and distribution of buildings between 1945 and 2015 and how development changed people’s exposure to natural hazards.

The researchers first made a hazard map of the United States that included earthquake, wildfire, hurricane, tornado, and floods with data from federal agencies and Fathom (for flood data). Then they identified “hazard hot spots” that correspond to areas where the frequency or intensity of an event falls in the top 10%. Although hazards tend to congregate in certain areas—hurricane risk is high around the Gulf Coast, for example—there can be some spillover into other areas.

When scientists or policymakers look at exposure risk to natural hazards, population density is often a key factor—for instance, the number of people who would be affected by a tornado. But in this study, the researchers focused more on the presence of structures, information they obtained from Zillow’s housing and property database. Their analysis included buildings like homes, stores, schools, and hospitals.

The researchers found that a third of the country contained hazard hot spots, but about 57% of structures sit within these hot risk areas. This is especially the case in earthquake- and hurricane-prone areas, where the density of structures has increased faster than the national trend.

What’s more, there are many structures that are at risk for more than one natural hazard. In the western United States, earthquakes and wildfires could occur in the same area, and floods and tornadoes (and sometimes hurricanes) can threaten the middle and southeastern regions. The explosion of development over 7 decades has ballooned the number of structures at risk of multiple hazards from around 173,000 in 1945 to more than 1.5 million in 2015.

The authors note that development patterns should be taken into consideration to fully capture the risk from natural hazards. And they explain that as climate continues to change, monitoring the occurrence and intensity of weather-related events will help refine the hazards of the future. (Earth’s Future, https://doi.org/10.1029/2020EF001795, 2021)

—Sarah Derouin, Science Writer

树木年轮显示了最新发现的极端太阳活动事件记录

EOS - Wed, 08/11/2021 - 11:53

This is an authorized translation of an Eos article. 本文是Eos 文章的授权翻译。

太阳持续不断地放射出高能粒子流,其中一部分可以到达地球。这种粒子流的密度和能量构成了空间天气的基础,它会干扰卫星和其他航天器的运行。该领域一个尚未解决的关键问题是,太阳发射的高能粒子爆发达到什么频率,其强度会足以破坏或摧毁太空电子设备。

确定此类事件发生率的一个很有前景的方法是树木年代学记录。这种方法依赖于太阳能量粒子(solar energetic particle, SEP)撞击大气的过程,该过程引发连锁反应,产生碳-14原子。这种原子随后可以被整合到树木的结构中;因此,树木年轮中碳-14原子的浓度可以指示特定年份中SEP的影响率。

迄今为止,文献中已详细描述了三次极端的SEP产生事件,大约发生在公元前660年、公元774-775年和公元992-993年。每一次事件都比太空探索时代的任何测量要强烈一个数量级。Miyake等人描述了一个发生在公元前5411年至公元前5410年之间的事件。由于这次爆发,北半球的大气碳14每年增加了0.6%,持续了好几年才降至正常水平。

作者通过从三个分散地区采集的树木样本推断了这一事件的存在:加利福尼亚的狐尾松、芬兰的苏格兰松和瑞士的欧洲落叶松。每个样本都分离出了独立的树木年轮,每个年轮上的物质都经过加速器质谱测定以确定其碳-14含量。

研究人员利用统计方法,发现了一种与太阳11年周期相一致的碳-14小波动模式;记录在年轮上的事件发生在太阳活动高峰期。值得注意的是,其他证据表明当时太阳也经历了一个长达数十年的活动增长时期。

如果一次极端SEP爆发的确是造成额外碳14的原因的话,那么这些观察还可以帮助预测未来的事件。然而,树木年轮的测量无法排除其他地外原因,比如附近的超新星爆炸。作者认为,要想得到明确的结论,还需要对从冰芯中提取的铍和氯进行同位素测量。(Geophysical Research Letters, https://doi.org/10.1029/2021GL093419, 2021)

-科学作家Morgan Rehnberg

This translation was made by Wiley. 本文翻译由Wiley提供。

Share this article on WeChat. 在微信上分享本文。

Need for Rational Thinking for Predicting Floods and Droughts

EOS - Tue, 08/10/2021 - 14:34

Natural disasters such as floods and droughts have affected the earth and shaped human activities throughout its long geologic and more recent human history. The stakes in managing the risk are now higher because of floods’ economic and social costs on high population centers and the drastic effects of droughts on global food security. The problems have taken some urgency because of global threats of rapid urban population growth and climate change affecting regional and local weather. In many problems of nature such as these, advances in scientific knowledge will help. However, it is not clear whether just following the science helps in policymaking. Di Baldassarre et al. [2021] postulate that answers are not simple, and the team provides a range of citations to support their reasoning and for follow-up reading. They recommend that, as multiple disciplines contribute to the needed policy-relevant science, we need integrated interdisciplinary research and methods to cope with uncertainty.  The challenge remains of how to encourage politicians to make policy through rational thinking based on these ideas.

Citation: Di Baldassarre, G., Cloke, H., Lindersson, S., Mazzoleni, M., Mondino, E., Mård, J., et al. [2021]. Integrating multiple research methods to unravel the complexity of human-water systems. AGU Advances, 2, e2021AV000473. https://doi.org/10.1029/2021AV000473

—Tissa Illangasekare, Editor, AGU Advances

 

Desert Life Conjures Organic Carbon from Thin Air

EOS - Tue, 08/10/2021 - 13:12

Photosynthesis is thirsty work—it requires just as much water as it does carbon dioxide, and in deserts, it can all but shut down. Without the organic carbon photosynthesis provides, life in arid climes must either compete for scraps blown in from afar or wait for rain. But despite the twofold challenge of drought and starvation, microbes in many desert soils somehow manage not only to survive but to flourish.

“The enigma has always been: Why are deserts diverse?” said Sean Bay, a microbial ecologist at Monash University in Melbourne, Australia. “Why do we see so many rich microbial communities?”

In Israel’s Negev Desert, microbes are pulling off a metabolic magic trick. By “burning” hydrogen gas scavenged from the air, they can scrape together enough energy to survive dry spells—and some can even use hydrogen to fuel carbon fixation.Bay and his colleagues may have found an answer in the hyperarid soils of Israel’s Negev Desert, where microbes are pulling off a metabolic magic trick. By “burning” traces of hydrogen gas scavenged from the air, they can scrape together enough energy to survive dry spells—and some can even use hydrogen to fuel carbon fixation. The researchers announced their findings in The ISME Journal: Multidisciplinary Journal of Microbial Ecology.

The Negev Desert is a natural experiment in how microbes adapt to aridity. “Over a relatively short spatial scale—[the farthest sampled] soils were approximately 160 kilometers apart—you have got distinct climatic zones,” said Bay. Driving south from the Judea Hills, the landscape of green chaparral gives way to dramatic swaths of chalky brown and tan. The researchers gathered 72 soil samples along this natural climatic gradient for analysis in their Melbourne lab, where they hoped to discover genetic or chemical clues explaining the unexpected diversity of microbial communities in arid environments.

Desert Microbes Run On Hydrogen Fuel

Specifically, Bay and his colleagues wanted to find out how the Negev Desert microbes might be using hydrogen for survival.

“In these large swaths of arid ecosystems, trace gas metabolism is likely a really important part of microbial metabolism,” explained Laura Meredith, an environmental scientist at the University of Arizona who was not involved in the new research. Trace gases like hydrogen and methane are naturally present in the air, together accounting for about 0.1% of atmospheric gases. Some microbes have specialized enzymes that can capture trace gases and exploit them for energy when there are few other resources available.

Previous studies showed that microbes can use hydrogen to run their life-support systems while waiting for favorable conditions in a kind of stasis, or dormancy. And Bay suspected that hydrogen might be fueling carbon fixation in deserts, too.

“Something seemed off about the accepted model,” he said. “These [photosynthesizing] communities of cyanobacteria—which are really, really low abundance in these soils—are providing enough energy, or organic carbon?”

Bay’s bet on hydrogen appears to have been justified. He and his colleagues discovered genes associated with hydrogen metabolism were widespread across the samples and enriched in samples from drier soils. Microbes inhabiting the driest soils consumed hydrogen 143 times faster than those in samples collected from the greener Judea Hills. The research team even found evidence that soil microbes from across the climatic gradient will “burn” hydrogen to power carbon fixation as a supplement to photosynthesis when provided with a bit of water.

“It’s like adding another ecological player, another strategy,” said Meredith.

Implications for the Carbon Cycle

Bay saw the results as evidence that trace gas metabolism is far more widespread than previously thought—not a niche process used by a handful of exotic bacteria, but something that takes place across entire ecosystems. And according to Meredith, microbes that use trace gases like hydrogen to maintain and create new biomass could also be tinkering with Earth’s carbon cycle.

“Carbon cycling in arid ecosystems, we know, is a leading contributor to overall carbon cycle variability at a global scale,” she said, “so anything that’s contributing to carbon fixation or carbon stabilization in the massive swaths of arid lands around the world is also important.”

Bay agreed. “I think that’s a really exciting part of this research…it’s not just about discovering curious new ecosystems or curious new processes. There are actually really important implications.”

—Elise Cutts (@elisecutts), Science Writer

El impacto de Chicxulub cambió para siempre la biodiversidad de la selva tropical

EOS - Tue, 08/10/2021 - 13:10

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

Las cunas de la vida en las regiones neotropicales del planeta siguen siendo un misterio para los geólogos y paleontólogos. Pero una nueva investigación proporciona algunas pistas, sugiriendo que los neotrópicos, un área geográfica que consta de América Central y del Sur, el Caribe y las regiones tropicales del sureñas de América del Norte, eran significativamente diferentes antes y después del impacto del asteroide Chicxulub hace 66 millones de años.

Las selvas neotropicales albergan más de 90.000 especies de plantas, casi el 50% de la biodiversidad total del planeta. Estos ecosistemas producen altas tasas de oxígeno, además de secuestrar dióxido de carbono de la atmósfera, lo que ayuda a equilibrar el clima global.

Mónica Carvalho, paleobióloga colombiana del Instituto Smithsonian de Investigaciones Tropicales en Panamá, ha pasado los últimos 12 años explorando los bosques y las minas de carbón de su país natal. Con cada viaje, ha recolectado miles de hojas fósiles y rocas que contienen polen microscópico, los cuales son evidencia de vida vegetal antigua.

Mónica Carvalho, en la foto, ha recolectado miles de fósiles raros del paisaje neotropical de Colombia. Crédito: Fabiany Herrera

Carvalho y su equipo acumularon una colección de fósiles del Cretácico y Paleoceno, períodos del tiempo geológico que fueron separados por el impacto de un asteroide que mató al 75% de las especies vivas. El equipo cuantificó qué plantas desaparecieron y cuáles sobrevivieron al evento y qué tipo de cambios atravesó el ecosistema.

Sus resultados indicaron que durante el Cretácico, los bosques neotropicales se caracterizaban por doseles abiertos y estaban dominados por helechos, pinos y algunas plantas con flores. Pero el paisaje cambió drásticamente en el Paleoceno, con bosques dominados principalmente por plantas con flores, leguminosas y doseles cerrados. Este patrón muestra los bosques densos y oscuros que conocemos hoy.

El artículo fue publicado en Science.

Resolviendo el rompecabezas

La mayoría de los datos existentes sobre lo que sucedió después del impacto de Chicxulub provienen de investigaciones en América del Norte y la Patagonia, principalmente porque es “relativamente fácil” encontrar fósiles intactos allí, dijo Viviana Barreda, paleontóloga del Museo Argentino de Ciencias Naturales Bernardino Rivadavia, en Buenos Aires. El paisaje ahí es “casi una estepa sin vegetación que lo cubra, el sueño de todo geólogo y paleontólogo”, dijo Barreda, quien no formó parte del estudio.

Encontrar fósiles preservados en los trópicos es extremadamente difícil debido a la gran cantidad de vegetación, así como a las altas tasas de oxígeno y humedad que degradan rápidamente el polen y las esporas.Pero encontrar fósiles preservados en los trópicos es extremadamente difícil debido a la gran cantidad de vegetación que cubre el suelo. Además, las altas tasas de oxígeno y humedad de los trópicos degradan rápidamente el polen y las esporas, lo que reduce la probabilidad de que se conserven.

Para Paula Sucerquia, geóloga de la Universidad Federal de Pernambuco en Brasil que no participó en el nuevo estudio, “este problema provocó un vacío en la historia paleontológica de Colombia”, razón por la cual los investigadores no sabían qué había sucedido con las plantas en el neotrópico después del impacto. Los resultados de Carvalho, sin embargo, “aportan información importante sobre las piezas que faltan en el rompecabezas paleontológico”.

El análisis incluyó 50,000 muestras de polen fosilizado de 39 localidades colombianas y más de 6,000 muestras de hojas fosilizadas de los municipios de Guaduas, Bogotá y la mina de carbón Cerrejón en La Guajira.

(Izquierda) Los fósiles como éste son raros porque el clima del neotrópico desalienta su formación; (derecha) Mauricio Gutiérrez recolecta fósiles maastrichtianos en una mina de carbón en Cundinamarca. Crédito: Fabiany Herrera

Las plantas y el polen fosilizado brindan información sobre lo que sucedió en un lugar específico en un momento específico. “Son como una fotografía. Se quedan congelados [en el tiempo]”, dijo Sucerquia.

Los investigadores también encontraron evidencia de cómo los insectos cambiaron sus patrones de alimentación. Aunque algunas especies de plantas del Cretácico fueron mordidas selectivamente, las plantas del Paleoceno mostraron mucho más daño. “Esto muestra que el bosque cambió no solo en su estructura y especies de plantas, sino también en sus interacciones ecológicas”, dijo Carvalho.

Mirar hacia atrás para avanzar

Después del impacto del asteroide, las cenizas y los sedimentos de minerales como el nitrógeno y el fósforo cubrieron el suelo, aumentando la fertilidad del suelo y permitiendo que las especies más adaptables, como las plantas con flores y las leguminosas, se apoderaran de los bosques. Además, la extinción de los dinosaurios herbívoros gigantes permitió que los árboles crecieran más juntos, formando densos parches de vegetación, dijeron los investigadores.

Estos cambios probablemente no fueron inmediatos, dijo Barreda. Los flujos de gases de efecto invernadero también alteraron la composición de los bosques en todo el planeta.

Para Carvalho, los resultados del estudio muestran cómo los ecosistemas tropicales pueden recuperarse después de catástrofes ecológicas, pero al mismo tiempo demuestran lo lento que es el proceso de recuperación.

“La vida tardó alrededor de 6 años en regresar, y aunque la catástrofe ecológica que los humanos están provocando por la deforestación no es de la misma magnitud, sigue el mismo camino”, dijo. “La vida definitivamente regresará, pero ¿seremos capaces de esperar?”

—Humberto Basilio (@humbertobasilio), Escritor de ciencia

This translation by Mauro González Vega (@MGonVe) and @Anthnyy was made possible by a partnership with Planeteando. Esta traducción fue posible gracias a una asociación con Planeteando.

What Five Graphs from the U.N. Climate Report Reveal About Our Path to Halting Climate Change

EOS - Mon, 08/09/2021 - 18:03

It has been 8 years, one pandemic, and a slew of wildfires, storms, and heat waves since the last United Nations climate assessment report was released in 2013. During that time, 191 parties signed the Paris Agreement; the United States (the world’s second-largest emitter) left and reentered the agreement; renewable energy outpaced coal in the United States and all fossil fuels in Europe for the first time; and greenhouse gas emissions crashed worldwide during stay-at-home orders before springing back.

It is with this backdrop that the Intergovernmental Panel on Climate Change (IPCC) unveiled its new assessment of global climate science.

Started in 1988 by the U.N. Environment Programme and the World Meteorological Organization, the IPCC supplies policymakers with policy-neutral information about climate change. The IPCC does not conduct its own research­: It summarizes the work by global experts and notes where disagreements lie. More than 200 authors from 66 countries in the organization’s Working Group I wrote the latest report. The document includes more than 14,000 cited references. All eyes are turning to October’s U.N. Climate Change Conference of the Parties (COP26) in Glasgow, Scotland, where the latest report will inform negotiations.

The report predicts that warming will reach 1.5°C by the early 2030s, exceeding the lower goal of the Paris Agreement. How much further the temperatures rise will depend on emissions. Each of the world’s top three emitters—China, the United States, and the European Union­—have goals to slow the rate of emissions this decade.

The IPCC report spells out what could happen if we don’t meet these targets: The Arctic could be ice free by mid- to late century. Sea level could rise by a meter by 2100, inundating cities. And extreme heat waves could become more intense and frequent.

Here are five takeaways.

1. Global Warming Thus Far

For the past 2,000 years, global surface temperatures stayed relatively constant until an unprecedented rate of warming began in the mid-20th century. Today, the planet’s temperature is 1.09°C (0.95°C to 1.20°C) above what is was in 1850–1900. Historical data came from paleoclimate archives, and recent observations are direct measurements. Shading shows 5% and 95% confidence intervals for historical measurements. Credit: Jenessa Duncombe. Source: IPCC [2021]The takeaway: The world has warmed 1.1°C compared to preindustrial levels, and regional hot spots already feel the heat, but we have not surpassed the Paris Agreement goal of limiting warming to 1.5°C or 2°C.

In the past 100,000 years, Earth has been this warm only once. Around 6,500 years ago, the planet’s temperatures were about on par with what they are today. The difference? That warming was part of an ebbing and flowing cycle of ice sheets from natural variation of Earth’s orbit. Today’s temperatures come from pollution that will continue to grow unless we hit the brakes.

Today, some areas on Earth have already warmed beyond 2°C. The Washington Post reported in 2019 that 71 counties in the United States have already warmed past 2°C. Temperatures in the Arctic are rising at least twice as fast as the rest of the world. Islands are particularly at risk: The rallying cry for 1.5°C originated from an alliance of 44 small island states that commissioned a study in 2008 and became alarmed that 2°C warming would threaten their survival.

Previous climate agreements favored a 2°C rise, but mounting evidence suggests that keeping temperatures to a 1.5°C rise would greatly reduce extreme heat, instances of extreme precipitation and drought, sea level rise, species loss and extinction, and ocean acidification.

Global temperatures have a 20% chance of reaching 1.5°C above preindustrial levels during at least one of the next five years, according to the U.K. Met Office and the World Meteorological Organization.

2. Future Warming Pathways

The global average temperature at the end of the century will be determined by the amount of greenhouse gas emissions over the next several decades. The two shared socioeconomic pathways (SSP) that stay below 2°C (very low emissions and low emissions) require net zero emissions by mid- to late century and carbon removal. There are five scenarios: very low emissions (SSP1-1.9), low emissions (SSP1-2.6), midlevel emissions (SSP2-4.5), high emissions (SSP3-7.0), and very high emissions (SSP5-8.5). Shading shows the 5% and 95% confident intervals. Credit: Jenessa Duncombe. Source: IPCC [2021]The takeaway: Keeping warming below 2°C, and perhaps 1.5°C, is still possible; it’ll take immediate and sustained emissions cuts.

Future illustrative scenarios of warming are one of the hallmarks of IPCC reports. The scenarios include natural forcing like solar activity and volcanoes, along with social and economic forces that drive greenhouse gas emissions, land use, climate mitigation, and air pollution.

The scenarios will underpin international policy, research, and activism for years to come.The scenarios aren’t predictions; they can’t determine the fate of global warming. Instead, they provide road maps. The scenarios often underpin international policy, research, and activism for years to come.

The new report has five scenarios: two with low emissions, one with intermediate emissions, and two with high emissions. The very low emissions scenario meets the 1.5°C Paris Agreement goal with likely warming of 1.4°C by 2100—but it overshoots the target to just above 1.5°C midcentury before decreasing to 1.4°C. The low emissions scenario reaches 1.8°C by 2100, just skirting under the high bounds of the Paris Agreement. Midlevel emissions hit 2.7°C, high emissions clock in at 3.6°C, and very high emissions extend to 4.4°C in 2100.

Climate scientist and IPCC Working Group I cochair Valérie Masson-Delmotte said that the midlevel emissions scenario most closely resembles the pledges made by countries to plateau emissions until around 2030. The highest emissions scenarios represent futures without any climate mitigation.

The last IPCC assessment in 2013 included just one low emissions scenario that kept warming under 2°C.

3. Carbon Dioxide’s Oversized Footprint

Aerosols and land use changes cool global climate, whereas greenhouse gases warm it. There are five scenarios: very low emissions (SSP1-1.9), low emissions (SSP1-2.6), midlevel emissions (SSP2-4.5), high emissions (SSP3-7.0), and very high emissions (SSP5-8.5). Shading within the total (observed) warming shows global temperature rise to date. Global surface temperature is measured relative to 1850–1900. Credit: IPCC [2021], Figure SPM.4The takeaway: Net zero carbon dioxide (CO2) is a requirement for any long-term climate solution.

Greenhouse gases include CO2, methane, nitrous oxide, and fluorinated gases. When headlines or politicians talk about “net zero carbon” or “carbon neutral,” it may seem like they’re leaving out other greenhouse gases. But although most climate targets aim to reduce greenhouse gas emissions as a group, the essential ingredient is carbon dioxide.

The graph above illustrates why.

Warming is overwhelmingly controlled by the amount of carbon dioxide in the atmosphere. There is a nearly linear relationship between cumulative carbon dioxide increasing and global surface temperatures rising. The latest report even has an equation for it: Every 1,000 metric gigatons of cumulative CO2 emissions (GtCO2) will likely cause planetary warming of 0.45℃.

4. Annual Carbon Dioxide Emissions

In the very low emissions scenario, annual carbon dioxide emissions drop practically to zero by 2050. In the low scenario, CO2 emissions drop to zero between 2070 and 2080. The other scenarios never achieve zero CO2 emissions. Credit: Jenessa Duncombe. Source: IPCC [2021]The takeaway: The most aggressive scenario to limit warming requires sharp CO2 cuts per decade, net zero CO2 by 2050, and carbon capture.

Carbon emissions come from burning oil, gas, and coal; these fossil fuels drive heating, electricity, agriculture, land use, industry, and transport.

During COVID-19, emissions fell an unprecedented 2.6 GtCO2 in 1 year, according to research published in Nature Climate Change in 2021. Because the emissions cuts during the pandemic were temporary, those reductions won’t have any detectable effect on CO2 concentrations or temperature. The researchers of the Nature Climate Change study predict that emissions cuts of about this scale (1–2 GtCO2) are necessary at least through the 2020s to meet the Paris Agreement.

5. Carbon Extraction

Carbon dioxide typically stays in the atmosphere for centuries to millennia, but carbon removal accelerates the natural cycle to store excess carbon in soil, plants, or water. A simplified computer model shows how long Earth systems take (years to centuries) to rebound following peak CO2 emissions (vertical gray dashed lines in each plot). Credit: IPCC [2021], FAQ 5.3. Click image for larger version.The takeaway: The two scenarios in the report that limit warming below 2°C use carbon removal from the atmosphere during the latter part of the century.

Carbon naturally cycles through the soil, water, plants, and air continuously. We can draw carbon out of the atmosphere by planting trees, sequestering carbon in agricultural soil, restoring ocean ecosystems that store carbon, and applying carbon capture and storage technology.

Model simulations in the latest report suggest that removing carbon dioxide from the atmosphere drops temperatures in just a matter of years.

Although some carbon removal methods show promise, the practice remains in the research and development phase and would require deployment at massive scales, according to the report. Carbon capture could cause undesirable effects such as losses of biodiversity, water, or food production.

More to Come

The report by Working Group I on the physical science is one of four expected over the next year; reports from Working Group II in February 2022 and Working Group III in March 2022 will explore the impacts of climate change and mitigation, respectively. The synthesis report in November 2022 will combine all findings.

—Jenessa Duncombe (@jrdscience), Staff Writer

References

Intergovernmental Panel on Climate Change (IPCC) (2021), Summary for policymakers, in Climate Change 2021: The Physical Science Basis. Contribution of Working Group I to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change, edited by V. Masson-Delmotte et al., Cambridge Univ. Press, Cambridge, U.K., in press.

Examining the Intricacies of Ozone Removal by Deciduous Forests

EOS - Mon, 08/09/2021 - 13:28

Ozone plays a vital role in Earth’s climate system. In the stratosphere, which begins about 6 miles (9.7 kilometers) off the ground, ozone protects the planet from harmful ultraviolet radiation. Lower in the atmosphere, however, the molecule is an air pollutant injurious to both humans and plants, as well as a greenhouse gas.

Ozone interacts with forests through a process known as dry deposition, often with harmful consequences. In this process, turbulence in the atmospheric boundary layer brings ozone to the surface where reactions on and inside leaves and soil remove ozone from the air. Ozone injury to plants results from ozone reactions inside leaves and can alter carbon and water cycling.

The mechanics of dry deposition are not completely understood, however. While we know that turbulent eddies in the atmosphere transport ozone to surfaces onto which the gas can be deposited, one remaining question is whether the organized nature of these eddies, known as organized turbulence, influences dry deposition. Uncertainty related to the mechanics of dry deposition makes it harder to understand ozone in the lower atmosphere and ozone’s impacts on both plants and humans.

In a new study, Clifton and Patton use high-resolution computer simulations to examine the relationship between turbulent eddies and leaf ozone uptake. The authors hypothesized that organized turbulence generates local fluctuations in temperature, wind, and humidity that together with local changes in ozone might result in different rates of ozone uptake by leaves. They call this variation in leaf uptake “segregation of dry deposition.” By taking segregation of dry deposition into account, scientists can better predict ozone dry deposition, the authors say.

The results showed that organized turbulence did not create more efficient areas of ozone uptake in the forest canopy. In other words, higher concentrations of ozone in some air motions together with higher leaf uptake in the same air motions did not result in more ozone uptake by the canopy. Therefore, the findings are a null result and indicate that segregation of dry deposition is likely an unimportant factor in a forest’s ozone budget. Null results are less likely to be published but play an essential role in figuring out important natural processes. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2021JG006362, 2021)

—Aaron Sidder, Science Writer

Satellites Support Disaster Response to Storm-Driven Landslides

EOS - Mon, 08/09/2021 - 13:28

High winds and flooding storm surges driven by tropical cyclones cause some of the deadliest and most damaging weather-related conditions around the world. The rainfall that cyclones bring compounds these conditions and, in hilly or mountainous areas, can trigger landslides that cause even more widespread and devastating impacts. When extreme precipitation occurs over short time frames, hillslopes may become saturated and critically unstable. The most intense storms can trigger thousands of landslides in mountainous areas, as was dramatically illustrated in Puerto Rico in September 2017, when Hurricane Maria’s rains left the landscape scarred by roughly 40,000 landslides.

Locating landslides quickly helps authorities direct resources to where they are most needed to save people and critical infrastructure.Before and during a major cyclone, disaster responders need information about where landslides are likely to occur. In the aftermath, locating landslides quickly helps authorities direct resources to where they are most needed to save people and critical infrastructure. However, this information is often unavailable during an event response or is presented only for small regions, constraining the effectiveness of response efforts.

Satellite data provide regional perspectives of landslide hazards from major cyclones. New satellite constellations (groups of satellites functioning together) can image most of the world every day, helping scientists to detect the locations of landslides as they occur. In addition, using satellite data to characterize extreme rainfall allows us to model where landslides might occur in near-real time.

Our research group at NASA is developing a range of tools to help prepare for and respond to landslide disasters. These tools include rapid landslide mapping as well as modeling at different timescales to help direct response activities soon after a tropical storm dissipates and inform decisionmaking throughout the disaster life cycle.

Recently, we supported the NASA Disasters Program to deliver experimental products during significant disasters. In late 2020, a pair of major hurricanes allowed us, for the first time, to combine and test all our research in response to real events in real time and to gain valuable feedback from end users about the efficacy and utility of the tools we are developing.

Eta, Iota Delivered a One-Two Punch

The busy 2020 Atlantic hurricane season gave rise to 30 named storms—the highest number on record—including 13 hurricanes. In late October, warm sea surface conditions in the Caribbean caused Tropical Storm Eta to intensify rapidly. Eta made landfall south of Puerto Cabezas, Nicaragua, as a category 4 hurricane on 3 November. The storm tracked west and then northeast over Honduras and Guatemala before it reentered the Caribbean Sea and continued northeast. The storm strengthened again before making further landfalls in Cuba and Florida. Intense rainfall led to catastrophic flooding and landslides throughout Central America, with Guatemala, Honduras, and Nicaragua facing the most severe impacts.

Eta and Iota were among the worst disasters to strike Central America in several decades. Combined, the two storms led to hundreds of landslide-related fatalities.Less than a week after Eta dissipated, Tropical Storm Iota became the final named storm of the season on 13 November. Iota also strengthened rapidly, becoming a category 5 hurricane by 16 November. The storm weakened marginally to a category 4 storm prior to hitting Nicaragua on 17 November, but the location of landfall was only 25 kilometers south of where Eta had first landed. The combined damage from Eta and Iota has been estimated at over $8 billion, with Eta causing more than $6 billion in damage in Central America.

Landslides resulting from Hurricane Eta claimed lives in Costa Rica, Guatemala, Honduras, and Nicaragua. One particularly deadly landslide killed more than 100 people in the village of Queja, Guatemala. Hurricane Iota also triggered fatal landslides in Honduras and Nicaragua, and some of the storm’s outlying rainbands even triggered deadly landslides in Colombia. Combined, the two storms led to hundreds of landslide-related fatalities, and although not as deadly as Hurricane Mitch in 1998, Eta and Iota were among the worst disasters to strike Central America in several decades.

A Real-World Test for Models and Observations

The dire conditions from the storms prompted immediate disaster response and recovery (DRR) efforts across the affected regions by various national and international agencies. During these efforts, we provided partner agencies with the latest versions of our landslide tools, putting them into practice in real experimental settings. These tools fall into two categories: predictive models of landslide effects before and during events and rapid observations of landslide locations in the immediate aftermath.

The Landslide Hazard for Situational Awareness (LHASA) tool, first deployed in 2017, provides near-real-time nowcasts of locations where landslide hazards are elevated by comparing precipitation data from the Global Precipitation Measurement mission (GPM) from the past 7 days with long-term precipitation records and a global landslide susceptibility map [Kirschbaum and Stanley, 2018; Stanley and Kirschbaum, 2017].

Fig. 1. This map illustrates results from the Landslide Hazard for Situational Awareness model, which assesses landslide hazard (purple shades) and population exposure to these landslide hazards (teal), during Tropical Storm Eta on 5 November 2020. Credit: NASA Earth Observatory

The latest version of the LHASA model uses a machine learning approach to estimate probabilistic landslide hazards around the world every 3 hours. We combine this hazard output with data sets detailing population and infrastructure to estimate landslide exposure—a critical step forward that allows stakeholders to make actionable decisions (Figure 1). Combining geophysical hazard science with socioeconomic data mirrors the shift in the broader natural hazards research community toward more interdisciplinary disaster research and is in line with the objectives of the United Nations’ Sendai Framework for Disaster Risk Reduction 2015–2030, which recognizes the increasing complexity of disasters and diversity of impacts on human systems.

During Eta and Iota, we also tested our landslide forecast product, which uses rainfall forecast data from the NASA Goddard Earth Observing System (GEOS) suite of computerized forecasts to model locations where intense rainfall is likely to trigger landslides in the coming days. Although this data product remains experimental, the landslide forecasts it produced for these hurricanes showed moderate and high levels of forecasted hazard in areas where landslides were reported or observed. We anticipate that this landslide forecast may address crucial needs of disaster response stakeholders prior to major rainfall events in the future.

Images of the Aftermath

As clouds clear after a cyclone, the full extent of the damage becomes clearer. Satellite radar data have been used to detect changes in topography due to landslides, even through clouds. However, optical imagery remains the most definitive source to determine landslide locations and effects on critical infrastructure like hospitals, power stations, and schools.

Our team recently developed a computerized method called Semi-Automatic Landslide Detection (SALaD) [Amatya et al., 2021], which uses 3-meter-resolution imagery from Planet Labs and 10-meter-resolution Sentinel satellite imagery to rapidly map landslides (Figure 2). Planet Labs’ satellites image much of the world once a day, meaning images can be obtained and landslides can be mapped almost as soon as clouds clear after an event.

Fig. 2. These photos collected by the Sentinel-2 satellite highlight a major landslide event at Queja, Guatemala, triggered by Tropical Storm Eta. The site (a) before the landslide and (b) between Hurricanes Eta and Iota. Major landslide damage can be seen in the center of the image. Credit: Contains modified Copernicus Sentinel data 2020, processed by the European Space Agency

Using the hazard model results obtained from LHASA in conjunction with media reports, we can home in on regions of interest quickly to find areas most affected by landslides and identify where landslide mapping would be beneficial—in some cases within 2 days. This identification and mapping can help estimate impacts on rural communities that have been cut off from outside communication. In addition, landslide mapping guided by hazard estimates can help us discover landslides in unpopulated areas to provide more comprehensive regional maps of triggered landslides.

Information When It’s Needed

We were able to map landslides between the end of Hurricane Eta and before Iota made landfall, which allowed us to distinguish the effects of the two storms.On the basis of informal discussions with disaster responders, the most critical time frame for disaster response information is within a week of an event. The rapid return periods of Planet Labs imagery allow us to build maps of landslide locations well within this time frame, providing information when it is most needed. Hurricanes Eta and Iota illustrated the value of this rapid response—we were able to map landslides even in the short cloud-free interval between the end of Hurricane Eta and the time before Iota made landfall, which allowed us to distinguish the effects of the two storms.

The purpose of developing the hazard and exposure estimates is to provide stakeholders with information to help support decisionmaking during major catastrophes, information that is especially valuable for remote regions and areas in which little information is available from the ground. During the two hurricane events last fall, we worked with the NASA Disasters team to help inform partner agencies assisting in DRR efforts in the region about potential landslide impacts. Our partners included the U.S. Department of Defense Southern Command (SOUTHCOM), the intergovernmental Coordination Center for the Prevention of Disasters in Central America and the Dominican Republic (CEPREDENAC), and the Pacific Disaster Center (PDC), an applied research center managed by the University of Hawaii.

This was the first time that each of the data products described above was provided to end users to support disaster response during an event. Some of the new users from CEPREDENAC and SOUTHCOM indicated that they valued having the regional perspective on landslide hazard and exposure provided by the tools. In particular, they emphasized that combining exposure information with hazard assessments helped them to prioritize the distribution of resources during their responses. In addition, users stated that the landslide forecast information may be the most useful tool for future events.

Field testing of our research is the best way to learn quickly what tools provide actionable information at relevant timescales and where additional work is needed. And with each test, we can incorporate feedback iteratively to keep improving the tools and guiding future developments.

More work is needed to ensure that all relevant hazard and exposure information reaches authorities and the public in areas at high risk of cyclone-driven landslides. However, we suggest that the combined suite of products described here can serve decisionmakers, especially those facing a dearth of detailed information from local observations, as valuable tools of triage in determining where emergency response is needed.

The Auroral E-region is a Source for Ionospheric Scintillation

EOS - Mon, 08/09/2021 - 11:30

Scintillations are random fluctuations of radio signal amplitudes and/or phases caused by irregularities in the ionosphere, which impact global positioning system (GPS) signals. Makarevich et al. [2021] used data covering a period of 166 days from the incoherent scatter radar (ISR) at Poker Flat, Alaska (PFISR) and nearby global positioning system (GPS) receivers to examine the generation mechanisms and possible source regions of ionospheric scintillations.

Scintillations have been traditionally described using the S4 (amplitude) and σϕ (phase) indexes, but when these are unavailable a proxy rate of change of total electron content (ROTI) index is often used. The authors find that the ROTI index exhibits significant correlation and an approximately linear relationship with the phase scintillation metric σϕ in the auroral region while the amplitude scintillation S4 shows no relationship with ROTI or σϕ. The probability of high scintillation measured using ROTI or σϕ also increases with auroral activity. A strong connection between the auroral particle precipitation into the E-region and scintillation (ROTI and σϕ) was noted, indicating that the ionospheric E-region is a key source region for phase scintillation at auroral latitudes. The authors also showed that for one event, scintillations occurred on the trailing edge of a well-defined propagating density enhancement in the E-region, suggesting that the gradient-drift instability was the possible candidate for the plasma structuring and scintillations.

This paper adds to the growing body of evidence that ROTI can be used as a useful proxy for phase scintillation and that the ionospheric E-region is an important source region for ionospheric scintillations at auroral latitudes.

Citation: Makarevich, R. A., Crowley, G., Azeem, I., Ngwira, C., & Forsythe, V. V. [2021]. Auroral E-region as a source region for ionospheric scintillation. Journal of Geophysical Research: Space Physics, 126, e2021JA029212. https://doi.org/10.1029/2021JA029212

—Michael P. Hickey, Editor, JGR: Space Physics

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer