Feed aggregator

How the “Best Accidental Climate Treaty” Stopped Runaway Climate Change

EOS - Thu, 09/02/2021 - 12:29

The international treaty that phased out the production of ozone-depleting chemicals has prevented between 0.65°C and 1°C of global warming, according to research.

The study also showed that carbon stored in vegetation through photosynthesis would have dropped by 30% without the treaty, which came into force in 1989.

Researchers from the United Kingdom, New Zealand, and the United States wrote in Nature that the Montreal Protocol was essential in protecting carbon stored in plants. Studies in the polar regions have shown that high-energy ultraviolet rays (UVB) reduce plant biomass and damage DNA. Forests and soil currently absorb 30% of human carbon dioxide emissions.

“At the ends of our simulations, which we finished around 2100, the amount of carbon which is being taken up by plants is 15% the value of our control world where the Montreal Protocol is enacted,” said lead author and atmospheric scientist Paul Young of Lancaster University.

In the simulation, the UVB radiation is so intense that plants in the midlatitudes stop taking up a net increase in carbon.

Plants in the tropics fare better, but humid forests would have 60% less ozone overhead than before, a state much worse than was ever observed in the Antarctic ozone hole.

A “World Avoided”

The study used a chemistry climate model, a weather-generating tool, a land surface model, and a carbon cycling model. It links ozone loss with declines in the carbon sink in plants for the first time.

Chlorofluorocarbons (CFCs), ozone-depleting chemicals phased out by the Montreal Protocol, are potent greenhouse gases. The study estimated that CFCs would warm the planet an additional 1.7°C by 2100. Taken together, the damage from UVB radiation and the greenhouse effect of CFCs would add an additional 2.5°C warming by the century’s end. Today, the world has warmed, on average, 1.1°C at the surface, leading to more frequent droughts, heat waves, and extreme precipitation.Carbon dioxide levels double in the “World Avoided” scenario.

Carbon dioxide levels in the atmosphere would reach 827 parts per million by the end of the century too, double the amount of carbon dioxide today (~412 parts per million).

The work analyzed three different scenarios: The first assumes that ozone-depleting substances stayed below 1960 levels when massive production kicked in. The second assumes that ozone-depleting chemicals peaked in the late 1980s before tapering off. The last assumes that ozone-depleting chemicals increase in the atmosphere every year by 3% through 2100.

The last scenario, called the “World Avoided,” assumes not only that the Montreal Protocol never happened but also that humans had no idea CFCs were harming ozone, even when the effects would become clear in the 2040s. The models also assume one kind of UVB damage to all vegetation, when in reality, plants react differently.

“Change Is Possible” The ozone layer over Antarctica has stabilized and is expected to recover this century. Credit: Amy Moran/NASA Goddard Space Flight Center

“The Montreal Protocol is regarded as one of the most successful global environmental treaties,” said University of Leeds atmospheric scientist Martyn Chipperfield, who was not involved in the research. “CFCs and other ozone-depleting substances are potent greenhouse gases, and the Montreal Protocol is known for having real benefits in addressing climate change by removing previous levels of high CFCs from the atmosphere.”

The Kigali Amendment to the Montreal Protocol in 2016 brought climate change to the forefront. Countries agreed to gradually phase out hydrofluorocarbons (HFCs), which are used in applications such as air conditioning and fire extinguishing systems. HFCs originally replaced hydrochlorofluorocarbons (HCFCs) and CFCs because they do not harm ozone. Yet HFCs are potent greenhouse gases.

The Montreal Protocol was the “best accidental climate treaty,” said Young. “It is an example of where science discovered there was a problem, and the world acted on that problem.”

Injecting sulfate aerosols into the stratosphere has been proposed as one geoengineering solution to slow global warming. “People are seriously talking about this because it’s one of the most plausible geoengineering mechanisms, yet that does destroy ozone,” Young said. Calculating the harm to the carbon cycle is “the obvious follow-up experiment for us.”

The research highlights the importance of the U.N. Climate Change Conference of the Parties (COP26) this fall, which will determine the success of worldwide climate targets.

Immediate and rapid reductions in greenhouse gases are necessary to stop the most damaging consequences of climate change, according to the Intergovernmental Panel on Climate Change.

—Jenessa Duncombe (@jrdscience), Staff Writer

Heat Pumps Can Lower Home Emissions, but Not Everywhere

EOS - Thu, 09/02/2021 - 12:25

In 1855, engineer Peter von Rittinger was concerned with salt production. He was building a device that could evaporate water from brine more efficiently than available methods. Later iterations of this device, the heat pump, would become tools to slow climate change. Today heat pumps aim to replace a home’s in situ oil or gas consumption with cleaner electricity use.

Researchers recently found that wider installation of residential heat pumps for space heating could lower greenhouse gas emissions. The results, published in Environmental Research Letters, showed that heat pumps would reduce emissions for two thirds of households and financially benefit a third of U.S. homeowners.

But only around 10% of homes use heat pumps, which pump heat out of the house in summer and into the house during winter. “The majority of heating in buildings, as well as hot water and cooking, relies on fossil fuels burned on site,” said Michael Waite, associate research scientist at Columbia University who was not involved in the new study. To reduce emissions, homeowners need to replace such heating systems. “The only direct way of doing that is through electrification of those uses,” said Waite.

Pros and Cons

But wide-scale heat pump adoption may have unintended, undesirable consequences. Thomas Deetjen, a research associate at the University of Texas at Austin, and his coauthors wanted to see which circumstances make heat pumps a wise choice for homeowners and society.

Using tools from the National Renewable Energy Laboratory (NREL), they simulated outcomes of widespread heat pump adoption. They modeled 400 locally representative single-family homes in each of 55 cities. To model the electric grid, the researchers assumed moderate decarbonization of the grid (a 45% decline in emissions over the 15-year lifetime of a heat pump).

Researchers evaluated effects on homeowners, comparing costs of heat pump installation to energy cost savings. They also analyzed changes in carbon dioxide emissions and air pollutants, putting a dollar amount to climate and health damages. Climate damages included costs associated with climate change–driven natural hazards such as flooding and wildfire. Health damages include premature deaths due to air pollution.

“The key finding is that for around a third of the single-family homes in the U.S., if you installed the heat pump, you would reduce environmental and health damages.”“The key finding is that for around a third of the single-family homes in the U.S., if you installed the heat pump, you would reduce environmental and health damages,” said Parth Vaishnav, an assistant professor at the School for Environment and Sustainability at the University of Michigan and a coauthor of the paper. Installing heat pumps would avoid $600 million in health damages and $1.7 billion in climate damages each year. It would also directly save homeowners money on energy costs. They also found that for all homes, assuming moderate electric grid decarbonization, heat pump use cut greenhouse gas emissions.

But heat pump installation did have other consequences. “Heat pumps are not necessarily a silver bullet for every house,” said Deetjen.

Although homeowners may trade a furnace for a heat pump, for example, the electricity for that pump could still come from a plant burning fossil fuels. The cost of generating electricity may be more than the cost of in situ fossil fuel use. “There are some houses that if they get a heat pump, it’s actually worse for the public,” said Deetjen. ”They end up creating more pollution.”

Heat pump benefits also depend on climate. Heat pumps operate less efficiently in the cold, running up electricity costs. In 24 of the studied cities, mostly in colder climates, peak residential electricity demand increased by over 100% if all houses adopted heat pumps, which would require grid upgrades.

“It could be challenging to meet that increase of winter peaking, because our system is not built that way,” said Ella Zhou, a senior modeling engineer at NREL not involved with this study. “We need to think about both the planning and operation of the grid system in a more integrated fashion with future use.”

Consequences of Widespread Electrification

The new research supported 32% of single-family homes converting to heat pumps. More widespread adoption came at much higher financial and health costs. If all U.S. houses adopted heat pumps, the study said, it would yield $6.4 billion in climate benefits. However, it would also cost homeowners $26.7 billion, and pollutants from increased electricity generation would result in $4.9 billion in health damages from financial burdens resulting from illnesses or premature deaths.

There is some uncertainty surrounding these findings. The study didn’t consider the cost of potential grid upgrades or what complete decarbonization would mean for heat pump adoption. Waite pointed out that as the grid evolves, future research should also determine whether renewable energy could even meet the demands of high electricity loads.

—Jackie Rocheleau (@JackieRocheleau), Science Writer

When Deep Learning Meets Geophysics

EOS - Wed, 09/01/2021 - 14:06

As artificial intelligence (AI) continues to develop, geoscientists are interested in how new AI developments could contribute to geophysical discoveries. A new article in Reviews of Geophysics examines one popular AI technique, deep learning (DL). We asked the authors some questions about the connection between deep learning and the geosciences.

How would you describe “deep learning” in simple terms to a non-expert?

Deep learning (DL) optimizes the parameters in a system, a so-called “neural network,” by feeding it a large amount of training data. “Deep” means the system consists of a structure with multiple layers.

DL can be understood from different angles. In terms of biology, DL is a bionic approach imitating the neurons in the human brain; a computer can learn knowledge as well as draw inferences like a human. In terms of mathematics, DL is a high-dimensional nonlinear optimization problem; DL constructs a mapping from the input samples to the output labels. In terms of information science, DL extracts useful information from a large set of redundant data.

How can deep learning be used by the geophysical community?

Deep learning-based geophysical applications. Credit: Yu and Ma [2021], Figure 4aDL has the potential to be applied to most areas of geophysics. By providing a large database, you can train a DL architecture to perform geophysical inferring. Take earthquake science as an example. The historical records of seismic stations contain useful information such as the waveforms of an earthquake and corresponding locations. Therefore, the waveforms and locations serve as the input and output of a neural network. The parameters in the neural network are optimized to minimize the mismatch between the output of the neural network and the true locations. Then the trained neural network can predict locations of new coming seismic events. DL can be used in other fields in a similar manner.

What advantages does deep learning have over traditional methods in geophysical data processing and analysis?

Traditional methods suffer from inaccurate modeling and computational bottlenecks with large-scale and complex geophysical systems; DL could be helpful to solve this. First, DL can handle big data naturally where it causes a computational burden in traditional methods. Second, DL can utilize historical data and experience which are usually not considered in traditional methods. Third, an accurate description of the physical model is not required, which is useful when the physical model is not known partially. Fourth, DL can provide a high computational efficiency after the training is complete thus enabling the characterization of Earth with a high resolution. Fifth, DL can be used for discovering physical concepts, such as the solar system is heliocentric, and may even provide discoveries that are not yet known.

In your opinion, what are some of the most exciting opportunities for deep learning applications in geophysics?

DL has already provided some surprising results in geophysics. For instance, on the Stanford earthquake data set, the earthquake detection accuracy improved to 100 percent compared to 91 percent accuracy with the traditional method.

In our review article, we suggest a roadmap for applying DL to different geophysical tasks, divided into three levels:

Traditional methods are time-consuming and require intensive human labor and expert knowledge, such as in the first-arrival selection and velocity selection in exploration geophysics. Traditional methods have difficulties and bottlenecks. For example, geophysical inversion requires good initial values and high accuracy modeling and suffers from local minimization. Traditional methods cannot handle some cases, such as multimodal data fusion and inversion.

What are some difficulties in applying deep learning in the geophysical community?

Despite the success of DL in some geophysical applications, such as earthquake detectors or pickers, its use as a tool for most practical geophysics is still in its infancy.

Despite the success of deep learning in some geophysical applications its use as a tool for most practical geophysics is still in its infancy.The main difficulties include a shortage of training samples, low signal-to-noise ratios, and strong nonlinearity. The lack of training samples in geophysical applications compared to those in other industries is the most critical of these challenges. Though the volume of geophysical data is large, available labels are scarce. Also, in certain geophysical fields, such as exploration geophysics, the data are not shared among companies. Further, geophysical tasks are usually much more difficult than those in computer vision.

What are potential future directions for research involving deep learning in geophysics?

Future trends for applying deep learning in geophysics. Credit: Yu and Ma [2021], Figure 4bIn terms of DL approaches, several advanced DL methods may overcome the difficulties of applying DL in geophysics, such as semi-supervised and unsupervised learning, transfer learning, multimodal DL, federated learning, and active learning. For example, in practical geophysical applications, obtaining labels for a large data set is time-consuming and can even be infeasible. Therefore, semi-supervised or unsupervised learning is required to relieve the dependence on labels.

We would like to see research of DL in geophysics focus on the cases that traditional methods cannot handle, such as simulating the atmosphere or imaging the Earth’s interior on a large spatial and temporal scale with high resolution.

—Jianwei Ma (jwm@pku.edu.cn,  0000-0002-9803-0763), Peking University, China; and Siwei Yu, Harbin Institute of Technology, China

Forecast: 8 Million Energy Jobs Created by Meeting Paris Agreement

EOS - Wed, 09/01/2021 - 14:05

The tricky part will be ensuring that laid-off fossil fuel workers have access to alternative employment.Opponents of climate policy say curbing fossil fuel emissions will kill jobs, but a new study showed that switching to renewables would actually create more jobs than a fossil fuel–heavy future will. The tricky part will be ensuring that laid-off workers have access to alternative employment.

Globally, jobs in the energy sector are projected to increase from 18 million today to 26 million in 2050 if the world cuts carbon to meet the well-below 2°C target set by the Paris Agreement, according to a model created by researchers in Canada and Europe. Renewables will make up 84% of energy jobs in 2050, primarily in wind and solar manufacturing. The new study was published earlier this summer in One Earth.

In contrast, if we don’t limit global warming to below 2°C, 5 million fewer energy jobs will be created.

The Future Is Looking Bright for Solar and Wind

The Intergovernmental Panel on Climate Change’s latest physical science assessment predicted that climate will be 1.5°C warmer than preindustrial levels by the 2030s unless there are strong, rapid cuts to greenhouse gases in the coming decades. Such cuts will necessitate a greater reliance on sustainable energy sources.

In 2020, renewables and nuclear energy supplied less than a quarter of global energy, according to BP’s 2021 report.

Many regions will gain energy jobs in the transition.This number is expected to rise, however, in part because solar photovoltaics and wind are now cheaper than fossil fuels per megawatt-hour and because many countries have set aggressive emissions-cutting goals.

According to the new study, many regions will gain energy jobs in the transition, including countries throughout Asia (except for China), North Africa, and the Middle East, as well as the United States and Brazil. Although fossil fuel extraction jobs will largely disappear, “massive deployment of renewables leads to an overall rise in jobs,” wrote the authors.

But not all countries will be so lucky: Fossil fuel–rich China, Australia, Canada, Mexico, South Africa, and sub-Saharan African countries will likely lose jobs overall.

Only jobs directly associated with energy industries, such as construction or maintenance, were included in the study. Other reports have included adjacent or induced jobs such as fuel transport, government oversight, and service industry.

Previous studies estimated a larger increase in energy jobs, using numbers compiled from the Organisation for Economic Co-operation and Development.

The new study instead compiled data from primary sources by mining fossil fuel company reports, trade union documents, government reports, national databases, and other sources that cover 50 countries representing all major players in fossil fuels and renewables. Lead study author Sandeep Pai ran the numbers through an integrated assessment model housed at the European Institute on Economics and the Environment. The model calculates job growth projections under different climate policies and social and economic factors. Pai is a lead researcher at the Just Transition Initiative supported by the nonprofit policy research organization the Center for Strategic and International Studies and the Climate Investment Funds.

Calls for Just Transitions

Crucially, the study found that nearly 8 million of the 26 million jobs (31%) in 2050 are “up for grabs,” said study author Johannes Emmerling, a scientist at the European Institute on Economics and the Environment.Renewable manufacturing isn’t tied to a particular location, unlike coal mining.

These jobs in renewable manufacturing aren’t tied to a particular location, unlike coal mining.

Pai concurred. “Any country with the right policies and incentives has the opportunity to attract between 3 [million and] 8 million manufacturing jobs in the future.”

Recently, countries have begun putting billions of dollars toward “just transition,” a loose framework describing initiatives that among other things, seek to minimize harm to workers in the fossil fuel industry. Concerns include salary loss, local revenue, and labor exploitation.

What could be done? Just transition projects may include employing fossil fuel workers to rehabilitate old coal mines or orphan oil wells, funding community colleges to train workers with new skills, supporting social services like substance abuse centers, and incentivizing local manufacturing.

“The just transition aspect is quite critical,” Pai said. “If [countries] don’t do that, this energy transition will be delayed.”

LUT University energy scientist Manish Thulasi Ram, who was not involved in the study, thinks the latest research underestimates the job potential of the energy transition. Using a different approach, Ram forecasts that close to 10 million jobs could be created from battery storage alone by 2050—a sector not considered in the latest analysis.

—Jenessa Duncombe (@jrdscience), Staff Writer

Does the Priming Effect Happen Underwater? It’s Complicated

EOS - Wed, 09/01/2021 - 14:03

In microbiology, the priming effect is the observation that the decomposition rate of organic material is often altered by the introduction of fresh organic matter. Depending on the context, the effect can be the increase or reduction of microbial consumption and a corresponding change in emitted carbon dioxide.

Although the mechanism isn’t fully understood, several contributing processes have been proposed. They include the shift of some specialist microbes to the consumption of only fresh or only older material, as well as increased decomposition of stable (older) matter in search of specific nutrients needed to sustain growth enabled by the addition of fresh material.

The priming effect has been well established in terrestrial soils, but experimental evidence has appeared more mixed in aquatic environments. Both the magnitude and the direction (i.e., increase versus decrease) of the effect have been contradictory in a variety of studies conducted in the laboratory and the field.

Sanches et al. performed a meta-analysis of the literature in an attempt to resolve these difficulties. The authors identified 36 prior studies that published a total of 877 results matching their experimental criteria. Of the subset that directly estimated priming, about two thirds concluded that there was no priming effect, with the majority of the remainder indicating an acceleration in decomposition. However, these past studies used a wide variety of metrics and thresholds to define the priming effect. Many others did not directly calculate the magnitude of the effect.

To overcome the range of methodologies, the researchers defined a consistent priming effect metric that can be calculated from the reported data. With this metric, they found support for the existence of a positive priming effect. Namely, the addition of new organic material increases decomposition on average by 54%, with a 95% confidence interval of 23%–92%. They attribute this divergence from the aggregated conclusions described above to a significantly larger data set (because they could calculate their metric even when the original authors did not), which enabled increased statistical significance.

The meta-analysis also indicated which experimental factors were most correlated with an observed priming effect. One key factor was the proxy chosen for microbial activity, as well as the addition of any other nutrients, such as nitrogen or phosphorus. Finally, the authors noted that other recent meta-analyses using differing methodologies have reported no priming effect; they concluded that the umbrella term “priming effect” may be better split into several terms describing related, but distinct, processes. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2020JG006201, 2021)

—Morgan Rehnberg, Science Writer

中生代1.95亿年的全球气候模拟

EOS - Tue, 08/31/2021 - 13:44

This is an authorized translation of an Eos article. 本文是Eos文章的授权翻译。

中生代约在2.52亿年前至6600万年前之间,是地球历史上的一个关键时期。除了是恐龙时代之外,当时的超级大陆泛古陆开始分裂成我们今天所熟悉的碎片大陆。随着二氧化碳水平的升高和太阳的照射,构造变化影响了全球气候,产生了温暖潮湿的温室条件。详细了解中生代气候变化趋势的驱动因素,不仅有助于深入了解地球的历史,也有助于科学家研究人类造成的地球变暖的后果。

研究过去气候的一种方法是使用数值模型。在一项新的研究中,Landwehrs等人采用500万年的时间间隔,对2.55亿到6000万年前的气候进行了集合气候模拟。他们在不同的运行中调整了特定的参数,以剖析过去气候对古地理、大气二氧化碳水平、海平面、植被模式、太阳能量输出和地球轨道变化的敏感性。

作者发现,中生代的全球平均气温普遍高于前工业时期。他们还观察到一种变暖的趋势,这是由太阳光度增加和海平面上升引起的。海洋区域反射的太阳辐射通常比陆地要少;相应地,研究人员发现,海平面上升和大陆地区洪水泛滥与全球平均温度上升同时出现。在这一总趋势下,大气中二氧化碳的波动造成了全球平均温度的冷暖异常。作者指出,这一发现并不意味着人类导致的全球变暖可以被忽视;现代气候变化的速度比地球历史上的变化快得多。

气候模拟的集合也提供了深入了解中生代长期气候变化的其他方面。总的来说,作者确定了从强烈的季节性和干旱的泛古陆气候到一个更平衡和湿润的气候的过渡。为了辅助对中生代气候趋势的进一步分析,作者们在网上分享了他们的模型数据。

-科学作家Jack Lee

This translation was made by Wiley. 本文翻译由Wiley提供。

Share this article on WeChat. 在微信上分享本文。

Lake Erie Sediments: All Dredged Up with Nowhere to Grow

EOS - Tue, 08/31/2021 - 13:09

In 2014, a Lake Erie algal bloom sent a cloud of toxins into Toledo, Ohio’s water supply, forcing the city to shut down water service to 400,000 residents. Like many lakes in agricultural areas, Lake Erie produces thick, smelly algae mats when the water gets warm. Temperatures above 60°F can trigger algal blooms, and Lake Erie—shallowest and warmest of the Great Lakes—hit nearly 80°F in 2020. In addition, the lake is the final destination for fertilizers washing off of area farms—that’s a recipe for excess photosynthesis.

“You’re cooking a perfect soup for having a very productive lake,” said Angélica Vázquez-Ortega, an assistant professor of geochemistry at Bowling Green State University.

Whereas fertilizers are a source of Lake Erie’s annual algae issue, research from Vázquez-Ortega’s lab suggests agriculture could be a partial solution, too. Instead of applying more fertilizers upstream, farmers could remove nutrients from the lake by mixing Lake Erie sediment into their soils. The research is especially timely as a new law leaves millions of tons of sediment piling up at Ohio’s ports.

Bright green algae lights up western Lake Erie near Toledo, Ohio. Credit: Joshua Stevens, NASA Earth Observatory Hydrologic History

The research is rooted in northeastern Ohio, a region that formerly boasted a 4,000-square-kilometer marsh dubbed the Great Black Swamp. The swamp “was the kidneys of the area, filtering the nutrients and making sure the water Lake Erie is receiving was clean,” said Vázquez-Ortega.

Colonizers sick of the knee-deep mud and clouds of mosquitoes gradually drained the area in the mid-1800s, easing navigation but increasing the export of sediments from the land. The watershed is now over 72% agricultural.

Nutrients like nitrogen and phosphorus naturally enter lakes through sediment export, but farm practices—like draining and fertilizing fields—accelerate the process. Once nutrients enter Lake Erie, they tend to stay there, eventually accumulating in sediments on the lake floor.

Storing dredged sediment from Lake Erie “is a completely new challenge for Ohio.”Those sediments require annual dredging to keep ports viable. Ohio dredges 1.5 million tons of sediment from its eight ports each year, and the Port of Toledo accounts for more than half that figure. Until recently, Toledo would dump dredged sediment into open water, a common practice that introduces phosphorus and nitrogen back into the water column and buries benthic communities on the seafloor. Ohio banned open-water dumping of dredged sediment, effective in 2020, forcing ports to find a process for storing their sediment. For now, Toledo is building artificial islands on the lakefront.

“This is a completely new challenge for Ohio,” said Vázquez-Ortega.

More Sediment, More Soybeans Soybeans grow in buckets in a greenhouse at Bowling Green State University. Credit: Angélica Vázquez-Ortega

Agriculture could be a possible destination for dredged sediment, according to results from Vázquez-Ortega’s lab published in Journal of Environmental Quality. In a greenhouse experiment, sediment from the Port of Toledo increased crop growth with no significant loss of nutrients in percolated water.

The study created four soil combinations, blending material from a local farm with dredged material from the Port of Toledo at sediment ratios of 0%, 10%, 20%, and 100%. Dredged sediment introduced more organic content, giving the test soils a lower bulk density and allowing roots and water to penetrate into the less compact soil. Samples with more Lake Erie sediment grew heftier root systems and generated higher soybean yields. The study demonstrated that Lake Erie sediments can improve crop yield without the use of additional fertilizers.

Farming Out the Research

“All that information is really necessary for convincing a farmer this is an option.”Despite promising results, there’s plenty left to research. What crops grow best and at what sediment percentages? What if industrial contaminants are in the soil? Importantly, will this work outside the greenhouse on an actual farm?

“All that information is really necessary for convincing a farmer this is an option,” said Vázquez-Ortega.

Economics and logistics are other key concerns. With 1.5 million tons of material, Ohio can give nutrient-rich sediment away for free. But would anyone want it?

In the study, the greatest soybean yield came from the 100% dredged sediment sample. That’s not a feasible ratio for farms, though. Sediment is heavy, and transporting it is expensive. Even at 10% application, a farmer would need 100 tons of dried sediment per acre, estimated Keith Reid, a soil scientist with Canada’s Department of Agriculture and Agri-Food. In addition, he said, spreading tons of sediment would require heavy machinery, which would compact the soils and remove any benefits of lower bulk density.

Soybeans with more Lake Erie sediment grew heftier root systems and generated higher soybean yields in a study at Bowling Green State University. Credit: Angélica Vázquez-Ortega

“It’s a good start at looking at the potential for uses of soil amendment,” Reid said of the study. “It’s fair to safely say there was no negative impact. It’s hard to say if there was a real large positive impact.”

Any new method for farming must demonstrate effectiveness and affordability, and Vázquez-Ortega recognizes the work left to do. “It’s a very preliminary step,” she said of the study. She’s now collaborating with the Ohio EPA and the Ohio Lake Erie Commission, among other parties, on a 2-year farm test.

The study is a step toward finding a beneficial use for sediment that preserves the ports and protects the lake. But until the process makes economic and agronomic sense, sediments will remain all dredged up with nowhere to grow.

—J. Besl (@J_Besl), Science Writer

Anticipating Climate Impacts of Major Volcanic Eruptions

EOS - Tue, 08/31/2021 - 13:09

This year marks the 30th anniversary of the most recent volcanic eruption that had a measurable effect on global climate. In addition to devastating much of the surrounding landscape and driving thousands of people to flee the area, the June 1991 eruption at Mount Pinatubo in the Philippines sent towering plumes of gas, ash, and particulates high into the atmosphere—materials that ultimately reduced average global surface temperatures by up to about 0.5°C in 1991–1993. It has also been more than 40 years since the last major explosive eruption in the conterminous United States, at Mount St. Helens in Washington in May 1980. As the institutional memory of these infrequent, but high-impact, events fades in this country and new generations of scientists assume responsibility for volcanic eruption responses, the geophysical community must remain prepared for coming eruptions, regardless of these events’ locations.

Rapid responses to major volcanic eruptions enable scientists to make timely, initial estimates of potential climate impacts to assist responders in implementing mitigation efforts.Rapid responses to major volcanic eruptions enable scientists to make timely, initial estimates of potential climate impacts (i.e., long-term effects) to assist responders in implementing mitigation efforts, including preparing for weather and climate effects in the few years following an eruption. These events also present critical opportunities to advance volcano science [National Academies of Sciences, Engineering, and Medicine (NASEM), 2017], and observations of large events with the potential to affect climate and life globally are particularly valuable.

Recognizing this value, NASA recently developed a volcanic eruption response plan to maximize the quantity and quality of observations it makes following eruptions [NASA, 2018], and it is facilitating continuing research into the drivers and behaviors of volcanic eruptions to further improve scientific eruption response efforts.

How Volcanic Eruptions Affect Climate

Major volcanic eruptions inject large amounts of gases, aerosols, and particulates into the atmosphere. Timely quantification of these emissions shortly after they erupt and as they disperse is needed to assess their potential climate effects. Scientists have a reasonable understanding of the fundamentals of how explosive volcanic eruptions influence climate and stratospheric ozone. This understanding is based on a few well-studied events in the satellite remote sensing era (e.g., Pinatubo) and on proxy records of older eruptions such as the 1815 eruption of Tambora in Indonesia [Robock, 2000]. However, the specific effects of eruptions depend on their magnitude, location, and the particular mix of materials ejected.

To affect global climate, an eruption must inject large quantities of sulfur dioxide (SO2) or other sulfur species (e.g., hydrogen sulfide, H2S) into the stratosphere, where they are converted to sulfuric acid (or sulfate) aerosols over weeks to months (Figure 1). The sulfate aerosols linger in the stratosphere for a few years, reflecting some incoming solar radiation and thus reducing global average surface temperatures by as much as about 0.5°C for 1–3 years, after which temperatures recover to preeruption levels.

Fig. 1. In the top plot, the black curve represents monthly global mean stratospheric aerosol optical depth (AOD; background is 0.004 or below) for green light (525 nanometers) from 1979 to 2018 from the Global Space-based Stratospheric Aerosol Climatology (GloSSAC) [Kovilakam et al., 2020; Thomason et al., 2018]. AOD is a measure of aerosol abundance in the atmosphere. Red dots represent annual sulfur dioxide (SO2) emissions in teragrams (Tg) from explosive volcanic eruptions as determined from satellite measurements [Carn, 2021]. The dashed horizontal line indicates the 5-Tg SO2 emission threshold for a NASA eruption response. Vertical gray bars indicate notable volcanic eruptions and their SO2 emissions. From left to right, He = 1980 Mount St. Helens (United States), Ul = 1980 Ulawun (Papua New Guinea (PNG)), Pa = 1981 Pagan (Commonwealth of the Northern Mariana Islands), El = 1982 El Chichón (Mexico), Co = 1983 Colo (Indonesia), Ne = 1985 Nevado del Ruiz (Colombia), Ba = 1988 Banda Api (Indonesia), Ke = 1990 Kelut (Indonesia), Pi = 1991 Mount Pinatubo (Philippines), Ce = 1991 Cerro Hudson (Chile), Ra = 1994 Rabaul (PNG), Ru = 2001 Ulawun, 2002 Ruang (Indonesia), Re = 2002 Reventador (Ecuador), Ma = 2005 Manam (PNG), So = 2006 Soufriere Hills (Montserrat), Ra = 2006 Rabaul (PNG), Ka = 2008 Kasatochi (USA), Sa = 2009 Sarychev Peak (Russia), Me = 2010 Merapi (Indonesia), Na = 2011 Nabro (Eritrea), Ke = 2014 Kelut (Indonesia), Ca = 2015 Calbuco (Chile), Am = 2018 Ambae (Vanuatu). In the bottom plot, circles indicate satellite-measured SO2 emissions (symbol size denotes SO2 mass) and estimated plume altitudes (symbol color denotes altitude) for volcanic eruptions since October 1978 [Carn, 2021].Eruptions from tropical volcanoes like Pinatubo typically generate more extensive stratospheric aerosol veils because material injected into the tropical stratosphere can spread into both hemispheres.Although this direct radiative effect cools the surface, the aerosol particles also promote warming in the stratosphere by absorbing outgoing longwave radiation emitted from Earth’s surface as well as some solar radiation, which affects atmospheric temperature gradients and thus circulation (an indirect advective effect). This absorption of longwave radiation also promotes chemical reactions on the aerosol particles that drive stratospheric ozone depletion [Kremser et al., 2016], which reduces absorption of ultraviolet (UV) radiation and further influences atmospheric circulation. The interplay of aerosol radiative and advective effects, which both influence surface temperatures, leads to regional and seasonal variations in surface cooling and warming. For example, because advective effects tend to dominate in winter in the northern midlatitudes, winter warming of Northern Hemisphere continents—lasting about 2 years—is expected after major tropical eruptions [Shindell et al., 2004].

Eruptions from tropical volcanoes like Pinatubo typically generate more extensive stratospheric aerosol veils because material injected into the tropical stratosphere can spread into both hemispheres. However, major high-latitude eruptions can also have significant climate impacts depending on their season and the altitude that their eruption plumes reach [Toohey et al., 2019].

The effects of volcanic ash particles are usually neglected in climate models because the particles have shorter atmospheric lifetimes than sulfate aerosols, although recent work has suggested that persistent fine ash may influence stratospheric sulfur chemistry [Zhu et al., 2020]. This finding provides further motivation for timely sampling of volcanic eruption clouds.

The threshold amount of volcanic SO2 emissions required to produce measurable climate impacts is not known exactly. On the basis of prior eruptions, NASA considers that an injection of roughly 5 teragrams (5 million metric tons) of SO2 or more into the stratosphere has sufficient potential for climate forcing of –1 Watt per square meter (that is, 1 Watt per square meter less energy is put into Earth’s climate system as a result of the stratospheric aerosols produced from the SO2) and warrants application of substantial observational assets.

Pinatubo volcano erupts on 12 June 1991. Credit: K. Jackson, U.S. Air Force; accessed at NOAA National Centers for Environmental Information

Since the dawn of the satellite era for eruption observations in 1978, this threshold has been surpassed by only two eruptions: at El Chichón (Mexico) in 1982 and Pinatubo in 1991 (Figure 1), which reached 5 and 6, respectively, on the volcanic explosivity index (VEI; a logarithmic scale of eruption size from 0 to 8). Since Pinatubo, the observational tools that NASA employs have greatly improved.

In the event of future eruptions on par with or larger than those at El Chichón and Pinatubo, rapid mobilization of NASA’s observational and research assets, including satellites, balloons, ground-based instruments, aircraft, and modeling capabilities, will permit scientists to make early initial estimates of potential impacts. Capturing the transient effects of volcanic aerosols on climate would also provide critical data to inform proposed solar geoengineering strategies that involve introducing aerosols into the atmosphere to mitigate global warming [NASEM, 2021].

NASA’s Eruption Response Plan

In the United States, NASA has traditionally led investigations of eruptions involving stratospheric injection because of the agency’s global satellite-based observation capabilities for measuring atmospheric composition and chemistry and its unique suborbital assets for measuring the evolution of volcanic clouds in the stratosphere.

Under its current plan, NASA’s eruption response procedures will be triggered in the event an eruption emits at least 5 teragrams of SO2 into the stratosphere.Under its current plan, NASA’s eruption response procedures will be triggered in the event an eruption emits at least 5 teragrams of SO2 into the stratosphere, as estimated using NASA’s or other satellite assets [e.g., Carn et al., 2016]. The first phase of the response plan involves a review of near-real-time satellite data by a combined panel of NASA Headquarters (HQ) science program managers and NASA research scientists in parallel with initial modeling of the eruption plume’s potential atmospheric evolution and impacts.

The HQ review identifies relevant measurement and modeling capabilities at the various NASA centers and among existing NASA-funded activities. HQ personnel would establish and task science leads and teams comprising relevant experts from inside and outside NASA to take responsibility for observations from the ground, from balloons, and from aircraft. The efforts of these three groups would be supplemented by satellite observations and modeling to develop key questions, priority observations, and sampling and deployment plans.

Implementing the plan developed in this phase would likely result in major diversions and re-tasking of assets, such as NASA aircraft involved in meteorological monitoring, from ongoing NASA research activities and field deployments. Ensuring that these diversions are warranted necessitates that this review process is thorough and tasking assignments are carefully considered.

The second phase of NASA’s volcanic response plan—starting between 1 week and 1 month after the eruption—involves the application of its satellite platforms, ground observations from operational networks, and eruption cloud modeling. Satellites would track volcanic clouds to observe levels of SO2 and other aerosols and materials. Gathering early information on volcanic aerosol properties like density, particle composition, and particle size distribution would provide key information for assessing in greater detail the potential evolution and effects of the volcanic aerosols. Such assessments could provide valuable information on the amount of expected surface cooling attributable to these aerosols, as well as the lifetime of stratospheric aerosol particles—two factors that depend strongly on the aerosols’ size distribution and temporal evolution.

Meanwhile, NASA’s Aerosol Robotic Network (AERONET), Micro-Pulse Lidar Network (MPLNET), and Southern Hemisphere Additional Ozonesondes (SHADOZ) would provide real-time observations from the ground. Eruption cloud modeling would be used to calculate cloud trajectories and dispersion to optimize selection of ground stations for balloon launches and re-tasking of airborne assets.

The third phase of the response plan—starting 1–3 months after an eruption—would see the deployment of rapid response balloons and aircraft (e.g., from NASA’s Airborne Science Program). The NASA P-3 Orion, Gulfstream V, and DC-8 aircraft have ranges of more than 7,000 kilometers and can carry heavy instrumentation payloads of more than 2,500 kilograms to sample the middle to upper troposphere. A mix of in situ and remote sensing instruments would be employed to collect detailed observations of eruption plume structure, evolution, and optical properties.

NASA’s high-altitude aircraft (ER-2 and WB-57f) provide coverage into the stratosphere (above about 18 kilometers) with payloads of more than 2,200 kilograms. These high-altitude planes would carry payloads for measuring the evolving aerosol distributions along with trace gas measurements in situ to further understand the response of stratospheric ozone and climate forcing to the eruption. In particular, the high-altitude observations would include data on the particle composition and size distribution of aerosols, as well as on ozone, SO2, nitrous oxide and other stratospheric tracers, water vapor, and free radical species. Instrumented balloons capable of reaching the stratosphere could also be rapidly deployed to remote locations to supplement these data in areas not reached by the aircraft.

The third phase would be staged as several 2- to 6-week deployments over a 1- to 2-year period that would document the seasonal evolution, latitudinal dispersion, and multiyear dissipation of the plume from the stratosphere. These longer-term observations would help to constrain model simulations of the eruption’s impacts on the global atmosphere and climate.

Enhancing Eruption Response

An effective eruption response is contingent on timely recognition of the hallmarks of a major volcanic eruption.An effective eruption response is contingent on timely recognition of the hallmarks of a major volcanic eruption, namely, stratospheric injection and substantial emissions of SO2 (and H2S) amounting to more than 5 teragrams, using satellite data. However, it may take several hours to a day after an event for satellites to confirm that emissions have reached this level. By then, time has been lost to position instruments and personnel to effectively sample the earliest stages of an eruption, and it is already too late to observe the onset of the eruption.

Hence, a key element in efforts to strengthen eruption responses is improving our recognition of distinctive geophysical or geochemical eruption precursors that may herald a high-magnitude event. Observations of large, sulfur-rich eruptions such as Pinatubo have led to scientific consensus that such eruptions emit “excess” volatiles—gas emissions (especially sulfur species, but also other gases such as water vapor and carbon dioxide) exceeding those that could be derived from the erupted magma alone. Excess volatiles, in the form of gas bubbles derived from within or below a magma reservoir that then accumulate near the top of the reservoir, may exacerbate climate impacts of eruptions and influence magmatic processes like magma differentiation, eruption triggering and magnitude, and hydrothermal ore deposition [e.g., Edmonds and Woods, 2018]. They may also produce detectable eruption precursors and influence eruption and plume dynamics, although how remains largely unknown.

With support from NASA’s Interdisciplinary Research in Earth Science program, we (the authors) have begun an integrated investigation of eruption dynamics focused on understanding the fate of excess volatiles from their origins in a magma reservoir, through underground conduits and into a volcanic plume, and, subsequently, as they are dispersed in the atmosphere. The satellite observations we use are the same or similar to those required for rapid assessment and response to future high-magnitude events (with a VEI of 6 or greater).

We are examining whether excess volatile accumulation in magma reservoirs can drive large eruptions and produce enhanced aerosol-related climate impacts resulting from these eruptions.Our investigation is using data from previous moderate-scale eruptions (VEI of 3–5) with excellent satellite observational records that captured instances in which gases and aerosols displayed disparate atmospheric dispersion patterns. Among the main questions we are examining is whether excess volatile accumulation in magma reservoirs can drive large eruptions and produce enhanced aerosol-related climate impacts resulting from these eruptions. Using numerical model simulations of eruptions involving variable quantities of excess volatiles, we will endeavor to reproduce the specific atmospheric distributions of gases and aerosols observed by satellites after these events and thus elucidate how volatile accumulation might influence plume dispersion and climate impacts.

We are currently developing a framework to simulate a future eruption with a VEI of 6+. Over the coming year, we hope to produce benchmark simulations that track the fate of volcanic gases as they travel from a subsurface magmatic system into the atmosphere to be distributed globally. This simulation framework will comprise a coupled suite of subsystem-scale numerical models, including models of magma withdrawal from the magma reservoir, magma ascent within the volcanic conduit, stratospheric injection within the volcanic plume, and atmospheric dispersion and effects on climate.

With these tools, NASA will have gained important capabilities in simulating volcanic eruptions and understanding their potential precursors. These capabilities will complement NASA’s satellite and suborbital observations of volcanic eruptions as they unfold—an important advance for volcano science and a powerful means to assess the climate impacts of future large explosive eruptions.

Acknowledgments

Although not listed as coauthors, we acknowledge contributions to this work from the organizers of the NASA Major Volcanic Eruption Response Plan workshop in 2016, including Hal Maring, Ken Jucks, and Jack Kaye (NASA HQ), as well as the workshop participants from NASA, NOAA, the U.S. Geological Survey, and the academic community.

Making the Most of Volcanic Eruption Responses

EOS - Tue, 08/31/2021 - 13:09

Mount St. Helens, hidden away in a remote forest midway between Seattle, Wash., and Portland, Ore., had been putting out warning signals for 2 months. Still, the size and destruction of the 18 May 1980 eruption took the United States by surprise. The blast spewed ash into the air for more than 9 hours, and pyroclastic density currents and mudflows wiped out surrounding forests and downstream bridges and buildings. Fifty-seven people died as a result of the volcanic disaster, the worst known in the continental United States.

In addition to its immediate and devastating effects, the 1980 eruption spurred efforts to study volcanic processes and their impacts on surrounding landscapes more thoroughly and to advance monitoring and forecasting capabilities. It also prompted further cooperation among agencies and communities to better prepare for and respond to future volcanic eruptions.

Mount St. Helens erupts in 1980. Credit: USGS

According to a 2018 U.S. Geological Survey (USGS) report, there are 161 potentially active volcanoes in the United States and its territories, including 55 classified as high or very high threat [Ewert et al., 2018]. Over the past century, especially since 1980, integrated studies of active volcanic systems have shed light on magmatic and volcanic processes that control the initiation, duration, magnitude, and style of volcanic eruptions. However, because there have been few continuously monitored volcanic eruptions with observations that span the entire sequence before, during, and after eruption, our understanding of these processes and the hazards they pose is still limited.

This limited understanding, in turn, hampers efforts to forecast future eruptions and to help nearby communities prepare evacuation plans and to marshal and allocate resources during and after an event. Thus, a recent consensus study about volcanic eruptions by the National Academies of Sciences, Engineering, and Medicine [2017] highlighted the need to coordinate eruption responses among the broad volcanological and natural hazard scientific community as one of three grand challenges.

The charge of the Community Network for Volcanic Eruption Response (CONVERSE) is to maximize the scientific return from eruption responses at U.S. volcanoes.The Community Network for Volcanic Eruption Response (CONVERSE) initiative, which began in 2018 as a 3-year Research Coordination Network supported by the National Science Foundation (NSF), is attempting to meet this challenge. The charge of CONVERSE is to maximize the scientific return from eruption responses at U.S. volcanoes by making the most efficient use possible of the relatively limited access and time to collect the most beneficial data and samples. This goal requires looking for ways to better organize the national volcano science community.

A critical component of this organization is to facilitate cooperation between scientists at academic institutions and the U.S. Geological Survey, which is responsible for volcano monitoring and hazard assessment at domestic volcanoes. Since 2019, CONVERSE has conducted several workshops to allow groups representing the various disciplines in volcanology to formulate specific science questions that can be addressed with data collected during an eruption response and assess their capacities for such a response. Most recently, in November 2020, we conducted a virtual response scenario exercise based on a hypothetical eruption of Mount Hood in the Oregon Cascades. A month later, Hawaii’s Kīlauea volcano erupted, allowing us to put what we learned from the simulation to use in a coordinated response.

A Virtual Eruption at Mount Hood

To work through a simulated response to an eruption scenario at Mount Hood, our CONVERSE team had planned an in-person meeting for March 2020 involving a 2-day tabletop exercise. Travel and meeting restrictions enacted in response to the COVID-19 pandemic required us to postpone the exercise until 16–17 November, when we conducted it virtually, with 80 scientists participating for one or more days. The goal of the exercise was to test the effectiveness of forming a science advisory committee (SAC) as a model for facilitating communications between responding USGS volcano observatories and the U.S. academic community.

Mount Hood, located near Portland, Ore., is relatively accessible through a network of roads and would attract a lot of scientific interest during an eruption. Thus, we based our eruption scenario loosely on a scenario developed in 2010 for Mount Hood for a Volcanic Crisis Awareness training course.

Mount Hood, seen here in 2018, is part of the Cascade Volcanic Arc and is less than 100 kilometers from Portland, Ore. Credit: Seth Moran, USGS

Because a real-life eruption can happen at any time at any active volcano, participants in the November 2020 workshop were not informed of the selected volcano until 1 week prior to the workshop. Then we sent a simulated “exercise-only” USGS information statement to all registrants noting that an earthquake swarm had started several kilometers south of Mount Hood’s summit. In the days leading up to the workshop, we sent several additional information statements containing status updates and observations of the volcano’s behavior like those that might precede an actual eruption.

During the workshop, participants communicated via videoconference for large group discussions and smaller breakout meetings. We used a business communications platform to share graphics and information resources and for rapid-fire chat-based discussions.

The workshop started with an overview of Mount Hood’s eruptive history and monitoring status, after which the scenario continued with the volcano exhibiting escalating unrest and with concomitant changes in USGS alert level. Participants were asked to meet in groups representing different disciplines, including deformation, seismicity, gas, eruption dynamics, and geochemistry, to discuss science response priorities, particularly those that required access to the volcano.

This break in communication was done to mimic the difficulty that external scientists often encounter communicating with observatory staff during full-blown eruption responses.As the simulated crisis escalated at the end of the first day of the workshop, non-USGS attendees were told they could no longer communicate with USGS participants (and vice versa). This break in communication was done to mimic the difficulty that external scientists often encounter communicating with observatory staff during full-blown eruption responses, when observatory staff are fully consumed by various aspects of responding to the eruption. Instead, scientific proposals had to be submitted to a rapidly formed Hood SAC (H-SAC) consisting of a USGS liaison and several non-USGS scientists with expertise on Mount Hood.

The H-SAC’s role was to quickly evaluate proposals submitted by discipline-specific groups on the basis of scientific merit or their benefit for hazard mitigation. For example, the geodesy group was approved to install five instruments at sites outside the near-field volcanic hazard zone to capture a deep deflation signal more clearly, an activity that did not require special access to restricted areas. On the other hand, a proposal by the gas group to climb up to the summit for direct gas sampling was declined because it was deemed too hazardous. Proposals by the tephra sampling group to collect ash at specific locations were also approved, but only if the group coordinated with a petrology group that had also submitted a proposal to collect samples for characterizing the pressure-temperature and storage conditions of the magma.

The H-SAC then provided recommendations to the Cascade Volcano Observatory (CVO) scientist-in-charge, with that discussion happening in front of all participants so they could understand the considerations that went into the decisionmaking. After the meeting, participants provided feedback that the SAC concept seemed to work well. The proposal evaluation process that included scientific merit, benefit for hazard mitigation, and feasibility was seen as a positive outcome of the exercise that would translate well into a real-world scenario. Participants emphasized, however, that it was critical that SAC members be perceived as neutral with respect to any disciplinary or institutional preferences and that the SAC have broad scientific representation.

Responding to Kīlauea’s Real Eruption

Just 1 month after the workshop, on 20 December 2020, Kīlauea volcano began erupting in real life, providing an immediate opportunity for CONVERSE to test the SAC model. The goals of CONVERSE with respect to the Kīlauea eruption were to facilitate communication and coordination of planned and ongoing scientific efforts by USGS scientists at the Hawaiian Volcano Observatory (HVO) and external scientists and to broaden participation by the academic community in the response.

Kīlauea’s volcanic lava lake is seen here at the start of the December 2020 eruption. Credit: Matthew Patrick, USGS

These goals were addressed through two types of activities. First, a Kīlauea Scientific Advisory Committee (K-SAC), consisting of four academic and three USGS scientists, was convened within a week of the start of the eruption. This committee acted as the formal point of contact between HVO and the external scientific community for the Kīlauea eruption, and it solicited and managed proposals for work requiring coordination between these groups.

The K-SAC evaluated proposals on the basis of the potential for scientific gain and contributions to mitigating hazards. For example, one proposal dealt with assessing whether new magma had entered the chamber or whether the eruption released primarily older magma already under the volcano. The K-SAC also identified likely benefits and areas of collaboration between proposing groups, and it flagged potential safety and logistical (including permitting from the National Park Service) concerns in proposals as well as resources required from HVO.

Proposals recommended by the K-SAC were then passed to HVO staff, who consulted with USGS experts about feasibility, potential collaborations, and HVO resources required before making decisions on whether to move forward with them. One proposal supported by the K-SAC involved the use of hyperspectral imaging to quantify in real time the proportion of crystalline material and melt in the active lava lake to help determine the lava’s viscosity, a critical parameter for hazard assessment.

The second major activity of CONVERSE as the Kīlauea eruption progressed was to provide a forum for communication of science information open to all volcano scientists.The second major activity of CONVERSE as the Kīlauea eruption progressed was to provide a forum for communication of science information via a business communications platform open to all volcano scientists. In addition, we posted information about planned and current activities by HVO and external scientists online and updated it using “living documents” as well as through virtual information sessions. As part of this effort, the K-SAC developed a simple spreadsheet that listed the types of measurements that were being made, the groups making these measurements, and where the obtained data could be accessed. For example, rock samples collected from the eruption were documented, and a corresponding protocol on how to request such samples for analytical work was developed. We held virtual town hall meetings, open to all, to discuss these topics, as well as updates from HVO K-SAC members on the status of the eruption and HVO efforts.

The Future of CONVERSE

The recent virtual exercise and the experience with the Kīlauea eruption provided valuable knowledge in support of CONVERSE’s mandate to develop protocols for coordinating scientific responses to volcanic eruptions. These two events brought home to us the importance of conducting regular, perhaps yearly or even more frequent, tabletop exercises. Such exercises could be held in person or virtually to further calibrate expectations and develop protocols for scientific coordination during real eruptions and to create community among scientists from different institutions and fields. Currently, workshops to conduct two scenario exercises are being planned for late this year and early next year. One will focus on testing deformation models with a virtual magma injection event; the other will focus on a response to an eruption occurring in a distributed volcanic field in the southwestern United States.

CONVERSE’s best practices and protocols could guide future international eruption responses coordinated among volcano monitoring agencies of multiple countries.Future exercises should build on lessons learned from the Hood scenario workshop and the Kīlauea eruption response. For example, although the SAC concept worked well in principle, the process required significant investments of time that delayed some decisions, possibly limiting windows of opportunity for critical data collection at the onset of the eruption. Although CONVERSE is focused on coordination for U.S. eruptions, its best practices and protocols could guide future international eruption responses coordinated among volcano monitoring agencies of multiple countries.

A critical next step will be the development of a permanent organizational framework and infrastructure for CONVERSE, which at a minimum should include the following:

A mechanism for interested scientists to self-identify and join CONVERSE so they can participate in eruption response planning and activities, including media and communications training. A national-level advisory committee with accessibility to equitable decisionmaking representation across scientific disciplines and career stages. The committee would be responsible for coordinating regular meetings, planning and conducting activities, liaising with efforts like the SZ4D and Modeling Collaboratory for Subduction initiatives, and convening eruption-specific SACs. Dedicated eruption SACs that facilitate open application processes for fieldwork efforts, including sample collection, distribution, and archiving. The SACs would establish and provide clear and consistent protocols for handling data and samples and would act as two-way liaisons between the USGS observatories and external scientists. A dedicated pool of rapid response instruments, including, for example, multispectral cameras, infrasound sensors, Global Navigation Satellite System receivers, uncrewed aerial vehicles, and gas measuring equipment. This pool could consist of permanent instruments belonging to CONVERSE and housed at an existing facility as well as scientist-owned distributed instruments available on demand as needed.

The SAC structure holds great promise for facilitating collaboration between U.S. observatories and external science communities during eruptions and for managing the many requests for information from scientists interested in working on an eruption. It also broadens participation in eruption responses beyond those who have preexisting points of contact with USGS observatory scientists by providing a point of contact and process to become engaged.

We are confident that when the next eruption occurs in the United States—whether it resembles the 1980 Mount St. Helens blast, the recent effusive lava flows from Kīlauea, or some other style—this structure will maximize the science that can be done during the unrest. Such efforts will ultimately help us to better understand what is happening at the volcano and to better assist communities to prepare for and respond to eruptions.

Acknowledgments

The CONVERSE RCN is funded by NSF grant 1830873. We thank all the participants of the Mount Hood Virtual Scenario Exercise and, specifically, the USGS CVO staff and the CONVERSE disciplinary leaders. We also thank USGS HVO staff for their insights and efforts during the ongoing Kīlauea eruption in making the K-SAC (and future SACs) a better vehicle for communication and collaboration. We thank Hawaii state volcanologist Bruce Houghton for developing the initial training course that served as a basis for the Mount Hood scenario workshop in collaboration with CVO scientists. Finally, we thank Tina Neal and Wes Thelen for their careful reviews of this paper. Any use of trade, firm, or product names is for descriptive purposes only and does not imply endorsement by the U.S. government.

Megaripples on Mars—How to Name Wind-Shaped Features on the Red Planet

EOS - Mon, 08/30/2021 - 13:16

Spacecraft on Mars have captured images of barren, desertlike landscapes complete with dunes of sand. But the windswept features are not identical to their terrestrial counterparts. The surface of the Red Planet is dotted by midsized sand masses not found on Earth. These features go by a variety of names: megaripples, sand ripples, sand ridges, and the less melodic transverse aeolian ridges (TARs) chief among them. But the nomenclature is inconsistent, causing confusion that hampers scientific advancement. Now, new research has proposed an official naming scheme for wind-formed features.

“Because we’re seeing new things on Mars, people have adapted what they are calling things,” said Mackenzie Day, a researcher at the University of California, Los Angeles. Day and James Zimbelman of the Smithsonian Institution coauthored the new paper, published in the journal Icarus. “People have adapted in slightly different ways.”

“As we’re getting new information, having a standard nomenclature makes sure everybody is on the same page.”Broadly based, the new system classifies aeolian, or wind-created, features by size and geomorphology.

“As we’re getting new information, having a standard nomenclature makes sure everybody is on the same page,” Day said. “If we’re all talking about the same thing in the same way, it makes it easier as a scientific community to move forward in understanding what’s going on.”

Blowing in the Wind

Aeolian bed forms are piles of moving sand brushed across the planet’s surface by the wind. On Earth, the largest of these features are sand dunes, which can stretch for tens to hundreds of meters in length. Small ripples only a few tens of centimeters long can be carved on top of these dunes.

“Bed forms are really amazing interactions between the atmosphere and the surface,” said Serina Diniega, a research scientist at NASA’s Jet Propulsion Laboratory who is not associated with the new paper. “If you see one, you immediately have a whole bunch of information about the environment.”

In addition to dunes and ripples, Mars has a third type of bed form: transverse aeolian ridges. TARs appear to have been created by the wind but move on much slower timescales than their fellow bed forms and seem to be coated with a layer of fine-grained dust.

Day and Zimbelman proposed a broad frame of terminology for ripples, TARs, and dunes that relies first on the size and geomorphology of the features. As surface observations (anticipated soon from Curiosity and Perseverance) allow scientists to classify grain size and dust cover, the terminology can be further constrained.

Small ripples, for instance, are measured on centimeter scales in height and are classified as straight crested. Megaripples are measured at less than a meter in height and may be straight crested or sinuous. Unlike small ripples, megaripples may include coarse grains. TARs are classified as larger than a meter in height and straight crested. Dunes, the largest aeolian bed form on Mars, are classified as taller than 3 meters and have wildly varying geomorphologies: from straight crested or sinuous to radially symmetrical stars.

Straight-crested transverse aeolian ridges in the lower part of the image give way to more complex star-shaped sand dunes in this terrain southwest of Schiaparelli Crater on Mars. Credit: NASA/JPL-Caltech/University of Arizona

According to Ryan Ewing, a geologist at Texas A&M University not involved in the new study, the biggest challenge of a settled nomenclature will be agreeing on the processes that created TARs. “I think as we uncover more about how sediments move on Mars by wind, that will help the community refine their definitions of these [features],” he said.

“I really like this paper because it’s attempting to apply some sort of structure around these terms,” said Diniega. “Using a classification based on looking at both Earth and Mars is better than a classification system based only on Earth.”

Sand Through the Solar System

Bed forms aren’t limited to Earth and Mars. They’ve been spotted on Venus and on Saturn’s moon Titan, and there have been signs of them on Pluto and Comet 67/P.

“Every place that has an atmosphere—and even places that don’t have an atmosphere—we see an example of these bed forms,” Diniega said.

The new classification system should work on these bodies as well as on Earth and Mars, researchers said.

“As we start exploring the solar system more, like sending Dragonfly to Titan, it would be nice to have a nomenclature that could be applied independent of what planet you’re on,” Day said.

—Nola Taylor Tillman (@NolaTRedd), Science Writer

Los geomojis traducen la geociencia a cualquier idioma

EOS - Mon, 08/30/2021 - 13:16

Esta historia es parte de la semana de cobertura de Covering Climate Now centrada en “Vivir la emergencia climática”. Covering Climate Now es una colaboración periodística global comprometida con el fortalecimiento de la cobertura de la historia climática.

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

Los emojis son pictogramas utilizados para transmitir mensajes particulares. Tienen el mismo significado básico en cualquier idioma: una sonrisa significa una sonrisa.

New Inversion Method Improves Earthquake Source Imaging

EOS - Mon, 08/30/2021 - 11:30

The increasing density and accuracy of geodetic measurements of earthquake-related movements of the Earth’s surface can improve our understanding of the physics of earthquakes, a critical requirement to better assess seismic hazard in tectonically active regions.

Modeling of such surface observations allows to recover key parameters of the earthquake source, such as the geometry and spatial extent of the fault that broke during the earthquake as well as the amount of slip on this fault during rupture (the “coseismic slip”), all related to the energy released during the seismic event.

Although there has long been evidence of the geometric complexity of faults, most earthquake source models ignore this complexity, for the sake of simplicity and due to the lack of precise imaging of faults at depth. Planar fault geometries are generally assumed, which leads to biases in coseismic slip estimates.

Dutta et al. [2021] propose a method to simultaneously recover the fault geometry and the coseismic slip, allowing for non-planar faults and slip variability along fault (described by a limited set of parameters to be estimated). The method ultimately provides not a unique fault and slip model but an ensemble of plausible models, with uncertainties on all the estimated parameters, which is also essential for a proper interpretation of the results.

The approach is validated and its contribution discussed using synthetic cases of earthquakes, mimicking the main characteristics of real earthquakes in various tectonic contexts, to underline the importance of taking into account more realistic fault geometries in earthquake source modeling.

Citation: Dutta, R., Jónsson, S., & Vasyura-Bathke, H. [2021]. Simultaneous Bayesian estimation of non-planar fault geometry and spatially-variable slip. Journal of Geophysical Research: Solid Earth, 126, e2020JB020441. https://doi.org/10.1029/2020JB020441

 —Cécile Lasserre, Associate Editor, JGR: Solid Earth

Amazon Deforestation and Fires are a Hazard to Public Health

EOS - Fri, 08/27/2021 - 12:51

Wildfires are increasingly common, and their smoky emissions can wreak havoc on human health. In South America, fires may cause nearly 17,000 otherwise avoidable deaths each year. Fire frequency in the Amazon basin has been linked to climate—drier conditions result in more fires—but direct human action, such as deforestation, drives up fire frequency as well.

Deforestation can cause wildfires that spread out of control because of humans burning vegetation. Smoke from these fires also interacts with clouds and the Sun to reduce further rainfall, which creates dry, fire-prone conditions. Perhaps most subtly, deforestation breaks up the massive rain forest ecosystem, disrupting the forest’s effect on climate and creating a drier environment with greater fire risk.

The number of fires—and the amount of fire-generated air pollution—in the Brazilian Legal Amazon has closely shadowed the deforestation rate over the past 2 decades. In the early 2000s, high deforestation rates led to frequent fires and accompanying air pollution. Over time, the Brazilian government enacted policies to protect large sections of the rain forest, and the deforestation rate dropped. In the past decade or so, however, the rate of deforestation has been slowly climbing again, bringing with it increased fire and health risks.

In a new study, Butt et al. model the year 2019 under different deforestation scenarios to understand the link between these events in the rain forest and public health.

The researchers found that if 2019 had matched the year in the last 2 decades with the least deforestation, regional air pollution would have been substantially lower that year, resulting in 3,400 fewer premature deaths across South America. If, on the other hand, deforestation rates in 2019 had matched those of the early 2000s, before government regulations brought the rates down, the number of fires would have increased by 130%, and the number of deaths would have more than doubled to 7,900.

These models demonstrate the link between direct human action such as deforestation and environmental hazards and, consequently, public health. They also show how government environmental protections can have a substantial impact on human health. (GeoHealth, https://doi.org/10.1029/2021GH000429, 2021)

—Elizabeth Thompson, Science Writer

How Can Wristbands Monitor Pollution, PAHs, and Prenatal Care?

EOS - Fri, 08/27/2021 - 12:51

Wildfires, vehicle emissions, petroleum by-products, and even cooking can conjure images of climate change. Each category also produces polycyclic aromatic hydrocarbons, or PAHs, which are products of incomplete combustion. This group of hundreds of chemical species is toxic to human health, and as the world warms, more extreme weather will further exacerbate their presence in the atmosphere, said Natalie Johnson, an environmental toxicologist at Texas A&M University. Monitoring human exposure to these air pollutants, she said, is a public health issue.

In a new study published in the Journal of Exposure Science and Environmental Epidemiology, Johnson and her colleagues used silicone wristbands—like the ones worn by people supporting various causes—to track pregnant women’s exposure to PAHs. Their study took place in McAllen, Texas, which has high rates of premature births and childhood asthma—adverse health outcomes associated with poor air quality.

Highway to Poor Health

Studies show that mothers exposed to high levels of air pollutants have infants with an increased risk of developing respiratory infections, said Johnson. Moreover, if mothers live closer to sources of vehicle-related air pollution—like freeways—their children are more likely to develop asthma.

Three pathways transport PAHs into the human body, said Johnson. We can absorb them through our skin or ingest them by consuming charred foods. The third pathway is inhalation. This is a key pathway because our bloodstream can deliver PAHs throughout the body, she said. In pregnant women, the sanguineous superhighway can carry PAHs to the placenta. In this way, said Johnson, “[PAHs] can have some direct effects on the developing fetus.”

One problem PAHs can pose for people is cancer. By themselves, PAHs are typically not carcinogenic, but the pathways through which they can morph into cancer-causing molecules are known, said Pierre Herckes, an atmospheric scientist at Arizona State University who was not involved in the Texas study. Less well understood are the exact mechanisms through which PAHs might cause premature births and other adverse health outcomes in infants and children, he said.

Our bodies’ metabolisms can manage PAHs by converting them to free radicals, which are unstable, oxygen-bearing molecules that desperately want to react with anything that can give them electrons, said Johnson. Our bodies’ antioxidant systems can limit the impact of free radicals, she said. But when the scale tips toward more free radicals—more oxidants versus antioxidants—the antioxidant systems can become overwhelmed. The accumulation of free radicals can adversely affect growth and development in utero, she said, because “oxidative stress is tightly linked with inflammation.”

Exposures at the earliest developmental stages—in the womb or during infancy—may increase the possibility of lung disease.Too little inflammation leaves the body prone to viruses and bacteria, whereas too much results in the body overreacting to seemingly benign invaders, like dust. “Early in infancy, the prenatal exposures to these pollutants may cause immune suppression, and you may get inability to respond to important viruses like RSV,” said Johnson. The respiratory syncytial virus (RSV) can be deadly for premature and young infants. Later in life, the same children exposed to these pollutants at an early age tend to have too much inflammation, triggering asthma attacks or allergic reactions, she said. Exposures at the earliest developmental stages—in the womb or during infancy—may increase the possibility of lung disease.

Silicone Sampling

One of Johnson’s graduate students, Jairus Pulczinski, mentioned to Johnson that other researchers have demonstrated the ability of silicone wristbands to passively sample pollutants. He suggested using them in an ongoing study of pregnant women in McAllen, which has poor air quality resulting from phenomena like Saharan sands blowing through the region and PAH-laden air wafting by from seasonal burning in Mexico.

Wristbands can qualitatively assess air pollution, said Johnson. “They’ve been really useful so far to say, ‘Yes or no, is there exposure?’”

However, Johnson and her colleagues were more interested in air quality in this case. “In our study, we actually placed [wristbands] on small backpacks because we were also sending out active air monitors.” Within the backpacks, two tubes actively sampled the air. One tube sampled heavier, particulate PAHs, whereas the second tube sampled lighter, volatile PAHs. Seventeen expectant mothers carried the wristband-tagged backpacks for 24 hours, sampling the ambient atmosphere. The wristband results compared well with the volatile PAH sampling tube.

In this graphical abstract of the Texas study, the center photo shows an actual active air sampler backpack, with a wristband affixed to the outside. The wristband collects data related to volatile and semivolatile PAHs: phenanthrene, biphenyl, 1-methylnaphthalene, and 2-methylnaphthalene. The backpack active air sampler also gathers data related to 2,6-dimethylnaphthalene, a particulate PAH. Credit: Natalie M. Johnson

Health care providers could use information provided by the backpack sampler to help identify whether the person, for example, lives with a smoker or has an open fireplace—both PAH sources, said Herckes.

The “bad” part of town might be closer to the highway or near industries that produce more air pollutants, and quantifying these deleterious effects could play a role in environmental justice.The data might also provide important clues about the geography of health care. “More studies show that exposure is different by socioeconomic situation,” Herckes said. The “bad” part of town might be closer to the highway or near industries that produce more air pollutants, and quantifying these deleterious effects could play a role in environmental justice, he explained.

Johnson and her colleagues provided guidance for limiting PAH exposure to the expectant mothers in McAllen who were part of the study. Good air filtration in the home is paramount, and monitoring the air quality index also helps.

Wristband research is now focusing on the quantitative side: How much air pollution has someone been exposed to? If the amount of exposure is known, said Johnson, scientists can start detangling just how much exposure is detrimental to mother or child. In the future, she said, they plan to explore whether air quality regulations are stringent enough to ensure safe pregnancies. This, she said, could inform future policy. “Doing anything we can to mitigate these environmental exposures could have a potentially big impact on public health outcomes.”

—Alka Tripathy-Lang (@DrAlkaTrip), Science Writer

Tracking Sustainability Goals with Creative Data Sources

EOS - Fri, 08/27/2021 - 12:51

The United Nations has created 17 interlinked Sustainable Development Goals (SDGs) that “recognize that ending poverty and other deprivations must go hand-in-hand with strategies that improve health and education, reduce inequality, and spur economic growth—all while tackling climate change and working to preserve our oceans and forests.” The SDGs were unveiled in 2015 and are intended to be reached by 2030 in a process nicknamed Agenda 2030. Achieving the SDGs will be a challenge of scientific know-how, technical creativity, and political will.

But there’s one challenge that often slips under the radar: How do we actually track how well we’re doing? It turns out there are insufficient data for 68% of the environmental indicators needed to assess progress on the SDGs. Several areas with limited data are biodiversity, ecosystem health, and the concentration of pollution and waste in the environment.

“When you are missing data, it creates sort of a vicious cycle where you are making decisions on data that you don’t have, and you are also making a deprioritizing investment in the collection of that data.”“If we are going to be able to measure the environment in a way that allows us to make better interventions and investment, then we need better data,” said Jillian Campbell, head of monitoring, review, and reporting at the United Nations (U.N.) Convention on Biological Diversity, at a recent U.N. World Data Forum webinar.

“When you are missing data, it creates sort of a vicious cycle where you are making decisions on data that you don’t have, and you are also making a deprioritizing investment in the collection of that data,” she said.

Traditionally, data from academia, official statistical agencies, central banks, the private sector, and nonprofit organizations are gathered through surveys and censuses. To plug data gaps in these sources, experts are turning to geospatial technologies, crowdsourced science initiatives, and greater partnerships with Indigenous Knowledge holders.

Earth Observations from Ocean to Desert

Earth observations, which include space-based data, remotely sensed data, ground-based data, and in situ data, help provide spectral information that can be processed or transformed into high-level products that are useful to produce indicators and inform relevant SDG targets and goals, said Argyro Kavvada, program manager of SDGs at NASA.

For example, the GEO Blue Planet initiative works to advance the use of Earth observations to monitor coastal eutrophication and marine litter. (The Group on Earth Observations (GEO) is a global network of governments, academic and research institutions, data providers, businesses, engineers, and scientists.)

Kavvada said GEO Blue Planet has worked with the U.N. Environment Programme and Esri to develop a methodology that combines satellite information on factors such as chlorophyll concentrations with in situ and ground-based observations such as imagery and videos from uncrewed aerial vehicles and ship-based cameras. Such robust data can help scientists infer changes in water quality.

Similarly, GEO’s Land Degradation Neutrality initiative is working with the U.N. Convention to Combat Desertification to develop data quality standards, analytical tools, and remote sensing data to help support land degradation monitoring and reporting. The group is looking at how globally available Earth observation data sets can complement national data for three main SDG concerns: land cover, land productivity, and soil data.

“They are looking for key requirements for the global data sets to contribute, and for the suitability of those data sets in supporting country efforts, timeliness of the data, and spatial cover rates,” Kavvada said.

Integrating Geospatial Information

The Food and Agriculture Organization (FAO) of the United Nations is the custodian agency for 21 out of the 231 SDG indicators. Its roles include supporting countries to develop the capacity to generate, disseminate, and use national data, as well as to realign their national monitoring frameworks to SDG indicators.

“Geospatial information and satellite Earth observations offer unprecedented opportunities to support national and global statistical systems.”At the FAO, guiding progress on the SDGs increasingly relies on integrating geospatial information provided by Earth observations. “Geospatial information and satellite Earth observations offer unprecedented opportunities to support national and global statistical systems,” said Lorenzo De Simone, a geospatial specialist in the office of the chief statistician at the FAO.

Broadening the scope of data may make monitoring environmental progress more cost-effective and efficient, experts say. Geospatial data, for instance, can be scaled and integrated with traditional sources of socioeconomic and environmental data such as surveys.

For instance, the FAO developed a new SDG indicator directly monitored with Earth observation data. SDG indicator 15.4.2, the Mountain Green Cover Index (MGCI), uses remotely sensed images to measure changes in mountain vegetation such as forests, shrubs, and individual trees.

De Simone said the FAO is committed to helping member states develop Earth observation technology. EOSTAT, for example, is aimed at building capacity with Earth observations (EO) to produce national agricultural statistics and support practices that increase efficiency in the use of fertilizer and chemicals to boost production output. De Simone said four EOSTAT pilots have been implemented, in Afghanistan, Lesotho, Senegal, and Uganda.

Mapping Crowdsourced Science

There is untapped potential for crowdsourced science (described as “voluntary public participation in scientific research and knowledge production”) to plug some of the data gaps for SDG indicators, according to a study done by Dilek Fraisl at the International Institute for Applied Systems Analysis. “We should start thinking how we harness the potential,” she said.

When data are lacking for the SDGs, relevant agencies within countries can search for crowdsourced projects that may help provide some of these data gaps and reach out to them, said Fraisl.

“In cases where citizen science projects do not exist but data are lacking, relevant agencies within countries might consider working with local communities on the ground on issues that are important to them but might also help to fill data gaps,” Fraisl said.

For example, Fraisl said crowdsourced science was crucial to monitoring marine debris in Ghana, a project of the Ghana Statistical Service. As individuals and groups engaged in beach cleanups along Ghana’s 550-kilometer-long coastline, they cataloged the numbers and types of marine debris they found.

In addition to communities and individuals, the initiative involved federal agencies (such as the Ghana Statistical Service and the Ghana Environmental Protection Agency), nongovernmental organizations (such as the Ocean Conservancy), and intergovernmental organizations (such as the U.N. Convention on Biological Diversity).

“One of the most valuable lessons from this initiative is that working with existing initiatives…utilizes existing tools [and is] more resource efficient than starting an initiative from scratch,” Fraisl said.

Indigenous Knowledges

Indigenous Knowledges are not a traditional source of data for monitoring environmental progress on the SDGs. But such knowledge could provide valuable information on natural resources, public services, and population demographics.

For example, Indigenous rangers in Arnhem Land, Australia, are using science-based water monitoring techniques to test salinity, toxicity, and microbiological contaminants in freshwater streams on their ancestral homelands, according to one recent study. Such techniques “complement local Indigenous knowledge concerning the health of waterways, such as the taste, smell, and color of water in specific places, combined with knowledge of the presence or absence of key attributes that can serve as proxies for the status and condition of freshwater ecosystems.”

A more comprehensive use of Indigenous Knowledges and other nontraditional methodologies can thus help bridge data gaps in monitoring the SDGs, researchers said, as well as contributing to better stewardship of local ecosystems.

—Munyaradzi Makoni (@MunyaWaMakoni), Science Writer

Meet Jane, the Zircon Grain—Geochronology’s New Mascot

EOS - Fri, 08/27/2021 - 12:51

There is no “once upon a time” in the children’s book Jane’s Geological Adventure, but if there were, that time was 400 million years ago, in a world replete with creepy-crawly creatures threading their way through a lush verdure of unfamiliar plants. As a volcano’s magma chamber seethed, a zircon grain named Jane was born, growing until she erupted onto Earth for a full life of metamorphism, multiple mountain-building adventures, sundry erosion styles, and her most recent phase: display at a museum.

As part of his outreach efforts, author and geochronologist Matthew Fox, a lecturer at University College London, created Jane, the zircon grain, modeling her life after rocks similar to the Jura Mountains in Switzerland. “That we can actually understand this much information from a single grain of sand is really incredible,” said Fox, “and I wanted to try to describe how we can do that.”

Janes Geological History As Jane metamorphoses, other minerals marking the rock’s transformation grow, including Mitesh the mica (center) and Gary the garnet. Credit: Martin Fox

“You can think about this as a children’s book,” Fox said, or “you can think about it in terms of how you could actually extract that information from a crystal, which might require different analytical methods.” As nature’s time capsules, zircons like Jane can retain evidence of multiple high-temperature events, like the timing of crystallization or metamorphism. “That’s why we use them for geochronology,” he said.

As Jane metamorphoses, she is joined by Gary the garnet and Mitesh the mica. Although the characters are anthropomorphized, the metamorphic mineral assemblage is real. “You can look at trace element concentration within different zones to see what other minerals might have been growing at these different time intervals,” explained Fox.

In this excerpt from the book Jane’s Geological Adventure, Jane the zircon bumps along the river bed as animals appropriate to the Cretaceous period swim and play. Credit: Matthew Fox and Martin Fox In this excerpt from the book Jane’s Geological Adventure, a geologist collects Jane from an outcrop. Credit: Matthew Fox and Martin Fox

The shapes of the crystal itself provide additional clues, said Fox. For example, Jane’s distinct points eroded away as she bumped along a river bottom.

After this tumultuous travel, the sediments in which she landed eventually lithified and rose skyward as mountains. From this vantage, Jane watched glaciers carve the land before being plucked from an outcrop by a geochronologist who wrings history from Jane’s lattice.

By describing the many geological processes that Jane (and, by extension, mountains like the Swiss Jura) experienced, Fox said, “you can get a sense of how much can fit into such a long period of time.”

A Family Project

Although Jane’s geological tale spans 400 million years, the book itself has a much younger provenance. After years of scribbling short geology-themed poems during field trips, Fox began to toy with writing a longer poem for children. In 2018, shortly after Fox joined University College London as a Natural Environment Research Council Independent Research Fellow, he began to compose Jane’s story on his phone, during his commute.

As Fox refined the rhyme, he reached out to several friends and colleagues, many of whom worked on zircon-related quandaries (including the author of this article). With the support of his community, Fox became convinced that a children’s book was worth pursuing. However, without funds to pay for an illustrator, he was stuck.

At this point, the project became a true family affair. Fox’s mother contributed indirectly to the story because Jane is her namesake. Fox proposed a collaboration to his father, Martin Fox, an architect and occasional painter, who agreed to help. Fox the elder created a playfully anthropomorphic, but scientifically precise, depiction of Jane’s journey, while Fox the younger ensured the details were correct—for example, that only dinosaurs from the same era feature in Jane’s story.

Connecting with Kids and Parents

As the Fox family worked to illustrate Jane’s exploits, Matthew Fox began looking forward to fatherhood himself. Fox’s daughter was born soon after he finished the book and just as the COVID-19 pandemic began in spring 2020.

Jane’s Geological Adventure was written by Matthew Fox and illustrated by Martin Fox. Credit: Alka Tripathy-Lang

The pandemic thwarted Fox’s plan to sell the book at conferences to eschew postage. He opted to sell the book via his website instead, publishing the first 200 copies during parental leave. “[Fatherhood] made me appreciate how important children’s books are and how important that time is where you actually interact with children,” said Fox. “My partner says it’s one of [my daughter’s] favorite books.”

Structural engineer Jan Moore of Salt Lake City, Utah, said that Jane’s appeal is not limited to children. “[My kids] really got the idea [that] the dinosaurs existed at one time and not another, which I thought was an advanced idea that kids don’t always grasp,” she said. “I don’t think I really grasped what the age [of a rock] really meant until there was a children’s book to explain it.”

Creative Public Outreach

When it comes to public outreach, said Fox, “everyone’s got different skills.” For example, although he’s spoken at schools around London, he acknowledges that public speaking sometimes makes him nervous.

“Try and do outreach activities that you enjoy doing.”He had a different approach to the book. “This was something that I quite enjoyed doing…and I thought I could contribute to outreach in a way that might be potentially more far-reaching,” Fox said. He plans to donate any profits made from the sales of Janes Geological Journey to GeoBus, an outreach activity funded by the U.K. Natural Environment Research Council wherein a van brimming with activities designed to engage children in geology travels to different schools.

To other researchers trying expand their outreach, Fox offered some tried-and-true advice: “Try and do outreach activities that you enjoy doing.” If the outreach you’re doing is something you’re excited about, he said, people will respond to that.

—Alka Tripathy-Lang (@DrAlkaTrip), Science Writer

Not So Hot Under the Collar

EOS - Fri, 08/27/2021 - 11:30

The InSight lading spacecraft, which successfully landed on Mars on 26 November 2018, contained a heat flow and physical properties (HP3) package. One component of this was a heating experiment designed to measure the Martian soil’s ability to transport heat. Grott et al. [2021] report on results of this experiment. Although HP3 was designed to be deployed in a vertical configuration below the ground up to 5 meters (16.4 feet) deep via a self-hammering penetrator dubbed the “mole”, a depth of only 30 centimeters was achieved due to issues with the deployment. However, that depth was sufficient to conduct the heating experiment (see image above).

Numerical modeling of the heating data resulted in soil thermal conductivity of 0.039 W/mK. This value is more than a factor of two smaller than determined at the Phoenix Mars mission, the only direct measurement of this quantity prior to InSight (Zent et al., 2010). The presence of cementing agents at Phoenix such as shallow subsurface water ice and perchlorate salts likely explain this difference. Overall, both values show that the Martian soil is a poor thermal conductor.

Comparisons with results from laboratory experiments were established to estimate the soil grain size, suggesting that the overwhelming majority of the soil particles are less than 200 microns or 0.2 mm (just over 1/16 inch) across, corresponding to a fine sand. This is consistent with the surface geology of the landing site, named Homestead hollow, being a depression filled with eolian or wind-blown deposits (Grant et al., 2020; Weitz et al., 2020). The heat flow data also suggested that the porosity values are high (>60%) and the degree of cementation is inferred to be low. This latter inference is a somewhat surprising result since visual observations strongly suggest the presence of a duricrust or partially cemented layer (Golombek et al., 2020; Marteau et al., 2021). The surface of Mars continues to yield mysteries, even when we dig beneath the surface.

Citation: Grott, M., Spohn, T., Knollenberg, J., Krause, C., Hudson, T. L., Piqueux, S., et al. [2021]. Thermal conductivity of the martian soil at the Insight landing site from HP3 active heating experiments. Journal of Geophysical Research: Planets, 126, e2021JE006861. https://doi.org/10.1029/2021JE006861

—Germán Martinez, Associate Editor, and Bradley J. Thomson, Editor, JGR: Planets

New View of Expanding Perspectives in the Geosciences

EOS - Thu, 08/26/2021 - 13:25

Geology and other geosciences, physical geography, and environmental sciences remain “disproportionately white” in the United Kingdom and the United States, according to a new study. Addressing reasons for this disparity and increasing minority representation in science, technology, engineering, and mathematics (STEM) careers, the authors argue, are crucial for creating a stronger academic field that is more fully capable of meeting challenges that cross disciplines and policy.

According to the new research, published in Nature Geoscience, geoscience students represent a narrow subset of the population in the United Kingdom. Between 2018 and 2019, just 5.2% of physical geography, 6.9% of environmental science, and 10.4% of geology postgraduate students identified as Black, Asian, or minority ethnic (BAME), even though these groups represent 18.5% of the 18- to 24-year-old population. In the past 5 years, there have been two years when no Black women took up full-time postgraduate research in geology or physical geography programs.

The U.S. panorama doesn’t look better. In the past 40 years, about 85% of people earning doctorates in geosciences have come from white, non-Hispanic backgrounds. According to National Science Foundation data, of the 610 geoscience doctoral degrees awarded to U.S. citizens in 2016, white students received 480 (79%), Asian students 28 (5%), Hispanic or Latino students 27 (4%), Black or African American students 11 (2%), and Native American students 5 (less than 0.1%)

The researchers agree that inequitable access to geoscientific education has one of its roots in how geosciences were historically defined: with colonialism, white supremacy, and resource exploitation. This legacy prevents early-career Black, Asian, Indigenous, LGBTQ+, and disabled researchers from identifying with the classic image of a geoscientist.

Natasha Dowey, a lecturer in physical geography at Sheffield Hallam University in the United Kingdom and lead author of the study, said that the first step to bridging the inclusion gap is decolonizing early geoscience education by supporting it with history and sociology classes for “telling the whole truth of the subject from different experiences and viewpoints.”

Big Changes Require Diverse Perspectives

To support these perspectives, professors have to understand that students from underrepresented backgrounds can identify with more than one cultural or social identity. Recognizing such intersectional identities will help in “making sure that no student has been left behind,” said Ann-Marie Núñez from the Department of Educational Studies of The Ohio State University. Núñez was not involved in the Nature Geoscience study.

Greater inclusion in geoscience will help communities outside academia, the report authors contend: From studying air, soil, and groundwater pollution to making risk assessments and finding new minerals, Earth and environmental studies have everything to do with addressing the climate crisis. “If we don’t have a diverse workforce working on those problems, how are we going to be genuinely tackling them equitably across all communities?” asked Dowey.

Mitzy Cortés was selected by the U.K. embassy in Mexico City as an “Ambassador for a Day” to discuss the need for including the perspectives of women from Indigenous communities in the framework of the 26th U.N. Climate Change Conference of the Parties (COP26). Cortés is a Mixteca student at the National Autonomous University of Mexico’s Mexican Indigenous Languages Promotion and Advocacy project.

“Our academic research is fully determined by our life experiences. And we’ve been historically excluded for where we’ve [been] born, the color of our skin, and our gender. There’s no way to save the planet if those oppression systems are not disarticulated.”From Cortés’s perspective, research and decisionmaking on climate change issues are still taken from a privileged-exclusionary perspective. Establishing clean energy projects such as wind farms or hydroelectric plants, for instance, has serious consequences in Native and Indigenous territories, where land dispossession is a familiar practice. Those industries are often accompanied by territorial militarization, she said, which increases violence, especially for women. Including the perspectives of those who have experienced such a situation, she explained, is a necessary practice to find solutions to climate issues that don’t perpetuate violence against women from rural and Indigenous communities.

Dealing with climate change is much more than diminishing carbon emissions, Cortés said. It’s understanding how the climate crisis differently affects all communities around the world, especially those that have been historically marginalized.

“Our academic research is fully determined by our life experiences. And we’ve been historically excluded for where we’ve [been] born, the color of our skin and our gender….There’s no way to save the planet if those oppression systems are not disarticulated,” she concluded.

Academic, Institutional, and Government Reform

To bridge the inclusion gap, not just in geoscience but in STEM in general, much more effort from academia, government, and institutions is needed, said the paper authors.

Starting to change the elitist logic of high-tariff universities’ recruitment processes is a good starting point, explained Christopher Jackson, chair in sustainable geoscience at the University of Manchester in the United Kingdom and a coauthor of the new study. The tariff classification system describes “a university’s reputation in the international marketplace” on the basis of established entry standards data. High-tariff universities in the United Kingdom include Oxford, Cambridge, and University College London.

“Recruitment processes have nothing to do with smartness. It’s the fact that some grow up in a better socioeconomic circumstance that allows them to get a high grade, that then allows them to get accepted into a specific university.”The admission standards used by high-tariff universities “have nothing to do with smartness,” Jackson said. “It’s the fact that some [applicants] grow up in a better socioeconomic circumstance that allows them to get a high grade, that then allows them to get accepted into a specific university.”

From the government side, many U.S. researchers say it’s necessary to support public minority-serving institutions (MSIs). MSIs enroll 16% of all African American students, 40% of all Hispanic American students, and a rapidly growing number of Asian American students in higher education, according to U.S. Department of the Interior data.

Institutions are also responding. In 2019, AGU started the first geoscience-focused inclusion initiative, the Bridge Program, as part of the Inclusive Graduate Education Network (IGEN). IGEN now has 31 partner institutions all over the United States.

The Bridge Program is focused on giving a second chance to students from underrepresented backgrounds—focusing on their personal stories, passions, and interests in science; mentoring them; and monitoring their academic development to ensure they graduate. These steps are needed to bridge the inclusion gap, said Pranoti Asher, assistant director for grants and education programs at AGU.

Like Cortés, Asher emphasized the importance of recognizing diverse perspectives when addressing the climate crisis and other instances of global change. “That’s what is needed to help future geoscientists to become the broad diverse thinkers that we need to solve all the incoming issues for the next decades,” Asher said.

—Humberto Basilio (@humbertobasilio), Science Writer

Indigenous Peoples Harness Space Technology to Stop Deforestation

EOS - Thu, 08/26/2021 - 13:25

In the Peruvian Amazon, deforestation is being driven by illegal gold mining, logging, and clear-cutting for cultivation of crops like palm oil and coca. Between 2001 and 2016, the Peruvian Amazon lost nearly 2 million hectares of forest.

More than one third of the Amazon rain forest falls within the territory of more than 3,000 formally acknowledged Indigenous groups, but the size and inaccessibility of Indigenous Peoples’ territory in the Amazon mean that timely alerts from satellite data can make a big difference in their existing antideforestation patrol efforts. For example, alerts can allow communities to take preventive actions, such as blocking the rivers where loggers entered.

To determine the effectiveness of timely deforestation alerts derived from recent satellite data, Indigenous Peoples in the Peruvian Amazon teamed up with scientists and conservation organizations. They analyzed deforestation rates in Indigenous communities with access to alerts about deforestation in their territory and compared them with rates from groups using other patrol methods.

The findings, published in July in the Proceedings of the National Academy of Sciences of the United States of America (PNAS), showed that from 2018 to 2020, there was a notable reduction in tree cover loss among communities with access to satellite data.

The study suggested that governments should provide Indigenous communities greater access to satellite data. “As a policymaker, you want to know: If a monitoring method works on this site, it might work somewhere else,” said Tara Slough, assistant professor of politics at New York University and lead author of the paper.

Training Locals to Monitor Forests Kichwa forest monitors fill out a deforestation report in Sunullacta, Peru. Credit: Melvin Shipa Sihuango/ORPIO/RFUS

In 36 out of 73 participating Indigenous communities, researchers trained local people to use a combination of two smartphone mapping applications (Locus Map and Global Forest Watcher), with monitors receiving monthly deforestation alerts from Peru’s national GeoBosques deforestation-monitoring platform, which uses NASA’s Landsat data. They could then head out with the phone, document the problem area, see what activities were going on, and make a report to the community council.

Wendy Pineda, a project coordinator for Rainforest Foundation US—the rights-based forest protection organization that funded the research project—has been working for more than a decade to bring more high-tech monitoring tools to Indigenous communities.

For this study, each of the noncontrol communities designed its own monitoring plan, tailored to existing and potential threats in its area. For example, a Ticuna community in Buen Jardin de Callaru that was heavily threatened by land invasion from coca farmers was encouraged to send its monitoring data of 7 hectares of deforestation to Peru’s Environmental Prosecutor’s Office. As a result, the invaders left, deforestation halted, and the community is now the beneficiary of a reforestation project.

“Indigenous Peoples have done [forest monitoring] for their entire existence and will continue to do so, only now they can be more decisive, thanks to technology.”“Indigenous Peoples have done [forest monitoring] for their entire existence and will continue to do so, only now they can be more decisive, thanks to technology,” Pineda said. “Satellite imagery and technology…only complemented and enhanced the effectiveness of their plans.”

Jorge Perez is president of the Indigenous People’s Organization of the Eastern Amazon (ORPIO), which has long fought for land rights and preventing deforestation. ORPIO’s member communities participated in the study, and according to Perez, they are the ones who know the territory, know its problems, and feel the impacts of deforestation.

According to Perez, the satellite information aided Indigenous monitors in responding more quickly to sites where illegal deforestation was taking place. More immediate notification also allowed authorities, like the Ministry of Environment and the Environmental Prosecutor’s Office, to build the case against those engaged in illegal activity.

“Communities are experiencing the positive impacts of the intervention, so many continued to monitor even when funding ended and the pandemic began,” Pineda said.

Empirical Evidence

Ane Alencar, director of science for the Amazon Environmental Research Institute who wasn’t involved in the PNAS study, said empirical evidence of deforestation reduction helps generate strong arguments and ideas for policymakers.“Communities are experiencing the positive impacts of the intervention, so many continued to monitor even when funding ended and the pandemic began.”

“In this case, the availability of real-time information on deforestation…seemed to end up empowering the communities to do peer enforcement,” Alencar said. Consistency is key, she warned: Over time, the effect of community empowerment may fade away if offenders perceive that there are no consequences.

One criticism of the PNAS paper is that none of the authors are from Peru, raising the specter of colonial science, in which local collaborators contribute to a major paper in a prestigious journal published by scientists from the Global North but don’t receive the academic benefits of being named as authors.

“I think it is very important to engage local actors or experts in scientific studies, since they are aware of the context and they are able to redirect and enrich any discussion or conclusion, while avoiding any possibility of misinterpretation of the results,” Alencar said.

Back in the Peruvian Amazon, Perez said he wants more climate funding, including a recent commitment from Germany, Norway, the United Kingdom, and the United States, to arrive directly to Indigenous communities to help them to continue to defend their territory.

“Even if funds run out, we are able to continue to use this knowledge,” he said.

—Andrew J. Wight (@ligaze), Science Writer

Explaining Thermal Tides in the Upper Atmosphere During the 2015 El Niño

EOS - Thu, 08/26/2021 - 13:25

Much like the oceans, the atmosphere on Earth oscillates on a global scale. The so-called atmosphere tides depend on the Sun’s heat and gravity, as well as the pull of the Moon and Earth’s own rotation. In the troposphere, scientists have identified a regular tide, which they call DW1, that has a 24-hour period and a zonal wave number of 1. Zonal wave number refers to the number of troughs and peaks that can be observed simultaneously in a wave as it circles the entire globe, meaning in this case that there is only one of each.

Researchers have known for several decades that DW1 exists in large part because of heating related to tropospheric water vapor, which then propagates up into the mesosphere and lower thermosphere. During the 2015 El Niño, scientists saw an anomalously large enhancement of DW1.

In a new study, Kogure and Liu sought to explain the cause of that enhancement by investigating two potential drivers. First, they looked at the effect of enhanced tropospheric tidal heating caused by El Niño. However, the team reported the 2015 El Niño event had increased heating by 0.4 milliwatt per kilogram, which equates to only an extra 5%. In turn, the team says, 5% more tropospheric heating could explain only 7% of the thermal tide enhancement.

The other 93% comes from a reduction in dissipation in the atmosphere. As soon as DW1 begins to propagate, it also begins to dissipate as the air it’s pushing into drags against it, which is part of the natural life cycle of the tide. However, during the 2015 El Niño, wind impacts on the thermal tide were greatly reduced, causing less dissipation and a net enhancement of DW1. Specifically, the researchers suggest that quasi-biennial oscillation (QBO) in the lower stratosphere can account for the reduced dissipation by suppressing the vertical wavelength and wind shear in the northward direction. The QBO explains how equatorial zonal winds shift from easterlies to westerlies in the stratosphere roughly every 2 years. The 2015 El Niño corresponded to an eastward QBO phase, which the researchers say created the favorable conditions for the enhanced thermal tide. (Journal of Geophysical Research: Space Physics, https://doi.org/10.1029/2021JA029342, 2021)

—David Shultz, Science Writer

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer