EOS

Syndicate content Eos
Science News by AGU
Updated: 2 years 26 weeks ago

Hear Ye! Hear Ye! A Declaration of the Rights of the Moon

Tue, 07/20/2021 - 12:16

Sometime this decade, humans will probably stand on the Moon for the first time since 1972. U.S. president Joe Biden recently committed to NASA’s Artemis program, which aims to land the first woman and the first person of color on the lunar surface by 2024. Other countries and private companies want to send people, too.

This time, they might take more than photographs and a few rocks.

Mining on the Moon is becoming increasingly likely, as growing numbers of countries and corporations hope to exploit its minerals and molecules to enable further exploration and commercial gain. The discovery of water on the lunar surface has raised the possibility of permanent human settlement, as well as making the Moon a potential pit stop on the way to Mars: Water can be split into hydrogen and oxygen and used to make rocket fuel.

Click image for an infographic on how Moon mining could work. Credit: David Michaud, 911metallurgist.com

In 2015, the U.S. Congress and President Barack Obama passed legislation that unilaterally gave American companies the right to own and sell natural resources they mine from celestial bodies, including the Moon. In 2020, President Donald Trump issued an executive order proclaiming that “Americans should have the right to engage in commercial exploration, recovery, and use of resources in outer space…and the United States does not view it as a global commons.”

Other countries are also interested in exploring our nearest celestial neighbor. In 2019, China landed a probe on the farside of the Moon. Russia is resurrecting its Moon program, planning a series of missions starting in 2021 to drill into the surface of the lunar south pole and prospect for water ice, helium-3, carbon, nitrogen, and precious metals.

Corporations have been plotting out their own ways to claim resources on the Moon, including U.S.-based SpaceX and Blue Origin, and the Japanese lunar exploration company ispace—which, according to its website, aims to mine water and “spearhead a space-based economy.” The company also anticipates that by 2040 “the Moon will support a [permanent] population of 1,000 people with 10,000 visiting every year.”

But what effects might these activities have on Earth’s only natural satellite? Who gets to decide what happens on the Moon?

We, the People of Earth

In a bid to get more people thinking about these questions, and to start a conversation about the ethics of exploiting the lunar landscape for profit, a group of mainly Australian academics have come up with a draft Declaration of the Rights of the Moon, which they hope members of the global public will sign and discuss.

“We the people of Earth,” the declaration begins, and goes on to assert that the Moon is “a sovereign natural entity in its own right and…possesses fundamental rights, which arise from its existence in the universe.” These rights include “the right to exist, persist and continue its vital cycles unaltered, unharmed and unpolluted by human beings; the right to maintain ecological integrity…and the right to remain a forever peaceful celestial entity, unmarred by human conflict or warfare.”

Given the acceleration of planned missions and ongoing legal uncertainty over what private companies are allowed to do in space, the authors said, “it is timely to question the instrumental approach which subordinates this ancient celestial body to human interests.” Now is the time, they said, to have a clear-eyed global debate about the consequences of human activity in a landscape that has remained largely unchanged for billions of years.

The aim of the declaration is to give the Moon a voice of its own, as a celestial body with an ancient existence separate from human perceptions.The declaration was penned after a series of public fora organized by Thomas Gooch, a Melbourne-based landscape architect. The discipline of landscape architecture is well suited to having a voice in Moon exploration, he said: “We walk the line of science, art, creativity, nature, and human habitation.”

Existing international space agreements address safety, conflict reduction, heritage preservation, sharing knowledge, and offering assistance in emergencies. These are all people-centric concerns; the aim of the declaration is to give the Moon a voice of its own, as a celestial body with an ancient existence separate from human perceptions, Gooch said.

The Moon might not have inhabitants or biological ecosystems—or, at least, we haven’t found any yet—but that doesn’t mean it is a “dead rock,” as it is sometimes described. “Once you see something as dead, then it limits the way you engage with it,” said Gooch.

The declaration, as coauthor Alice Gorman sees it, is a position statement to which companies and countries operating on the Moon could be held accountable. Gorman is a space archaeologist studying the heritage of space exploration (and the junk humans leave behind) at Flinders University in Adelaide, Australia.

“Have they respected the Moon’s own processes?” she asked. “Have they respected the Moon’s environment? Some of the time, the answer to that is going to be no, because you can’t dig up huge chunks of a landscape and expect there to be no impact.

“But if that’s the guiding principle, if that’s something that they’re attempting to achieve from the beginning, then that’s surely got to give us a better outcome than if we turn around in 10 years’ time and realize that if you look at the Moon with the naked eye you can see the scars of mining activities.”

The Dusty, Living Moon

Recent discoveries suggest the Moon is a much more complex and dynamic place than was previously thought, said Gorman.

Mining will require extraction machinery, processing facilities, transportation infrastructure, storage, and power sources. “It’s not just, ‘Let’s dig a hole on the Moon.’”It has seismic activity, including moonquakes and fault lines. Ancient water ice was directly observed at both lunar poles in 2018, hiding in shadowy areas that haven’t seen sunlight in 2 billion years. “Surely that’s environmentally significant,” said Gorman. “Even in completely human terms, 2-billion-year-old shadows are aesthetically significant.”

Individual water molecules have also recently been identified on the Moon’s sunlit surface, and there may even be a water cycle happening, with the molecules bouncing around over the course of a lunar day.

Gorman is vice chair of an expert group affiliated with the Moon Village Association, an international organization that hopes to establish a permanent human presence on the Moon. “I’m as motivated by the excitement of space science as the most hardcore space nut,” she said.

As such, she recognizes it’s inevitable that human activities—building a village, conducting scientific experiments, or extracting minerals—will have some kind of environmental impact on the Moon. Mining will require extraction machinery, processing facilities, transportation infrastructure, storage, and power sources, Gorman said. “It’s not just, ‘Let’s dig a hole on the Moon.’”

Lunar dust coats the boots of astronaut Edgar Mitchell in 1971. Credit: NASA

Lunar dust, for instance, is an important concern. Sticky, abrasive, and full of sharp fragments of obsidian, it eroded the seals on Apollo astronauts’ spacesuits and coated their instruments, making data hard to read. It smelled of “spent gunpowder,” gave Apollo 17’s Harrison Schmitt a kind of hay fever, and turned out to be extremely hazardous to respiratory health—the grains are so sharp they can slice holes in astronauts’ lungs and cause damage to their DNA.

Machinery designed to operate on the Moon will need to be resistant to abrasion by the lunar dust. And some research suggests that too many rockets landing on and taking off from the Moon could lift significant quantities of dust into the exosphere. “There’s the potential to create a little dust cloud around the Moon,” said Gorman, “and we don’t yet know enough about how the Moon operates in order to properly assess those impacts.”

A Space for Capitalism

In theory, existing space law should already protect the Moon from commercial exploitation, said Gbenga Oduntan, a reader in international commercial law at the University of Kent in the United Kingdom. Originally from Nigeria, Oduntan was inspired to study law by the fact that nations got together to agree on and create the Outer Space Treaty—a “beautiful” idea that made him “proud of mankind.”

In the treaty, which came into effect in 1967, nations agreed that space (including the Moon) “is not subject to national appropriation by claim of sovereignty” and that “exploration and use of outer space shall be carried out for the benefit and in the interests of all countries and shall be the province of all mankind.” For Oduntan, the meaning is clear: Mining on the Moon would be legal if the resources were used for further exploration and scientific research on behalf of all humanity, “but appropriation for sale is a vastly new territory which we cannot allow countries, not to mention companies, to run along with on their own,” he said.

Successive U.S. administrations have had a different interpretation: that outer space is a space for capitalism. In 1979, the United States refused to sign the Moon Agreement, another United Nations treaty that specifically declared that lunar resources were the “common heritage of mankind” and committed signatories to establishing an international regime of oversight when resource extraction was “about to become feasible.” (Lack of support from the major space powers led to only 18 countries signing it, and it remains one of the most unpopular multilateral treaties.)

“Just because an area is beyond sovereignty doesn’t make it a global commons.”Instead, in 2015, once extraction actually was about to become feasible, the Space Act explicitly gave U.S. companies the right to own and sell resources they mine from space, as well as 8 more years mostly free of government oversight. (In a 2015 article, Oduntan called it “the most significant salvo that has been fired in the ideological battle over ownership of the cosmos.”)

Scott Pace, a professor of international affairs at George Washington University and director of the U.S. Space Policy Institute, said that legally speaking, space is not a global commons. (In his former role as head of the National Space Council, Pace worked on the 2020 Trump executive order—which also explicitly repudiated the Moon Agreement.)

“Just because an area is beyond sovereignty doesn’t make it a global commons,” he said. “Commons implies common ownership and common responsibility, which means…[other countries get] a say in what the United States does out there.”

Instead, the official American view is that “rules on frontiers and shared domains are made by those who show up, not by those who stay behind,” as Pace put it. To that end, the United States has signed nonbinding bilateral agreements—the Artemis Accords—with, so far, 11 other countries that hope to work with the United States on upcoming lunar missions. The accords aim to set norms of behavior for activity on the Moon, Pace said, although some experts have pointed out that they might also be designed to reinforce the U.S. interpretation of the Outer Space Treaty on resource exploitation.

Oduntan believes that all countries should get a say in what happens in space and on the Moon, even countries that are not yet capable of or interested in going there. Such a perspective is not about “exporting communism into outer space,” he said. Instead, the point is to recognize that conflict over resources is inevitable. “Commercialization of outer space in a Wild West mode is going to lead faster to disputes. There will be turf wars. And experience shows us that lack of regulation leads to tears.”

Rock Rights

So could giving the Moon its own rights be one way to provide that kind of oversight and help ensure that countries and companies act in ways that minimize harm to its environment?

The Declaration of the Rights of the Moon was inspired by the growing Rights for Nature movement and uses some of its language. In the past 5 years, some natural entities—like New Zealand’s Whanganui River and Urewera forest, India’s Ganges River, and Colombia’s Atrato River—have been granted legal rights as part of efforts to protect and restore them. (Similarly, some astronomers have been investigating legal action to stop constellations of satellites, like Space X’s Starlink, from ruining their observations and altering the night sky.)

New Zealand’s Whanganui River is one of a growing number of natural entities that have been granted legal rights. Credit: James Shook/Wikimedia, CC BY 2.5

Pace was skeptical of the concept and said the Declaration of the Rights of the Moon has no legal standing.

“This [declaration] is saying that there should be something called rock rights—that a lunar rock has a right. It’s an interesting metaphor, but it doesn’t have any legal foundation, and it’s politically meaningless.”“The idea that the Moon as an inanimate object possesses fundamental rights as a result of its existence in the universe doesn’t make any sense. Rights are something which attach to human persons. We can have an argument about animal rights, but this is saying that there should be something called rock rights—that a lunar rock has a right. It’s an interesting metaphor, but it doesn’t have any legal foundation, and it’s politically meaningless.”

New Zealand’s Whanganui River might now have legal rights, Pace explained, but that’s because those rights were granted by the sovereign government of New Zealand. Countries agreed in the Outer Space Treaty that the Moon was beyond any nation’s sovereignty. That means there is no sovereign power that could legally grant the Moon rights, Pace reasoned—and efforts to have the Moon declared a national park or a World Heritage Site have failed for the same reason.

Erin O’Donnell, an expert on water law and the Rights for Nature movement at the University of Melbourne, foresees a different problem. Her research has shown that granting rights to rivers has frequently had unintended consequences for environmental protection.

Depending on the exact legal instrument used, some rivers now have the right to sue, enter into contracts, or own property. “But,” she said, “none of them have rights to water.”

“This is the real tension at the heart of the rights of nature advocacy movement: If something’s not legally enforceable, then it may not necessarily lead to a lot of change, because you can’t rely on it then in situations of conflict.”

Emphasizing legal rights can set up an adversarial atmosphere that can actually make conflict more likely, she said, and even weaken community support for protecting an environment, because people assume that if something has rights, it can look after itself. “If you emphasize the legal rights to the exclusion of all else, you can end up fracturing the relationship between people and nature, and that can be very hard to recover from.”

Where rights of nature movements have had success, she said, is in “reframing and resetting the human relationship with nature,” often by elevating Indigenous worldviews.

Our Beloved Moon

For Pace, the declaration is premature. Norms of behavior will evolve over time, he said, once we actually get to the Moon and figure out what we can possibly achieve there.

“What you don’t do is have a group of lawyers, no matter how smart, sit down in a room and try to draft up rules for things that are totally hypothetical. Environmental ethics considerations are rather speculative and not really necessary right now.”

“It sounds blunt, but the rules are made by the people who show up. Find a way to get in the game, and then you have a say.”If people really want to have an influence on space policy, Pace said, they should lobby their governments to get involved in the new space race. “Make sure you’re at the table. It sounds blunt, but the rules are made by the people who show up. Find a way to get in the game, and then you have a say.”

But Oduntan, O’Donnell, and Gorman disagreed. “By the time there’s a problem, it’s massively too late,” said O’Donnell. “We see that in the case of the rivers every day. All of the rivers around the world that have received legal rights are beloved, but heavily impacted.” The Moon is beloved, too, she said, but is as yet undamaged. “It would be nice if in this case we could act preventatively.”

The Declaration of the Rights of the Moon may not result in any legal outcomes, O’Donnell said, but it’s “a really important conversation starter.”

Most of us will never walk on its surface, but all human cultures tell stories about the Moon. It lights our nights, is a presence in our myths and legends, powers the tides, triggers animal (and, in limited ways, human) behavior, and marks the passing of time.

“The more of us who talk about these kinds of things,” said O’Donnell, “the more we’re likely to normalize seeing the Moon as something other than a piece of territory to be fought over by nation states and corporate investors.”

Supporters of the declaration want to democratize that conversation and give everyone a chance to take part.

“Every single person on Earth has a right to have a say in what happens to the Moon,” said Gorman. “It’s important for the environments in which we live, and for our cultural and scientific worldviews. It really does not belong to anyone.”

Author Information

Kate Evans (@kate_g_evans), Science Writer

A New Method Produces Improved Surface Strain Rate Maps

Mon, 07/19/2021 - 13:04

Earthquakes occur when tectonic strain that has gradually accumulated along a fault is suddenly released. Measurements of how much Earth’s surface deforms over time, or the strain rate, can be used in seismic hazard models to predict where earthquakes might occur. One way that scientists estimate strain rate is via orbiting satellites and detailed measurements of how much GPS stations on Earth’s surface move.

There are challenges, however, to using such geodetic data. The stations provide measurements only at specific locations and aren’t evenly distributed—constructing a continuous strain rate map requires that scientists make estimates to fill in data gaps. These interpolated data add uncertainty to resulting mathematical models.

To tackle these issues, Pagani et al. developed a transdimensional Bayesian method to estimate surface strain rates in the southwestern United States, with a focus on the San Andreas Fault. Their method essentially divided the study area into nonoverlapping triangles and calculated velocities within each triangle by incorporating measurements from the GPS stations located inside.

The team didn’t rely on just one such model. They used a reversible-jump Markov chain Monte Carlo algorithm to produce up to hundreds of thousands of such models, with slightly tweaked coordinates for those 2D triangles. In fact, across these models, even the number of triangles could change—because the method is transdimensional, the authors didn’t predetermine any parameters. Finally, they stacked all these models together to generate a final continuous strain rate map.

Using test data, the authors found that their approach handled data errors and uneven data distribution better than a standard B spline interpolation scheme. In addition, because the approach included information from many models, it produced a range of strain rate estimates at each point and probabilities for those values.

When the team used the new approach to calculate strain rates around the San Andreas Fault system, they found that their map agreed with past studies. It even successfully identified creeping sections of the fault system from locked segments. The newly described technique could potentially be used by researchers to develop other strain rate maps and may generally have application to other interpolation problems in the geosciences. (Journal of Geophysical Research: Solid Earth, https://doi.org/10.1029/2021JB021905, 2021)

—Jack Lee, Science Writer

Tiny Kinks Record Ancient Quakes

Mon, 07/19/2021 - 13:04

Every so often, somewhere beneath our feet, rocks rupture, and an earthquake begins. With big enough ruptures, we might feel an earthquake as seismic waves radiate to or along the surface. However, a mere 15% to 20% of the energy needed to break rocks in the first place translates into seismicity, scientists suspect.

The structure of the mica leaves it prone to kinking, rather than buckling or folding.The remaining energy can dissipate as frictional heat, leaving behind melted planes of glassy rock called pseudotachylyte. The leftover energy may also fracture, pulverize, or deform rocks that surround the rupture as it rushes through the crust, said Erik Anderson, a doctoral student at the University of Maine. Because these processes occur kilometers below Earth’s surface, scientists cannot directly observe them when modern earthquakes strike. Shear zones millions of years old that now reside at the surface can provide windows into the rocks around ancient ruptures. However, although seismogenically altered rocks remain at depth, heat and pressure can erase clues of past quakes, said Anderson. “We need some other proxy,” he said, “when we’re looking for evidence of earthquakes in the rock record.”

Micas—sheetlike minerals that can stack together in individual crystals that often provide the sparkle in kitchen countertops—can preserve deformation features that look like microscopic chevrons. On geology’s macroscale, chevrons form in layered strata. In minuscule sheaves of mica, petrologists observe similar pointy folds because the structure of the mica leaves it prone to kinking, rather than buckling or folding, said Frans Aben, a rock physicist at University College London.

In a new article in Earth and Planetary Science Letters, Anderson and his colleagues argue that these microstructures—called kink bands—often mark bygone earthquake ruptures and might outlast other indicators of seismicity.

Ancient Kink Bands, Explosive Explanation

To observe kinked micas, scientists must carefully cut rocks into slivers thinner than the typical width of a human hair and affix each rock slice to a piece of glass. By using high-powered microscopes to examine this rock and glass combination (aptly called a thin section), Anderson and his colleagues compared kink bands from two locations in Maine, both more than 300 million years old. The first location is rife with telltale signs of a dynamically deformed former seismogenic zone, like shattered garnets and pseudotachylyte. The second location exposes rocks that changed slowly, under relatively static conditions.

Comparing the geometry of the kink bands from these sites, the researchers observed differences in the thicknesses and symmetries of the microstructures. In particular, samples from the dynamically deformed location display thin-sided, asymmetric kinks. The more statically deformed samples showcase equally proportioned points with thicker limbs.

Kink bands, said Aben, can be added to a growing list of indicators of seismic activity in otherwise cryptic shear zones. The data, he said, “speak for themselves.” Aben was not involved in this study.

To further cement the link between earthquakes and kink band geometry, Anderson and colleagues analyzed 1960s era studies largely driven by the development of nuclear weapons. During that time, scientists strove to understand how shock waves emanated from sites of sudden, rapid, massive perturbations like those produced at nuclear test sites or meteor impact craters. Micas developed kink bands at such sites, as well as in complementary laboratory experiments, said Anderson, and they mimic the geometric patterns produced by dynamic strain rate events—like earthquakes. “[Kink band] geometry,” Anderson said, “is directly linked to the mode of deformation.”

Stressing Rocks, Kinking Micas

In addition to exploring whether kinked mica geometry could fingerprint relics of earthquake ruptures, Anderson and his colleagues estimated the magnitude of localized, transient stress their samples experienced as an earthquake’s rupture front propagated through the rocks, he said. In other words, he asked, might the geometry of kinked micas scale with the magnitude of momentary stress that kinked the micas in the first place?

By extrapolating data from previously published laboratory experiments, Anderson estimated that pulverizing rocks at the deepest depths at which earthquakes can nucleate requires up to 2 gigapascals of stress. Although stress doesn’t directly correspond to pressure, 2 gigapascals are equivalent to more than 7,200 times the pressure inside a car tire inflated to 40 pounds per square inch. For reference, the unimaginably crushing pressure in the deepest part of the ocean—the Mariana Trench—is only about 400 times the pressure in that same tire.

With micas, once they’re kinked, they will remain kinked, preserving records of ancient earthquakes in the hearts of mountains.By the same conversion, kinking micas requires stresses 8–30 times the water pressure in the deepest ocean. Because Anderson found pulverized garnets proximal to kinked micas at the fault-filled field site, he and his colleagues inferred that the stresses momentarily experienced by these rocks as an earthquake’s rupture tore through the shear zone were about 1 gigapascal, or 9 times the pressure at the Mariana Trench.

Aben described this transient stress estimate for earthquakes as speculative, but he said the new study’s focus on earthquake-induced deformation fills a gap in research between very slow rock deformation that builds mountains and extremely rapid deformation that occurs during nuclear weapons testing and meteor impacts. And with micas, he said, “once they’re kinked, they will remain kinked,” preserving records of ancient earthquakes in the hearts of mountains.

—Alka Tripathy-Lang (@DrAlkaTrip), Science Writer

Half of U.S. Tidal Marsh Areas Vulnerable to Rising Seas

Fri, 07/16/2021 - 12:27

Sea level is rising worldwide, thanks in large part to climate change. Rising seas threaten coastal communities and ecosystems, including marshes that lie at the interface between salt water and freshwater. Tidal marsh ecosystems feature distinct plants and play key ecological roles, such as serving as nurseries for fish. It is known that some tidal marshes can avoid destruction by migrating inland or through formation of new soil that raises their elevation, but a better understanding of how they are affected by rising seas could inform efforts to plan for and mitigate the effects.

New research by Holmquist et al. investigates the vulnerability of tidal marshes to sea level rise across the contiguous United States. The findings show, for the first time, that opportunities for resilience differ between more northerly and more southerly marshes across the country.

To help clarify the fate of tidal marshes in the contiguous United States, the researchers combined tide gauge data on sea level rise rates with soil formation rates reported in previous studies. They also incorporated information from local maps of water level, elevation, and land cover. Using these data, they calculated the potential for the marshes to adapt to rising seas by 2100 under several climate change scenarios.

The analysis revealed that different tidal marshes have different pathways for resilience to sea level rise. Specifically, more northerly marshes are more likely to lack opportunities to migrate inland, whereas more southerly marshes are more likely to lack the capacity to form and build up enough soil to sufficiently keep pace with sea level rise.

The researchers also found that depending on the degree of climate change, 43%–48% of tidal marsh area in the contiguous United States is vulnerable to destruction by sea level rise. These vulnerable areas tend to occur along the Gulf of Mexico and mid-Atlantic coasts, at sites where opportunities for both inland migration and vertical soil buildup are limited.

This study highlights the importance of considering local conditions when gauging the vulnerability of tidal marshes to rising seas. The findings could aid future research and planning efforts. (Earth’s Future, https://doi.org/10.1029/2020EF001804, 2021)

—Sarah Stanley, Science Writer

Astronomers for Planet Earth

Fri, 07/16/2021 - 12:25

There is no escaping the reality of the climate crisis: There is no Planet B. A group of astronomers, united under the name Astronomers for Planet Earth (A4E), is ready to use their unique astronomical perspective to reinforce that important message.

A good number of exoplanets may potentially be habitable, but humans cannot simply cross the vast distances required to get there. And other planets of the solar system, although accessible, are all inhospitable. “Like it or not, for the moment the Earth is where we make our stand,” Carl Sagan famously wrote in his 1994 book Pale Blue Dot. The book’s title is based on the eponymous image showing Earth as small, fragile, and isolated. Sagan’s reflection on the image shows that astronomers can have a powerful voice in the climate debate.

Astronomy, a Field with a Reach The Voyager 1 spacecraft took a picture of Earth from a distance of more than 6.4 billion kilometers. Credit: NASA/JPL-Caltech

The beginnings of A4E go back to 2019 when two groups of astronomers, one from the United States and the other from Europe, decided to join forces. Today the network numbers over a thousand astronomers, students, and astronomy educators from 78 countries. “We’re still trying to get ourselves together,” said Adrienne Cool, a professor at San Francisco State University. “It’s a volunteer organization that’s grown rapidly.”

Astronomy, its practitioners note, has a surprisingly wide earthly reach. “We teach astronomy courses that are taken by, just in the U.S., a quarter million students every year,” said Cool. “That’s a lot of students that we reach.”

And their influence goes way beyond students. About 150 million people visit planetariums around the world each year. Astronomers also organize countless stargazing nights and public lectures. Perhaps more than any other discipline, some researchers think, astronomy has the opportunity to address masses of people of all ages and occupations.

Toward Sustainable Science

There is no guide for how best to incorporate climate science into an astronomy lecture. A4E works as a hub of knowledge and experience where astronomers can exchange teaching and outreach material. Members also learn about climate science and sustainability from regularly organized webinars.

However, although astronomers are spreading their message, they also acknowledge the need to address the elephant in the room: Astronomy can leave a significant carbon footprint. “I don’t feel comfortable telling the public, ‘Look, we really need to make a change,’ and the next moment I’m jumping on a plane for Chile [to use the telescopes],” said Leonard Burtscher, a staff astronomer at Leiden University in the Netherlands. “That’s a recipe for disaster in terms of communication.”

On average, an astronomer’s work-related greenhouse gas emissions are about twice as high as those of an average citizen in a developed country. The emissions per person are many times above the goal set by the Paris Agreement to limit the global increase in average temperature to less than 1.5°C relative to preindustrial levels.

“Let’s get real, and let’s figure out how to make sustainability the key part of what our institutions do in addition to astronomy.”At a recent virtual conference of the European Astronomical Society, hosted by Leiden University, A4E organized a session in which astronomers and climate crisis experts discussed the measures that would help reduce the carbon footprint of astronomy. Observatories and institutes are moving toward a greater reliance on renewable energy, and plans for future facilities take carbon assessment into account.

Perhaps the most contentious topic of discussion in academia is air travel. One solution is to hold fewer in-person conferences, as studies have shown that moving conferences to a virtual setting dramatically reduces the carbon footprint. “Good things [come] out of virtual meetings,” said Burtscher. “Better inclusivity, lower costs, often a higher legacy value, recordings of talks and discussions.” On the other hand, proponents of face-to-face meetings argue that a virtual setting impedes fruitful collaborations and networking that are especially important for young scientists. In the end, the community will likely have to make a compromise.

The impetus for change is strong. More than 2,700 astronomers signed an open letter released on Earth Day 2021 in which they recognized the urgency of the climate crisis and called for all astronomical institutions to adopt sustainability as a primary goal. But this is just the beginning, and the time for action is ticking away. “So let’s get real, and let’s figure out how to make sustainability the key part of what our institutions do in addition to astronomy,” said Cool.

—Jure Japelj (@JureJapelj), Science Writer

Call for Papers on Machine Learning and Earth System Modeling

Thu, 07/15/2021 - 11:56

Machine learning techniques are becoming important tools to solve questions in Earth system sciences. In recent years, Earth system data has grown dramatically. These data come from a variety of sources such as remote sensors, in situ observations, citizen science, and increasingly high-quality computer simulations of the Earth. Scientists use this data to make better predictions (such as weather forecasts or epidemic projections), test new hypotheses, improve existing models, and develop new theories. However, scientists are currently unable to fully benefit from these large volumes of data because our current ability to collect and generate data surpasses our ability to process, understand, and use them.

This sharp increase in data volumes has been accompanied by recent advances in machine learning algorithms from within the computer science community. “Machine learning” is a generic term for a variety of emerging data science algorithms that use data to learn to perform tasks without being explicitly programmed to. Such machine learning models can perform a variety of tasks with near-human-level skill including image recognition, classification, prediction, and pattern recognition. These algorithms have already revolutionized several domains, such as computer vision (He et al. 2015), natural language processing (Devlin et al., 2018), and video games (Justesen et al., 2019).

Atmospheric scientists, oceanographers, and climate scientists are now using machine learning algorithms to better predict, process, analyze, and learn from large volumes of Earth systems data.The success of machine learning has been adopted by atmospheric scientists, oceanographers, and climate scientists who are now using machine learning algorithms to better predict, process, analyze, and learn from large volumes of Earth systems data.

For example, machine learning has become a common approach in pattern identification and prediction of weather and climate phenomena such as synoptic fronts (Lagerquist et al., 2019) and El Niño (Ham et al., 2019). It is possible that, in the near future, a machine learning algorithm could provide a more accurate weather forecast than traditional numerical weather models. The idea that a purely data-driven algorithm could outperform state-of-the-art numerical weather models which are based on our physical knowledge is remarkable and could change our approach to the simulation of weather and climate (Balaji, 2021). Benchmarking community datasets to formalize data-driven numerical weather prediction intercomparisons to this end is emerging (Rasp et al., 2020).

For longer-term climate projection, another modeling approach that is gaining popularity is combining machine learning with traditional climate models to create hybrid climate models. This approach relies on (a) mathematical equations derived from physical laws for simulation of processes that are accurately captured in climate models­­­ and (b) machine learning for the simulation of processes, such as clouds, that are not well captured by current climate models, but for which high-fidelity training datasets can be generated (Gentine et al. 2018, Brenowitz and Bretherton 2018, Rasp et al. 2018, Yuval and O’Gorman 2020). This hybrid approach has been attracting scientists because it might answer some pressing questions regarding the climate sensitivity of Earth. However, there are still non-trivial issues with this approach. For example, using out-of-the-box machine-learning algorithms as a part of a climate model can lead to physically inconsistent results, such as the violation of conservation laws (Brenowitz and Bretherton, 2019).

Failure of machine learning algorithms to produce physically consistent results is one reason why there is a growing recognition that machine-learning approaches should be used in nuanced ways that incorporate ideas from existing scientific knowledge.Failure of machine learning algorithms to produce physically consistent results is one reason why there is a growing recognition that machine-learning approaches should be used in nuanced ways that incorporate ideas from existing scientific knowledge.

Such approaches are commonly referred to as knowledge-guided machine learning approaches. The idea of these approaches is to find ways to integrate scientific knowledge into machine learning algorithms; for example, by designing algorithms that can enforce physical constraints (Beucler et al. 2021), or tailoring training data to allow strategically emulating subprocesses in ways that also enable constraints be satisfied (Yuval et al. 2021).

Much remains to be explored in this new subfield, including what combination of algorithmic versus knowledge-guided approaches will lead to reliably robust operational process emulators. Since machine learning is empirical, and the decisions of how to optimize neural networks can also have a major effect on their performance in hybrid climate models (Ott et al. 2020), a healthy technical debate in the machine learning assisted climate simulation literature is emerging.

One promising thread is the advent of explainable artificial intelligence. Most machine learning applications lead to improvements in our predictive abilities but provide little information regarding how these machine learning algorithms provide accurate prediction, and many scientists perceive machine learning algorithms as uninterpretable “black-boxes.” However, in recent years, atmospheric scientists, oceanographers, and climate scientists have been adapting methods that help to interpret machine learning algorithms. For example, these interpretability methods can help us to understand when and why these algorithms can provide reliable subseasonal forecasts (Mayer and Barnes, 2021) and assist in discovering unknown equations for ocean turbulence (Zanna and Bolton, 2020). These ideas give us hope that machine learning algorithms together with the abundance of Earth system data could lead to scientific breakthroughs.

A new special collection aims to bring together new research that uses machine learning to advance Earth system modeling.To accelerate this important application and communication within the fields of atmosphere, ocean, and land, a new special collection in Journal of Advances in Modeling Earth Systems (JAMES), entitled “Machine Learning Application to Earth System Modeling” aims to bring together new research that uses machine learning to advance Earth system modeling.

The collection is open to manuscripts covering use of new machine learning methodologies developed for advancing Earth system science (for example, interpretability of machine learning algorithms, physics-guided algorithms, causal inference, and hybrid modeling) and applications of Machine learning to Earth system modeling (for example, predictability of weather and climate, Machine learning parameterizations, uncertainty quantification). Manuscripts should be submitted via the GEMS website for JAMES.

—Janni Yuval (janniy@mit.edu, 0000-0001-7519-0118), Massachusetts Institute of Technology, USA; Mike Pritchard ( 0000-0002-0340-6327), University of California Irvine, USA; Pierre Gentine ( 0000-0002-0845-8345), Columbia University, USA; Laure Zanna ( 0000-0002-8472-4828), New York University, USA; Jiwen Fan ( 0000-0001-5280-4391), Pacific Northwest National Laboratory, USA

Have You Seen Ball Lightning? Scientists Want to Know About It

Thu, 07/15/2021 - 11:48

Graduate student Christopher Sterpka remembers the first time he saw ball lightning, as a 9- or 10-year-old staying at his grandparents’ house in West Hartford, Conn. He was home alone and watching a thunderstorm from a window one summer night.

“I remember this blue, kind of fuzzy ball just sort of descended diagonally out of the clouds,” said Sterpka, who conducts research in lightning physics at the University of New Hampshire’s Space Science Center. He watched the ball of light float down to the ground in the distance and disappear out of sight in a matter of 5–10 seconds. “I was terrified.”

Sterpka told his grandfather, a science teacher, when he returned home. His grandfather had no idea what it was but asked around. That’s when they heard what the strange sighting may have been: ball lightning. “It’s actually one of the incidents that probably got me interested in lightning in the first place.” Sterpka saw ball lightning again in his twenties while driving near a thunderstorm in Massachusetts.

“Nobody has correlated any of the observations with any other measurements.”Ball lightning has been reported for centuries but hasn’t been reliably observed by scientific instruments. A new website hosted by New Mexico Tech physicist Richard Sonnenfeld and Texas State University engineer Karl Stephan is collecting eyewitness accounts to improve the basic understanding of the phenomenon. They’ll compare the accounts with weather radar systems to characterize the factors that could lead to ball lightning.

“This is one thing that hasn’t been done,” said Martin Uman, a lightning scientist at the University of Florida who is not involved in the research. When it comes to ball lightning, there are lots of observations, but “nobody has correlated any of the observations with any other measurements.”

If ball lightning turns out to be explainable by science, the findings could revolutionize our understanding of physics. As it stands, nothing can explain a glowing ball with no fuel source that can last up to a minute, said Sonnenfeld. “That’s fascinating physics,” he said. “Revolutionary physics even, if you can believe it.”

A Scientific Riddle In 1886, Dr. G. Hartwig illustrated ball lightning arriving through a chimney. Credit: Dr. G. Hartwig/NOAA, CC BY 2.0

Eyewitness accounts describe hovering balls of light typically about 20 centimeters in diameter, roughly the size of a bowling ball. The balls appear white, yellow, orange, blue, or (rarely) green and can last from mere seconds to up to a minute before fading, flashing, or exploding into nothing.

People have seen ball lightning outside their windows from a distance, mere meters away in their kitchen, roving down the aisles of airplanes, and coming down their chimneys. Other reports are widely different, like a luminous ring the size of a truck that lasted 10 minutes, as described by an Austrian woman. The balls of light typically happen during thunderstorms and may have led, in very rare cases, to burn injuries or deaths.

“In my opinion, there are probably multiple causes for what’s described as ball lightning,” Uman said.

Despite hundreds of eyewitness accounts from across the world and going back centuries, scientists can’t explain ball lightning or re-create it in the lab. Theories abound: Ball lightning is the result of a failed lightning bolt, of hot ionized silica, or of chains of charged particles. One paper even proposed that ball lightning came from magnetic stimulation in the brain. “I think that a lot of the theories have a piece of the truth,” Stephan said, “but none of them have the whole truth.”

“Since there [are] virtually no data, anybody can come up with a theory, and you can’t prove them wrong.”“There are literally dozens of ball lightning theories because it’s an unconstrained situation,” Stephan said. “Since there [are] virtually no data, anybody can come up with a theory, and you can’t prove them wrong.”

A self-described “wet blanket” in the niche world of ball lightning scholarship, Stephan wants to “clear the underbrush” of scientific theories and see what’s left.

The public reporting website launched last year might help with that. Eyewitnesses fill out a Google form with the location, date, time, and description of the ball lightning sighting and send photos or videos to an email address.

Stephan and Sonnenfeld have received written reports of sightings from decades ago, like Sterpka’s account. Those are helpful, Sonnenfeld said, but he hopes for more recent sightings with more exact information.

Thunderstorms are tracked across the globe using national and global lightning networks that record every flash (with some error). With precise location and time information of a ball lightning sighting, the team could check how close a thunderstorm was to the event. The team could also investigate the charge of the nearby cloud-to-ground lightning (positive or negative) and the current of the lightning to learn more about the conditions. This kind of technology was not around 20 years ago, said Sonnenfeld.

Identifying Sprites

It turns out there is a precedent for eyewitness accounts of strange lightning occurrences that turn out to be real.

For decades, pilots spoke of strange pink flashes of light above thunderclouds, but scientists had no explanation for the phenomenon. It wasn’t until a physicist accidentally captured a similar event on a low-light television camera while testing equipment for a rocket launch that the phenomenon came to light, wrote Matthew Cappucci for the Washington Post. This special type of lightning is called a sprite, and it occurs 50–90 kilometers above the ground, in the mesosphere. (For comparison, airplanes fly about 9–11 kilometers up.) Sprites flash reddish light over massive patches of sky for less than a tenth of a second. They’re said to happen after a positive bolt of lightning hits the ground from a thunderstorm below.

The perfect sighting of ball lightning would include multiple witnesses with accounts that agree, videos from more than one angle, the object casting a shadow, and enough detail to compare with radar. “No ball lightning sighting has had all that good stuff happen yet,” Stephan said. “But maybe it will someday.”

—Jenessa Duncombe (@jrdscience), Staff Writer

Where Do the Metals Go?

Thu, 07/15/2021 - 11:48

Hawaii’s Kīlauea volcano is very large, very active, and very disruptive. Its recent activity belched tons of sulfur dioxide into the air every day. But aside from gases, eruptions from basaltic volcanoes like Kīlauea release metals and metalloids, including ones considered pollutants, like copper, zinc, arsenic, and lead. These metal pollutants have been found in the ground, water, rain, snow, and plants near vents posteruption, as well as in the air downwind.

But how these volcanic metals are transported from active eruptions, their longevity in the environment, and how much and where they end up settling were open questions until recently. “We know that volcanoes are a huge natural source of these metals, which are environmentally very important,” said Evgenia Ilyinskaya, an associate professor at Leeds University in the United Kingdom. “But there’s just not very much known about what happens to them after emission—how long do they stay in the atmosphere, and where do they go?”

Sampling the Wind

To better understand how concentrations of metals change as the plume travels downwind during an active ongoing eruption, Ilyinskaya and fellow researcher and University of Cambridge doctoral student Emily Mason set up sampling stations around the Big Island of Hawaii. Intermittently over the course of almost a year, samples were collected as close as possible to Kīlauea’s eruptive vent and at another six sites around the island. The farthest site was more than 200 kilometers distant, and all were in the path of the trade wind. Samples were also collected 300 meters above the plume using a drone.

“Kīlauea is a wonderful natural laboratory for studying volcanism and particularly that type of basaltic volcanism.”“Kīlauea is a wonderful natural laboratory for studying volcanism and particularly that type of basaltic volcanism,” said Mason. “It’s a well-understood system, and that makes it a very appealing target.”

The research, published in Communications Earth and Environment, is the biggest study of volcano metal emissions ever done.

Ilyinskaya, Mason, and their colleagues found an enormous difference between pollutant levels during and after the eruption—up to 3 times higher than periods without volcanic activity. They discovered that different pollutants fall out at different rates: Some pollutants, like cadmium, remain in the plume for only a few hours, whereas others, like cerium, remain for much longer. “It was quite striking to see that there is such a large difference,” said Ilyinskaya. “That’s something we didn’t really expect.”

The researchers think that metal deposition may be very sensitive to atmospheric conditions like winds, rain, and humidity. Different environments could mean different patterns of volcanic metal dispersal and pollution. For example, drier and colder environments, like Iceland, may have different patterns than hot and humid environments like Hawaii.

Laze Plumes and Copper

Mason also studied laze plumes, created when the heat of the lava very quickly evaporates seawater, by taking samples where the lava entered the ocean. The phenomenon is relatively rare, as there aren’t that many basaltic volcanoes near sea level where lava can reach the ocean. But laze plumes are worth studying, says Mason, because there have historically been much larger basaltic eruptions that created large igneous provinces, like the Deccan Traps. These eruptions may have released tons of metals and metalloids into the surrounding environment. “It’s possible that laze plumes are a slightly underestimated force in those events,” said Mason.

The amount of copper being released by laze plumes is surprising, said Mason. Seawater is rich in chlorine, and she thinks it enables more copper to degas. Laze plumes could even release more copper into the environment than large magmatic plumes, she said. This copper would also be released directly into the ocean and could affect marine environments, either by worsening ocean acidification or adding nutrients. “The fact that copper emissions could be comparable between the laze plume and the magmatic plume is definitely surprising to me,” said Mason.

Volcanologist and geochemist Tobias Fischer of the University of New Mexico, who was not involved in either study, said this research is “a really nice approach and really advances our understanding of not only the quantity of metal emissions but also their life cycle in a volcanic plume like this one.”

Health Risks

At some point during the 3 hours the plume took to reach the closest sampling station, its metals were radically depleted. Researchers hypothesize that the heavy-metal pollutants may have formed a very water soluble chemical species that fell out in rain close to the eruption site. Ilyinskaya is collecting samples from Iceland’s Fagradalsfjall volcano to learn more about what happens in those first 3 hours of a plume’s lifetime.

“If this process is really happening, then it could be disproportionately impacting people living close to the volcano,” she said. “On the other hand, it may be lessening the impact on the communities further away.”

A researcher wearing protective gear walks toward a laze plume created by Kīlauea’s lava entering the ocean. Credit: Evgenia Ilyinskaya/USGS

One goal of studies like this is to create a pollution map that models where the plume will go, concentration of metals, and atmospheric conditions to help communities avoid exposure.Volcanic pollutants have been linked to health problems like thyroid cancer, multiple sclerosis, and respiratory diseases. One goal of studies like Ilyinskaya’s and Mason’s is to create a pollution map that models where the plume will go, concentration of metals, and atmospheric conditions to help communities avoid exposure. Fischer said a pollution map would be a wonderful contribution. “Then you can probably make some pretty good predictions of where you get high concentrations of metal deposition and what kind of metals,” he said.

More research needs to be done on how metals are stratified within a plume and also their long-term accumulations in water and plants, said Mason. “Volcanic metals are an insidious threat in terms of the way that they build up in the environment,” said Mason.

—Danielle Beurteaux (@daniellebeurt), Science Writer

 

Correction, 19 July 2021: This article was updated to correct a quote from Evgenia Ilyinskaya: “But there’s just not very much known about what happens to them after emission.”

Una mirada global al carbono orgánico superficial del suelo

Wed, 07/14/2021 - 12:26

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

Un suelo sano es fundamental para la vida en la Tierra. Además de su importancia en la agricultura, el suelo es la base de casi todos los ecosistemas terrestres de la Tierra. El carbono orgánico del suelo (COS) se utiliza con frecuencia como indicador de la salud del suelo, desempeña un papel importante en el ciclo del carbono terrestre y tiene enormes implicaciones para la adaptación al cambio climático. Entender esta dinámica a escala planetaria será vital cuando la humanidad intente alimentar a una población en aumento bajo el creciente estrés de un planeta que se calienta.

En un nuevo estudio, Endsley et al. utilizaron la teledetección para estudiar la dinámica del COS en la superficie a nivel mundial, usando los datos del satélite Soil Moisture Active Passive (SMAP) de la NASA, que combina las mediciones radiométricas de las emisiones de microondas de baja frecuencia en la superficie de la Tierra con la modelización para calcular la humedad del suelo y el estado de congelación-descongelación. En particular, los datos de los radiómetros de microondas del SMAP pueden combinarse con un modelo físico de absorción de carbono por parte de las plantas y de descomposición del suelo para estimar el presupuesto global de carbono terrestre en el producto SMAP Level 4 Carbon (L4C). El equipo utilizó el SMAP L4C en combinación con otros datos de satélite, como las observaciones de la vegetación procedentes de los instrumentos del espectrorradiómetro de imágenes de resolución moderada, para crear un modelo que caracterizaría específicamente el COS.

El resultado es una estimación global del COS hasta una profundidad de 5 centímetros con una resolución horizontal de 9 kilómetros cuadrados. Los científicos compararon sus estimaciones con mediciones anteriores y registros de inventarios de suelos de COS y descubrieron que su modelo coincidía generalmente con ellos. Los investigadores afirman que el nuevo modelo les permitirá monitorear cambios estacionales y anuales del COS y ofrecerá también una visión de cómo los ecosistemas y el planeta en general responden a las inundaciones, las sequías y otros acontecimientos de corta duración. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2020JG006100, 2020)

—David Shultz, Escritor de ciencia

This translation by Monica Alejandra Gomez Correa (@Mokasaurus) of @GeoLatinas and @Anthnyy was made possible by a partnership with Planeteando. Esta traducción fue posible gracias a una asociación con Planeteando.

Heating Up the Hot Spots

Wed, 07/14/2021 - 12:25

The U.S. Navy opened its first national training institution in Annapolis, Md., in 1845, with 50 midshipmen and seven professors. In the 176 years since, the U.S. Naval Academy has trained thousands of officers. By the end of this century, though, the academy might have to abandon ship. Climate models suggest rising sea levels and subsiding land could flood the site, forcing the Navy to find a drier spot for educating its future leaders.

Climate change could function as a “threat multiplier” in already touchy regions of the globe, perhaps triggering armed conflicts over water, arable land, or other resources.Climate change could introduce more serious security challenges to the American military in the coming decades, experts say. The U.S. military already faces repairs and upgrades to facilities across the country, along with reductions in training operations. And climate change could function as a “threat multiplier” in already touchy regions of the globe, perhaps triggering armed conflicts over water, arable land, or other resources.

“Personally, I put climate change below a lot of other threats—a lot of other things are more immediate and more pressing—but it deserves a place on the list,” said Col. Mark Read, head of the Department of Geography and Environmental Engineering at the U.S. Military Academy at West Point, N.Y. “Twenty years ago, it wasn’t even on the list.”

“The problem is certainly cascading,” said Sherri Goodman, secretary general of the International Military Council on Climate and Security (IMCCS; a think tank composed of military and security leaders) and a former U.S. deputy undersecretary of defense. “It’s converging in many ways at the top of the global agenda.”

Hurricanes, Wildfires Are Major Infrastructure Threats

For the American military, perhaps the most immediate threats are infrastructure damage and training restrictions. Hurricanes, inland storm systems, and wildfires have caused extensive damage in the past few years.

In 2018, for example, Hurricane Florence caused $3.6 billion worth of damage to Camp Lejeune, a Marine Corps base in North Carolina that supports a population of more than 130,000 marines, sailors, retirees, their families, and civilian employees. The following year, Offutt Air Force Base in Nebraska suffered $1 billion in damages when major flooding hit the Midwest. Wildfires in 2016 burned 10,000 acres (4,047 hectares) at Vandenberg Air Force Base (now Vandenberg Space Force Base) in California, threatening two of its rocket launch pads.

A 2019 Department of Defense (DOD) study of 79 bases around the country concluded that two thirds of them are vulnerable to flooding and about half are vulnerable to drought and wildfires. Bases in California, New Mexico, and Nevada could be threatened by desertification, whereas facilities in Alaska could be damaged by thawing permafrost.

“The DOD must consider the related risks and make wise investment decisions to mitigate the impacts of extreme weather on the DOD’s mission.”Flooding is increasing at some coastal bases even without hurricanes. Several facilities in the Hampton Roads region of Virginia and around Chesapeake Bay, for example, face frequent tidal flooding of roads and low-lying areas caused by higher sea level and some ground subsidence.

“If you add rain, the flooding can be pretty significant,” said Read, who emphasized that he was expressing his own views, not those of the Army or West Point. “That’s damaged some infrastructure and limited base access….That has readiness implications. It’s nothing glamorous. It seems mundane, but it’s profound.”

Higher temperatures also present problems. Bases in the Southwest have faced more “black flag” days, when it’s too hot or the air quality is too low to safely conduct training operations—a problem that is likely to grow worse as the climate continues to warm. And live-fire exercises have a greater potential to spark wildfires that could damage not only military facilities but civilian ones as well. In 2018, for example, two wildfires in Colorado were triggered by training exercises for an upcoming deployment, burning 3,300 acres (1,335 hectares) and forcing the evacuation of 250 households.

“DOD should ensure that extreme weather and climate change are considered during facility design and investment decisions,” the Defense Department’s inspector general’s office wrote in a 2020 report. “As the frequency of extreme weather events has increased, the DOD must consider the related risks and make wise investment decisions to mitigate the impacts of extreme weather on the DOD’s mission.”

Not a Big International Concern—Yet

That mission often includes responding to climate disasters around the world, which are forecast to become more common and more severe as the climate continues to change. In parts of the world, it’s possible that such disasters could help trigger armed conflicts.

A 2019 study found that climate-assisted conflicts are rare today but could become more common later in the century. “Does a flood lead to a civil war?” asked lead author Katharine Mach, an assistant professor of marine and atmospheric science at the University of Miami. “If you’re in Norway, the answer is totally no. But if you’re in a place that’s on the brink of civil war anyway, that’s where you start to see greater effects of climate shocks.”

“In places that are already experiencing strains due to poor governance or a lack of social cohesion, when you add climate change on top of that, it makes it a more combustible mix.”“Climate acts as a threat multiplier,” said Erin Sikorsky, deputy director of the Center for Climate and Security and director of IMCCS. “In places that are already experiencing strains due to poor governance or a lack of social cohesion, when you add climate change on top of that, it makes it a more combustible mix.”

Sub-Saharan Africa, the Middle East, southern Asia, and parts of the Indo-Pacific lead the list of regions that could be most vulnerable to climate-triggered violence, Sikorsky said, but they aren’t alone. “I always say that you could spin a globe and just pick a spot, and you could find some kind of climate security risk there.”

Some experts say they are concerned that reduced snowfall in the Himalayas could produce water shortages that could lead to armed conflict between countries in Asia, for example, particularly in regions where one country can limit other nations’ access to water. Others suggest that the Arctic could become a climate security hot spot, as reduced ice coverage in summer makes it easier to extract mineral resources from the ocean floor. “We’ve seen the great powers posturing and competing for resources, and whenever you have that, there are security implications,” said Read.

The United States and other nations therefore must take climate change into consideration as they plan their foreign policy, said Sikorsky. “When you talk about security risks, you need to add climate change to the mix. It’s not a matter of, is climate change more important or risky than China, for example. Instead, it’s a question of how does climate change shape the risk from China? How does it shape competition? How does it shape our China foreign policy? Climate change will help set the parameters of the world stage.”

—Damond Benningfield (damond5916@att.net), Science Writer

America’s Natural Gas Pipeline Routes and Environmental Justice

Tue, 07/13/2021 - 11:38

Most research into the environmental and social impacts of the oil and natural gas industries focuses on the beginning and end of the process: where resources are extracted and where they are refined and consumed. Very little attention, however, is paid to middle infrastructure—the enormous vascular system of pipelines crisscrossing the United States. In a new study, Emanuel et al. address this continent-wide gap by comparing natural gas pipeline density to social vulnerability at the county level.

The Centers for Disease Control and Prevention has created a social vulnerability index that measures how well a community can prepare for, handle, and recover from hazards and disasters, either natural or human-caused. A county with high social vulnerability would be poorly equipped to handle a potential pipeline disaster. The researchers found that more socially vulnerable counties in the United States tended to have higher pipeline densities, while less socially vulnerable counties had lower pipeline densities. The correlation is stronger for counties with the highest pipeline densities.

The authors point to the policy implications of the inequitable distribution of environmental harms connected with the construction and operation of this vast network of infrastructure. The burdens of pipelines—including noise, reduced property values and land use options, risk of leak or explosion, and cultural harm—fall disproportionately on the communities least capable of handling them.

Pipelines are frequently located in rural areas rather than urban ones. Although rural areas have lower population densities and many times presumed “lower risks,” rural routes do not diffuse risks; they present a different set of risks, the authors say. Plus, the scientists highlight that Indigenous people rooted in rural areas have deep cultural ties to specific landscapes and waterways that are increasingly affected by pipeline construction and operation, and their cultures and communities may be harmed if the land is marred. Rural emergency response systems have fewer resources to handle large disasters. Further, local conflict over fossil fuel infrastructure can quickly tear rural communities apart and lead to mass relocations, converting rural communities to industrial landscapes within only a few years.

The scientists suggest that future projects undergo more rigorous environmental justice assessments that incorporate culture- and community-focused research and local perspectives. They call upon other scientists to partner with marginalized communities to identify and quantify impacts that may be overlooked or ignored by the powerful forces behind pipeline projects. Finally, they remind decisionmakers to consider the cumulative risks of existing oil and natural gas industry infrastructure, including the issues that follow climate change, which also tend to affect those most vulnerable. (GeoHealth, https://doi.org/10.1029/2021GH000442, 2021)

—Elizabeth Thompson, Science Writer

A Remarkably Constant History of Meteorite Strikes

Tue, 07/13/2021 - 11:38

Thousands of tons of extraterrestrial material pummel Earth’s surface each year. The vast majority of it is too small to see with the naked eye, but even bits of cosmic dust have secrets to reveal.

By poring over more than 2,800 grains from micrometeorites, researchers have found that the amount of extraterrestrial material falling to Earth has remained remarkably stable over millions of years. That’s a surprise, the team suggested, because it’s long been believed that random collisions of asteroids in the asteroid belt periodically send showers of meteoroids toward Earth.

Astronomy by Looking Down

Birger Schmitz, a geologist at Lund University in Sweden, remembers the first time he looked at sediments to trace something that had come from space. It was the 1980s, and he was studying the Chicxulub impact crater. “It was the first insight that we could get astronomical information by looking down instead of looking up,” said Schmitz.

Inspired by that experience, Schmitz and his Lund University colleague Fredrik Terfelt, a research engineer, have spent the past 8 years collecting over 8,000 kilograms of sedimentary limestone. They’re not interested in the rock itself, which was once part of the ancient seafloor, but rather in what it contains: micrometeorites that fell to Earth over the past 500 million years.

Dissolving Rocks

“Ordinary chondritic asteroids don’t even appear to be common in the asteroid belt.”Schmitz and Terfelt used a series of strong chemicals in a specially designed laboratory to isolate the extraterrestrial material. They immersed their samples of limestone—representing 15 different time windows spanning from the Late Cambrian to the early Paleogene—in successive baths of hydrochloric acid, hydrofluoric acid, sulfuric acid, and nitric acid to dissolve the rock. Some of the reactions that ensued were impressive, said Terfelt, who recalled black smoke filling their laboratory’s fume hood. “The reaction between pyrite and nitric acid is quite spectacular.”

The chemical barrage left behind grains of chromite, an extremely hardy mineral that composes about 0.25% of some meteorites by weight. These grains are like a corpse’s gold tooth, said Schmitz. “They survive.”

Schmitz and Terfelt found that over 99% of the chromite grains they recovered came from a stony meteorite known as an ordinary chondrite. That’s perplexing, the researchers suggested, because asteroids of this type are rare in the asteroid belt, the source of most meteorites. “Ordinary chondritic asteroids don’t even appear to be common in the asteroid belt,” Schmitz told Eos.

“Everyone was telling us [we would] find several peaks.”An implication of this finding is that most of Earth’s roughly 200 known impact structures were likely formed from ordinary chondrites striking the planet. “The general view has been that comets and all types of asteroids were responsible,” said Schmitz.

When Schmitz and Terfelt sorted the 2,828 chromite grains they recovered by age, the mystery deepened. The distribution they found was remarkably flat except for one peak roughly 460 million years ago. We were surprised, said Schmitz. “Everyone was telling us [we would] find several peaks.”

Making It to Earth

Sporadic collisions between asteroids in the asteroid belt produce a plethora of debris, and it’s logical to assume that some of that cosmic shrapnel will reach Earth in the form of meteorites. But of the 15 of these titanic tussles involving chromite-bearing asteroids that occurred over the past 500 million years, that was the case only once, Schmitz and Terfelt showed. “Only one appears to have led to an increase in the flux of meteorites to Earth.”

Perhaps asteroid collisions need to occur in a specific place for their refuse to actually make it to our planet, the researchers proposed in the Proceedings of the National Academy of Sciences of the United States of America. So-called “Kirkwood gaps”—areas within the asteroid belt where the orbital periods of an asteroid and the planet Jupiter constitute a ratio of integers (e.g., 3:1 or 5:2)—are conspicuously empty. Thanks to gravitational interactions that asteroids experience in these regions of space, they tend to get flung out of those orbits, said Philipp Heck, a meteorist at the Field Museum of Natural History in Chicago not involved in the research. “Those objects tend to become Earth-crossing relatively quickly.”

We’re gaining a better understanding of the solar system by studying the relics of asteroids, its oldest constituents, said Heck. But this analysis should be extended to other types of meteorites that don’t contain chromite grains, he said. “This method only looks at certain types of meteorites. It’s far from a complete picture.”

—Katherine Kornei (@KatherineKornei), Science Writer

U.S. Data Centers Rely on Water from Stressed Basins

Mon, 07/12/2021 - 13:42

Thanks to our ever increasing reliance on the Internet, the amount of data online is skyrocketing. The global data volume is expected to grow sixfold from 2018 to 2025. It might seem like that information is swirling in the cloudy sky, but it’s stored in physical data centers.

“We know data centers use a lot of energy, and energy uses a lot of water. So how much water is being used?”Landon Marston, an assistant professor at Virginia Tech, recently noticed news articles addressing the growing energy requirements of the data center industry. As an expert in water resources engineering, he wondered how those energy requirements translated into water consumption. “We know data centers use a lot of energy, and energy uses a lot of water. So how much water is being used?” said Marston. “We suspected that there could be large impacts at a very local scale, but there hadn’t really been a spatially detailed analysis looking at the environmental impact of data centers.”

In a study recently published in Environmental Research Letters, Marston and colleagues attempted to map how and where data centers consume energy and water in the United States. The results showed that it takes a large amount of water to support the cloud and that the water often comes from water-stressed basins.

Connecting Water Consumption to Data Centers

The researchers identified over 100,000 data centers using previous data from the Lawrence Berkeley National Laboratory and the websites of commercial data centers. While most of the data centers are small operations run by individual companies, the majority of servers in the United States are housed in fewer than 2,500 “colocation” and “hyperscale” data centers, which store data for many companies and the public simultaneously. Hyperscale data centers are the biggest type of data center, typically housing over 5,000 servers, but are designed to be more energy efficient by using cutting-edge cooling methods and servers.

All data centers consume water directly (to cool the electronics at the site) and indirectly (through electricity generation at the power plants that service the sites). Using records from the U.S. Environmental Protection Agency and the U.S. Energy Information Association, and data from previous academic studies, the researchers matched the data centers with their most likely sources of electricity and water. Then they estimated the data centers’ annual energy, direct water, and indirect water consumption based on their energy and cooling requirements. By piecing all this information together, “we can have a spatially explicit representation of the environmental footprints associated with each of the data centers,” said Marston.

They mapped the U.S. data center industry’s carbon footprint, water footprint, and water scarcity footprint. The last calculation accounts for the pressure that water consumption will put on a region based on local water availability and needs.

Hot, Dry, and Hydroelectric

The results revealed that data centers use water from 90% of watersheds in the United States. The water consumption of individual data centers varies dramatically depending on where they are located and their electricity source. For example, data centers in the Southwest rely on water-heavy hydroelectric power, and the hot climate there leads to more evaporation compared with other regions in the country. Data centers in the cooler, wetter climates of the East Coast also tend to use more solar and wind energy, which require less water.

“This is important because most [data center] operators don’t really look at their power consumption as part of the overall water footprint.”Of the total water footprint attributed to data centers, 75% was from indirect water use at power plants and 25% was from on-site water use. “This is important because most [data center] operators don’t really look at their power consumption as part of the overall water footprint,” said David Mytton, a researcher at Imperial College London and the Data Center Sustainability Research Team at the Uptime Institute. Mytton was not involved in the new study.

A. B. Siddik, a graduate student at Virginia Tech and the study’s lead author, explained that on-site water consumption has a bigger impact on the water scarcity footprint, indicating that many data centers are in water-stressed regions. “Most often they are in the middle of a desert, or in the Southwest, like California, Nevada, and Arizona,” said Siddik. “Those are hubs of data centers.” The overall water scarcity footprint was more than double the water footprint, suggesting that data centers in the United States disproportionately consume water from water-stressed regions.

Planning for the Digital Future

As the demand for data storage grows, so will the need for hyperscale data centers. Although these buildings are more efficient than smaller data centers, concentrating the energy and water demands in fewer locations could tax the local environment.

“Simple real estate decisions could potentially be the solution here.”Further innovations in energy-efficient technology and investments in renewable energy will help curb energy and water usage, but Marston also recommended building new data centers in regions with smaller carbon and water-scarcity footprints. “Simple real estate decisions could potentially be the solution here,” he said.

Technology companies have already tried out extreme locations for data centers. For example, Google converted an old mill in frigid northern Finland into a data center, and Microsoft experimented with putting data centers in the ocean. But according to the study, locations such as New York and southern Florida that have an abundance of water and renewable energy sources would have a lower environmental impact.

Mytton agreed that it’s important to consider the locations of future data centers, adding that climate change complicates these decisions because places that are not water stressed now might become drier and hotter over time. Plus, there are many other factors that contribute to where data centers are built, such as the local taxes, regulations, and workforce. Strategically placing data centers based on water resources is also an important economic consideration for the industry, Marston said, because water-stressed regions are prone to electricity blackouts and brownouts, which are detrimental to the operation of data centers.

“Data [are] so critical to the way our society functions, and data centers underpin all that,” Marston said. “It’s not just about the environmental footprint. It’s also a potential risk for these data centers.”

—Andrew Chapman (@andrew7chapman), Science Writer

Tree Rings Show Record of Newly Identified Extreme Solar Activity Event

Mon, 07/12/2021 - 13:42

The Sun constantly emits a stream of energetic particles, some of which reach Earth. The density and energy of this stream form the basis of space weather, which can interfere with the operation of satellites and other spacecraft. A key unresolved question in the field is the frequency with which the Sun emits bursts of energetic particles strong enough to disable or destroy space-based electronics.

One promising avenue for determining the rate of such events is the dendrochronological record. This approach relies on the process by which a solar energetic particle (SEP) strikes the atmosphere, causing a chain reaction that results in the production of an atom of carbon-14. This atom subsequently can be incorporated into the structure of a tree; thus, the concentration of carbon-14 atoms in a tree ring can indicate the impact rate of SEPs in a given year.

To date, three events of extreme SEP production are well described in literature, occurring approximately in the years 660 BCE, 774–775 CE, and 992–993 CE. Each event was roughly an order of magnitude stronger than any measured in the space exploration era. Miyake et al. describe such an event, which occurred between 5411 BCE and 5410 BCE. Because of this burst, atmospheric carbon-14 increased 0.6% year over year in the Northern Hemisphere and was sustained for several years before dropping to typical levels.

The authors deduced the presence of this event by using samples collected from trees in three widely dispersed locales: a bristlecone pine in California, a Scotch pine in Finland, and a European larch in Switzerland. Each sample had its individual tree rings separated, and material from each ring underwent accelerator mass spectrometry to determine its carbon-14 content.

Using statistical methods, the researchers identified a pattern of small carbon-14 fluctuations consistent with the Sun’s 11-year solar cycle; the event recorded in the tree ring occurred during a time of solar maximum. Notably, other evidence suggests that the Sun was also undergoing a decades-long period of increasing activity.

If an extreme SEP burst is indeed the cause of the additional carbon-14, then these observations could aid in forecasting future events. However, tree ring measurements cannot rule out other extraterrestrial causes, such as a nearby supernova explosion. Confirmation will require isotopic measurements of beryllium and chlorine taken from ice cores, according to the authors. (Geophysical Research Letters, https://doi.org/10.1029/2021GL093419, 2021)

—Morgan Rehnberg, Science Writer

Reconstructing Rocks with Machine Learning

Mon, 07/12/2021 - 11:30

Digital rock physics utilizes a paradigm of first taking a digital image of a rock and then performing a computer simulation using the digital image. This has many applications, such as hydrogeology and geologic carbon dioxide sequestration. The imaging portion of the task can be costly because high-resolution images of 3D rocks often must be pieced-together by taking many images of 2D rock slices.

You et al. [2021] utilize a machine learning technique called a “progressive growing generative adversarial network” (or PG-GAN) to reduce the cost of producing high-resolution 3D rock images. The PG-GAN learns to generate realistic, high-dimensional rock images from noise in a low-dimensional space. A given rock image can be reconstructed by finding an optimal point in the low-dimensional space. Performing interpolation of the rock images directly results in a low-quality reconstruction, but the PG-GAN produces a high-quality result after interpolation in the low-dimensional space. Using the PG-GAN to interpolate in the low-dimensional space enables the accurate digital reconstruction of a rock using fewer 2D slices, which reduces the cost of the process.

Citation: You, N., Li, Y. E., & Cheng, A. [2021]. 3D carbonate digital rock reconstruction using progressive growing GAN. Journal of Geophysical Research: Solid Earth, 126, e2021JB021687. https://doi.org/10.1029/2021JB021687

This research article is part of a cross-journal special collection on “Machine Learning for Solid Earth Observation, Modeling, and Understanding”. Find out more and read other articles.

—Daniel O’Malley, Associate Editor, JGR: Solid Earth

Getting to the Bottom of Trawling’s Carbon Emissions

Fri, 07/09/2021 - 12:50

Bottom trawling, a controversial fishing practice in which industrial boats drag weighted nets through the water and along the ocean floor, can unintentionally dig up seafloor ecosystems and release sequestered carbon within the sediments. For the first time, researchers have attempted to estimate globally how this fishing technique may be remineralizing stored carbon that, as the seabed is tilled, ends up back in the water column and possibly the atmosphere, where it would contribute to climate change.

“The ocean is one of our biggest carbon sinks, so when we put in more human-induced CO2 emissions…we’re weakening that sink.”“The ocean is one of our biggest carbon sinks,” said Trisha Atwood, who researches aquatic ecology at Utah State University. “So when we put in more human-induced CO2 emissions, whether that’s directly dumping CO2 into deep waters or whether that’s trawling and enhancing remineralization of this carbon, we’re weakening that sink.”

Atwood helped build a model that shows that bottom trawling may be releasing as much as 1.5 billion metric tons of aqueous carbon dioxide (CO2) annually, equal to what is released on land through farming. Her work was part of a paper recently published in Nature that presents a framework for prioritizing the creation of marine protected areas to restore ocean biodiversity and maximize carbon storage and ecosystem services.

Estimating Carbon Loss from the Ocean Floor

To create the model, Atwood and her coauthors first needed to figure out how much of the ocean floor is dredged by trawlers. They turned to data from the nonprofit Global Fishing Watch, which recently began tracking fishing activity around the world and compiled data on industrial trawlers and dredgers from 2016 to 2019.

The next step was to find data on how much carbon is stored in the world’s ocean sediments. Because that information was not readily available, Atwood and colleagues built a data set by analyzing thousands of sediment cores that had been collected over the decades.

Last, they dug through the scientific literature, looking at studies that examined whether disturbances to the soil in coastal ecosystems, such as seagrasses, mangroves, and salt marshes, exposed carbon that was once deep in marine sediments and enhanced carbon production in the ocean.

A group of twin-rigged shrimp trawlers in the northern Gulf of Mexico off the coast of Louisiana. The trawlers are trailed by a plume of sediment, suggesting that their nets are scraping against the seafloor. Credit: SkyTruth Galleries, CC BY-NC-SA 2.0

“We lean really heavily on that literature,” said Atwood. “We used a lot of the equations [in previous papers] to build our model and extend it into the seabeds in these more open ocean locations. And from there, we were able to come up with this first estimate.”

Their investigation did not attempt to determine whether sequestered carbon that has been released by bottom trawling remains in the water column or is released into the atmosphere, although they noted potential problems either way. In the paper, the authors noted that it is likely to increase ocean acidification, limit the ocean’s buffering capacity, and even add to the buildup of atmospheric CO2.

Atwood and the lead author of the paper, Enric Sala, a conservation ecologist who is also a National Geographic Explorer-in-Residence, are working with Tim DeVries, who studies ocean biogeochemistry at the University of California, Santa Barbara, and scientists at NASA’s Goddard Space Flight Center to build atmospheric models to try to figure out where the released carbon goes.

Existing Trawling Data May Be Too Scant

Not everyone, however, is convinced that Atwood and Sala’s model on bottom trawling and loss of carbon sequestration in marine sediments is accurate. Sarah Paradis, who is studying the effects of bottom trawling on the seafloor for her Ph.D. at the Institute of Environmental Science and Technology in Barcelona, is skeptical.

In an email to Eos, Paradis noted that since the 1980s, there have been fewer than 40 studies that address the impacts that bottom trawling has on sedimentary organic carbon. These few studies are not enough to build a model on, she said, and in addition, the studies reach different conclusions. Some studies have observed that bottom trawling decreases organic carbon content of the seafloor, whereas others show it increases organic carbon.

“We in no way intended our model to be the end-all in the trawling conversation. We hope that many more studies will come along that help produce more localized results.”In addition, Paradis wrote that lower organic carbon on the seafloor does not necessarily mean its remineralization to CO2. Rather, it could simply mean loss of organic carbon through erosion, which means the carbon moves to another area of the seabed but very little is remineralized into CO2. She pointed to several studies, including one that she was a part of, that showed loss of organic carbon through erosion.

“I want to emphasize that [the authors] address a very important issue regarding how bottom trawling, a ubiquitous and very poorly-regulated anthropogenic activity, is affecting the seafloor,” she wrote. “But the values they propose are far from being credible.”

Atwood disagreed. “We don’t need lots of studies on the effects of trawling because we built our model using decades of carbon cycling research,” she wrote in an email to Eos. “Trawling is simply a perturbation that mixes and re-suspends sediments, leading to increases in carbon availability. All we needed to know about trawling to apply a carbon model to it is where trawling occurs and how deep in the sediment the trawls go.”

In addition, Atwood said, “We in no way intended our model to be the end-all in the trawling conversation. We hope that many more studies will come along that help produce more localized results.”

—Nancy Averett (@nancyaverett), Science Writer

Virtual Tours Through the Ice Using Everyday Tools

Fri, 07/09/2021 - 12:48

You know you are on to something special when researchers who have traveled to and experienced the wonder of some of the most remote places on Earth are captivated by a tool that takes them there virtually.

Earth’s cryosphere, including its ice sheets, ice caps, glaciers, and sea ice, is undergoing stark changes as air temperatures continue to rise. Scientists who study these regions understand viscerally the scale and scope of these changes, but they encounter limitations in communicating their experiences and observations to the public. Digital learning tools and online scientific data repositories have greatly expanded over the past decade, but there are still few ways for the public to explore rapidly changing icy environments through a realistic platform that provides contextual information, supplemental media, and connections to data sets.

Create Your Own Virtual Field Experience

Byrd Center’s instructional guide Record: GoPro MAX 360° waterproof VR camera Edit footage: Adobe Premiere Pro Add location data: Dashware Host videos to access in tour: Vimeo Add 3D objects: Sketchfab Add the virtual tour overlay: 3DVista

The Virtual Ice Explorer (VIE) aims to bring the public closer to these important places. Developed by the Education and Outreach (E&O) team at the Byrd Polar and Climate Research Center, VIE encourages informal learning about icy environments by immersing visitors in “choose your own adventure” tours. Click on the globe on the home page and head to, for example, the Multidisciplinary Drifting Observatory for the Study of Arctic Climate (MOSAiC) expedition that intentionally froze its ship into Arctic sea ice for a year of observations last year. You’ll land on the deck of the icebreaker R/V Polarstern overlooking the ice camp—no long voyage required. Next, you can visit scientists in action, sampling Arctic waters up to 1,000 meters below the ocean surface through a hole drilled in the ice. Or maybe you’d like to see how researchers spend their off hours with a little snow soccer. These options offer visitors a glimpse into the daily lives of scientists in the field as they fill in the blanks about what researchers study in these extraordinary locations and why it matters to our understanding of the planet.

DIY-ing the First VIE

VIE was originally conceived as a platform to display immersive tours for about a dozen glacial sites around the world, generated from digital elevation models draped with satellite imagery. Following setbacks caused by the quality of the virtual landscapes created from satellite data and challenging user experience, two opportunities allowed us to reenvision VIE: (1) the acquisition of rugged, easy-to-use field cameras and (2) our discovery of existing commercial software with which we could more easily create tours that had been painstakingly built with custom code. We also began involving researchers who visited these sites firsthand; their experiences turned out to be essential for our tour development.

Our team purchased a GoPro Fusion 360° camera by way of a generous donation to the Byrd Center. At the same time, Michael Loso, a geologist with the U.S. National Park Service, was planning to spend a field season at Kennicott Glacier in Alaska. Loso agreed to take the camera and capture footage. We shared his footage using a virtual reality (VR) headset during Byrd Center outreach events and with park visitors, and also collected feedback. We were particularly moved by a visitor who appreciated that the tour allowed them to explore a site that was otherwise inaccessible due to a physical disability.

This ease of use in the field was an essential criterion if we were to ask scientists to carry the cameras along on expeditions.These rugged, inexpensive, and relatively easy-to-use cameras come with their own software and have a multitude of third-party programs available. Researchers can set them up, hit record, and walk away to focus on their work. This ease of use in the field was an essential criterion if we were to ask scientists to carry the cameras along on expeditions. After capturing and rendering video using GoPro’s software, we use tools like Adobe Premiere Pro for additional editing, Dashware for accessing location data, and Plotly’s Chart Studio for graphing and hosting interactive data sets.

A workshop run by Ryan Hollister, a high school educator, during the 2019 Earth Educators’ Rendezvous also led to tremendous advances in our team’s ability to create VIE tours. Hollister showed off the immersive virtual field experience he created for the Columns of the Giants in California and walked attendees through designing their own experiences. After collecting 24 panoramic images with a Nikon D750 camera, Hollister stitched them together to create a 360° image using Autopano Giga software. He then used 3DVista software to add an interactive overlay to the images that allowed users to move to different locations within a site, click on marked points of interest and read about them, and embed 3D models of rock samples. This software was originally designed for architects and real estate professionals to create virtual tours of buildings, so it seamlessly constructs the computer code underpinning the tours with landscapes. Today 3DVista caters to wider audiences, including educators, and it provides services such as live guided tours and hosting capabilities.

The 3DVista software allowed us to create glacial landscape tours that we had been building with customized computer code, but in far less time. Use of off-the-shelf software allowed us to spend more time collecting footage, creating compelling narratives, and testing a wider range of scene designs. In the future, we plan to use 3DVista’s user-tested interface to train educators and researchers to create their own tours.

Getting Scientists Camera-Ready

The E&O team now trains Byrd Center researchers with the cameras on basic photography techniques and more specific 360° filming techniques to capture high-quality video for VIE and other outreach programs. We want researchers to illustrate the vast, unique landscapes in which they’re working as well as showcase engaging scenes from their day-to-day experiences. We train them to create compositions to fill a scene, such as the inclusion of people to provide scale and demonstrate the research process, and we encourage them to film all parts of the expedition, including the journey, their living conditions, and interactions with collaborators and local partners.

We also have conversations with expedition members on the nature of their research, the field site itself, the equipment that will be on-site, and the desired impact of our outreach so that we can coproduce a narrative that guides what they film. These training sessions help the E&O team consider unique content for each tour, such as maps of study sites, time-lapse series, information on samples and equipment, biographies of researchers, links to publications, and prominent messages that properly identify and give respect to the people and places shown.

A benefit of having researchers explore virtual tours of other sites before they embark on their journey is that it generates genuine enthusiasm to share their own experiences. Chris Gardner, a Byrd Center researcher, viewed a tour of ice core drilling on Mount Huascarán in Peru while preparing to lead an expedition to the Dry Valleys of Antarctica during the 2019–2020 field season. Once he could see what was possible, he met with the E&O team to develop a video capture plan. Importantly, Gardner involved his entire team in selecting shots, recording video, and contributing to the tour narrative.

This photo of scientists (left to right) Chris Gardner, Adolfo Calero, and Melisa Diaz on a 2019 expedition to Dry Valleys in Antarctica welcomes visitors on the Virtual Ice Tour by the Byrd Polar and Climate Research Center called “McMurdo Dry Valleys, Victoria Land, Antarctica.” Credit: Chris Gardner (photo); Byrd Polar and Climate Research Center (screen capture)

Authors Kira Harris and Kasey Krok have participated in many of these training sessions as undergraduate interns on the E&O team. They found that these sessions offered opportunities for pivotal interpersonal interactions among group members, including undergraduate and graduate students, postdocs, and investigators. Students gained a better understanding of the science that researchers were carrying out, while getting an opportunity to share their sometimes more finely honed technical experience in video and photography.

High-Quality Tours With a Low Lift

As of this writing, the Byrd Center has created virtual field experiences for eight sites, thanks to collaboration with the National Park Service, the Ohio Department of Natural Resources, and the many scientists who filmed their field campaigns. Additional examples of virtual field experiences by other groups include VR Glaciers and Glaciated Landscapes by Des McDougall at the University of Worcester; The Hidden Worlds of the National Parks by the National Park Service; and these immersive virtual field trips by Arizona State University. More are being developed all the time. At AGU’s Fall Meeting 2020, for example, there were numerous oral sessions and posters highlighting the applications of virtual field experiences.

What’s most exciting is that these virtual explorations allow individuals almost anywhere in the world—regardless of their wealth, abilities, or learning preferences—to experience new landscapes and engage with Earth science lessons.Our E&O team has published an instructional guide for educators and scientists to use to build their own virtual field experiences tailored to their initiatives, using the same workflow that we use. Ryan Hollister has several resources, including guides on the technical requirements for high-resolution tours, creating 3D models of rock samples, and how to use the Next Generation Science Standards to best adapt immersive virtual field experiences for all learners. Our team also continues to test new tour features that will increase user engagement, knowledge retention, and options for user interaction. Last year, while closed to the public due to the COVID-19 pandemic, we even created a virtual tour of the Byrd Center to continue reaching out to the thousands of individuals who typically visit our facility each year.

What’s most exciting is that these virtual explorations allow individuals almost anywhere in the world—regardless of their wealth, abilities, or learning preferences—to experience new landscapes and engage with Earth science lessons. While you can get the best view of the tours on a VR headset, all you need is a modest Internet connection and a laptop, tablet, or smartphone. These tours can be developed to specifically put visitors into the role of scientist to observe the terrain and use the provided data to make evidence-based claims.

This virtual field access enables individuals of all ages to get a taste of field research, appreciate the daily work of scientists, and gain a deeper understanding of our rapidly altering natural world. Although nothing can truly replicate an in-person field experience, virtual tours can be used to enhance educational opportunities in cases where people would otherwise not have access to those experiences, such as in large introductory courses, socially distanced laboratory exercises, or locations that need protection from oversampling or ecotourism. We can’t always bring people to the far reaches of the world, but we now have many tools to bring the vast world to each of us.

Acknowledgments

Funding for this project was provided by the National Science Foundation under award 1713537 and a contribution from ENGIE/Ohio State Energy Partners. We thank the MOSAiC expedition and the NSF Office of Polar Programs for their continued collaboration on this project.

Author Information

Kira Harris (kiraharris@email.arizona.edu), University of Arizona, Tucson; Kasey Krok, Byrd Polar and Climate Research Center, Ohio State University, Columbus; Ryan Hollister, Turlock Unified School District, Turlock, Calif.; and Jason Cervenec, Byrd Polar and Climate Research Center, Ohio State University, Columbus

Previous Intra-oceanic Subduction Found Beneath South America?

Fri, 07/09/2021 - 11:30

High velocity slabs deeper than 1,000 kilometers have been imaged beneath the Amazon by various tomographic studies and have been interpreted as a continuation of the present Nazca slab. Mohammadzaheri et al. [2021] propose a new interpretation of these slab pieces deeper than about 900 kilometers. Geodynamic and plate reconstruction analyses of a new global P-wave tomography model (DETOX-P1, based on both travel time data and multi-frequency waveform picks) suggest that these 900-1800 kilometer deep high-velocity anomalies are actually remnants of a west-dipping intra-oceanic subduction zone during late Jurassic and Early Cretaceous times when South America’s paleo-position was near Africa, before the start of the present, east-dipping Andean subduction around 85 million years ago. This gives support to the hypothesis that slabs in the lower mantle sink vertically with implications on models of plate motion reconstructions.

Citation: Mohammadzaheri, A., Sigloch, K., Hosseini, K., & Mihalynuk, M. G. [2021]. Subducted lithosphere under South America from multifrequency P wave tomography. Journal of Geophysical Research: Solid Earth, 126, e2020JB020704. https://doi.org/10.1029/2020JB020704

—Marcelo Assumpção, Associate Editor, JGR: Solid Earth

Good, Soon, and Cheap – Earthquake Early Warning by Smartphone

Thu, 07/08/2021 - 13:50

Even short warning of earthquakes can be crucial in protecting lives and infrastructure, so there is great interest in developing systems for earthquake early warning. Any such system must be reliable and balance sensitivity for events against such factors as user tolerance for false alarms in which no shaking is felt. This is complicated by the need to have relatively dense sensor coverage not only where people reside but also in adjacent seismogenic regions. This requires high costs if typical scientific-grade instruments are used but such costs are prohibitive in many countries where resources are limited.

Brooks et al. [2021] describe very encouraging results from Costa Rica, where the ASTUTI network (Alerta Sismica Temprana Utilizando Teléfonos Inteligentes, or Earthquake Early Warning Utilizing Smartphones) uses a fixed network of smartphones. Their data indicate that such low-cost networks can be highly effective and installed and operated at relatively lower costs, bring the benefits of early warning to a broader portion of the world’s population.

Citation: Brooks, B., Protti, M., Ericksen, T. et al. [2021]. Robust Earthquake Early Warning at a Fraction of the Cost: ASTUTI Costa Rica. AGU Advances, 2, e2021AV000407. https://doi.org/10.1029/2021AV000407

 —Peter Zeitler, Editor, AGU Advances

Realizing Machine Learning’s Promise in Geoscience Remote Sensing

Thu, 07/08/2021 - 12:25

In recent years, machine learning and pattern recognition methods have become common in Earth and space sciences. This is especially true for remote sensing applications, which often rely on massive archives of noisy data and so are well suited to such artificial intelligence (AI) techniques.

As the data science revolution matures, we can assess its impact on specific research disciplines. We focus here on imaging spectroscopy, also known as hyperspectral imaging, as a data-centric remote sensing discipline expected to benefit from machine learning. Imaging spectroscopy involves collecting spectral data from airborne and satellite sensors at hundreds of electromagnetic wavelengths for each pixel in the sensors’ viewing area.

Modern signal processing and machine learning concepts applied to imaging spectroscopy analysis have potential benefits for numerous areas of geoscience research.Since the introduction of imaging spectrometers in the early 1980s, their numbers and sophistication have grown dramatically, and their application has expanded across diverse topics in Earth, space, and laboratory sciences. They have, for example, surveyed greenhouse gas emitters across California [Duren et al., 2019], found water on the moon [Pieters et al., 2009], and mapped the tree chemistry of the Peruvian Amazon [Asner et al., 2017]. The data sets involved are large and complex. And a new generation of orbital instruments, slated for launch in coming years, will provide global coverage with far larger archives. Missions featuring these instruments include NASA’s Earth Surface Mineral Dust Source Investigation (EMIT) [Green et al., 2020] and Surface Biology and Geology investigation [National Academies of Sciences, Engineering, and Medicine, 2019].

Researchers have introduced modern signal processing and machine learning concepts to imaging spectroscopy analysis, with potential benefits for numerous areas of geoscience research. But to what extent has this potential been realized? To help answer this question, we assessed whether the growth in signal processing and pattern recognition research, indicated by an increasing number of peer-reviewed technical articles, has produced a commensurate impact on science investigations using imaging spectroscopy.

Mining for Data

Following an established method, we surveyed all articles cataloged in the Web of Science [Harzing and Alakangas, 2016] since 1976 with titles or abstracts containing the term “imaging spectroscopy” or “hyperspectral.” Then, using a modular clustering approach [Waltman et al., 2010], we identified clustered bibliographic communities among the 13,850 connected articles within the citation network.

We found that these articles fall into several independent and self-citing groups (Figure 1): optics and medicine, food and agriculture, machine learning, signal processing, terrestrial Earth science, aquatic Earth science, astrophysics, heliophysics, and planetary science. The articles in two of these nine groups (signal processing and machine learning) make up a distinct cluster of methodological research investigating how signal processing and machine learning can be used with imaging spectroscopy, and those in the other seven involve research using imaging spectroscopy to address questions in applied sciences. The volume of research has increased recently in all of these groups, especially those in the methods cluster (Figure 2). Nevertheless, these methods articles have seldom been cited by the applied sciences papers, drawing more than 96% of their citations internally but no more than 2% from any applied science group.

Fig. 1. Research communities tend to sort themselves into self-citing clusters. Circles in this figure represent scientific journal publications, with the size proportional to the number of citations. Map distance indicates similarity in the citation network. Seven of nine total clusters are shown; the other two (astrophysics and heliophysics) were predominantly isolated from the others. Annotations indicate keywords from representative publications. Image produced using VOSviewer. Click image for larger version.

The siloing is even stronger among published research in high-ranked scholarly journals, defined as having h-indices among the 20 highest in the 2020 public Google Scholar ranking. Fewer than 40% of the articles in our survey came from the clinical, Earth, and space science fields noted above, yet these fields produced all of the publications in top-ranked journals. We did not find a single instance in which one of those papers in a high-impact journal cited a paper from the methods cluster.

Fig. 2. The number of publications per year in each of the nine research communities considered is shown here. A Dramatic Disconnect

The recent boom in machine learning and signal processing research has not yet made a commensurate impact on the use of imaging spectroscopy in applied sciences.From our analysis, we conclude that the recent boom in machine learning and signal processing research has not yet made a commensurate impact on the use of imaging spectroscopy in applied sciences.

A lack of citations does not necessarily imply a lack of influence. For instance, an Earth science paper that borrows techniques published in a machine learning paper may cite that manuscript once, whereas later studies applying the techniques may cite the science paper rather than the progenitor. Nonetheless, it is clear that despite constituting a large fraction of the research volume having to do with imaging spectroscopy for more than half a decade, research focused on machine learning and signal processing methods is nearly absent from high-impact science discoveries. This absence suggests a dramatic disconnect between science investigations and pure methodological research.

Research communities focused on improving the use of signal processing and machine learning with imaging spectroscopy have produced thousands of manuscripts through person-centuries of effort. How can we improve the science impact of these efforts?

Lowering Barriers to Entry

We have two main recommendations. The first is technical. The methodology-science disconnect is symptomatic of high barriers to entry for data science researchers to engage applied science questions.

Imaging spectroscopy data are still expensive to acquire, challenging to use, and regional in scale. Most top-ranked journal publications are written by career experts who plan and conduct specific acquisition campaigns and then perform each stage of the collection and analysis. This effort requires a chain of specialized steps involving instrument calibration, removal of atmospheric interference, and interpretation of reflectance spectra, all of which are challenging for nonexperts. These analyses often require expensive and complex software, raising obstacles for nonexpert researchers to engage cutting-edge geoscience problems.

In contrast, a large fraction of methodological research related to hyperspectral imaging focuses on packaged, publicly available benchmark scenes such as the Indian Pines [Baumgardner et al., 2015] or the University of Pavia [Dell’Acqua et al., 2004]. These benchmark scenes reduce multifaceted real-world measurement challenges to simplified classification tasks, creating well-defined problems with debatable relevance to pressing science questions.

We support efforts to democratize imaging spectrometer data by improving core data products, making pertinent science data more accessible to machine learning researchers.Not all remote sensing disciplines have this disconnect. Hyperspectral imaging, involving hundreds of spectral channels, contrasts with multiband remote sensing, which generally involves only 3 to 10 channels and is far more commonly used. Multiband remote sensing instruments have regular global coverage, producing familiar image-like reflectance data. Although multiband instruments cannot measure the same wide range of phenomena as hyperspectral imagers, the maturity and extent of their data products democratize their use to address novel science questions.

We support efforts to similarly democratize imaging spectrometer data by improving and disseminating core data products, making pertinent science data more accessible to machine learning researchers. Open spectral libraries like SPECCHIO and EcoSIS exemplify this trend, as do the commitments by missions such as PRISMA, EnMAP, and EMIT to distribute reflectance data for each acquisition.

In the longer term, global imaging spectroscopy missions can increase data usage by providing data in a format that is user-friendly and ready to analyze. We also support open-source visualization and high-quality corrections for atmospheric effects to make existing hyperspectral data sets more accessible to nonexperts, thereby strengthening connections among methodological and application-based research communities. Recent efforts in this area include open source packages like the EnMAP-Box, HyTools, ISOFIT, and ImgSPEC.

Expanding the Envelope

Science investigators should partner with data scientists to pursue challenging (bio)geophysical investigations, thus broadening their technical tool kits and pushing the limits of what can be measured remotely.Our second recommendation is cultural. Many of today’s most compelling science questions live at the limits of detectability—for example, in the first data acquisition over a new target, in a signal close to the noise, or in a relationship struggling for statistical significance. The papers in the planetary science cluster from our survey are exemplary in this respect, with many focusing on first observations of novel environments and achieving the best high-impact publication rate of any group. In contrast, a lot of methodological work makes use of standardized, well-understood benchmark data sets. Although benchmarks can help to coordinate research around key challenge areas, they should be connected to pertinent science questions.

Journal editors should encourage submission of manuscripts reporting research about specific, new, and compelling science problems of interest while also being more skeptical of incremental improvements in generic classification, regression, or unmixing algorithms. Science investigators in turn should partner with data scientists to pursue challenging (bio)geophysical investigations, thus broadening their technical tool kits and pushing the limits of what can be measured remotely.

Machine learning will play a central role in the next decade of imaging spectroscopy research, but its potential in the geosciences will be realized only through engagement with specific and pressing investigations. There is reason for optimism: The next generation of orbiting imaging spectrometer missions promises global coverage commensurate with existing imagers. We foresee a future in which, with judicious help from data science, imaging spectroscopy becomes as pervasive as multiband remote sensing is today.

Acknowledgments

The research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with NASA (80NM0018D0004). Copyright 2021. California Institute of Technology. Government sponsorship acknowledged.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer