EOS

Syndicate content
Earth & Space Science News
Updated: 20 hours 33 min ago

Former NOAA Head Calls for Renewed Social Contract for Science

Fri, 12/13/2019 - 15:02

Stating that “this is a moment of truth” about the climate crisis and related issues like biodiversity loss and ecosystem disruptions, renowned environmental scientist Jane Lubchenco has called for “a renewed social contract for science.”

Lubchenco, a former administrator of the National Oceanic and Atmospheric Administration (NOAA) and U.S. undersecretary of commerce for oceans and atmosphere, is challenging the scientific community to think about “our obligations as scientists [and] our responsibilities to society, to each other, to future generations.”

Lubchenco issued her renewed challenge at a 10 December session at AGU’s Fall Meeting 2019 in San Francisco, Calif. It came 22 years after she initially issued a call for a new social contract for science. That earlier call urged scientists “to devote their energies and talents to the most pressing problems of the day”—including urgent and unprecedented environmental and social changes—“in proportion to their importance, in exchange for public funding.”

“I believe that we have an obligation to be helpful to society.”“I believe that we have an obligation to be helpful to society,” said Lubchenco, a professor at Oregon State University. “To me, it means that the responsibilities go beyond doing really cool science and publishing it to share it with other scientists. To me, that obligation means doing science and sharing it widely, but also working on issues that in times of serious need deserve particular attention.”

Lubchenco said that for science to inform understanding and action, it needs to be accessible, understandable, relevant, credible, salient, and useful. However, she said that all too often, science does not meet those benchmarks.

Making the situation even more complicated is “this post-truth world that has emerged, especially in the 2016 election, not just in the U.S. but around the world. And that makes some of these challenges even more problematic,” she said.

“One of the reasons that there is antagonism to science is the orchestrated campaign that has existed to sow misinformation [and] disinformation to distract people,” Lubchenco said. “There are active attempts by vested interests to create this nonsense that we don’t know enough about climate change, for example, to act.”

Lubchenco said that marches in defense of science have been among a series of positive measures by scientists. She added, “More and more environmental scientists are actively sharing their science broadly. They’re conducting use-inspired science. They’re engaging with society. They’re crafting solutions.”

However, Lubchenco said that although these efforts show real progress, they are insufficient to fulfill the science community’s responsibilities to society. “It’s not enough to overcome the serious impediments that exist in society, in science, and in academia,” she said.

A major impediment in society, she said, is that “the power that vested interests have to control the narrative and to buy elections is quite problematic.”

Impediments Within Academia

“I think it’s time for us collectively to make a quantum leap in our engagement with society. It’s time to change the culture of academia and to mobilize enabling conditions for science to serve society more effectively.”Lubchenco also focused on impediments in the culture of academia. “It doesn’t reward engagement with society, it doesn’t reward communication, it doesn’t reward creating solutions.”

She said that academia needs to embrace an incentives system that rewards these efforts rather than just rewarding successes in metrics such as grant and publication numbers. She also called for academia to provide more training and mentoring to improve communication efforts and active engagement with the broader community and to enable partnerships whether with local communities, nongovernmental organizations, or other groups.

“I think it’s time for us collectively to make a quantum leap in our engagement with society,” Lubchenco said. “It’s time to change the culture of academia and to mobilize enabling conditions for science to serve society more effectively.”

—Randy Showstack (@RandyShowstack), Staff Writer

Are Beavers Nature’s “Little Firefighters”?

Fri, 12/13/2019 - 15:00

When a wildfire tears through a landscape, there can be little left behind.

A new study, though, suggests that beavers may be protecting life around streams, thanks to their signature dams. Satellite images from five major wildfires in the United States revealed that corridors around beaver habitat stayed green even after a wildfire.

Millions of beavers live in forests across North America, and they make their homes in a particular way: By stacking piles of branches and rocks in a river’s path, they slow its flow and create a pool of calm water to call home. They even dig little channels radiating out from their pools to create “little water highways,” said Emily Fairfax, an assistant professor at California State University Channel Islands who led the study.

Fairfax wondered whether beaver dams would insulate riparian vegetation, as well as the fish and amphibians that live there, from wildfire damage. Wildfires course through landscapes naturally, but blazes will become more frequent as climate change dries out forests.

Fairfax sifted through records of past fires in the U.S. Geological Survey’s database and chose five recent fires that occurred in beaver habitat. She then analyzed the “greenness” of vegetation before, during, and after the fires. She used measurements from NASA’s Landsat satellites, which use red and near-infrared light to detect the lushness of vegetation.

Fairfax found that vegetation along sections of a river without dams burned straight to the river’s edge. But for sections with a resident beaver, “essentially, the plants don’t know a fire is happening.” The channels dug by beavers acted like irrigation channels, said Fairfax, keeping vegetation too wet to burn, even during drought. In all, stretches of river without beavers lost 51% of their vegetation greenness, compared with a 19% reduction for sections with beavers.

Joseph Wagenbrenner, a research hydrologist at the U.S. Forest Service who was not involved with the research, said that protecting the vegetation around rivers can help prevent problems downstream. Contaminants and sediment can clog rivers right after a fire, degrading water quality and threatening life. He said the work could be important for scientists’ efforts to reduce wildfire’s negative impacts.

Fairfax presented the research at AGU’s Fall Meeting 2019 in San Francisco, Calif. She created this stop-animation story of one little beaver’s influence during a burn.

 

 

—Jenessa Duncombe (@jrdscience), News Writing and Production Fellow

Atmospheric Rivers Trigger Heavy Snowmelt in Western USA

Fri, 12/13/2019 - 12:30

Changing climate poses multiple questions concerning the future fate of snowpack, its contribution to runoff and extreme flooding, and seasonal distribution of water resource. Chen et al. [2019] scrutinize historic interactions of precipitation with snowpack in the western United States, specifically exploring how they affect regional runoff generation.

Atmospheric rivers, rare events of strong atmospheric water vapor transport into the coastal areas, are discovered to be important contributors to snowmelt events and significant runoff events and flooding in the Pacific Northwest. As atmospheric rivers are projected to change in the future, this study contributes to the understanding of their potential role in a warmer climate and the likely impacts on water resources and flooding in the western United States.

Citation: Chen, X., Duan, Z., Leung, L. R., & Wigmosta, M. [2019]. A framework to delineate precipitation‐runoff regimes: Precipitation versus Snowpack in the Western United States. Geophysical Research Letters, 46. https://doi.org/10.1029/2019GL085184

—Valeriy Ivanov, Editor, Geophysical Research Letters

Hurricanes Hit Puerto Rico’s Mangroves Harder Than Florida’s

Thu, 12/12/2019 - 22:29

Puerto Rico’s mangrove forests sustained an average of 3 times more damage during Hurricane Maria than southern Florida’s mangrove forests did during Hurricane Irma. Both 2017 storms sheared the tops off mangrove trees in the regions’ coastal ecosystems, but Puerto Rico’s mangroves experienced a wider range of damage, possibly because of the island’s mountainous terrain.

The forests “both saw higher damages as wind speed increased, but the magnitudes of those damages between Florida and Puerto Rico differ greatly,” said Vivian Griffey, lead researcher and a master’s student at the University of Washington in Seattle. “The degree to which those damages occur within the given wind speed classes differs between the different areas.”

Not only did Puerto Rican mangroves lose more height on average, but there was also more variation in height loss in Puerto Rico than in Florida, she said. This variation got the researchers thinking that hurricane wind speed was not the only factor at play. Griffey will present this research on 13 December at AGU’s Fall Meeting 2019 in San Francisco, Calif.

From the Top

“Mangroves provide a number of different ecosystem services,” Griffey said. “One of the big ones that especially is important in Puerto Rico is their ability to buffer against hurricane storm surges.” Mangroves also store a lot of carbon per area and are a nursery habitat for tropical fish species. “They’re doing a lot of work for us,” she said.

In 2017, three major hurricanes struck the Caribbean and southeastern United States, causing damage from which residents are still recovering. The storms, particularly Maria and Irma, also damaged coastal ecosystems including saltwater mangrove forests.

The Jobos Bay mangrove forest in Puerto Rico covered nearly 3,000 acres (12 square kilometers) before Hurricane Maria (left). The upper layers of that forest were sheared off by the storm (right), but the damage to this forest was relatively minor compared with the damage to mangroves on the eastern side of the island. Credit: Vivian Griffey

The researchers wanted to map the damage done to the mangrove forests in each area: The two regions have the same species of mangroves and were struck by strong hurricanes around the same time. The team used a NASA satellite, Goddard’s Lidar, Hyperspectral and Thermal Imager (G-LiHT), to measure the heights of the mangrove trees in southern Florida and in several locations in Puerto Rico before and after the hurricanes.

The team used lidar to measure the heights of mangrove trees before (top) and after (bottom) Hurricane Maria. These two slices come from Humacao in eastern Puerto Rico, just north of where the hurricane made landfall. Credit: Vivian Griffey

“On average, there are far more damages in Puerto Rico than in Florida,” Griffey said. “In Florida we saw, on average, an 11% height loss. Whereas in Puerto Rico we saw, on average, a 33% height loss, which extended up to about 65% in some of the sites. Whereas in Florida, the largest percentage loss that we saw was only 17%.”

However, one thing that stood out as strange in the lidar data was that Irma and Maria had similar ranges of and maximum wind speeds, but the mangrove damage was very different between the two locations. The team also noticed a pattern to the damage in Puerto Rico.

“We saw the highest losses in Humacao, which is just north of where the storm made landfall,” Griffey said. “And then Jobos Bay saw the smallest amount of loss and the smallest range of loss as well.” Humacao is on the eastern coast of the island, and Jobos Bay is on the southwestern coast.

The researchers suspect that a combination of factors led to this pattern: Maria’s southeast-to-northwest trajectory, the counterclockwise rotation of northern hurricanes, and Puerto Rico’s mountainous central terrain.

“There’s very complex mountain topography in Puerto Rico,” Griffey said. “The way the winds interacted with that could potentially cause these patterns of damage we’re seeing on the eastern side.”

Mountains, Storm Surge, and Mangroves

Doug Morton, principal investigator on the project and chief of the Biospheric Sciences Laboratory at NASA’s Goddard Space Flight Center in Greenbelt, Md., said that storm history might be another factor that contributed to the different mangrove damages.

“Florida historically has seen more regular storm action, and Puerto Rico has had infrequent but very strong hurricanes that have impacted the island,” he said. “And so one of the curiosities—not finding the similar relationship between wind speed, tree height, and storm damage—may be that legacy effect or lack of recent storms making mangrove forests more vulnerable to damages from 2017 storms.”

“It’s not all doom and gloom. They may be bad at resisting it, but they may be good at coming back from it.”There’s also the impact of storm surges to consider, Griffey said. Puerto Rico’s mountains mean that mangroves form a ring around the island, whereas Florida’s mangroves extend farther inland. Likewise, storm surges can’t propagate inland in Puerto Rico like they can in Florida. Although the storm surge buffer is beneficial for Puerto Ricans and inland infrastructure, it leaves the coastal mangroves exposed to damaging winds.

“One of the patterns that pops right out in the poststorm data is this green ring around the smaller islands of mangroves in southwest Florida,” Morton said. “And those are shorter trees that were submerged during the storm. They were not only protected from damage, they actually retained their leaves, where many other parts of the coast of Florida and Puerto Rico had their leaves completely stripped.”

The team is continuing to explore the extent to which the regions’ geomorphologies, storm surge patterns, and hurricane histories may have factored in to mangrove damage. Those causes and the patterns of damage they left behind will be key to figuring out how the mangroves will recover.

“You see [some] mangroves recovering in 3-4 years,” Griffey said. “So it’s not all doom and gloom. They may be bad at resisting it, but they may be good at coming back from it. And that we don’t really know yet.”

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

Location, Location, Location: The How-to’s of Asteroid Sampling

Thu, 12/12/2019 - 19:03

Update 12 December 2019: OSIRIS-REx’s primary sample site will be Nightingale and its secondary site will be Osprey. The team announced this at a press conference at AGU’s Fall Meeting 2019.

 

How do you pick the right spot on an asteroid to snag a handful of rocks and dust to bring back home? The mission team for NASA’s Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) spacecraft has been hard at work since July weighing the limited options on asteroid 101955 Bennu.

“It’s quite a bit more rugged than what we anticipated,” said OSIRIS-REx deputy investigator Heather Enos. “When we first started to get our images back, we realized we were going to be hard-pressed to find real estate that provided a big enough site that met the criteria for us to be able to safely TAG.” (TAG is the team’s shorthand for touch-and-go.) Enos is a planetary scientist at the University of Arizona’s Lunar and Planetary Laboratory in Tucson and will discuss the selection process on 13 December during AGU’s Fall Meeting 2019.

How Do You Solve a Problem Like Bennu?

OSIRIS-REx arrived at Bennu on 3 December 2018. Since then, the craft has completed multiple detailed scans, mapping out the plethora of boulders and craters that litter the surface. The mapping campaign, aided in large part by a citizen science project, counted every rock on the surface larger than about 10 centimeters in size.

There turned out to be many more “large” rocks than initially expected, which presented a twofold problem. First, the boulders are a navigational hazard for the spacecraft’s approach and departure during the touch-and-go maneuver. The team needed to adjust its target landing radius from 25 meters across to just 10 meters, akin to striking the bull’s-eye of a dartboard.

“This is a very challenging mission, and it requires incredible integrative effort between the flight dynamics team, the spacecraft team, the science planning team, and the science team [itself].”Second, the craft’s arm is designed to pick up material 2 centimeters or smaller in size. That material seems to be in short supply. Using the detailed map and model of the surface, the researchers digitally removed every boulder larger than 15 centimeters to locate spots that might have smaller material.

The slope of the land factored into the safety of approach and touchdown, and the reflectivity of the material factored into the likelihood that the yet-unresolved material is grabbable. The researchers considered all of these aspects when they calculated the probability that a particular site will be the right spot on which to land.

“This is a very challenging mission, and it requires incredible integrative effort between the flight dynamics team, the spacecraft team, the science planning team, and the science team [itself],” Enos said. “It’s a more integrated process tactically than any other mission I’ve ever participated on and most that I am even aware of.”

The team announced its final four candidates on 12 August: Nightingale, Kingfisher, Osprey, and Sandpiper, named after native Egyptian birds.



Room for Adjustment

OSIRIS-REx will conduct more reconnaissance campaigns in 2020, Enos said, which will help the team resolve the smaller-grained material in each of the sites and refine the landing coordinates.

“We have our TAG coordinates that we specify, but we will have the ability to tweak those a little bit by perhaps a meter,” Enos said. “Once we get the higher-resolution data, we can…slightly adjust where within our TAG site we’re going to target to an area that we think is the best and has particles that are 2 centimeters or less in the most abundance possible.”

OSIRIS-REx is scheduled to attempt sample retrieval from Bennu in late 2020 and to return to Earth with its cache in 2023. The mission aims to bring home 60 grams or more of asteroid material for extensive laboratory study.

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

Celebrating the 2019 Class of Fellows

Thu, 12/12/2019 - 15:40

AGU president-elect Susan Lozier presented the newly elected class of Fellows at AGU’s Fall Meeting 2019 Honors Ceremony, held 11 December in San Francisco, Calif. These individuals were recognized for their exceptional contributions to Earth and space science through a breakthrough, discovery, or innovation in their field. Please join us in congratulating our 62 colleagues who have joined the AGU College of Fellows!

A brief statement of the achievements for which each of the 62 fellows was elected is provided below.

 

Zuheir Altamimi

For developing the International Terrestrial Reference Frame, the foundation for measuring motions of Earth’s surface, sea level, and ice sheets.

 

Ronald Amundson

For pioneering the use of isotopes in the study of soils for interpreting land surface biogeochemistry and paleoclimate.

 

Jonathan L. Bamber

For pioneering satellite remote sensing in glaciology and building bridges to other disciplines of the geoscience community.

 

Barbara A. Bekins

For groundbreaking contributions in subsurface contaminant hydrology, the effects of fluids on plate boundary faults, and induced earthquakes.

 

Jayne Belnap

For outstanding research in desert soil systems and their response to environmental and anthropogenic stresses.

 

Thomas S. Bianchi

For providing molecular-level detail and underlying mechanisms of the burial, transformation, and flux of carbon in dynamic coastal ecosystems.

 

Jean Braun

For his unselfish spirit and seminal contributions to our understanding of the complex coupling between Earth’s topography, tectonics, and climate.

 

Ximing Cai

For forging a new science of hydrologic change accounting for human interaction and using it to advance water resources management.

 

Ken Carslaw

For outstanding creativity in aerosol climate modeling.

 

Benjamin Fong Chao

For outstanding contributions to the field of global geodesy with applications to hydrology, oceanography, and dynamics of Earth’s interior.

 

Patrick Cordier

For groundbreaking work using microscopy and simulation to understand mineral plasticity and its applications to seismology and geodynamics.

 

Rosanne D’Arrigo

For insightful, rigorous, and original contributions to the development of high-resolution paleoclimatology, particularly dendroclimatology.

 

Eric A. Davidson

For advancing scientific understanding of soil nitrogen and carbon cycles that improves predictions of how they are altered by global environmental change.

 

Gert J. de Lange

For elegant contributions elucidating nonsteady state diagenetic processes that improve the interpretation of marine sedimentary records.

 

Andrew E. Dessler

For creative and incisive studies of the influences of water and clouds in the climate system.

 

Michele K. Dougherty

For the study of outer planet systems.

 

Joseph R. Dwyer

For key contributions to understanding energetic radiation processes in our atmosphere and establishing the field of high-energy atmospheric physics.

 

James Farquhar

For innovations in isotope geochemistry that transformed our understanding of the evolution of Earth and life.

 

Mei-Ching Hannah Fok

For profound advancements in understanding the coupled geospace system during magnetic storms.

 

Piers Forster

For outstanding contributions to the development of knowledge on radiative forcing, Earth’s energy balance, and climate sensitivity.

 

Christian France-Lanord

For developing and implementing geochemical tools to resolve tectonic controversies and to constrain rates of organic carbon burial and of erosion.

 

Antoinette B. Galvin

For exceptional contributions to our understanding of the properties of the solar wind, its solar sources, and its structure in the heliosphere.

 

Peter R. Gent

For fundamental contributions to the understanding of the role of the ocean in the climate system and to its representation in Earth system models.

 

Taras Gerya

For fundamental contributions to our understanding of lithospheric and mantle dynamics from a planetary evolution perspective.

 

Dennis Arthur Hansell

For transformative insights into the biogeochemistry of marine dissolved organic matter and assessment of ocean carbon cycling.

 

Ruth A. Harris

For outstanding contributions to earthquake rupture dynamics, stress transfer, and triggering.

 

Robert M. Hazen

For impactful, sustained, and creative data science discoveries in mineral science and mineral evolution and for launching a new era to study Earth’s history.

 

Kosuke Heki

For breakthrough discoveries and original research in geodetic science that have led to fundamental advances in our understanding of geodynamics.

 

Karen J. Heywood

For world-leading, innovative research on ocean physics, bottom water formation and export, and their impact on global climate.

 

Russell A. Howard

For fundamental contributions to understanding solar coronal mass ejections and remote sensing observations of the heliosphere.

 

 

 

 

 

 

 

 

Alan Jones

For fundamental studies of the solid Earth using electromagnetic methods and relating them to the broader Earth sciences.

 

Kurt O. Konhauser

For pioneering research at the intersection of biology and geology, giving us vital new ways to ponder Earth’s past relationships with life.

 

Sonia M. Kreidenweis

For elucidating aerosols’ role in climate and visibility by quantifying their hygroscopic growth and cloud condensation/ice nuclei activity.

 

Kitack Lee

For transformational discoveries of the impacts of anthropogenic carbon and nitrogen inputs to the ocean.

 

Zheng-Xiang Li

For insights into restoring pre-Pangean supercontinents and their connections to mantle superswells, true polar wander, and snowball Earth.

 

Jean Lynch-Stieglitz

For developing new methods for reconstructing past ocean circulation and for advancing understanding the late Quaternary deepwater and climate variability.

 

Kuo-Fong Ma

For fundamental advances in earthquake source physics using geophysical and geological data.

 

Reed Maxwell

For outstanding contributions toward the advancement of integrated hydrological simulation across scales.

 

John W. Meriwether

For fundamental contributions to understanding the thermal and dynamical structures in Earth’s upper atmosphere.

 

Son Van Nghiem

For remote sensing innovations leading to breakthroughs in Earth science research and applications to hazard mitigation ranging from fire to ice.

 

Yaoling Niu

For stimulating a new understanding of the relationships between mantle evolution and melt generation at oceanic plate boundaries.

 

Thomas Howell Painter

For breakthrough contributions to the understanding of snow-related runoff generation processes and their measure in mountainous environments.

 

Beth L. Parker

For fundamental advancement in characterizing contaminant mobility in fractured sedimentary rocks.

 

Ann Pearson

For pioneering and transformative contributions concerning the origins and paleoceanographic significance of microbial biomarkers.

 

Graham Pearson

For sustained contributions on the age, origin, and evolution of the continental upper mantle.

 

Lorenzo Polvani

For fundamental contributions to the understanding of the dynamics of tropospheric-stratospheric interactions and their role in climate change.

 

Peter W. Reiners

For validating the U-Th/He thermochronology technique and using it creatively to solve key geological problems.

 

Yair Rosenthal

For fundamental contributions to the development of deep-ocean paleothermometry and understanding of Pleistocene and Cenozoic climate changes.

 

Osvaldo Sala

For integrative research on biodiversity and ecosystem functioning with sustained impact to science and society.

 

Edward “Ted” Schuur

For being a global leader in research that has fundamentally contributed to understanding the vulnerability of permafrost carbon to climate change.

 

Sybil Putnam Seitzinger

For fundamental research on the human impacts on the biogeochemistry of the Earth system and for inspiring policy solutions.

 

Toshihiko Shimamoto

For outstanding contributions to fault and earthquake mechanics, in particular to mechanics of faulting at seismic slip rates.

 

Adam Showman

For groundbreaking work on the dynamics of planetary atmospheres, inside and outside the solar system, and the geophysics of icy satellites.

 

Alex Sobolev

For groundbreaking work on magmatic melt inclusions and phenocrysts to unravel the nature and source of compositions of mantle-derived melts.

 

Carl I. Steefel

For pioneering and cross-disciplinary work on fluid-rock systems through innovative reactive transport model development and application.

 

John Suppe

For seminal contributions in structural geology and tectonics, including fold-fault kinematics and thrust belt and strike-slip fault mechanics.

 

Karl E. Taylor

For improving our ability to evaluate and intercompare climate models and for advancing understanding of climate forcings, responses, and feedbacks.

 

Meenakshi Wadhwa

For outstanding contributions to the understanding of solar system chronology and the chemistry of Mars.

 

Michael J. Walter

For advances in understanding the formation of Earth and its core, the petrology of the mantle, and the phase relationships of the deep Earth.

 

John S. Wettlaufer

For fundamental contributions to understanding the physics of ice from molecular to geophysical, climatic, and planetary scales.

 

Chunmiao Zheng

For elucidating solute transport mechanisms in heterogeneous porous media and developing codes for analysis of groundwater solute transport.

 

Tong Zhu

For exceptional contributions to advancing fundamental atmospheric chemistry and to assessing impacts of megacity air pollution on human health and climate.

 

Scientists and Activists Examine Need for Climate Action

Thu, 12/12/2019 - 15:39

For Varshini Prakash, the climate crisis “is obviously very depressing” and “terrifying with the timeline that we’re working on” to curb greenhouse gas emissions.

However, Prakash isn’t letting that stop her as she works to organize and mobilize youth and others to stop climate change. She is the cofounder of the Sunrise Movement, an organization that advocates for climate action and supports the Green New Deal initiative.

She spoke at a 9 December session at AGU’s Fall Meeting 2019 in San Francisco, Calif., on aligning U.S. energy policy with a 1.5°C climate limit above preindustrial levels. The session included climate scientists and activists.

A Role for Scientists in Climate Action

“We need a large, vocal, active base of support, and scientists are a critical part of that constituency.”“We need a large, vocal, active base of support, and scientists are a critical part of that constituency,” said Prakash, who first became involved in climate politics as an undergraduate studying environmental science at the University of Massachusetts Amherst. “I have seen so many badass scientists over the last few years stepping up into real leadership, calling on action, refusing to be quiet.”

Her message for the scientific community is that “we have no more time,” Prakash said. “Putting a 20-, 30-, or 40-year career ahead of the future of human civilization: I understand why people do it. At the same time, I want people to really grapple with what it is that we are up against right now and the sheer demise that we could fall into if we don’t take adequate action in the next 5 years. That does not feel arbitrary or sort of off in the distance to me. It feels like right now.”

Prakash added that she understands that people need to make a living, and she isn’t saying to give that up. “I’m just saying, take the necessary risks, take the appropriate risks. And don’t be afraid because you’re worried that someone will be mean to you on Twitter or someone might criticize you publicly.”

Climate scientist Michael Mann, a professor of atmospheric science at Pennsylvania State University in University Park, said that scientists shouldn’t have to apologize for being advocates “for a fact-based, objective discourse over what is arguably the greatest threat that we face as a civilization.”

Fossil Fuel Interests

“We’re seeing an evolution from denial to what I would call deflection, division, doomism, and delay” by fossil fuel interests.Mann also cautioned about the shifting strategies of fossil fuel interests as the science and impacts of climate change become undeniable.

“We’re seeing an evolution from denial to what I would call deflection, division, doomism, and delay” by fossil fuel interests, Mann said.

“We have to recognize the evolving nature of the campaign to block progress on climate change. Just because outright denial of the evidence seems to be waning doesn’t mean that there isn’t still a concerted campaign to ensure the one thing that fossil fuel interests care about: that we don’t act on climate, that we do not decarbonize our economy.”

Other speakers at the session also warned of the tactics of fossil fuel interests and the need to move on from fossil fuels. Prakash, for example, said that a misinformation and denialism campaign “has stalled progress on this issue for 40 years, and not because people are genuinely confused about the science, [but] because there is money to be made off of that confusion.”

Georgia Piggot, a staff scientist for the Stockholm Environment Institute (SEI), presented findings from the institute’s recent report “Closing the Fossil Fuel Production Gap” that show that countries are on track to produce much more coal, oil, and gas by 2030 than is consistent with the goals of the Paris climate agreement.

In line with the SEI report, Kelly Trout, a senior research analyst at Oil Change International, said a report by her group and others about U.S. oil and gas expansion indicates that “over exactly the period in which the world needs to rapidly decarbonize, what we see is [that] the U.S. would be unleashing the world’s largest burst of new carbon from oil and gas.”

A Just Transition

Kassie Siegel, senior counsel and Climate Law Institute director at the Center for Biological Diversity, said the United States doesn’t need to continue along this path. Citing a report by the center, Siegel said the next president could declare a national climate emergency and halt fossil fuel lease sales and permits, among other measures.

Siegel and the others also emphasized the need for a “just transition” to protect extraction communities and workers when mines and fossil fuel infrastructure shut down.

Emily Grubert, an assistant professor of civil and environmental engineering at the Georgia Institute of Technology, said miners and other fossil fuel workers need help in the transition and should not be vilified for their hard work. “People are starting to come to terms with the actual effects of climate change,” Grubert said, “and it’s very easy to look around for a villain.”

Mann said that fossil fuel workers should not be conflated with coal barons and other vested interests “who have profited greatly off of essentially the suffering of the people who have worked for their industry.”

—Randy Showstack (@RandyShowstack), Staff Writer

Human Brains Have Tiny Bits of Magnetic Material

Thu, 12/12/2019 - 15:38

Scientists have mapped magnetic materials in human brains for the first time, revealing that our brains may selectively contain more magnetic material in their lower and more ancient regions.

Researchers used seven specimens donated in Germany to measure brain tissue for signs of magnetite, Earth’s most magnetic mineral. Scientists have known that other types of life, such as special kinds of bacteria, contain magnetite. But the distribution of magnetite in human brains has been unclear because no systematic study had mapped the mineral in human tissue before.

The results could shine a light on why humans have magnetite in their brains to begin with, which remains an open question. Stuart Gilder, the lead author of the study and a scientist at Munich University, said that their results show that magnetic particles exist in the “more ancient” part of the brain. “We thought from an evolutionary standpoint, that was important,” Gilder said.

Magnetic Minds

Scientists discovered the first hints of magnets in human brains in 1992: A paper reported that tiny crystal grains, some barely wider than a DNA strand, were found in human brain tissue from seven patients in California. The crystals looked just like the tiny magnets in magnetotactic bacteria that help them navigate along geomagnetic field lines in lakes and saltwater environments.

Scientists are not sure why or how magnetite gets into human brains. Magnetite could serve some physiological function, such as signal transmission in the brain, but scientists are only able to speculate. A further mystery is how magnetite arrives in the brain in the first place: One study of the frontal cortex of 37 human brains suggests that we breathe in magnetite from the environment. But other researchers, like Gilder, think magnetite comes from internal sources.

From Rocks to Brains

To find out some answers, Gilder and his team dissected seven brains and measured their magnetic strength and orientation. The brains had been preserved in formaldehyde since the 1990s, when relatives and guardians of the deceased donated them to science. The brains came from four men and three women between the ages of 54 and 87.

Gilder typically studies rocks in his lab to ascertain their geologic history, but his latest study was not so different, he says. “I could essentially apply everything that I do to rock to brains,” Gilder said. The scientists cut the preserved brains into 822 pieces and ran each sample through a magnetometer, a machine used to measure records of Earth’s magnetic field in rocks.

When Gilder studies a rock, he measures its magnetism in two steps: First, he tests the rock’s natural magnetic strength, which will typically be low because rocks are bad at creating orderly magnets. (Even if the rock contains magnetic particles, their dipoles point in random directions, potentially canceling each other out.)

Second, Gilder uses an electromagnet to apply a strong magnetic field to the sample, and this aligns the tiny magnetic particles so that they all face the same direction. When he tests their magnetic strength a second time, he sees the full strength of the magnetic signal from the rock. “If I measure something that is more magnetic after I’ve applied a very big magnetic field, that’s proof that this material contains magnetic recording particles,” Gilder said.

Gilder applied the same two-step technique to the brain samples. The comparison revealed that the human brain had a detectable magnetism after a magnetic field had been applied to the samples. The results showed that magnetite was in “almost every piece” of the specimens, said Gilder.

“The Exact Same Pattern” The lower in the brain you go, the stronger the magnetic signal grows. Levels are particularly high in the brain stem. The study found some difference in magnetite in the left and right hemispheres of the brain. Credit: Gilder et al., 2018, https://doi.org/10.1038/s41598-018-29766-z

The latest study reveals that the lower regions of the human brain, including the cerebellum and the brain stem, had 2 or more times the magnetic remanence of the upper regions of the brain. The upper regions of the brain compose the cerebrum, which is responsible for reasoning, speech, and other tasks, whereas the lower regions handle muscle movement and automatic functions like heart rate and breathing.

Gilder said that the pattern emerged in each of the seven brains, and it showed no difference depending on the person’s age or sex. The brain stem had consistently higher magnetization than any other region, although only five of the seven brains had brain stems intact.

Joseph Kirschvink, a professor at the California Institute of Technology in Pasadena not involved in the study, said that the work “confirms the biological origin of the brain magnetite.” Kirschvink said that the results in the study closely matched research he had performed in his lab, but the latest research has “100 times more data.”

The scientists took pains to limit contamination, cutting the samples with a ceramic knife and staging the experiment inside a magnetically shielded room in a forest far from urban pollution. They removed samples with high levels of natural magnetic strength that could have been polluted with fragments of the saw cutting into the donors’ skulls many years ago. Even with the potentially contaminated samples removed, the data still showed an anatomical pattern.

Gilder presented the research this month at AGU’s Fall Meeting 2019 in San Francisco, Calif.

—Jenessa Duncombe (@jrdscience), News Writing and Production Fellow

The Science and Policy of Climate Action

Thu, 12/12/2019 - 01:45

Former New York City mayor Michael Bloomberg and former California governor Jerry Brown engaged in a discussion about the future of climate action at AGU’s Fall Meeting 2019 on Wednesday, 11 December. AGU executive director and CEO Chris McEntee joined them on stage to talk about what challenges and solutions are addressed by America’s Pledge, the climate plan Bloomberg and Brown first announced in 2017.

“When the AGU headquarters building was built 25 years ago, net zero buildings were just a dream,” said Bloomberg. “But the years since then have shown that when scientific research meets smart public policy, anything is possible.”

Earlier this week, the two leaders presented from their initiative a new report titled Accelerating America’s Pledge at the United Nations’ 25th Conference of the Parties (COP25) in Madrid, Spain. The report provides analysis on several strategies to significantly reduce carbon emissions that could potentially lead to full decarbonization in the United States by 2050. The analysis was led by the University of Maryland’s Center for Global Sustainability and the Rocky Mountain Institute, along with the World Resources Institute and CDP.

“America’s Pledge is not just a statement of support; it’s a way to uphold our promise to the world” that the United States made when it signed on to the Paris Agreement, said Bloomberg. In June 2017, President Donald Trump announced his intention to withdraw the United States from the Paris Agreement on 1 June 2020. (Bloomberg announced his Democratic candidacy for president on 24 November.) Brown added that around 4,000 governments and businesses have pledged to their initiative to take climate action.

The report analyzes the potential of a “bottom-up” strategy focusing on climate action led by cities, states, and businesses. It suggests that ambitious action by local leaders could reduce U.S. greenhouse gas emissions by up to 37% below 2005 levels by 2030. The idea is to combine that movement with “aggressive new federal engagement,” per the report, that “would lay the foundation for a net zero emissions economy by mid-century” and meet the goals of the Paris Agreement and the recent Intergovernmental Panel on Climate Change report, Global Warming of 1.5 °C.

Bloomberg called on scientists to convince the public to call their lawmakers and “hold their feet to the fire.” Both leaders insisted that the climate crisis is much more a political problem than a scientific one, and scientists should use their voices while providing their expertise. He added that organizations like AGU have power “to lead this country” through the great credibility of their scientific membership. On Monday, AGU released its updated position statement on climate, calling for scientists to engage with policy makers to take immediate and coordinated action on the climate crisis.

McEntee noted that “climate doesn’t know geography,” asking the two leaders whether their pledge for America would address global challenges. Brown suggested that more partnerships like the California–China Climate Institute, which he founded at the University of California, Berkeley, would create partnerships and enable better research. The United States also needs to provide better scientific funding in federal budgets for research and to raise the next generation through proper scientific education. “Scientists have to get the word to the politicians that the lifeblood of the future is new knowledge,” said Brown.

When asked by McEntee whether they were optimistic about the future, Brown responded, “The future is uncertain. The trends don’t look so good. But we have enormous capacity. And if we look to the past, the stuff that’s happening now was unimaginable.”

Bloomberg added, “We have to be opportunistic and aggressive. Everybody here should be calling their congressman.”

 

 

—Heather Goss (@heathermg), Editor in Chief, Eos

A Modern Manual for Marsquake Monitoring

Wed, 12/11/2019 - 19:38

At 5:54 p.m. Greenwich Mean Time in Potsdam, Germany, a horizontal pendulum suddenly started shaking. The pendulum scratched out spikes in ground movement for a few hours before settling down.

All of this sounds perfectly typical for a seismograph, but in fact, the event was quite extraordinary: The date was 17 April 1889, and this was the first recorded teleseismic event, a magnitude 5.8 earthquake that had shaken Tokyo, Japan, a little more than an hour prior.

Now, 130 years after that Tokyo quake kicked off the modern science of terrestrial seismology, a more sophisticated seismometer is retracing those first shaky steps 225 million kilometers and a planet away.

“We are at the birth of seismology on Mars.”“We are at the birth of seismology on Mars,” said Philippe Lognonné, a planetary seismologist at Institut de Physique du Globe de Paris in France. Lognonné is the principal investigator of the Seismic Experiment for Interior Structure (SEIS) instrument on NASA’s Interior Exploration using Seismic Investigations, Geodesy and Heat Transport (InSight) lander.

“The goal of InSight is really to do the seismology from the beginning of last century,” Lognonné said. “We do this first step of seismology, but of course with a much better instrument than one century ago, with much better tools, theory, [and] computers. But the data, in some ways, are like those from one century ago.”

Building on the Past

On Mars, as on Earth, seismometers can use the shaking of the ground to probe the interior structure of the planet.

“Since the dawn of the age of planetary exploration, seismometers have been considered among the key instruments that you need to really understand how a planet formed and evolved,” explained Renee Weber, a lunar and planetary scientist with the InSight team at NASA’s Marshall Space Flight Center in Huntsville, Ala. “We do that by analyzing energy recorded at the station from naturally occurring seismic events and from meteorite impacts.”

InSight landed just north of where the Curiosity rover is exploring and south of the Viking 2 lander, which carried the only other working seismometer on Mars. White text identifies successful missions, gray text identifies failed missions, and blue text identifies future missions as of February 2019. Credit: image, NASA/JPL/USGS; map, Emily Lakdawalla, CC BY-NC-SA 3.0

SEIS is not the first seismometer launched into space or even the first seismometer on Mars. The Apollo 12, 14, 15, and 16 missions carried active seismometers to the Moon in the early 1970s. NASA’s Viking 1 and Viking 2 landers brought seismometers to Mars a few years later. And the Soviet Venera 13 and Venera 14 landers placed short-lived seismometers on Venus in the early 1980s.

The seismic network installed by Apollo astronauts discovered of a variety of moonquakes, and the Venera instruments detected possible Venusian microquakes. But the Viking seismometers, unfortunately, sent back only one maybe quake and a lot of noise.

“One of them didn’t work. The other one, for a variety of reasons, didn’t really detect anything that we could definitively say was a marsquake,” Weber said. “That was primarily because the instrument itself never came in contact with the ground, but rather it was just mounted to the deck of the lander. And so all that it ever recorded was the wind blowing.”

The Viking seismometers put an upper limit on the level of Mars’s seismic activity, InSight principal investigator Bruce Banerdt told Eos. Before InSight launched, “all we knew was that Mars was almost certainly less active than the Earth, which is not a very strong constraint. And we knew how active the Moon was. Given their relative sizes and relative levels of geologic and volcano-tectonic activity over the last few billion years, we would expect Mars to be more active than the Moon,” Banerdt said.

More active than the Moon and less active than Earth is a very wide seismic range, Lognonné added, and is not very informative. Using measurements of tides caused by the Sun and Mars’s moon Phobos, “we know that [Mars] has a core, but everything else inside the planet is very much unknown.”

“Honestly, we knew very little about the Martian interior,” Lognonné said.

Designing the Right Tool for the Job

As they designed, built, and tested SEIS, the instrument team knew it had to overcome the shortcomings of Viking seismometers. “We needed to be able to convince ourselves, and certainly convince NASA, that we were actually going to see something when we got there,” Banerdt said.

InSight’s robotic arm placed SEIS on the Martian surface on 19 December 2018 (left) and then covered it with a thermal and wind shield (right). Credit: left, NASA/JPL-Caltech; right, NASA/JPL-Caltech

“The first challenge is to have only one seismometer, compared with an Earth station where we actually have a network of seismometers,” said Charles Yana, the SEIS project manager and an engineer at the Centre National d’Etudes Spatiales in Paris, France. A seismic network measures the same quake event from multiple locations and can provide a more precise and accurate quake profile than any single seismometer can. “That’s why [SEIS] has to be very precise and working very well and has to be tuned very carefully. It is very sensitive, actually, in comparison to seismometers on Earth.”

To detect likely weak marsquake signals, placing the instrument on the ground was a must, as was shielding it from the wind and protecting it from Mars’s 60°C daily temperature swings, Yana explained. In fact, protecting SEIS from any atmosphere at all was key to mission success. InSight’s initial 2016 launch date was pushed back to 2018 because there was a leak in SEIS’s vacuum chamber. “Just air inside that chamber can interfere with the measurement,” Weber said.

SEIS “is almost comparable to the best broadband seismometer used on the Earth for seismic networks,” Lognonné said. “We have been able to detect at about 10 hertz displacement of the ground of the order of less than 5 picometers…which is a fraction of the size of an atom.”

“The engineering scope of what was done is just remarkable,” said Keith Koper, a terrestrial seismologist at the University of Utah in Salt Lake City. Koper directs a seismic network in Utah and is not involved with the InSight mission. “I know how hard it is, even on Earth, to get these really sensitive instruments out and get them properly working….They’re kind of finicky, sensitive things.”

Detecting Marsquakes for the First Time InSight’s robotic arm placed a seismometer on Mars on 19 December 2018. Credit: NASA/JPL-Caltech

InSight launched to Mars in May 2018 with a fully functional seismometer aboard. It landed that November at an equatorial spot called Elysium Planitia, placed SEIS on the ground, and covered the instrument with a wind and thermal shield. That process took about a month. Then: calibration.

“The first 6 months of the mission were dedicated to perfectly determining the ambient noise around our seismometer,” Yana said. “Everything that is around our sensor creates noise and variations of seismic signals.”

Some of the “noise” SEIS measures is tiny ground shifts due to pressure and wind changes and weather phenomena like dust devils. Those signals, detectable by ultrasensitive SEIS, would be drowned out in the turbulence of Earth’s oceans and atmosphere, Banerdt said.

“We’re looking in into a frequency band that’s really invisible on the Earth,” he said. “And on Mars during the quietest part of the evening we’re seeing signals that are a thousand times lower than anything that’s detectable on the Earth.”

Finally, 128 Martian days after landing, SEIS detected its first verifiable marsquake. The tremor was tiny, too small to pinpoint its origin or cause, but was distinctly different from tremors caused by wind. The recording of the sol 128 event (below) wowed the planetary science community and, of course, the project team.



“I have it on my iTunes. I pull it up every once in a while,” Banerdt joked. “We had a full room for the plenary session talking about it” at the Seismological Society of America conference in Seattle, Wash. “People just started to wrap their minds around the fact that, you know, here we have a whole new planet to explore with this incredible tool.”

Koper, who attended the 27th International Union of Geodesy and Geophysics General Assembly in Montreal, Canada, said that the session on Mars seismology was also standing room only. “They had this one beautiful seismogram that just looked just like what we’d see on Earth.”

A second marsquake on sol 173, and a third on sol 235, revealed the variety of quakes that Mars has to offer.

“We are currently observing two families of quakes on Mars,” Simon Stähler, a seismologist at ETH Zurich, said in a statement about the second quake. “The first quake was a high frequency event more similar to a moonquake than we expected. The second quake was a much lower frequency, and we think this may be due to the distance.”

As a seismic wave travels through solid rock, that rock will attenuate more and more of the higher frequencies within the signal, Banerdt explained. That can help determine the distance between the quake epicenter and the seismometer.

“It looks like there’s going to be enough meat there in the data to make, honestly, without any sort of hyperbole, some important first-order discoveries about the interior structure of Mars,” Koper added.

Trying to Understand Marsquakes

During the first 3 months of data collection, SEIS detected 21 seismic events that the team are certain were not caused by atmospheric phenomena. More than 100 other events are classified as “maybes.” The team will discuss these results in several sessions at AGU’s Fall Meeting 2019 on 12 December and 13 December.

“There’s a magnitude scale that’s been developed for Mars analogous to the magnitude scales we use on the Earth”As the quakes started pouring in, the team realized they needed to develop a new ranking system for the events. Now “there’s a magnitude scale that’s been developed for Mars analogous to the magnitude scales we use on the Earth,” Banerdt said.

The new scale incorporates how a seismic wave changes as it travels from the epicenter to a seismometer station, changes that depend on the properties of the material that the wave travels through. “It’s really tied to the transmission properties of the planet that you’re looking at it on,” Banerdt said, which is why Earth’s magnitude scale doesn’t work for marsquakes.

The newly published data include three “Quality A” events with strong, clear signals well above the noise level. Data from two of the events were detailed enough to pinpoint their epicenters.

“They’re both in the same area,” Banerdt said, “a region called Cerberus Fossae.…There are recent lava flows in this area. There are recent water flooding events in this region. There’s faulting in this region. So this is an area that we had set aside as a place that we wanted to watch as we started accumulating events. And the first two locatable events that we’ve picked up are located in this region.”

Cerberus Fossae, seen here from orbit, was predicted to be a seismically active region on Mars on the basis of its crust fractures and evidence of recent surface flows. It has been pinpointed as the epicenter of at least two recent marsquakes. Credit: ESA/DLR/FU Berlin, CC BY-SA 3.0 IGO

The marsquakes also revealed a curious phenomenon: a resonance near a 2.4-hertz frequency that is not caused by the lander. The resonance appears as an unexpected increase in wave energy near that vibrational frequency but only for some quakes.

“We don’t know exactly where it comes from or why it’s there,” Banerdt added. “We don’t know why it gets excited or it doesn’t get excited, but we think that it has something to do with some kind of a resonance in the crustal layer that’s sensitive to being excited by seismic events. And this is something that’s kind of puzzling.”

Are marsquakes happening as often as expected? The InSight team is finding that the frequency of small seismic events looks surprisingly familiar.

“We’re seeing almost as many events as would be generated on the Earth if you took away all the plate boundaries and all the hot spot zones, just looking at what we call interplate seismicity,” Banerdt said. That’s a higher activity level than the team expected for small events.

Meanwhile, the instrument has detected fewer strong events than models predicted. “This is either telling us that Mars distributes its release of energy over different size quakes in a different way…or that we have some learning to do in identifying what’s a marsquake and what’s not,” he said.

Revealing Mars’s Seismicity

Two other instruments share a home with SEIS on board the InSight lander.

The Rotation and Interior Structure Experiment (RISE) is a precision tracker. RISE uses radio transmissions to measure the lander’s location relative to the Sun, trying to determine how fast Mars wobbles as it orbits the Sun. (Earth completes one “wobble” every 2 years.)

“This information will add to our knowledge of the size of Mars’ core, and helps us determine whether it is liquid or solid,” RISE principal investigator William Folkner says on the instrument’s website.

A simulation displays seismic waves from a marsquake as they move through different layers of Mars’s interior. Credit: NASA/JPL-Caltech/ETH Zurich/Van Driel

InSight’s heat probe, the Heat Flow and Physical Properties Package (HP3), is meant to measure the heat radiating out from Mars and determine the planet’s cooling rate. To do this, InSight has to drill a hole about 5 meters deep and place the probe within Mars’s crust. The drill, or mole, has experienced significant difficulties in getting down into the Martian soil.

“The hammer hammers it down, but it just bounces right back up again because there’s not enough friction on the sides of the mole to damp out that rebound,” Banerdt explained. The team hopes that InSight’s robotic arm can help the drill get a grip on the soil and keep digging. Weber called this “a lesson learned” for any future planetary drilling missions.

In the meantime, SEIS will continue to monitor Mars for seismic events and progressively build up a holistic view of the interior of Mars. “SEIS is working, and SEIS is working well,” Yana said. There are always small engineering tweaks that could be made using the most up-to-date technology, he said, but “if we needed to send the same instrument again on the next mission [to Mars], everyone would be very happy about it.”

It’s also possible that orbiters might contribute to Mars seismology, Weber said. She explained that imaging satellites orbiting Mars can spot impact events that might cause a quake. If SEIS simultaneously detected that quake, knowing precisely where it came from could, theoretically speaking, “help constrain the models that dictate how the seismic energy of that impact propagates through the planet.”

“Our next deployment to the Moon will learn from InSight more than it will learn from Apollo.”One hundred thirty years after the dawn of modern terrestrial seismology and 50 years after the advent of lunar seismology, Martian seismology has kicked off with a bang. “I think InSight is more of a grandchild of the evolution of terrestrial seismology than it is of the evolution of lunar seismology,” Weber said. “Because, unfortunately, since Apollo, we haven’t been back to the Moon.”

Terrestrial seismology has advanced in leaps and bounds since lunar seismology concluded in 1977, and SEIS was designed on the basis of those advancements. “And so really,” she said, “our next deployment to the Moon will learn from InSight more than it will learn from Apollo.”

—Kimberly M. S. Cartier, Staff Writer

Scientists Scramble to Collect Data After Ridgecrest Earthquakes

Wed, 12/11/2019 - 19:26

At 10:33 a.m. local time on 4 July 2019, a magnitude 6.4 earthquake struck California’s Mojave Desert near Ridgecrest (population 28,000). The next day, a magnitude 7.1 earthquake—roughly 11 times more powerful—occurred at 8:19 p.m. on a different fault in the same area.

“This was a unique set of events.”“This was a unique set of events,” said Abhijit Ghosh, a seismologist at the University of California, Riverside. Usually, there’s one large earthquake followed by smaller aftershocks, he said. “This time we had a large, damaging earthquake immediately followed by an even larger, damaging earthquake.”

Ground shaking was felt across Southern California. The Did You Feel It? website from the U.S. Geological Survey (USGS) recorded over 40,000 reports of shaking for each of the large earthquakes.

While the Ridgecrest area was still ringing with aftershocks—over 3,500 were detected within a week and a half—researchers working in Southern California and beyond rushed to the epicenters of the earthquakes.

Over the next days, weeks, and months, personnel associated with academic institutions, the USGS, the California Geological Survey, the U.S. Navy, and other organizations would collect a wide range of seismological, geological, and geodetic data. “It was an all-hands-on-deck effort,” said Elizabeth Cochran, a seismologist at the USGS in Pasadena, Calif.

The measurements and observations, which are currently being analyzed, will shed light on earthquake-triggering mechanisms, the structure of seismic faults, and how surface rupture affects buried infrastructure such as gas pipes and sewer lines, scientists anticipate.

Credit: USGS A Hurried Trip

Ghosh and five students arrived in Ridgecrest on the morning of 6 July with five seismometers stuffed in Ghosh’s white Chevy Suburban. Their hurried trip was guided by Omori’s law, which states that the number of aftershocks decays exponentially with time. “If you miss those first few days, you’re missing the lion’s share of the data,” said Ghosh.

Over the next 2 weeks, Ghosh and his students installed 25 seismic stations around the Mojave Desert. Data from the stations, which will remain in place for 6 months, will shed light on the complex fault structure near the boundary between the Pacific and North American plates, said Ghosh.

Abhijit Ghosh sits in the Mojave Desert near Ridgecrest, Calif., with one of the 25 seismic stations he and his students installed immediately following the Ridgecrest earthquakes. Credit: Baoning Wu/UC Riverside

This area, part of the Eastern California Shear Zone, is well known for having scads of crosscutting faults. It’s fiendishly complicated compared with models that focus on one isolated fault, researchers agree.

“The traditional way of simulating ruptures essentially involves far simpler structures,” said Yehuda Ben-Zion, a geophysicist at the University of Southern California in Los Angeles. Because earthquakes like those in the Ridgecrest sequence require models that go beyond what the community is currently using, said Ben-Zion, “it’s an impetus to move forward.”

The Southern California Earthquake Center (SCEC), a National Science Foundation– and USGS-funded organization led by Ben-Zion, seized the opportunity to learn from the Ridgecrest earthquakes. By 12 July, SCEC researchers had installed over 460 seismometers around Ridgecrest. The instrumentation, collected by Ben-Zion, was contributed by Sandia National Laboratories, USGS, the Incorporated Research Institutions for Seismology Portable Array Seismic Studies of the Continental Lithosphere Instrument Center, and other groups.

Scientists from the Pasadena and Moffett Field offices of the USGS also rushed to Ridgecrest soon after the initial large earthquake. “They wanted to get on the ground quickly and see if there was surface rupture,” said the USGS’s Cochran, who helped coordinate the response. And they found it. “Just as daylight was fading, they found the break across the highway.”

Over the coming days and weeks, USGS scientists installed over 200 seismometers, mostly along the fault that produced the magnitude 7.1 temblor. The goal, said Cochran, was to watch how a fault evolves through time after an earthquake.

Faulty Questions

By blanketing the landscape with seismometers, researchers hoped to carefully study this earthquake sequence, which included unusually large earthquakes. “It terminated a hiatus of large earthquakes in Southern California that’s lasted for almost 20 years,” said Ben-Zion.

Scientists want to better understand the structure of the complicated Eastern California Shear Zone and how earthquakes on one fault potentially trigger ground movement on other faults.For starters, scientists want to better understand the structure of the complicated Eastern California Shear Zone and how earthquakes on one fault potentially trigger ground movement on other faults. The two largest earthquakes of the Ridgecrest sequence occurred close in space and time but on different faults.

“Did the magnitude 6.4 earthquake somehow trigger the magnitude 7.1 earthquake?” asked Ghosh. “If it was indeed triggered, what’s the mechanism?”

Other faults in the area that didn’t rupture should also be watched, researchers agree.

One is the roughly 250-kilometer-long Garlock fault that skirts the large Southern California city of Bakersfield. There’s already evidence that the Garlock fault is more active than it was before the Ridgecrest sequence, said Ghosh. That’s potentially bad news given the size of the Garlock fault.

“It’s a larger fault, meaning that it can produce larger earthquakes,” said Ghosh.

Cochran and her USGS colleagues also analyzed how quickly seismic waves from aftershocks traveled through the ground. Using measurements of increased velocities over time after the magnitude 7.1 earthquake, Cochran and her collaborators estimated that the subsurface landscape was knitting itself back together. There are a lot of ideas to explain this “healing,” said Cochran, such as cracks closing because of confining pressure and mineral deposition essentially cementing cracks back together. But the details remain elusive. “We don’t actually know what the physical mechanism is,” said Cochran.

Cracked Pavement and Broken Pipes

Scientists traveled to Ridgecrest armed with more than just seismometers: They also came ready to collect geological and geodetic data. Scott Brandenberg, a geotechnical engineer at the University of California, Los Angeles, arrived in Ridgecrest around 3:00 p.m. local time on 5 July. (He experienced the magnitude 7.1 earthquake that evening from the parking lot of Ridgecrest’s Super 8 hotel.)

Brandenberg had come as part of Geotechnical Extreme Events Reconnaissance (GEER), a volunteer organization of geotechnical engineers, engineering geologists, and Earth scientists that, since the 1980s, has been conducting geotechnical engineering reconnaissance after disasters. The point of GEER is to provide coordination, said Brandenberg, because there’s the risk of having “a whole bunch of disjointed efforts” after an earthquake.

Using cameras, tape measures, and GPS, Brandenberg and his colleagues mapped roughly 2 kilometers of surface rupture manifested as fissures in the ground, scarps, and cracked pavement. This on-the-ground fieldwork provided an in-depth view of the aftereffects of ground shaking but covered just roughly 3% of the total surface rupture: Together, the magnitude 6.4 and 7.1 earthquakes produced about 70 kilometers of rupture. “Our ground-based mapping efforts focused on very detailed measurements over a short length of the fault rupture,” said Brandenberg.

The opportunity to analyze any amount of surface rupture is rare because even the largest earthquakes don’t always produce it.The opportunity to analyze any amount of surface rupture is rare, said Brandenberg, because even the largest earthquakes don’t always produce it. “Ridgecrest was the first California earthquake since Hector Mine in 1999 that ruptured the ground surface.”

Surface rupture is of interest because it’s liable to affect underground infrastructure like water pipes, gas pipes, electric utilities, and sewer lines, said Brandenberg. And indeed, broken water pipes, power outages, and fires were all reported following the Ridgecrest earthquakes.

“There’s just so much in the ground,” Brandenberg said.

Collecting these observations early on was critical because surface rupture data are “perishable,” said Brandenberg. People walk and drive over these surface features, and aftershocks alter them, he said. “Just during the time that we were there, the surface rupture features had started changing.”

A significant fraction of the fault zone researchers wanted to study fell within the Naval Air Weapons Station China Lake. Larger than the state of Rhode Island, the secure facility required coordination with the Navy. Military personnel were very cooperative, Brandenberg said, and regularly escorted scientists around the facility.

On 19 July, just 2 weeks after the magnitude 7.1 earthquake, Brandenberg and his colleagues published a summary of their reconnaissance work.

Going to the Sky

Other research groups opted for a bird’s-eye view of how the earthquakes changed the landscape. Mike Oskin, a geologist at the University of California, Davis, is part of a team that, starting in late July, flew a small aircraft to collect lidar observations. These remote sensing data, which cover 600 square kilometers of the Mojave Desert near Ridgecrest, trace features as small as a few centimeters. They’ll reveal surface features such as cracking and scarps potentially missed—or simply not surveyed—by fieldwork, said Oskin, and provide an important permanent record of transient features.

Oskin and his colleagues collected lidar observations that traced all roughly 70 kilometers of surface rupture. The point clouds generated from these measurements will accordingly be enormous, Oskin said. “It’s going to be trillions of points.”

The researchers also collected aerial observations of fragile geological features known as the Trona Pinnacles. These calcium carbonate spires, which are up to 40 meters high, formed thousands of years ago in hot springs that dotted the area. Pieces of these spires toppled during the Ridgecrest earthquakes, said Oskin, and there’s interest in using these geological features as paleoseismometers.

NASA research scientist Andrea Donnellan flies a drone with a 21-megapixel camera over the site of a rupture from the Ridgecrest earthquakes. Credit: NASA GeoGateway Team

Another view from the sky is being provided by Andrea Donnellan, a geophysicist at NASA’s Jet Propulsion Laboratory in Pasadena, and her collaborators. Donnellan and her colleagues are using drones to repeatedly survey two approximately 500- by 500-meter regions centered on surface rupture.

The quadcopter drones that Donnellan and her collaborators fly are equipped with 21-megapixel cameras that capture optical images. Since 9 July, the researchers have collected thousands of images with a spatial resolution of 2 centimeters.

There’s a big advantage to repeated looks at the same landscape after earthquakes, said Donnellan, because ground deformation can continue for years: After the El Mayor–Cucapah earthquake in Baja California in 2010, surface deformation persisted for 7 years.

These images captured by Donnellan and her colleagues complement lidar observations, which, although generally covering a wider area, are only one snapshot in time.

Sharing the Science

Researchers are already looking forward to sharing what they’ve learned about these earthquakes. Seismological Research Letters, a publication of the Seismological Society of America, plans to publish a series of papers focused on the Ridgecrest sequence.

“We were aware that a huge amount of data was being collected for the Ridgecrest sequence and that many seismologists need access to the data to conduct in-depth research to better understand the earthquake sequence and its implications,” said Allison Bent, a seismologist with Natural Resources Canada in Ottawa and the editor in chief of Seismological Research Letters. “Papers will be collated and published in a single print issue of Seismological Research Letters, but they will be published online as soon as possible after acceptance.”

Several conferences have also included results about the earthquakes and discussions of the mobilization effort. The Geological Society of America’s Annual Meeting, held in September in Phoenix, featured a special session about the Ridgecrest earthquakes. Researchers gathered to discuss earthquake early warning, the age of the Eastern California Shear Zone, and the types of slip that occurred during the Ridgecrest sequence, among other results. Also in September, the 2019 Southern California Earthquake Center Annual Meeting included two workshops, a plenary session and over 65 posters about the Ridgecrest earthquakes.

This week, AGU’s Fall Meeting will feature more than 100 sessions on the Ridgecrest earthquakes.

—Katherine Kornei (@katherinekornei), Freelance Science Journalist

Keeping Indigenous Science Knowledge out of a Colonial Mold

Wed, 12/11/2019 - 15:52

During her doctoral research, Dominique David-Chavez was studying her indigenous community’s climate knowledge. As she reviewed the scientific literature on the subject, she noticed a disturbing pattern.

“Whichever kind of study it was, whether it was about ecological indicators of seasonal change or agricultural practices, I mostly was reading similar types of studies where [nonindigenous scientists] would go and document that knowledge and report it back in a scientific journal,” said David-Chavez, who is a postdoctoral fellow working jointly with the University of Arizona’s Native Nations Institute in Tucson and Colorado State University in Fort Collins. She is a member of the Arawak Taíno community.

“It was very difficult to find who from the community was contributing that knowledge, how those findings were returned to that community, or what questions and concerns that indigenous community held in terms of the research,” she said.

“I felt concerned about doing research that way. It didn’t seem respectful.”This kind of extractive knowledge is one of many aspects of colonialism that plague modern research practice when it comes to indigenous scientific knowledge.

“I felt concerned about doing research that way. It didn’t seem respectful,” David-Chavez said. “I really had to look elsewhere to try to find a model that I felt aligned with my cultural values and the scientific standards that I needed to uphold in my work.”

David-Chavez and her coresearchers developed and field-tested a working model to guide scientists in meeting those standards. With that model as a framework, the researchers, along with members of the Cidra and Comerío rural communities in central Puerto Rico (Borikén), designed and facilitated a youth-led climate research project in 2016–2017.

“The model is really about being intentional about all aspects of the research during every stage of the research, [starting with] the design stage and even before that,” she said. David-Chavez will present this model at AGU’s Fall Meeting 2019 on 12 December.

Colonialism in Scientific Research and Education

“We’re at a time right now where there’s really a push for engaging…diverse perspectives in the sciences,” said David-Chavez. “However, in doing so we’re not always understanding or acknowledging the historical context that has inhibited that kind of engagement for, in the U.S. for example, the past 5 centuries.”

That context, she continued, includes a “history of colonialism, of genocide and oppression and assimilation, where knowledge systems that communities held and languages that those knowledge systems were held within, for example, were sometimes illegal and often oppressed.”

David-Chavez recalls many instances where indigenous peoples were concerned about how scientists were using the community’s knowledge, about whether the research results were going to be returned to the community, or that they were not consulted in the research at all.

“One of the biggest threats to sustaining indigenous knowledge…is this generational gap and the influence of the colonial school system.”“I also heard from tribal leaders, for example, that would say, ‘Yes we were consulted,’ but their version of consultation was sending us a letter about the research they were doing. And that was it,” she said.

Later, when the research is finished and published, a colonial mindset often shapes how that science is taught in schools. Indigenous students might learn from elders and knowledge holders how their communities withstood strong hurricanes and years of drought in generations past. However, “if you go to the big cities like San Juan, Ponce, or Mayagüez, they don’t know anything about that because they don’t have the experience and they don’t have this information at school,” said coauthor Norma Ortiz, a member of the indigenous community in Cidra who worked in the school system for more than 20 years.

“The school [system] is not too interested in teaching this. Right now, at school we have a class that is teaching something about the change of climate, but nothing about how to be sustainable. Because we are an island, we need this.”

“One of the biggest threats to sustaining indigenous knowledge that has been documented is this generational gap and the influence of the colonial school system,” David-Chavez said. “So that’s one really important aspect that’s both included in the model and something that we centered in our research study—making sure that the youth have access to that knowledge.”

Centering Research in Values

To design their youth-led climate study, David-Chavez and Ortiz first turned to community elders and farmers in Cidra and Comerío, who served as a community advisory group.

A community advisory group in Cidra, seen here, codesigned the youth-led climate study. Its members identified what results would be most valuable to the young people in their community and ensured that the knowledge shared by elders was applied in a way that respected its history. Credit: Dominique David-Chavez

“At the very beginning, we identified folks in the community that already had an interest in wanting to get involved with a study like this and just [talked] with them informally,” David-Chavez said. “We asked them specifically what indigenous environmental knowledge they felt was most important for the youth and future generations to learn about.”

“They identified that they wanted [students] to learn about our indigenous understanding of seasonal cycles for planting and harvesting indigenous plants, and especially indigenous food plants. We did end up having that be the focus and the theme of our study,” she said.

“By shifting the research to not just focus on goals and objectives and broader impacts, to shift that language to first center values…scientific and cultural protocols align with each other throughout that whole process,” David-Chavez said.

Indigenous Knowledge of Climate Resilience

Next, “we went to the schools, one in Cidra and one in Comerío,” Ortiz said. “We had a lot of students that wanted to participate, but we made a selection at random.” After introducing the students to the project theme, Ortiz said, “they learned how to use a lot of technology that they didn’t know how to use like a GPS [receiver], like a voice recorder. They interviewed the elders,” documenting the traditional environmental knowledge and observing connections to climate science concepts.

Elders and knowledge holders in Cidra and Comerío told the researchers that the youth in their communities needed to learn about which food plants sustained the communities during past hurricanes. Students learned about edible native roots (left) and documented their findings during a field camp (right). Credit: Dominique David-Chavez

The elders “talked to us a lot about the indigenous knowledge [of] how they survived in hurricanes, in dry seasons, in rainy seasons,” Ortiz said.

For example, “my family plants a lot of plants like yautía,” Ortiz said. (Yautía is a kind of starchy root vegetable.) Hurricane-force winds might take down towering fruit trees, “but we have the roots, and then it doesn’t matter how strong the hurricane is. The roots always stay down [in] the soil, so we have food.”

Following Hurricane Maria in 2017, “the port right here in Puerto Rico was still unused about 2 weeks” later, said Ortiz. “So a lot of people didn’t have anything to eat. But we [in Cidra] are in the center of the island. We always have plants. We always have the farmers, always have food, so we didn’t suffer a lot.”

At the end of the field camp, the students presented their research to scientists at the International Institute of Tropical Forestry in San Juan. Ortiz presented the results of this youth research program at AGU’s Fall Meeting 2018.

A Responsibility to Future Generations

Sometimes indigenous science knowledge is heavily stigmatized in schools, David-Chavez said, and students from indigenous communities will learn of it only if they happen upon it in a scientific journal in college or later. By actively participating in the research project, the students learned indigenous environmental knowledge from the source rather than through a colonial lens.

Norma Ortiz interviews an elder in Cidra. Credit: Dominique David-Chavez

“We had a pre- and post-test as part of this study where we looked at the impact of teaching science in this way, on their attitudes towards science, towards potentially seeing themselves as scientists engaging in science,” David-Chavez said. “We also looked at their attitudes towards indigenous knowledge and knowledge of science in their community and how they valued that, how they saw that.”

The surveys revealed that student interest in climate and environmental science increased when viewed within a culturally relevant context. “One of the most impactful outcomes we identified early on in this study was the renewed sense of pride and value towards indigenous knowledge expressed by youth researchers, their families, schools, and community members,” David-Chavez and Ortiz wrote in a blog about the study.

The researchers hope that youth-led intergenerational research studies like this will be used to bridge the generational knowledge gap in other indigenous communities. The team is putting together a report for the Puerto Rico Department of Education about the impact of this type of learning in schools and is also working with a local artist on an indigenous agricultural calendar to bring back to communities.

“We have a responsibility to the next generation [because] they will have to face the climate impacts. They need all of the resources they can have,” David-Chavez said. “And that includes the indigenous knowledge people have held about how to adapt, how to observe seasonal change indicators, what foods are going to grow well and be resistant.”

“It’s part of that resilience that we can ensure that they’ll have that, too.”

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

Dominique David-Chavez and Norma Ortiz would like to acknowledge members of the Cidra and Comerío indigenous communities for their contributions to this research. AGU’s Fall Meeting 2019 is held on the traditional territory of the Ohlone people, and the Muwekma Ohlone Tribe continues to live in their traditional lands, which include the present-day city of San Francisco.

Ocean Science Decade Calls Attention to a Wave of Concerns

Wed, 12/11/2019 - 15:49

With the global ocean under a barrage of assaults, including climate change, pollution, and overfishing, scientists hope that the upcoming United Nations Decade of Ocean Science for Sustainable Development will bring needed worldwide attention to these issues and encourage advances in research, monitoring, and mapping the ocean.

“We’ve had too little attention as a global society to the science of understanding the impacts of what we humans are doing to the ocean,” said Craig McLean, chief scientist and assistant administrator for Oceanic and Atmospheric Research at the National Oceanic and Atmospheric Administration.

The ocean science decade “is a wake-up call.”The ocean science decade “is a wake-up call,” McLean said at a 9 December town hall session at the AGU Fall Meeting. The ocean science decade, which is being coordinated by the Intergovernmental Oceanographic Commission (IOC) of the United Nations Educational, Scientific and Cultural Organization, will stretch from 2021 to 2030. “We see the world and the climate changing around us. Those impacts to society are easily and clearly documented, and we need to be paying attention to that.”

McLean, who serves on the initiative’s executive planning group, added, “The best thing for us to do now is ask ourselves those questions of what do we need to solve scientifically to provide the best pathway for how society redirects itself to bring a more sustainable future.”

Ocean Decade Goals

The main motivation for the ocean decade “is to support efforts to reverse the cycle of decline in ocean health and create improved conditions for sustainable development of the ocean, seas and coasts,” according to a road map for the initiative.

Within the road map, research and development priority areas include creating a comprehensive digital atlas of the ocean; implementing a comprehensive ocean observing system; developing a quantitative understanding of ocean ecosystems and their functioning; improving an ocean-related multihazard warning system; and expanded programs in capacity building, education, and ocean literacy.

The road map outlines some societal themes and potential outcomes of the decade. These outcomes include a clean ocean with sources of pollution identified, quantified, and reduced; a healthy and resilient ocean with marine ecosystems mapped and protected and with climate change and other impacts measured and reduced; a predicted ocean with society having the capacity to better understand current and future ocean conditions; a safe ocean with human communities protected from ocean hazards; a “transparent and accessible” ocean in terms of widespread access to ocean data and technologies; and a sustainably harvested and productive ocean.

Closely Aligned U.S. Themes

At the town hall, Margaret Leinen, director of the Scripps Institution of Oceanography and vice chancellor for marine science at the University of California, San Diego, said that the White House appears to be largely in line with the ocean decade themes.

“I think it should be a welcome development for the UN decade as well, that the U.S. is in, we’re talking about it, and that the ideas are so closely aligned with the decade.”Leinen noted that a November White House Summit on Partnerships in Ocean Science and Technology included discussions on, for instance, exploring the ocean, conserving living marine resources, protecting coastal health and safety, sustaining ocean observations, promoting food security, and leveraging big data.

“This is a really welcome development for all of us in the U.S. I think it should be a welcome development for the UN decade as well, that the U.S. is in, we’re talking about it, and that the ideas are so closely aligned with the decade,” Leinen said.

Leinen, who is a member of the decade’s executive planning group, said that a first draft of the science plan for the decade should be ready in early 2020.

Looking to the Future

At the town hall, Mykaela Barnes, a senior at the University of California, Berkeley, majoring in conservation and resource studies, said that the ocean decade also is important for youth around the world.

“This is the decade that is going to create the science we need for the ocean we want, and the ocean we need for the future we want.”“The decade is laying this foundation for young people. Within the next 10 years, people my age and younger are going to be entering their careers [with] the potential for them to be the next decision-makers with this issue and the next scientific researchers,” she said.

Vladimir Ryabinin, executive secretary of the IOC, said that with the ocean decade, “we are trying to engage everyone into understanding that the situation is so dramatic with the environment, climate change, and health of the ocean that everyone needs to engage and change their attitudes toward action.”

“This is the decade that is going to create the science we need for the ocean we want, and the ocean we need for the future we want,” Ryabinin added.

—Randy Showstack (@RandyShowstack), Staff Writer

What Controls How Quickly Faults Heal?

Wed, 12/11/2019 - 12:30

The growth and coalescence of fractures in fault zones influences the mechanical properties of fault rock, including stiffness and strength. Following earthquake ruptures, a network of fractures increases local permeability, which leads to fluid flow, mineral growth within fractures, and the return of some strength to the fault zone during interseismic periods. The rates of such mineralization, or “healing” of faults, have however been elusive, and could be limited by rates of fluid flow or precipitation.

Williams et al. [2019] used U-Th dating of syntectonic calcite veins in the Loma Blanca fault, New Mexico, USA, to establish the rates of fracture cementation and healing. For meter-to-decimeter scale fracture networks, calcite growth rates range by an order of magnitude and correlate strongly with fracture width, where the fastest growth rates of about one millimeter per thousand years are observed in fractures about one centimeter wide.

This correlation suggests calcite growth and fracture healing are likely fluid transport limited in faults, and that fracture healing rates are highly variable both within and between individual fractures through time. These variabilities suggest that fracture widths and associated fluid flow pathways control the spatial distribution of fault-zone healing following earthquakes.

Citation: Williams, R. T., Mozley, P. S., Sharp, W. D., & Goodwin, L. B. [2019] U‐Th Dating of Syntectonic Calcite Veins Reveals the Dynamic Nature of Fracture Cementation and Healing in Faults. Geophysical Research Letters, 46. https://doi.org/10.1029/2019GL085403

Steven D. Jacobsen, Editor, Geophysical Research Letters

Drugs in Our Water Can Leave Even More Toxic By-Products

Wed, 12/11/2019 - 01:46

People flush medicines, cosmetics, and hygiene products down the toilet or throw them in the trash every day. The waste ends up in water treatment plants and landfills and then makes its way into water supplies around the world.

But what happens after that?

New research examined the ways that common pharmaceuticals and personal care products degrade when they enter water systems. The researchers found that in some cases, the chemical by-products were more toxic to humans than the original products were.

Degradation products “are completely different chemical species, so they can have different toxic effects than their parent compounds.”Degradation products “are completely different chemical species, so they can have different toxic effects than their parent compounds,” said lead researcher Gayan Rubasinghege, an assistant professor of chemistry at the New Mexico Institute of Mining and Technology in Socorro.

“Sometimes they could be more toxic; sometimes they could be less toxic,” he said. “To get a complete picture of these pharmaceuticals and personal care products in the environment [and] their impact on the aquatic life or human health, we need to look at all of those things. Only then can we decide whether they are actually hazardous or not.”

Rubasinghege will present this work on 11 December at AGU’s Fall Meeting 2019.

Dissolved and Degraded, but Not Gone

“We have been producing more and more of these pharmaceuticals and personal care products to support the needs of a growing population,” Rubasinghege said. “When we say pharmaceuticals, we are talking about hundreds of different chemicals that are being produced and used every day,” including medicines, cosmetics, shampoos and soaps, and different types of plastics.

Scientists have quantified in a number of places just how much of our pharmaceutical waste ends up in water systems. The reported concentrations are often relatively low, “so people feel like, ‘Oh, it’s not so much. There shouldn’t be much impact on human health or impact on the aquatic life,’” Rubasinghege said.

“We saw that a couple of degradation products from ibuprofen [are] 5–8 times more toxic compared [with] ibuprofen.”However, “these are chemicals,” he added, “so they’re not going to last as they are in the environment. They’re going to change. They’re going to transform into something else.” Sunlight, microbes, and soil catalyze chemical reactions that turn the original compounds into new ones.

In laboratory tests, Rubasinghege’s team observed this degradation by dissolving ibuprofen in water and exposing it to sediments and simulated sunlight. “Within 5 or 10 days, 1,000 times of what has been reported [in the environment] completely disappeared or transformed into something else,” he said.

“We saw the same thing with amoxicillin,” he added. “We saw the same thing with clofibric acid, which is used to reduce plasma cholesterol. And we are working on diclofenac now, which is a nonsteroidal anti-inflammatory drug.”

The team then tested the effects of the sunlight-degraded ibuprofen on human liver and kidney cells. “In both the cases, we saw that a couple of degradation products from ibuprofen [are] 5–8 times more toxic compared [with] ibuprofen” at the same concentration, Rubasinghege said. The team also observed some toxic effects on a species of human gut bacteria (Lactobacillus acidophilus) and on an aquatic bioluminescent bacteria (Vibrio fischeri), but less so for the degradation products than for ibuprofen itself.

The bottom line is, even if reported concentrations of a pharmaceutical in our water are low, a lot  more of that drug might have entered the water system originally and quickly transformed into something else, Rubasinghege said.

Unknown Degradation Paths and Impacts

“Ibuprofen is one of the most consumed pharmaceuticals around the world. It, and probably its transformation products, is found in considerable amounts in wastewater effluents,” said Adeyemi Adeleye, a civil and environmental engineer at University of California, Irvine who was not involved with this research.

“This study emphasizes the need to consider the by-products formed in the environment when assessing the risks of pharmaceuticals and other chemicals,” he said. “My hope is that the study will encourage other researchers in the field to expand their focus beyond the primary chemicals” and include the toxic effects of secondary chemicals too.

The reported pharmaceutical concentrations in our water are not the end of the story.Rubasinghege’s group ultimately plans to put together a database of the most abundant pharmaceuticals and personal care products that enter the environment, how those compounds transform, and at what concentrations the degradation products are toxic to humans.

“We’re not saying that the drinking water is not safe,” he said. But people should take these results as an “eye-opener” that the reported pharmaceutical concentrations in our water are not the end of the story.

“Let’s do more studies about these degradation products,” Rubasinghege said. “Let’s do more studies to understand the complete picture of these compounds out there. Because the [environmental] impact of these could surface maybe in about 5 years, maybe in about 10 years. But if we understand it now and take actions now, we can avoid it in the future.”

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

A New Source of Sea Level Rise from Greenland: Ice Slabs

Wed, 12/11/2019 - 01:26

The interior of Greenland’s ice sheet doesn’t usually make headlines: It’s a layer of compact snow and glacial ice at high elevations that typically doesn’t contribute to runoff that drives sea level rise.

Ice slabs can extend for tens of kilometers and grow to over 16 meters thick.But a new study suggests that this may change: More runoff may come from Greenland’s interior because of a newly discovered phenomenon called ice slabs. The slabs are layers of ice that exist just below the snow’s surface, where porous snow usually sits. Ice slabs can extend for tens of kilometers and grow to over 16 meters thick.

The latest results reveal that 5% of Greenland’s ice sheet has ice slabs under the surface, and using modeling data, scientists project that ice slabs will double in area by midcentury. The news could mean that Greenland will contribute more to sea level rise because the majority of mass loss from the ice sheet comes from meltwater runoff.

“Core After Core of Solid Ice”

Scientists discovered ice slabs only in the past decade. Michael MacFerrin, a lecturer at the University of Colorado Boulder, first encountered ice slabs while drilling ice cores in southwestern Greenland in 2012. He noticed something unusual about the ice cores: Instead of finding cores made of snow, MacFerrin found “core after core after core of solid ice.” It seemed that higher temperatures from climate change were melting and refreezing the snow into solid ice.

MacFerrin saw satellite images of meltwater streaming down the same region he’d pulled his cores from.MacFerrin knew this could spell trouble for the ice sheet. At midelevations where the researchers were drilling, the snow beneath their feet should have been a granular type of snow called firn. Firn is so porous that it allows meltwater at the top of the ice sheet to percolate downward, helping compact the snow into glacial ice over time.

But ice slabs would throw a wrench into this process. Meltwater wouldn’t be able to infiltrate the ice sheet, leaving it no choice but to flow downhill toward the ocean. Later that year, Greenland saw one of the largest melt years on record, and MacFerrin saw satellite images of meltwater streaming down the same region he’d pulled his cores from.

A map of ice slabs on Greenland’s ice sheet shows their far-reaching extent, particularly along the western edge. The map also includes areas of firn aquifers, where pools of water stay hidden under the ice year-round. Taken together, ice slabs and aquifers cover one tenth of the ice sheet. Credit: Michael MacFerrin “The Size of West Virginia”

In the years following, MacFerrin and his colleagues mapped the extent of ice slabs in Greenland using ice cores and radar. Their map is the first to show the extent of ice slabs across Greenland.

The team used two types of radar data: one from ground-penetrating radar from an antenna towed behind a snowmobile along a 45-kilometer transect and the other from freely available accumulation radar data from a NASA IceBridge mission that measured the near-surface ice over much of Greenland. The radar data revealed ice slabs beneath the surface of the ice, and the team members checked their results against 17 ice cores drilled in southwestern Greenland.

The results showed that southwestern Greenland is ground zero for ice slabs, along with a big swath in northeastern Greenland. The slabs covered an area roughly the “size of West Virginia,” said MacFerrin.

From Ice to Sea

MacFerrin worries that the icy layers could have consequences for global sea level rise.

The team used three regional climate models forced by two greenhouse gas emission scenarios: one with moderate warming and the other with little action taken to curtail emissions. Trends in Greenland’s melting have been following the highest greenhouse gas emission scenario so far.

According to model projections, the ice slabs could double the amount of sea level rise from Greenland’s interior in the coming century.

“We don’t yet know what all the impacts on ice dynamics are going to be. That’s a big question.”Adam Schneider, a postdoctoral scholar at the University of California, Irvine who was not involved in the study, called MacFerrin’s team the “foremost experts” investigating the question of future climate impacts on runoff patterns and sea level rise. Schneider plans to improve his ice sheet model using insights from MacFerrin’s work.

MacFerrin said that future monitoring is necessary to track the extension of ice slabs throughout Greenland and pointed to microwave data from satellites as the best way to keep a keen eye. In terms of what more runoff streaming off Greenland’s middle- to high-elevation regions could mean for the ice sheet, “we don’t yet know what all the impacts on ice dynamics are going to be,” MacFerrin said. “That’s a big question.”

MacFerrin presented the research at AGU’s Fall Meeting 2019 in San Francisco, Calif.

—Jenessa Duncombe (@jrdscience), News Writing and Production Fellow

A Dirty Truth: Humans Began Accelerating Soil Erosion 4,000 Years Ago

Tue, 12/10/2019 - 15:37

In a way, human history is etched in the soil.

Recently, an international team of researchers found evidence that we humans have been leaving our mark on this planet since long before the Industrial Revolution. Around 4,000 years ago, human activities had already significantly accelerated soil erosion around lake beds on a global scale.

“We have been imprinting our presence [on] the landscape and in the natural world further back than we thought.”“We have been imprinting our presence [on] the landscape and in the natural world further back than we thought,” said Nuno Carvalhais, a research group leader at the Max Planck Institute for Biogeochemistry and the senior researcher of the study published in Proceedings of the National Academy of Sciences of the United States of America on 28 October.

The findings required an interdisciplinary approach, with different types of analyses allowing a more comprehensive picture of how human activities could be behind the accelerating erosion, Carvalhais said.

Jean-Philippe Jenny, a French geoscientist affiliated with the Max Planck Institute for Biogeochemistry and the Alpine Center for Research on Trophic Networks and Liminic Ecosystems and the lead author of the study, analyzed core samples of sediments from 632 lake beds collected from around the world. Because sediments accumulate in lakes at continuous rates, lake sediment cores can be used as a natural archive of fluctuations in soil erosion over time.

Combining sediment rates with radioactive carbon dating data from each site, Jenny and his collaborators inferred the changes in lake sedimentation accumulation rates and found that 35% of the sampled lakes had accelerated erosion over the past 10,000 years.

Crucially, the acceleration in erosion began around 4,000 years ago, and the researchers sought out the mechanisms that could explain this trend. “We built up our hypotheses, and based on these hypotheses, we [collected] the data that would either destroy or support the different hypotheses that were behind the trends,” Carvalhais explained.

In the end, humans were the most likely culprit.

Changes in erosion were less related to fluctuations in precipitation and temperature, researchers found, whereas trends in deforestation coincided with the rise in erosion. Jenny and his collaborators analyzed pollen samples at each lake bed site to produce a proxy for tree coverage of the surrounding land; they found that decreases in tree cover were tightly coupled with accelerated erosion. “Deforestation at the time was caused by the human beings, because at that time they were starting to develop agriculture,” said Jenny.

Humanity’s Past and Future Written in the Dirt

Although soil erosion accelerated 4,000 years ago in Europe, similar trends occurred only recently in North America, probably following European immigration and importation of agricultural practices.

The research team also found that 23% of lake sites had a decrease in erosion rates, which may be the result of human-driven river management, such as the construction of dams.

“It means that we as human beings are now living in a time period where we have a huge effect on everything on the Earth, and all our activities will be recorded in the natural archives,” said Jenny.

“These guys have done a really remarkably ambitious job putting the story together,” said David Montgomery, a professor of Earth and space sciences at the University of Washington and author of Dirt: The Erosion of Civilizations. The results of the paper “put into perspective just how powerful a force people are on the planet today,” he said.

“What you come away with is the lesson that societies that don’t take care of their soil don’t last.”Montgomery, who was not involved in the study, suggests that it was not merely deforestation that accelerated soil erosion, but subsequent agricultural activities as well. Though deforestation is a necessary first step for widespread farming, increased soil erosion is mainly driven by “the plow that followed,” he said. “It wasn’t simply cutting down the trees that caused the erosion; it was keeping them off the landscape through farming practices.”

The erosion rates produced by conventional agricultural practices are not sustainable, and they sap crucial nutrients from the soil. “What you come away with is the lesson that societies that don’t take care of their soil don’t last,” Montgomery said.

And there are broader environmental implications too. As with many types of large-scale human activities, increased soil erosion “can impact the climate in the long term,” said Jenny.

The results of this study provide more data about “the sensitivities of the Earth system to climate and environmental factors, including humans,” said Carvalhais. “And this can help us improve our ability to understand and also to predict or forecast future scenarios.”

“To go into the future, we also need to understand our history,” he added.

—Richard J. Sima (@richardsima), Science Writer

Direct Air Capture Offers Some Promise in Reducing Emissions

Tue, 12/10/2019 - 15:36

When Jennifer Wilcox began researching the direct air capture of carbon dioxide (CO₂) about a decade or so ago, she was skeptical about how useful the technique could be. That was before climate models started including negative emissions to meet climate goals, she said.

“We traveled along the business-as-usual trajectory for far too long to get to choose what we want to do at this point,” Wilcox said about insufficient efforts to curb CO₂ and the need now to look into many different options. “Beggars can’t be choosers.”“I thought, why would we ever consider doing this [direct air capture] when we should just really focus all of our efforts on avoiding carbon [emissions] in the first place?” said Wilcox, a professor of chemical engineering at Worcester Polytechnic Institute in Massachusetts. “We should be focusing all of our efforts on fuel switching, transitioning [to clean energy], increasing renewables, carbon capture and storage, [and] all kinds of things that are associated with avoiding carbon emissions in the first place.”

Now, however, Wilcox, the author of Carbon Capture, the first textbook on the subject, thinks that direct air capture should be considered as part of the overall mix to reduce emissions.

“We traveled along the business-as-usual trajectory for far too long to get to choose what we want to do at this point,” Wilcox said about insufficient efforts to curb CO₂ and the need now to look into many options. “Beggars can’t be choosers.”

Wilcox will present a talk at AGU Fall Meeting 2019 on “The Role of Direct Air Capture in Meeting Our Climate Goals” on Wednesday, 11 December, at 11:35 a.m.

Hopeful for Technological Advances and Lower Costs

A recent report, Negative Emissions Technologies and Reliable Sequestration: A Research Agenda, issued by the National Academies of Sciences, Engineering, and Medicine, describes direct air capture as “chemical processes that capture CO₂ from ambient air and concentrate it, so that it can be injected into a storage reservoir.”

Wilcox recognizes that direct air capture currently is a difficult, expensive, and energy-intensive process. CO₂ in the atmosphere, for instance, is about 300 times less concentrated than the CO₂ contained in the exhaust from a coal-fired power plant, which means that you need 300 times the contact area for direct air capture to sock away the equivalent amount of emissions, according to Wilcox.

“Energetically, cost-wise, it’s easier to go after the point sources,” Wilcox acknowledges. But she added that there is a need for direct air capture as well, and she and others are hopeful that technological advances could significantly lower the costs and the energy currently required for direct air capture.

Direct air capture has a “high potential capacity for removing carbon” but “is currently limited by high cost,” according to the report. Wilcox served on the committee that produced the report.

The report suggests a research plan that would include scaling up and testing air capture materials and components, as well as designing, building, and testing an air capture demonstration system. In addition, the report notes that some companies, including Carbon Engineering and Climeworks, are working on commercializing direct air capture systems.

The September 2019 Clearing the Air report by the Energy Futures Initiative, which is chaired by former U.S. energy secretary Ernest Moniz, calls for a 10-year, $10.7-billion U.S. federal research and development initiative for carbon dioxide removal technologies, including direct air capture, terrestrial and biological CO₂ capture, and other capture methods.

“Net-zero carbon dioxide (CO₂) emissions is not credibly achievable by midcentury without major contributions from negative-carbon technologies. Such technologies will also make possible, in the long term, a reversal of ever increasing greenhouse gas (GHG) concentrations in the atmosphere, thereby reducing the impact of past actions,” the report notes. Direct air capture, or DAC, “has a very large potential scale for CDR [carbon dioxide removal]. The overarching [research, development, and demonstration] objective for DAC is to reduce the cost and energy use and improve the performance and durability of DAC technologies to be a viable option for CDR.”

Not a Replacement for Avoiding Carbon in the First Place

Although Wilcox shied away from saying she is confident that direct air capture will be a real part of the solution to climate change, she said she is hopeful.

“I’m hopeful that it will be, because I don’t think we have any other choice,” Wilcox said, adding that other carbon capture methods also are important. “I’m hopeful that the subsidies will be put into place, and the policy framework will be put into place that will allow us to move forward with the deployment of negative emissions on a scale that will positively impact climate.”

“Negative emissions will not be a replacement for avoiding carbon in the first place.”She added, however, that negative emissions are not a solution by themselves. She said that “climate math is hard” and there needs to be an understanding that “we really need to do everything” to deal with climate change. Wilcox cautioned, “Negative emissions will not be a replacement for avoiding carbon in the first place.”

—Randy Showstack (@RandyShowstack), Staff Writer

Timing Matters for Rockfall Estimates

Mon, 12/09/2019 - 16:08

Rockfalls, in which gravel- to boulder-sized rocks break away from slopes, skew toward the small end on the scale of landslides. Nevertheless, they remain hazardous. Rockfalls have blocked roads, crushed infrastructure, and taken lives. Reports from around the world suggest that they may be becoming more common in response to rising temperatures in some mountainous regions, so scientists are justifiably interested in better understanding the frequency with which these geologic hazards occur.

Advances in computing power and remote sensing tools, such as lidar, have allowed researchers to detect and monitor rockfalls more closely in recent years. A new study shows, however, that the monitoring interval—the amount of time between successive data collections at a site—can significantly influence estimates of rockfall rates. Researchers typically monitor for rockfalls on a monthly basis, but such long periods between surveys can lead to underestimations of small rockfalls and/or overestimations of large ones. This is because multiple small events can be mistaken for a single, larger event when the time between surveys is longer than the return period of small rockfalls.

Williams et al. monitored a coastal cliff in the seaside town of Whitby in the United Kingdom, where Jurassic shales, sandstones, and mudstones are eroding into the sea. Here, as in most locations prone to rockfalls, the erosion follows a power law, with small rockfalls occurring more frequently than larger ones. The researchers used terrestrial lidar surveys, collected hourly over 10 months in 2015, and an algorithm to detect changes in the rock face due to rockfalls.

To find out how the monitoring interval influences rockfall rate estimates, the team modeled the frequency distribution of rockfall volumes at various hour- and day-scale intervals, ranging from 1 hour to 30 days. The researchers found that changing the time interval significantly influenced the rockfall frequency and volume measured. Overall, increasing rockfall rates were seen for monitoring intervals shorter than about 12 hours, whereas for a range of intervals greater than 12 hours, observed rockfall rates were nearly identical. Specifically, the mean rockfall rate under hourly monitoring was 61 events per day—an order of magnitude larger than the six rockfalls per day seen with monthly surveys. And monitoring using the shortest time interval of 1 hour led to a threefold decrease in the average recorded rockfall volume compared with using a 30-day interval.

Given the risks posed by rockfalls of all sizes, this large increase in rockfall rate has major implications for rockfall hazard models, according to the authors. (Journal of Geophysical Research: Earth Surface, https://doi.org/10.1029/2019JF005225, 2019)

—Kate Wheeling, Freelance Writer

Momentum Grows for Mapping the Seafloor

Mon, 12/09/2019 - 16:06

This is a “superexciting” time for seafloor mapping, according to Vicki Ferrini.This is a “superexciting” time for seafloor mapping, according to Vicki Ferrini, a marine geophysicist at Columbia University’s Lamont-Doherty Earth Observatory in Palisades, N.Y.

More than 80% of the seafloor remains unmapped at a resolution of 100 meters or better, but there is growing momentum to close that gap, according to Ferrini.

This momentum includes an increasing recognition that these data are vital to better understanding our planet, the mapping community working more closely together, and “a technology push that has put us at this edge of a new era in ocean mapping,” she said.

In addition, Ferrini pointed to several major initiatives, including the United Nations Decade of Ocean Science for Sustainable Development, which will stretch from 2021 to 2030.

Another related initiative is the Nippon Foundation-GEBCO Seabed 2030 Project, started in 2016. This project, between the Nippon Foundation and the General Bathymetric Chart of the Oceans (GEBCO), which is itself a joint project of the International Hydrographic Organization and the Intergovernmental Oceanographic Commission, has an aspirational goal: the entire accessible part of the ocean floor mapped to a resolution of 100 meters or better by 2030.

With so much momentum for mapping the seafloor, several sessions at AGU’s Fall Meeting 2019 in San Francisco, Calif., focus on the topic, including a poster session on Monday afternoon, 9 December, “Beyond Hydrography: Seafloor Mapping as Critical Data for Understanding Our Oceans II.” The session includes a number of posters related to the Seabed 2030 Project. A related oral session, “Beyond Hydrography: Seafloor Mapping as Critical Data for Understanding Our Oceans I,” takes place on Monday morning.

So Much Unmapped, Unexplored, and Unknown

With smartphones, “we are all very much accustomed to having detailed maps in the palm of our hands,” said Ferrini, who is a coconvener and cochair of both Fall Meeting sessions. She also serves as the head of GEBCO’s Atlantic and Indian Oceans Regional Center and chair of its Sub-Committee on Regional Undersea Mapping. “To think that the majority of our planet is not known with even the coarsest detail of 100-meter resolution is pretty astounding.”

“There is so much of our planet and our ocean that is not just unmapped but really unexplored and unknown. So there is a huge amount of excitement and wonder about what we’re going to find.”“If we really want to understand the planet, if we want to understand the ocean, if we want to manage resources in a sustainable way, we have to have at least a first-order map to help guide what we’re doing,” Ferrini said. “There is so much of our planet and our ocean that is not just unmapped but really unexplored and unknown. So there is a huge amount of excitement and wonder about what we’re going to find.”

Seabed 2030 will bring together all of the available data that exist and synthesize them into a publicly available GEBCO map, Ferrini said. The project relies on regional projects and coalitions as “the building blocks” of the map.

Mapping the U.S. Exclusive Economic Zone

Ferrini also mentioned a 19 November White House memorandum that calls for mapping the exclusive economic zone (EEZ) of the United States and the near shore of Alaska.

Elizabeth Lobecker, a physical scientist with the National Oceanic and Atmospheric Administration’s (NOAA) Office of Ocean Exploration and Research (OER), said that the memorandum recognizes the importance of ocean exploration and “is right in line with what we do: ocean mapping for exploration [and for] identification of important resources and habitat.” In a poster, Lobecker will focus on NOAA’s ocean exploration and research mapping contributions to Seabed 2030, including OER’s efforts to assess mapping data holdings and identify gaps in bathymetric coverage within the United States’ EEZ.

“When sonars go over a new area, what was once just a blurry smudge of data where you couldn’t see any details” transforms into a “remarkable level of resolution, and you can pick up interesting features.”Within NOAA, Lobecker noted, the Okeanos Explorer research vessel is very close to reaching a milestone of having mapped 2 million square kilometers of the seabed. Still, “the fact that so much of the seafloor is not mapped is actually very exciting,” she said. “When sonars go over a new area, what was once just a blurry smudge of data where you couldn’t see any details” transforms into a “remarkable level of resolution, and you can pick up interesting features.”

Despite the current momentum for mapping the seafloor, Columbia University’s Ferrini doesn’t want to speculate about whether Seabed 2030 will reach its goal by 2030, though she is hopeful. “To me, it almost doesn’t matter if we do, because we are building a global community that is learning to work together in ways that we have not done before,” she said. “That is going to be one of the biggest and most long-lasting impacts of this initiative. I think that there is the potential to make huge progress.”

—Randy Showstack (@RandyShowstack), Staff Writer

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer