Feed aggregator

Heating Up the Hot Spots

EOS - Wed, 07/14/2021 - 12:25

The U.S. Navy opened its first national training institution in Annapolis, Md., in 1845, with 50 midshipmen and seven professors. In the 176 years since, the U.S. Naval Academy has trained thousands of officers. By the end of this century, though, the academy might have to abandon ship. Climate models suggest rising sea levels and subsiding land could flood the site, forcing the Navy to find a drier spot for educating its future leaders.

Climate change could function as a “threat multiplier” in already touchy regions of the globe, perhaps triggering armed conflicts over water, arable land, or other resources.Climate change could introduce more serious security challenges to the American military in the coming decades, experts say. The U.S. military already faces repairs and upgrades to facilities across the country, along with reductions in training operations. And climate change could function as a “threat multiplier” in already touchy regions of the globe, perhaps triggering armed conflicts over water, arable land, or other resources.

“Personally, I put climate change below a lot of other threats—a lot of other things are more immediate and more pressing—but it deserves a place on the list,” said Col. Mark Read, head of the Department of Geography and Environmental Engineering at the U.S. Military Academy at West Point, N.Y. “Twenty years ago, it wasn’t even on the list.”

“The problem is certainly cascading,” said Sherri Goodman, secretary general of the International Military Council on Climate and Security (IMCCS; a think tank composed of military and security leaders) and a former U.S. deputy undersecretary of defense. “It’s converging in many ways at the top of the global agenda.”

Hurricanes, Wildfires Are Major Infrastructure Threats

For the American military, perhaps the most immediate threats are infrastructure damage and training restrictions. Hurricanes, inland storm systems, and wildfires have caused extensive damage in the past few years.

In 2018, for example, Hurricane Florence caused $3.6 billion worth of damage to Camp Lejeune, a Marine Corps base in North Carolina that supports a population of more than 130,000 marines, sailors, retirees, their families, and civilian employees. The following year, Offutt Air Force Base in Nebraska suffered $1 billion in damages when major flooding hit the Midwest. Wildfires in 2016 burned 10,000 acres (4,047 hectares) at Vandenberg Air Force Base (now Vandenberg Space Force Base) in California, threatening two of its rocket launch pads.

A 2019 Department of Defense (DOD) study of 79 bases around the country concluded that two thirds of them are vulnerable to flooding and about half are vulnerable to drought and wildfires. Bases in California, New Mexico, and Nevada could be threatened by desertification, whereas facilities in Alaska could be damaged by thawing permafrost.

“The DOD must consider the related risks and make wise investment decisions to mitigate the impacts of extreme weather on the DOD’s mission.”Flooding is increasing at some coastal bases even without hurricanes. Several facilities in the Hampton Roads region of Virginia and around Chesapeake Bay, for example, face frequent tidal flooding of roads and low-lying areas caused by higher sea level and some ground subsidence.

“If you add rain, the flooding can be pretty significant,” said Read, who emphasized that he was expressing his own views, not those of the Army or West Point. “That’s damaged some infrastructure and limited base access….That has readiness implications. It’s nothing glamorous. It seems mundane, but it’s profound.”

Higher temperatures also present problems. Bases in the Southwest have faced more “black flag” days, when it’s too hot or the air quality is too low to safely conduct training operations—a problem that is likely to grow worse as the climate continues to warm. And live-fire exercises have a greater potential to spark wildfires that could damage not only military facilities but civilian ones as well. In 2018, for example, two wildfires in Colorado were triggered by training exercises for an upcoming deployment, burning 3,300 acres (1,335 hectares) and forcing the evacuation of 250 households.

“DOD should ensure that extreme weather and climate change are considered during facility design and investment decisions,” the Defense Department’s inspector general’s office wrote in a 2020 report. “As the frequency of extreme weather events has increased, the DOD must consider the related risks and make wise investment decisions to mitigate the impacts of extreme weather on the DOD’s mission.”

Not a Big International Concern—Yet

That mission often includes responding to climate disasters around the world, which are forecast to become more common and more severe as the climate continues to change. In parts of the world, it’s possible that such disasters could help trigger armed conflicts.

A 2019 study found that climate-assisted conflicts are rare today but could become more common later in the century. “Does a flood lead to a civil war?” asked lead author Katharine Mach, an assistant professor of marine and atmospheric science at the University of Miami. “If you’re in Norway, the answer is totally no. But if you’re in a place that’s on the brink of civil war anyway, that’s where you start to see greater effects of climate shocks.”

“In places that are already experiencing strains due to poor governance or a lack of social cohesion, when you add climate change on top of that, it makes it a more combustible mix.”“Climate acts as a threat multiplier,” said Erin Sikorsky, deputy director of the Center for Climate and Security and director of IMCCS. “In places that are already experiencing strains due to poor governance or a lack of social cohesion, when you add climate change on top of that, it makes it a more combustible mix.”

Sub-Saharan Africa, the Middle East, southern Asia, and parts of the Indo-Pacific lead the list of regions that could be most vulnerable to climate-triggered violence, Sikorsky said, but they aren’t alone. “I always say that you could spin a globe and just pick a spot, and you could find some kind of climate security risk there.”

Some experts say they are concerned that reduced snowfall in the Himalayas could produce water shortages that could lead to armed conflict between countries in Asia, for example, particularly in regions where one country can limit other nations’ access to water. Others suggest that the Arctic could become a climate security hot spot, as reduced ice coverage in summer makes it easier to extract mineral resources from the ocean floor. “We’ve seen the great powers posturing and competing for resources, and whenever you have that, there are security implications,” said Read.

The United States and other nations therefore must take climate change into consideration as they plan their foreign policy, said Sikorsky. “When you talk about security risks, you need to add climate change to the mix. It’s not a matter of, is climate change more important or risky than China, for example. Instead, it’s a question of how does climate change shape the risk from China? How does it shape competition? How does it shape our China foreign policy? Climate change will help set the parameters of the world stage.”

—Damond Benningfield (damond5916@att.net), Science Writer

America’s Natural Gas Pipeline Routes and Environmental Justice

EOS - Tue, 07/13/2021 - 11:38

Most research into the environmental and social impacts of the oil and natural gas industries focuses on the beginning and end of the process: where resources are extracted and where they are refined and consumed. Very little attention, however, is paid to middle infrastructure—the enormous vascular system of pipelines crisscrossing the United States. In a new study, Emanuel et al. address this continent-wide gap by comparing natural gas pipeline density to social vulnerability at the county level.

The Centers for Disease Control and Prevention has created a social vulnerability index that measures how well a community can prepare for, handle, and recover from hazards and disasters, either natural or human-caused. A county with high social vulnerability would be poorly equipped to handle a potential pipeline disaster. The researchers found that more socially vulnerable counties in the United States tended to have higher pipeline densities, while less socially vulnerable counties had lower pipeline densities. The correlation is stronger for counties with the highest pipeline densities.

The authors point to the policy implications of the inequitable distribution of environmental harms connected with the construction and operation of this vast network of infrastructure. The burdens of pipelines—including noise, reduced property values and land use options, risk of leak or explosion, and cultural harm—fall disproportionately on the communities least capable of handling them.

Pipelines are frequently located in rural areas rather than urban ones. Although rural areas have lower population densities and many times presumed “lower risks,” rural routes do not diffuse risks; they present a different set of risks, the authors say. Plus, the scientists highlight that Indigenous people rooted in rural areas have deep cultural ties to specific landscapes and waterways that are increasingly affected by pipeline construction and operation, and their cultures and communities may be harmed if the land is marred. Rural emergency response systems have fewer resources to handle large disasters. Further, local conflict over fossil fuel infrastructure can quickly tear rural communities apart and lead to mass relocations, converting rural communities to industrial landscapes within only a few years.

The scientists suggest that future projects undergo more rigorous environmental justice assessments that incorporate culture- and community-focused research and local perspectives. They call upon other scientists to partner with marginalized communities to identify and quantify impacts that may be overlooked or ignored by the powerful forces behind pipeline projects. Finally, they remind decisionmakers to consider the cumulative risks of existing oil and natural gas industry infrastructure, including the issues that follow climate change, which also tend to affect those most vulnerable. (GeoHealth, https://doi.org/10.1029/2021GH000442, 2021)

—Elizabeth Thompson, Science Writer

A Remarkably Constant History of Meteorite Strikes

EOS - Tue, 07/13/2021 - 11:38

Thousands of tons of extraterrestrial material pummel Earth’s surface each year. The vast majority of it is too small to see with the naked eye, but even bits of cosmic dust have secrets to reveal.

By poring over more than 2,800 grains from micrometeorites, researchers have found that the amount of extraterrestrial material falling to Earth has remained remarkably stable over millions of years. That’s a surprise, the team suggested, because it’s long been believed that random collisions of asteroids in the asteroid belt periodically send showers of meteoroids toward Earth.

Astronomy by Looking Down

Birger Schmitz, a geologist at Lund University in Sweden, remembers the first time he looked at sediments to trace something that had come from space. It was the 1980s, and he was studying the Chicxulub impact crater. “It was the first insight that we could get astronomical information by looking down instead of looking up,” said Schmitz.

Inspired by that experience, Schmitz and his Lund University colleague Fredrik Terfelt, a research engineer, have spent the past 8 years collecting over 8,000 kilograms of sedimentary limestone. They’re not interested in the rock itself, which was once part of the ancient seafloor, but rather in what it contains: micrometeorites that fell to Earth over the past 500 million years.

Dissolving Rocks

“Ordinary chondritic asteroids don’t even appear to be common in the asteroid belt.”Schmitz and Terfelt used a series of strong chemicals in a specially designed laboratory to isolate the extraterrestrial material. They immersed their samples of limestone—representing 15 different time windows spanning from the Late Cambrian to the early Paleogene—in successive baths of hydrochloric acid, hydrofluoric acid, sulfuric acid, and nitric acid to dissolve the rock. Some of the reactions that ensued were impressive, said Terfelt, who recalled black smoke filling their laboratory’s fume hood. “The reaction between pyrite and nitric acid is quite spectacular.”

The chemical barrage left behind grains of chromite, an extremely hardy mineral that composes about 0.25% of some meteorites by weight. These grains are like a corpse’s gold tooth, said Schmitz. “They survive.”

Schmitz and Terfelt found that over 99% of the chromite grains they recovered came from a stony meteorite known as an ordinary chondrite. That’s perplexing, the researchers suggested, because asteroids of this type are rare in the asteroid belt, the source of most meteorites. “Ordinary chondritic asteroids don’t even appear to be common in the asteroid belt,” Schmitz told Eos.

“Everyone was telling us [we would] find several peaks.”An implication of this finding is that most of Earth’s roughly 200 known impact structures were likely formed from ordinary chondrites striking the planet. “The general view has been that comets and all types of asteroids were responsible,” said Schmitz.

When Schmitz and Terfelt sorted the 2,828 chromite grains they recovered by age, the mystery deepened. The distribution they found was remarkably flat except for one peak roughly 460 million years ago. We were surprised, said Schmitz. “Everyone was telling us [we would] find several peaks.”

Making It to Earth

Sporadic collisions between asteroids in the asteroid belt produce a plethora of debris, and it’s logical to assume that some of that cosmic shrapnel will reach Earth in the form of meteorites. But of the 15 of these titanic tussles involving chromite-bearing asteroids that occurred over the past 500 million years, that was the case only once, Schmitz and Terfelt showed. “Only one appears to have led to an increase in the flux of meteorites to Earth.”

Perhaps asteroid collisions need to occur in a specific place for their refuse to actually make it to our planet, the researchers proposed in the Proceedings of the National Academy of Sciences of the United States of America. So-called “Kirkwood gaps”—areas within the asteroid belt where the orbital periods of an asteroid and the planet Jupiter constitute a ratio of integers (e.g., 3:1 or 5:2)—are conspicuously empty. Thanks to gravitational interactions that asteroids experience in these regions of space, they tend to get flung out of those orbits, said Philipp Heck, a meteorist at the Field Museum of Natural History in Chicago not involved in the research. “Those objects tend to become Earth-crossing relatively quickly.”

We’re gaining a better understanding of the solar system by studying the relics of asteroids, its oldest constituents, said Heck. But this analysis should be extended to other types of meteorites that don’t contain chromite grains, he said. “This method only looks at certain types of meteorites. It’s far from a complete picture.”

—Katherine Kornei (@KatherineKornei), Science Writer

U.S. Data Centers Rely on Water from Stressed Basins

EOS - Mon, 07/12/2021 - 13:42

Thanks to our ever increasing reliance on the Internet, the amount of data online is skyrocketing. The global data volume is expected to grow sixfold from 2018 to 2025. It might seem like that information is swirling in the cloudy sky, but it’s stored in physical data centers.

“We know data centers use a lot of energy, and energy uses a lot of water. So how much water is being used?”Landon Marston, an assistant professor at Virginia Tech, recently noticed news articles addressing the growing energy requirements of the data center industry. As an expert in water resources engineering, he wondered how those energy requirements translated into water consumption. “We know data centers use a lot of energy, and energy uses a lot of water. So how much water is being used?” said Marston. “We suspected that there could be large impacts at a very local scale, but there hadn’t really been a spatially detailed analysis looking at the environmental impact of data centers.”

In a study recently published in Environmental Research Letters, Marston and colleagues attempted to map how and where data centers consume energy and water in the United States. The results showed that it takes a large amount of water to support the cloud and that the water often comes from water-stressed basins.

Connecting Water Consumption to Data Centers

The researchers identified over 100,000 data centers using previous data from the Lawrence Berkeley National Laboratory and the websites of commercial data centers. While most of the data centers are small operations run by individual companies, the majority of servers in the United States are housed in fewer than 2,500 “colocation” and “hyperscale” data centers, which store data for many companies and the public simultaneously. Hyperscale data centers are the biggest type of data center, typically housing over 5,000 servers, but are designed to be more energy efficient by using cutting-edge cooling methods and servers.

All data centers consume water directly (to cool the electronics at the site) and indirectly (through electricity generation at the power plants that service the sites). Using records from the U.S. Environmental Protection Agency and the U.S. Energy Information Association, and data from previous academic studies, the researchers matched the data centers with their most likely sources of electricity and water. Then they estimated the data centers’ annual energy, direct water, and indirect water consumption based on their energy and cooling requirements. By piecing all this information together, “we can have a spatially explicit representation of the environmental footprints associated with each of the data centers,” said Marston.

They mapped the U.S. data center industry’s carbon footprint, water footprint, and water scarcity footprint. The last calculation accounts for the pressure that water consumption will put on a region based on local water availability and needs.

Hot, Dry, and Hydroelectric

The results revealed that data centers use water from 90% of watersheds in the United States. The water consumption of individual data centers varies dramatically depending on where they are located and their electricity source. For example, data centers in the Southwest rely on water-heavy hydroelectric power, and the hot climate there leads to more evaporation compared with other regions in the country. Data centers in the cooler, wetter climates of the East Coast also tend to use more solar and wind energy, which require less water.

“This is important because most [data center] operators don’t really look at their power consumption as part of the overall water footprint.”Of the total water footprint attributed to data centers, 75% was from indirect water use at power plants and 25% was from on-site water use. “This is important because most [data center] operators don’t really look at their power consumption as part of the overall water footprint,” said David Mytton, a researcher at Imperial College London and the Data Center Sustainability Research Team at the Uptime Institute. Mytton was not involved in the new study.

A. B. Siddik, a graduate student at Virginia Tech and the study’s lead author, explained that on-site water consumption has a bigger impact on the water scarcity footprint, indicating that many data centers are in water-stressed regions. “Most often they are in the middle of a desert, or in the Southwest, like California, Nevada, and Arizona,” said Siddik. “Those are hubs of data centers.” The overall water scarcity footprint was more than double the water footprint, suggesting that data centers in the United States disproportionately consume water from water-stressed regions.

Planning for the Digital Future

As the demand for data storage grows, so will the need for hyperscale data centers. Although these buildings are more efficient than smaller data centers, concentrating the energy and water demands in fewer locations could tax the local environment.

“Simple real estate decisions could potentially be the solution here.”Further innovations in energy-efficient technology and investments in renewable energy will help curb energy and water usage, but Marston also recommended building new data centers in regions with smaller carbon and water-scarcity footprints. “Simple real estate decisions could potentially be the solution here,” he said.

Technology companies have already tried out extreme locations for data centers. For example, Google converted an old mill in frigid northern Finland into a data center, and Microsoft experimented with putting data centers in the ocean. But according to the study, locations such as New York and southern Florida that have an abundance of water and renewable energy sources would have a lower environmental impact.

Mytton agreed that it’s important to consider the locations of future data centers, adding that climate change complicates these decisions because places that are not water stressed now might become drier and hotter over time. Plus, there are many other factors that contribute to where data centers are built, such as the local taxes, regulations, and workforce. Strategically placing data centers based on water resources is also an important economic consideration for the industry, Marston said, because water-stressed regions are prone to electricity blackouts and brownouts, which are detrimental to the operation of data centers.

“Data [are] so critical to the way our society functions, and data centers underpin all that,” Marston said. “It’s not just about the environmental footprint. It’s also a potential risk for these data centers.”

—Andrew Chapman (@andrew7chapman), Science Writer

Tree Rings Show Record of Newly Identified Extreme Solar Activity Event

EOS - Mon, 07/12/2021 - 13:42

The Sun constantly emits a stream of energetic particles, some of which reach Earth. The density and energy of this stream form the basis of space weather, which can interfere with the operation of satellites and other spacecraft. A key unresolved question in the field is the frequency with which the Sun emits bursts of energetic particles strong enough to disable or destroy space-based electronics.

One promising avenue for determining the rate of such events is the dendrochronological record. This approach relies on the process by which a solar energetic particle (SEP) strikes the atmosphere, causing a chain reaction that results in the production of an atom of carbon-14. This atom subsequently can be incorporated into the structure of a tree; thus, the concentration of carbon-14 atoms in a tree ring can indicate the impact rate of SEPs in a given year.

To date, three events of extreme SEP production are well described in literature, occurring approximately in the years 660 BCE, 774–775 CE, and 992–993 CE. Each event was roughly an order of magnitude stronger than any measured in the space exploration era. Miyake et al. describe such an event, which occurred between 5411 BCE and 5410 BCE. Because of this burst, atmospheric carbon-14 increased 0.6% year over year in the Northern Hemisphere and was sustained for several years before dropping to typical levels.

The authors deduced the presence of this event by using samples collected from trees in three widely dispersed locales: a bristlecone pine in California, a Scotch pine in Finland, and a European larch in Switzerland. Each sample had its individual tree rings separated, and material from each ring underwent accelerator mass spectrometry to determine its carbon-14 content.

Using statistical methods, the researchers identified a pattern of small carbon-14 fluctuations consistent with the Sun’s 11-year solar cycle; the event recorded in the tree ring occurred during a time of solar maximum. Notably, other evidence suggests that the Sun was also undergoing a decades-long period of increasing activity.

If an extreme SEP burst is indeed the cause of the additional carbon-14, then these observations could aid in forecasting future events. However, tree ring measurements cannot rule out other extraterrestrial causes, such as a nearby supernova explosion. Confirmation will require isotopic measurements of beryllium and chlorine taken from ice cores, according to the authors. (Geophysical Research Letters, https://doi.org/10.1029/2021GL093419, 2021)

—Morgan Rehnberg, Science Writer

Reconstructing Rocks with Machine Learning

EOS - Mon, 07/12/2021 - 11:30

Digital rock physics utilizes a paradigm of first taking a digital image of a rock and then performing a computer simulation using the digital image. This has many applications, such as hydrogeology and geologic carbon dioxide sequestration. The imaging portion of the task can be costly because high-resolution images of 3D rocks often must be pieced-together by taking many images of 2D rock slices.

You et al. [2021] utilize a machine learning technique called a “progressive growing generative adversarial network” (or PG-GAN) to reduce the cost of producing high-resolution 3D rock images. The PG-GAN learns to generate realistic, high-dimensional rock images from noise in a low-dimensional space. A given rock image can be reconstructed by finding an optimal point in the low-dimensional space. Performing interpolation of the rock images directly results in a low-quality reconstruction, but the PG-GAN produces a high-quality result after interpolation in the low-dimensional space. Using the PG-GAN to interpolate in the low-dimensional space enables the accurate digital reconstruction of a rock using fewer 2D slices, which reduces the cost of the process.

Citation: You, N., Li, Y. E., & Cheng, A. [2021]. 3D carbonate digital rock reconstruction using progressive growing GAN. Journal of Geophysical Research: Solid Earth, 126, e2021JB021687. https://doi.org/10.1029/2021JB021687

This research article is part of a cross-journal special collection on “Machine Learning for Solid Earth Observation, Modeling, and Understanding”. Find out more and read other articles.

—Daniel O’Malley, Associate Editor, JGR: Solid Earth

Getting to the Bottom of Trawling’s Carbon Emissions

EOS - Fri, 07/09/2021 - 12:50

Bottom trawling, a controversial fishing practice in which industrial boats drag weighted nets through the water and along the ocean floor, can unintentionally dig up seafloor ecosystems and release sequestered carbon within the sediments. For the first time, researchers have attempted to estimate globally how this fishing technique may be remineralizing stored carbon that, as the seabed is tilled, ends up back in the water column and possibly the atmosphere, where it would contribute to climate change.

“The ocean is one of our biggest carbon sinks, so when we put in more human-induced CO2 emissions…we’re weakening that sink.”“The ocean is one of our biggest carbon sinks,” said Trisha Atwood, who researches aquatic ecology at Utah State University. “So when we put in more human-induced CO2 emissions, whether that’s directly dumping CO2 into deep waters or whether that’s trawling and enhancing remineralization of this carbon, we’re weakening that sink.”

Atwood helped build a model that shows that bottom trawling may be releasing as much as 1.5 billion metric tons of aqueous carbon dioxide (CO2) annually, equal to what is released on land through farming. Her work was part of a paper recently published in Nature that presents a framework for prioritizing the creation of marine protected areas to restore ocean biodiversity and maximize carbon storage and ecosystem services.

Estimating Carbon Loss from the Ocean Floor

To create the model, Atwood and her coauthors first needed to figure out how much of the ocean floor is dredged by trawlers. They turned to data from the nonprofit Global Fishing Watch, which recently began tracking fishing activity around the world and compiled data on industrial trawlers and dredgers from 2016 to 2019.

The next step was to find data on how much carbon is stored in the world’s ocean sediments. Because that information was not readily available, Atwood and colleagues built a data set by analyzing thousands of sediment cores that had been collected over the decades.

Last, they dug through the scientific literature, looking at studies that examined whether disturbances to the soil in coastal ecosystems, such as seagrasses, mangroves, and salt marshes, exposed carbon that was once deep in marine sediments and enhanced carbon production in the ocean.

A group of twin-rigged shrimp trawlers in the northern Gulf of Mexico off the coast of Louisiana. The trawlers are trailed by a plume of sediment, suggesting that their nets are scraping against the seafloor. Credit: SkyTruth Galleries, CC BY-NC-SA 2.0

“We lean really heavily on that literature,” said Atwood. “We used a lot of the equations [in previous papers] to build our model and extend it into the seabeds in these more open ocean locations. And from there, we were able to come up with this first estimate.”

Their investigation did not attempt to determine whether sequestered carbon that has been released by bottom trawling remains in the water column or is released into the atmosphere, although they noted potential problems either way. In the paper, the authors noted that it is likely to increase ocean acidification, limit the ocean’s buffering capacity, and even add to the buildup of atmospheric CO2.

Atwood and the lead author of the paper, Enric Sala, a conservation ecologist who is also a National Geographic Explorer-in-Residence, are working with Tim DeVries, who studies ocean biogeochemistry at the University of California, Santa Barbara, and scientists at NASA’s Goddard Space Flight Center to build atmospheric models to try to figure out where the released carbon goes.

Existing Trawling Data May Be Too Scant

Not everyone, however, is convinced that Atwood and Sala’s model on bottom trawling and loss of carbon sequestration in marine sediments is accurate. Sarah Paradis, who is studying the effects of bottom trawling on the seafloor for her Ph.D. at the Institute of Environmental Science and Technology in Barcelona, is skeptical.

In an email to Eos, Paradis noted that since the 1980s, there have been fewer than 40 studies that address the impacts that bottom trawling has on sedimentary organic carbon. These few studies are not enough to build a model on, she said, and in addition, the studies reach different conclusions. Some studies have observed that bottom trawling decreases organic carbon content of the seafloor, whereas others show it increases organic carbon.

“We in no way intended our model to be the end-all in the trawling conversation. We hope that many more studies will come along that help produce more localized results.”In addition, Paradis wrote that lower organic carbon on the seafloor does not necessarily mean its remineralization to CO2. Rather, it could simply mean loss of organic carbon through erosion, which means the carbon moves to another area of the seabed but very little is remineralized into CO2. She pointed to several studies, including one that she was a part of, that showed loss of organic carbon through erosion.

“I want to emphasize that [the authors] address a very important issue regarding how bottom trawling, a ubiquitous and very poorly-regulated anthropogenic activity, is affecting the seafloor,” she wrote. “But the values they propose are far from being credible.”

Atwood disagreed. “We don’t need lots of studies on the effects of trawling because we built our model using decades of carbon cycling research,” she wrote in an email to Eos. “Trawling is simply a perturbation that mixes and re-suspends sediments, leading to increases in carbon availability. All we needed to know about trawling to apply a carbon model to it is where trawling occurs and how deep in the sediment the trawls go.”

In addition, Atwood said, “We in no way intended our model to be the end-all in the trawling conversation. We hope that many more studies will come along that help produce more localized results.”

—Nancy Averett (@nancyaverett), Science Writer

Virtual Tours Through the Ice Using Everyday Tools

EOS - Fri, 07/09/2021 - 12:48

You know you are on to something special when researchers who have traveled to and experienced the wonder of some of the most remote places on Earth are captivated by a tool that takes them there virtually.

Earth’s cryosphere, including its ice sheets, ice caps, glaciers, and sea ice, is undergoing stark changes as air temperatures continue to rise. Scientists who study these regions understand viscerally the scale and scope of these changes, but they encounter limitations in communicating their experiences and observations to the public. Digital learning tools and online scientific data repositories have greatly expanded over the past decade, but there are still few ways for the public to explore rapidly changing icy environments through a realistic platform that provides contextual information, supplemental media, and connections to data sets.

Create Your Own Virtual Field Experience

Byrd Center’s instructional guide Record: GoPro MAX 360° waterproof VR camera Edit footage: Adobe Premiere Pro Add location data: Dashware Host videos to access in tour: Vimeo Add 3D objects: Sketchfab Add the virtual tour overlay: 3DVista

The Virtual Ice Explorer (VIE) aims to bring the public closer to these important places. Developed by the Education and Outreach (E&O) team at the Byrd Polar and Climate Research Center, VIE encourages informal learning about icy environments by immersing visitors in “choose your own adventure” tours. Click on the globe on the home page and head to, for example, the Multidisciplinary Drifting Observatory for the Study of Arctic Climate (MOSAiC) expedition that intentionally froze its ship into Arctic sea ice for a year of observations last year. You’ll land on the deck of the icebreaker R/V Polarstern overlooking the ice camp—no long voyage required. Next, you can visit scientists in action, sampling Arctic waters up to 1,000 meters below the ocean surface through a hole drilled in the ice. Or maybe you’d like to see how researchers spend their off hours with a little snow soccer. These options offer visitors a glimpse into the daily lives of scientists in the field as they fill in the blanks about what researchers study in these extraordinary locations and why it matters to our understanding of the planet.

DIY-ing the First VIE

VIE was originally conceived as a platform to display immersive tours for about a dozen glacial sites around the world, generated from digital elevation models draped with satellite imagery. Following setbacks caused by the quality of the virtual landscapes created from satellite data and challenging user experience, two opportunities allowed us to reenvision VIE: (1) the acquisition of rugged, easy-to-use field cameras and (2) our discovery of existing commercial software with which we could more easily create tours that had been painstakingly built with custom code. We also began involving researchers who visited these sites firsthand; their experiences turned out to be essential for our tour development.

Our team purchased a GoPro Fusion 360° camera by way of a generous donation to the Byrd Center. At the same time, Michael Loso, a geologist with the U.S. National Park Service, was planning to spend a field season at Kennicott Glacier in Alaska. Loso agreed to take the camera and capture footage. We shared his footage using a virtual reality (VR) headset during Byrd Center outreach events and with park visitors, and also collected feedback. We were particularly moved by a visitor who appreciated that the tour allowed them to explore a site that was otherwise inaccessible due to a physical disability.

This ease of use in the field was an essential criterion if we were to ask scientists to carry the cameras along on expeditions.These rugged, inexpensive, and relatively easy-to-use cameras come with their own software and have a multitude of third-party programs available. Researchers can set them up, hit record, and walk away to focus on their work. This ease of use in the field was an essential criterion if we were to ask scientists to carry the cameras along on expeditions. After capturing and rendering video using GoPro’s software, we use tools like Adobe Premiere Pro for additional editing, Dashware for accessing location data, and Plotly’s Chart Studio for graphing and hosting interactive data sets.

A workshop run by Ryan Hollister, a high school educator, during the 2019 Earth Educators’ Rendezvous also led to tremendous advances in our team’s ability to create VIE tours. Hollister showed off the immersive virtual field experience he created for the Columns of the Giants in California and walked attendees through designing their own experiences. After collecting 24 panoramic images with a Nikon D750 camera, Hollister stitched them together to create a 360° image using Autopano Giga software. He then used 3DVista software to add an interactive overlay to the images that allowed users to move to different locations within a site, click on marked points of interest and read about them, and embed 3D models of rock samples. This software was originally designed for architects and real estate professionals to create virtual tours of buildings, so it seamlessly constructs the computer code underpinning the tours with landscapes. Today 3DVista caters to wider audiences, including educators, and it provides services such as live guided tours and hosting capabilities.

The 3DVista software allowed us to create glacial landscape tours that we had been building with customized computer code, but in far less time. Use of off-the-shelf software allowed us to spend more time collecting footage, creating compelling narratives, and testing a wider range of scene designs. In the future, we plan to use 3DVista’s user-tested interface to train educators and researchers to create their own tours.

Getting Scientists Camera-Ready

The E&O team now trains Byrd Center researchers with the cameras on basic photography techniques and more specific 360° filming techniques to capture high-quality video for VIE and other outreach programs. We want researchers to illustrate the vast, unique landscapes in which they’re working as well as showcase engaging scenes from their day-to-day experiences. We train them to create compositions to fill a scene, such as the inclusion of people to provide scale and demonstrate the research process, and we encourage them to film all parts of the expedition, including the journey, their living conditions, and interactions with collaborators and local partners.

We also have conversations with expedition members on the nature of their research, the field site itself, the equipment that will be on-site, and the desired impact of our outreach so that we can coproduce a narrative that guides what they film. These training sessions help the E&O team consider unique content for each tour, such as maps of study sites, time-lapse series, information on samples and equipment, biographies of researchers, links to publications, and prominent messages that properly identify and give respect to the people and places shown.

A benefit of having researchers explore virtual tours of other sites before they embark on their journey is that it generates genuine enthusiasm to share their own experiences. Chris Gardner, a Byrd Center researcher, viewed a tour of ice core drilling on Mount Huascarán in Peru while preparing to lead an expedition to the Dry Valleys of Antarctica during the 2019–2020 field season. Once he could see what was possible, he met with the E&O team to develop a video capture plan. Importantly, Gardner involved his entire team in selecting shots, recording video, and contributing to the tour narrative.

This photo of scientists (left to right) Chris Gardner, Adolfo Calero, and Melisa Diaz on a 2019 expedition to Dry Valleys in Antarctica welcomes visitors on the Virtual Ice Tour by the Byrd Polar and Climate Research Center called “McMurdo Dry Valleys, Victoria Land, Antarctica.” Credit: Chris Gardner (photo); Byrd Polar and Climate Research Center (screen capture)

Authors Kira Harris and Kasey Krok have participated in many of these training sessions as undergraduate interns on the E&O team. They found that these sessions offered opportunities for pivotal interpersonal interactions among group members, including undergraduate and graduate students, postdocs, and investigators. Students gained a better understanding of the science that researchers were carrying out, while getting an opportunity to share their sometimes more finely honed technical experience in video and photography.

High-Quality Tours With a Low Lift

As of this writing, the Byrd Center has created virtual field experiences for eight sites, thanks to collaboration with the National Park Service, the Ohio Department of Natural Resources, and the many scientists who filmed their field campaigns. Additional examples of virtual field experiences by other groups include VR Glaciers and Glaciated Landscapes by Des McDougall at the University of Worcester; The Hidden Worlds of the National Parks by the National Park Service; and these immersive virtual field trips by Arizona State University. More are being developed all the time. At AGU’s Fall Meeting 2020, for example, there were numerous oral sessions and posters highlighting the applications of virtual field experiences.

What’s most exciting is that these virtual explorations allow individuals almost anywhere in the world—regardless of their wealth, abilities, or learning preferences—to experience new landscapes and engage with Earth science lessons.Our E&O team has published an instructional guide for educators and scientists to use to build their own virtual field experiences tailored to their initiatives, using the same workflow that we use. Ryan Hollister has several resources, including guides on the technical requirements for high-resolution tours, creating 3D models of rock samples, and how to use the Next Generation Science Standards to best adapt immersive virtual field experiences for all learners. Our team also continues to test new tour features that will increase user engagement, knowledge retention, and options for user interaction. Last year, while closed to the public due to the COVID-19 pandemic, we even created a virtual tour of the Byrd Center to continue reaching out to the thousands of individuals who typically visit our facility each year.

What’s most exciting is that these virtual explorations allow individuals almost anywhere in the world—regardless of their wealth, abilities, or learning preferences—to experience new landscapes and engage with Earth science lessons. While you can get the best view of the tours on a VR headset, all you need is a modest Internet connection and a laptop, tablet, or smartphone. These tours can be developed to specifically put visitors into the role of scientist to observe the terrain and use the provided data to make evidence-based claims.

This virtual field access enables individuals of all ages to get a taste of field research, appreciate the daily work of scientists, and gain a deeper understanding of our rapidly altering natural world. Although nothing can truly replicate an in-person field experience, virtual tours can be used to enhance educational opportunities in cases where people would otherwise not have access to those experiences, such as in large introductory courses, socially distanced laboratory exercises, or locations that need protection from oversampling or ecotourism. We can’t always bring people to the far reaches of the world, but we now have many tools to bring the vast world to each of us.

Acknowledgments

Funding for this project was provided by the National Science Foundation under award 1713537 and a contribution from ENGIE/Ohio State Energy Partners. We thank the MOSAiC expedition and the NSF Office of Polar Programs for their continued collaboration on this project.

Author Information

Kira Harris (kiraharris@email.arizona.edu), University of Arizona, Tucson; Kasey Krok, Byrd Polar and Climate Research Center, Ohio State University, Columbus; Ryan Hollister, Turlock Unified School District, Turlock, Calif.; and Jason Cervenec, Byrd Polar and Climate Research Center, Ohio State University, Columbus

Previous Intra-oceanic Subduction Found Beneath South America?

EOS - Fri, 07/09/2021 - 11:30

High velocity slabs deeper than 1,000 kilometers have been imaged beneath the Amazon by various tomographic studies and have been interpreted as a continuation of the present Nazca slab. Mohammadzaheri et al. [2021] propose a new interpretation of these slab pieces deeper than about 900 kilometers. Geodynamic and plate reconstruction analyses of a new global P-wave tomography model (DETOX-P1, based on both travel time data and multi-frequency waveform picks) suggest that these 900-1800 kilometer deep high-velocity anomalies are actually remnants of a west-dipping intra-oceanic subduction zone during late Jurassic and Early Cretaceous times when South America’s paleo-position was near Africa, before the start of the present, east-dipping Andean subduction around 85 million years ago. This gives support to the hypothesis that slabs in the lower mantle sink vertically with implications on models of plate motion reconstructions.

Citation: Mohammadzaheri, A., Sigloch, K., Hosseini, K., & Mihalynuk, M. G. [2021]. Subducted lithosphere under South America from multifrequency P wave tomography. Journal of Geophysical Research: Solid Earth, 126, e2020JB020704. https://doi.org/10.1029/2020JB020704

—Marcelo Assumpção, Associate Editor, JGR: Solid Earth

Good, Soon, and Cheap – Earthquake Early Warning by Smartphone

EOS - Thu, 07/08/2021 - 13:50

Even short warning of earthquakes can be crucial in protecting lives and infrastructure, so there is great interest in developing systems for earthquake early warning. Any such system must be reliable and balance sensitivity for events against such factors as user tolerance for false alarms in which no shaking is felt. This is complicated by the need to have relatively dense sensor coverage not only where people reside but also in adjacent seismogenic regions. This requires high costs if typical scientific-grade instruments are used but such costs are prohibitive in many countries where resources are limited.

Brooks et al. [2021] describe very encouraging results from Costa Rica, where the ASTUTI network (Alerta Sismica Temprana Utilizando Teléfonos Inteligentes, or Earthquake Early Warning Utilizing Smartphones) uses a fixed network of smartphones. Their data indicate that such low-cost networks can be highly effective and installed and operated at relatively lower costs, bring the benefits of early warning to a broader portion of the world’s population.

Citation: Brooks, B., Protti, M., Ericksen, T. et al. [2021]. Robust Earthquake Early Warning at a Fraction of the Cost: ASTUTI Costa Rica. AGU Advances, 2, e2021AV000407. https://doi.org/10.1029/2021AV000407

 —Peter Zeitler, Editor, AGU Advances

Realizing Machine Learning’s Promise in Geoscience Remote Sensing

EOS - Thu, 07/08/2021 - 12:25

In recent years, machine learning and pattern recognition methods have become common in Earth and space sciences. This is especially true for remote sensing applications, which often rely on massive archives of noisy data and so are well suited to such artificial intelligence (AI) techniques.

As the data science revolution matures, we can assess its impact on specific research disciplines. We focus here on imaging spectroscopy, also known as hyperspectral imaging, as a data-centric remote sensing discipline expected to benefit from machine learning. Imaging spectroscopy involves collecting spectral data from airborne and satellite sensors at hundreds of electromagnetic wavelengths for each pixel in the sensors’ viewing area.

Modern signal processing and machine learning concepts applied to imaging spectroscopy analysis have potential benefits for numerous areas of geoscience research.Since the introduction of imaging spectrometers in the early 1980s, their numbers and sophistication have grown dramatically, and their application has expanded across diverse topics in Earth, space, and laboratory sciences. They have, for example, surveyed greenhouse gas emitters across California [Duren et al., 2019], found water on the moon [Pieters et al., 2009], and mapped the tree chemistry of the Peruvian Amazon [Asner et al., 2017]. The data sets involved are large and complex. And a new generation of orbital instruments, slated for launch in coming years, will provide global coverage with far larger archives. Missions featuring these instruments include NASA’s Earth Surface Mineral Dust Source Investigation (EMIT) [Green et al., 2020] and Surface Biology and Geology investigation [National Academies of Sciences, Engineering, and Medicine, 2019].

Researchers have introduced modern signal processing and machine learning concepts to imaging spectroscopy analysis, with potential benefits for numerous areas of geoscience research. But to what extent has this potential been realized? To help answer this question, we assessed whether the growth in signal processing and pattern recognition research, indicated by an increasing number of peer-reviewed technical articles, has produced a commensurate impact on science investigations using imaging spectroscopy.

Mining for Data

Following an established method, we surveyed all articles cataloged in the Web of Science [Harzing and Alakangas, 2016] since 1976 with titles or abstracts containing the term “imaging spectroscopy” or “hyperspectral.” Then, using a modular clustering approach [Waltman et al., 2010], we identified clustered bibliographic communities among the 13,850 connected articles within the citation network.

We found that these articles fall into several independent and self-citing groups (Figure 1): optics and medicine, food and agriculture, machine learning, signal processing, terrestrial Earth science, aquatic Earth science, astrophysics, heliophysics, and planetary science. The articles in two of these nine groups (signal processing and machine learning) make up a distinct cluster of methodological research investigating how signal processing and machine learning can be used with imaging spectroscopy, and those in the other seven involve research using imaging spectroscopy to address questions in applied sciences. The volume of research has increased recently in all of these groups, especially those in the methods cluster (Figure 2). Nevertheless, these methods articles have seldom been cited by the applied sciences papers, drawing more than 96% of their citations internally but no more than 2% from any applied science group.

Fig. 1. Research communities tend to sort themselves into self-citing clusters. Circles in this figure represent scientific journal publications, with the size proportional to the number of citations. Map distance indicates similarity in the citation network. Seven of nine total clusters are shown; the other two (astrophysics and heliophysics) were predominantly isolated from the others. Annotations indicate keywords from representative publications. Image produced using VOSviewer. Click image for larger version.

The siloing is even stronger among published research in high-ranked scholarly journals, defined as having h-indices among the 20 highest in the 2020 public Google Scholar ranking. Fewer than 40% of the articles in our survey came from the clinical, Earth, and space science fields noted above, yet these fields produced all of the publications in top-ranked journals. We did not find a single instance in which one of those papers in a high-impact journal cited a paper from the methods cluster.

Fig. 2. The number of publications per year in each of the nine research communities considered is shown here. A Dramatic Disconnect

The recent boom in machine learning and signal processing research has not yet made a commensurate impact on the use of imaging spectroscopy in applied sciences.From our analysis, we conclude that the recent boom in machine learning and signal processing research has not yet made a commensurate impact on the use of imaging spectroscopy in applied sciences.

A lack of citations does not necessarily imply a lack of influence. For instance, an Earth science paper that borrows techniques published in a machine learning paper may cite that manuscript once, whereas later studies applying the techniques may cite the science paper rather than the progenitor. Nonetheless, it is clear that despite constituting a large fraction of the research volume having to do with imaging spectroscopy for more than half a decade, research focused on machine learning and signal processing methods is nearly absent from high-impact science discoveries. This absence suggests a dramatic disconnect between science investigations and pure methodological research.

Research communities focused on improving the use of signal processing and machine learning with imaging spectroscopy have produced thousands of manuscripts through person-centuries of effort. How can we improve the science impact of these efforts?

Lowering Barriers to Entry

We have two main recommendations. The first is technical. The methodology-science disconnect is symptomatic of high barriers to entry for data science researchers to engage applied science questions.

Imaging spectroscopy data are still expensive to acquire, challenging to use, and regional in scale. Most top-ranked journal publications are written by career experts who plan and conduct specific acquisition campaigns and then perform each stage of the collection and analysis. This effort requires a chain of specialized steps involving instrument calibration, removal of atmospheric interference, and interpretation of reflectance spectra, all of which are challenging for nonexperts. These analyses often require expensive and complex software, raising obstacles for nonexpert researchers to engage cutting-edge geoscience problems.

In contrast, a large fraction of methodological research related to hyperspectral imaging focuses on packaged, publicly available benchmark scenes such as the Indian Pines [Baumgardner et al., 2015] or the University of Pavia [Dell’Acqua et al., 2004]. These benchmark scenes reduce multifaceted real-world measurement challenges to simplified classification tasks, creating well-defined problems with debatable relevance to pressing science questions.

We support efforts to democratize imaging spectrometer data by improving core data products, making pertinent science data more accessible to machine learning researchers.Not all remote sensing disciplines have this disconnect. Hyperspectral imaging, involving hundreds of spectral channels, contrasts with multiband remote sensing, which generally involves only 3 to 10 channels and is far more commonly used. Multiband remote sensing instruments have regular global coverage, producing familiar image-like reflectance data. Although multiband instruments cannot measure the same wide range of phenomena as hyperspectral imagers, the maturity and extent of their data products democratize their use to address novel science questions.

We support efforts to similarly democratize imaging spectrometer data by improving and disseminating core data products, making pertinent science data more accessible to machine learning researchers. Open spectral libraries like SPECCHIO and EcoSIS exemplify this trend, as do the commitments by missions such as PRISMA, EnMAP, and EMIT to distribute reflectance data for each acquisition.

In the longer term, global imaging spectroscopy missions can increase data usage by providing data in a format that is user-friendly and ready to analyze. We also support open-source visualization and high-quality corrections for atmospheric effects to make existing hyperspectral data sets more accessible to nonexperts, thereby strengthening connections among methodological and application-based research communities. Recent efforts in this area include open source packages like the EnMAP-Box, HyTools, ISOFIT, and ImgSPEC.

Expanding the Envelope

Science investigators should partner with data scientists to pursue challenging (bio)geophysical investigations, thus broadening their technical tool kits and pushing the limits of what can be measured remotely.Our second recommendation is cultural. Many of today’s most compelling science questions live at the limits of detectability—for example, in the first data acquisition over a new target, in a signal close to the noise, or in a relationship struggling for statistical significance. The papers in the planetary science cluster from our survey are exemplary in this respect, with many focusing on first observations of novel environments and achieving the best high-impact publication rate of any group. In contrast, a lot of methodological work makes use of standardized, well-understood benchmark data sets. Although benchmarks can help to coordinate research around key challenge areas, they should be connected to pertinent science questions.

Journal editors should encourage submission of manuscripts reporting research about specific, new, and compelling science problems of interest while also being more skeptical of incremental improvements in generic classification, regression, or unmixing algorithms. Science investigators in turn should partner with data scientists to pursue challenging (bio)geophysical investigations, thus broadening their technical tool kits and pushing the limits of what can be measured remotely.

Machine learning will play a central role in the next decade of imaging spectroscopy research, but its potential in the geosciences will be realized only through engagement with specific and pressing investigations. There is reason for optimism: The next generation of orbiting imaging spectrometer missions promises global coverage commensurate with existing imagers. We foresee a future in which, with judicious help from data science, imaging spectroscopy becomes as pervasive as multiband remote sensing is today.

Acknowledgments

The research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with NASA (80NM0018D0004). Copyright 2021. California Institute of Technology. Government sponsorship acknowledged.

Sowing 1,000 Trees into Shanghai’s Urban Fabric

EOS - Thu, 07/08/2021 - 12:23

Towering over Suzhou Creek in Shanghai’s Putuo District, hundreds of pedestals double as giant planters holding trees and shrubs that shoot up from artificial mountains. Visitors will soon be able to look up at this aerial forest, with trees embedded in the building’s supports, as they walk, shop, eat, and work below in the new development.

The 1000 Trees project from Heatherwick Studio combines greenery with integral pieces of the structure. Expanding natural spaces has become a priority in Shanghai, which ranks among the most densely populated cities in the world. This density has come at the cost of green space: In Shanghai’s city center, green spaces shrank from 30.9 square kilometers to just 2.6 between 1980 and 2005. Efforts have slowly reversed that trend, and greenery has increased downtown as part of a push to make Shanghai an “ecological city” replete with forests and greenways. This undertaking is due in part to the ecosystem services that urban forests, especially trees, provide.

This composite image of pixels captured by satellite between 2013 and 2017 shows the vast expanse of Shanghai at the expense of green space. Credit: NASA Earth Observatory

“They produce a huge cover that can alter the environment in terms of temperatures, in terms of pollution, and in terms of water flows because they intercept water and evaporate water. They store carbon as they grow, which affects climate change,” said David Nowak, a senior scientist with the U.S. Department of Agriculture Forest Service. Urban forests also provide habitats for wildlife and promote well-being, providing mental health benefits for city residents.

However, if not selected and planted carefully, trees planted in a city can have drawbacks, like adding more pollen or sometimes trapping air pollutants beneath their canopy rather than removing them. But, Nowak said, “generally the benefits outweigh the negatives.”

Designing for Nature

In 2012, Heatherwick Studio, headquartered in London, was invited to design a development between Shanghai’s M50 art district and a park bordering Suzhou Creek for Tian An China Investments Company Limited. Studio founder Thomas Heatherwick’s designs are famous for weaving trees and plants into structures—like bridges and buildings—that have historically replaced nature rather than integrated it.

Trying to incorporate the artistic flair of the M50 and the riverine environment, the Heatherwick team designed a massive commercial development covered in deciduous and evergreen trees. Some of the trees were grown in advance for the project in Chongming, a Shanghai island district in the Yangtze River.

Likened to the Hanging Gardens of Babylon, the large building looks like it was carved from two mountains, covered in an organized display of trees. At the site’s western end, phase 1 of the project, a 60-meter-tall mountain, houses a shopping mall and connects to phase 2, a separate mountain that will likely include office space and restaurants, among other amenities. Phase 1 will tentatively be ready for the public toward the beginning of 2022, but phase 2 is still under construction.

The 1000 Trees project shows that development doesn’t have to exist in place of nature. Usually, “when you put buildings everywhere, it’s either vegetation or buildings. They’re exclusive,” said Nowak. “In this case, they’re designing the vegetation space not as an afterthought; they put it right into the design.”

Seeing the Forest Through the Trees

Green buildings have value, but sustainable cities need more than environmentally conscious construction.Qicheng Zhong, a senior engineer at Shanghai Academy of Landscape Architecture Science and Planning, monitors ecosystem services provided by Shanghai’s green areas. From his perspective, green buildings are helpful to promote nature. But 1000 Trees provides limited benefits to the urban environment. “They’re just little trees and shrubs, single trees, not an ecosystem,” said Zhong.

Green buildings have value, but experts like Zhong argue that sustainable cities need more than environmentally conscious construction. “We have to build or conserve more green areas, but we have to do that with wisdom,” he said. Supporting interconnected park, forest, and wetland systems throughout the city, for example, could advance Shanghai toward becoming a more sustainable, resilient city in a cost-effective way. Unlike structures that are built over natural landscapes, a network of natural areas in the city can provide vital ecosystem services such as absorbing excess rainwater through soil and supplying migration corridors for wildlife.

As a development, 1000 Trees has a lot of impervious surfaces and won’t provide a flood buffer that wetlands or even parks could provide. But according to a Tian An representative who asked not to be identified, they’re conscious of these concerns, and their development preserves 90% of riverside trees and continuity with a greenway near Suzhou Creek.

“We created a new green area in that space. That area has more possibility to be a green community, a creative community.”The 1000 Trees development also introduces unique challenges to engineers and architects. For example, trees need to be securely anchored to planters dozens of meters above the ground. Those planters restrict tree roots, limiting their growth, though the Tian An representative specified that this limitation keeps the trees at a safe, manageable height. Urban forestry experts suggested that tree replacement could present problems, as securing new trees high above the ground could be costly. According to the representative from Tian An, however, the trees have been planted for 3 years with minimal maintenance required, and they don’t anticipate replacing any for at least 5 years.

But urban forestry encompasses more than the natural environment, and Zhong said that 1000 Trees could provide a convenient place for people to socialize in nature within city limits. The representative from Tian An said 1000 Trees could be a better alternative to a conventional development project. “We created a new green area in that space. That area has more possibility to be a green community, a creative community.”

—Jackie Rocheleau (@JackieRocheleau), Science Writer

¿Cómo afectará el cambio climático a los Estados Unidos en las próximas décadas?

EOS - Thu, 07/08/2021 - 12:21

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

(2017) Los científicos publicaron un nuevo informe que detalla cómo el cambio climático está afectando el tiempo atmosférico y el clima en los Estados Unidos y cómo los cambios climáticos futuros podrían ocurrir en todo el país.

El Informe Especial de Ciencia del Clima (CSSR por sus siglas en inglés), creado por una organización del gobierno de los EE. UU. que coordina e integra la investigación federal sobre los cambios en el medio ambiente global y sus implicaciones para la sociedad, también expone el estado actual de la ciencia relacionada con el cambio climático y sus efectos físicos.

“Es muy probable que la influencia humana haya sido la causa dominante del calentamiento observado desde mediados del siglo XX”.“Es muy probable que la influencia humana haya sido la causa dominante del calentamiento observado desde mediados del siglo XX”, concluye el informe. “Para el calentamiento durante el último siglo, no existe una explicación alternativa convincente respaldada por el alcance de la evidencia observacional”.

Y la evidencia observacional es múltiple. Miles de estudios descritos en el informe documentan el aumento de las temperaturas superficiales, atmosféricas y oceánicas; derretimiento de glaciares; disminución de la capa de nieve; encogimiento del hielo marino; aumento del nivel del mar; acidificación oceánica; y el aumento de la intensidad y frecuencia de las lluvias, huracanes, olas de calor, incendios forestales y sequías. El informe describe meticulosamente cómo estos efectos se pueden rastrear en gran medida hacia las actividades humanas y las emisiones asociadas de gases y partículas de importancia radiactiva.

La portada de un informe del gobierno de EE. UU. Recientemente publicado sobre ciencia climática. Crédito: Jesse Allen, Observatorio de la Tierra de la NASA / VIIRS / Suomi-NPP

Detrás del informe hay un amplio consenso científico: cuanto más lejos y más rápido se empuje al sistema terrestre hacia un mayor cambio, mayor es el riesgo de efectos imprevistos, algunos de los cuales son potencialmente grandes e irreversibles.

Por ejemplo, sin grandes reducciones en las emisiones, el aumento en la temperatura global promedio anual en relación con la época preindustrial podría llegar a 9°F (5°C) o más para fines de este siglo. Aunque las tasas de emisión se han desacelerado a medida que el crecimiento económico se está volviendo menos intensivo en carbono, esta tendencia de desaceleración aún no está a una tasa que limitaría el cambio de temperatura promedio global a 3.6°F (2°C) por encima de los niveles preindustriales para fines de siglo.

Y hay más. Es probable que el nivel del mar continúe aumentando, y es probable que muchos eventos climáticos severos se vuelvan más intensos. Prepárate para más récords de temperaturas altas, incluidas olas de calor de varios días y precipitaciones más severas cuando llueva o nieva. La sequía podría afectar al oeste de los Estados Unidos durante las próximas décadas. Se espera que los huracanes del Atlántico y el Pacífico se vuelvan aún más intensos.

En otras palabras, el informe muestra que nuestras trayectorias de emisiones actuales llevarán a nuestro planeta a un estado climático muy diferente al actual, con profundos efectos para Estados Unidos.

Una voz con autoridad sobre el futuro climático de los Estados Unidos

“El informe está diseñado para servir como base para los esfuerzos para evaluar los riesgos relacionados con el clima e informar la toma de decisiones sobre las respuestas.”El CSSR fue creado por el Programa de Investigación de Cambio Global de los EE. UU. (USGCRP por sus siglas en inglés) como el volumen 1 de la Cuarta Evaluación Nacional del Clima (NCA4) [Wuebbles et al., 2017]. USGCRP supervisó la producción de este informe independiente sobre el estado de la ciencia relacionada con el cambio climático y sus impactos físicos. El CSSR está diseñado para ser una evaluación autorizada de la ciencia del cambio climático, con un enfoque en los Estados Unidos, para servir como base para los esfuerzos para evaluar los riesgos relacionados con el clima e informar la toma de decisiones sobre las respuestas.

El CSSR tiene varios propósitos, incluido proporcionar (1) un análisis actualizado y detallado de los hallazgos de cómo el cambio climático está afectando el tiempo atmosférico y el clima en los Estados Unidos, (2) un resumen ejecutivo y 15 capítulos que proporcionan la base para la discusión de ciencia del clima, y (3) información y proyecciones fundamentales para el cambio climático, incluidos los extremos, para mejorar la coherencia “de un extremo a otro” en los análisis sectoriales, regionales y de resiliencia.

El CSSR integra y evalúa los hallazgos sobre la ciencia del clima y analiza las incertidumbres asociadas con estos hallazgos. Analiza las tendencias actuales en el cambio climático, tanto inducidas por el hombre como naturales, y proyecta las principales tendencias para finales de este siglo.

La Administración Nacional Oceánica y Atmosférica (NOAA, por sus siglas en inglés) es la agencia administrativa principal del informe actual. Otras agencias involucradas incluyen la NASA y el Departamento de Energía; representantes de laboratorios nacionales, universidades y el sector privado también ayudaron a redactar el informe.

El informe se sometió a varios borradores y múltiples revisiones, incluido uno por parte del público, y revisiones de expertos de las 13 agencias del USGCRP y las Academias Nacionales de Ciencias, Ingeniería y Medicina. El resultado es un documento completo sobre el estado de la ciencia climática, con evaluaciones de escenarios climáticos estadísticamente probables en los Estados Unidos hasta el final del siglo.

Avances en la ciencia desde la última evaluación

El CSSR representa la evaluación más completa de la ciencia realizada para una Evaluación Nacional del Clima. Como tal, el informe refleja una serie de avances en la ciencia del clima desde que se publicó la Tercera Evaluación Nacional del Clima de EE. UU. (NCA3) en 2014.

Los investigadores ahora pueden identificar más de cerca las influencias humanas sobre el clima individual y los fenómenos meteorológicos extremo.Por ejemplo, desde NCA3, han surgido pruebas más sólidas del calentamiento continuo, rápido  de la atmósfera y los océanos globales causado por el hombre. Los investigadores ahora pueden identificar más de cerca las influencias humanas sobre el clima individual y los fenómenos meteorológicos extremos.

Además, se han logrado avances significativos en la comprensión de los eventos climáticos extremos en los Estados Unidos y cómo se relacionan con el aumento de las temperaturas globales y los cambios climáticos asociados. El nuevo informe también analiza hasta qué punto la circulación atmosférica en las latitudes medias está cambiando o se prevé que cambie, posiblemente de formas no captadas por los modelos climáticos actuales.

Por primera vez en el proceso de la NCA, las proyecciones del aumento del nivel del mar incorporan variaciones geográficas basadas en factores como el hundimiento local de la tierra, las corrientes oceánicas y los cambios en el campo gravitacional de la Tierra. En un análsis de los riesgos potenciales, el CSSR encontró que tanto los cambios de estado a gran escala en el sistema climático (a veces llamados “puntos de inflexión”) como los extremos compuestos tienen el potencial de generar sorpresas climáticas imprevistas.

Aspectos destacados del informe: perspectiva global

En el corazón del informe se encuentran algunos hechos indiscutibles. La concentración de dióxido de carbono (CO2) atmosférico global en todas partes es ahora más de 400 partes por millón (ppm), un nivel que ocurrió por última vez hace unos 3 millones de años, cuando tanto la temperatura media global como el nivel del mar eran significativamente más altos que en la actualidad. El crecimiento continuo de las emisiones de CO2 producidas por el hombre durante este siglo y más allá conduciría a una concentración atmosférica que no se ha experimentado en decenas o cientos de millones de años.

Es más, los últimos 115 años son ahora el período de tiempo más cálido en al menos los últimos 1700 años. La temperatura del aire en la superficie promediada anualmente a nivel mundial ha aumentado aproximadamente 1.8°F (1.0°C) desde 1901 (Figura 1).

Fig. 1. (izquierda) La temperatura media anual global ha aumentado en más de 0.7°C (1.2°F) durante el período 1986–2016 en relación con 1901–1960. Las barras rojas muestran temperaturas que estuvieron por encima del promedio de 1901-1960, y las barras azules indican temperaturas por debajo del promedio. (Derecha) Cambio de temperatura de la superficie (en °F) para el período 1986–2016 en relación con 1901–1960. El gris indica que faltan datos. Crédito: CSSR, capítulo 1, USGCRP

Muchos otros aspectos del clima global están cambiando. Por ejemplo, el promedio mundial del nivel del mar ha aumentado entre 7 y 8 pulgadas desde 1900, y casi la mitad (alrededor de 3 pulgadas) de ese aumento se produjo desde 1993. El cambio climático causado por el hombre ha contribuido sustancialmente a este aumento, contribuyendo a una tasa de aumento que es mayor que la de cualquier siglo anterior en al menos 2800 años.

Se espera que el nivel del mar promedio mundial continúe aumentando, al menos varias pulgadas en los próximos 15 años y de 1 a 4 pies para 2100. No se puede descartar un aumento de hasta 8 pies para 2100.

¿Qué significa esto para los Estados Unidos?

La temperatura promedio anual en los contiguos Estados Unidos ha aumentado en 1.8°F (1.0°C) durante el período de 1901 a 2016; durante las próximas décadas (2021–2050), se espera que las temperaturas medias anuales aumenten alrededor de 2.5°F para los Estados Unidos, en relación con el pasado reciente (promedio de 1976–2005), en todos los escenarios climáticos futuros plausibles.

El informe documenta cómo, en general, se espera que las temperaturas más altas proyectadas para los Estados Unidos y el mundo aumenten la intensidad y frecuencia de los eventos extremos. Los cambios en las características de los eventos extremos son particularmente importantes para la seguridad humana, la infraestructura, la agricultura, la calidad y cantidad del agua y los ecosistemas naturales.

A continuación, se muestran algunos de los ámbitos en los que se espera que Estados Unidos enfrente cambios profundos. Lo sorprendente aquí es que los eventos que consideramos extremos pueden convertirse en la nueva normalidad para fines de siglo.

Inundación costera. El aumento global del nivel del mar ya ha afectado a Estados Unidos; la incidencia de las inundaciones por mareas diarias se está acelerando en más de 25 ciudades del Atlántico y la costa del Golfo. Se espera que el aumento del nivel del mar sea más alto que el promedio mundial en algunas partes de los Estados Unidos, especialmente en las costas del este y del golfo de Estados Unidos. Esto se debe, en parte, a los cambios en el campo gravitacional de la Tierra por el derretimiento del hielo terrestre, los cambios en la circulación oceánica y el hundimiento local.

Eventos de precipitación más grandes. Las fuertes precipitaciones, ya sea como lluvia o nevadas, están aumentando en intensidad y frecuencia en los Estados Unidos (Figura 2) y el mundo. Se espera que estas tendencias continúen. Los mayores cambios observados en las precipitaciones extremas en los Estados Unidos se han producido en el noreste y el medio oeste.

Fig. 2. Cambios porcentuales en la cantidad de precipitación que cae en eventos muy fuertes (el 1% más fuerte) de 1958 a 2016 para los Estados Unidos sobre una base regional. Existe una clara tendencia nacional a que una mayor cantidad de precipitación se concentre en eventos muy fuertes, particularmente en el Noreste y Medio Oeste. Crédito: actualizado de NCA3; CSSR, capítulo 7, USGCRP

Olas de calor. Las olas de calor se han vuelto más frecuentes en los Estados Unidos desde la década de 1960, mientras que las temperaturas extremadamente frías y las olas de frío se han vuelto menos frecuentes. Se proyecta que los años cálidos que han establecido récords recientes se volverán comunes en el futuro cercano para los Estados Unidos a medida que las temperaturas medias anuales sigan aumentando.

Incendios forestales. La incidencia de grandes incendios forestales en los estados contiguos occidentales de los Estados Unidos y Alaska ha aumentado desde principios de la década de 1980 y se prevé que aumente aún más en esas regiones a medida que el clima se calienta, con cambios profundos en los ecosistemas regionales. La frecuencia de los grandes incendios forestales está influenciada por una combinación compleja de factores naturales y humanos.

Sequía. Las tendencias anuales hacia el deshielo primaveral más temprano y la reducción de la capa de nieve ya están afectando los recursos hídricos en el oeste de Estados Unidos, con efectos adversos para la pesca y la generación de electricidad. Se espera que estas tendencias continúen. En los escenarios de emisiones más altas y suponiendo que no se produzcan cambios en la gestión actual de los recursos hídricos, la sequía hidrológica crónica y de larga duración es cada vez más posible antes de finales de este siglo.

Las sequías recientes y las olas de calor asociadas han alcanzado una intensidad récord en algunas regiones de EE. UU. Hasta este momento, el informe señala que evaluar el efecto humano en las recientes sequías importantes de EE. UU. es complicado. Se encuentra poca evidencia de una influencia humana en los déficits de precipitación observados, pero se encuentra mucha evidencia de una influencia humana en los déficits de humedad de la superficie del suelo debido al aumento de la evapotranspiración causada por temperaturas más altas.

Huracanes. Los procesos físicos sugieren, y las simulaciones de modelos numéricos generalmente lo confirman, un aumento en la intensidad de los ciclones tropicales en un mundo más cálido, y los modelos del sistema terrestre generalmente muestran un aumento en el número de ciclones tropicales muy intensos. Para los huracanes del Atlántico y el este del Pacífico Norte, se proyectan aumentos en las tasas e intensidad de precipitación. Se prevé que la frecuencia de la más intensa de estas tormentas aumente en el Atlántico y el Pacífico Norte occidental y en el Pacífico Norte oriental.

Ríos atmosféricos. Estas corrientes estrechas de humedad representan entre el 30% y el 40% de la capa de nieve y la precipitación anual típicas en la costa oeste de EE. UU. También están asociados a eventos de inundaciones graves cuando pierden su humedad. La frecuencia y gravedad de los ríos atmosféricos que tocan tierra aumentará porque el aumento de las temperaturas aumenta la evaporación, lo que da como resultado concentraciones más altas de vapor de agua atmosférico.

Un destino que depende de las emisiones

La magnitud del cambio climático más allá de las próximas décadas dependerá principalmente de la cantidad de gases de efecto invernadero (especialmente dióxido de carbono) emitidos a nivel mundial. Y sin recortes significativos de las emisiones, es casi seguro que las temperaturas globales medias anuales se eleven por encima de los 2°C a finales de siglo.

“Las decisiones que se tomen hoy determinarán la magnitud de los riesgos del cambio climático más allá de las próximas décadas.”En otras palabras, el objetivo frecuentemente dicho de mantener el cambio de temperatura promediado globalmente en este nivel o por debajo de este para minimizar los impactos potenciales en los seres humanos y los ecosistemas sólo se puede lograr mediante reducciones sustanciales de las emisiones antes de 2040: las decisiones que se tomen hoy determinarán la magnitud de los riesgos de  cambio climático más allá de las próximas décadas.

Con reducciones significativas en las emisiones, el aumento en la temperatura global promedio anual podría limitarse a 3.6°F (2°C) o menos. La figura 3 muestra los cambios proyectados en la temperatura de EE. UU. para dos posibles escenarios futuros.

La ciencia está en esto, y el CSSR la documenta de una manera tan completa como reveladora. También proporciona información importante para el desarrollo de otras partes de la NCA4, que se centrarán principalmente en el bienestar humano, los elementos sociales, económicos y ambientales del cambio climático. Está previsto que el volumen II de la NCA4, con énfasis en los impactos del cambio climático, se publique a finales de 2018. (Editor: El Volumen II se publicó en 2018, y la Quinta Evaluación Nacional del Clima (NCA5) está actualmente en proceso, con entrega anticipada en 2023.)

Fig. 3. Cambios proyectados en las temperaturas medias anuales (°F) para América del Norte bajo dos vías de concentración representativas (RCPs, por sus siglas en inglés) identificadas en el Quinto Informe de Evaluación del Panel Intergubernamental sobre Cambio Climático. Los RPCs son trayectorias de concentración de gases de efecto invernadero, llamadas así porque representan el cambio en los valores de forzamiento radiativo (por ejemplo, +4.5 vatios por metro cuadrado) modelados para 2100 en relación con la época preindustrial. Aquí se muestra la diferencia entre las temperaturas medias para mediados de siglo (superior) (2036–2065) y finales de siglo (inferior) (2071–2100) y las temperaturas medias para el presente cercano (1976–2005). Cada mapa representa la media multimodelo ponderada. Los incrementos son estadísticamente significativos en todas las áreas (es decir, más del 50% de los modelos muestran un cambio estadísticamente significativo y más del 67% están de acuerdo con el signo del cambio). Los análisis se basan en análisis a escala reducida de los modelos del Proyecto 5 de intercomparación de modelos acoplados. Crédito: CSSR, capítulo 6, LOCA CMIP6 Agradecimientos

La redacción del CSSR requirió el esfuerzo concertado de un equipo de grandes, diversos y experimentados  científicos climáticos de todo Estados Unidos que trabajaron durante muchos meses. El USGCRP brindó organización y orientación para el proceso general, la NOAA supervisó como agencia líder y los Centros Nacionales de Información Ambiental de la NOAA brindaron apoyo técnico, editorial y de producción para los borradores del documento y el producto final. Agradecemos las revisiones independientes del público y las realizadas por la Academia Nacional de Ciencias.

Hydrothermal Vents May Add Ancient Carbon to Ocean Waters

EOS - Wed, 07/07/2021 - 11:49

Earth’s oceans play a pivotal role in the global carbon cycle. As seawater moves and mixes, it stores and transports huge amounts of carbon in the form of dissolved organic and inorganic carbon molecules. However, the various sources and fates of marine dissolved organic carbon (DOC) are complex, and much remains to be learned about its dynamics—especially as climate change progresses.

Carbon isotope ratios can help determine the age of DOC, which gives clues to its source and journey through the carbon cycle. Photosynthetic organisms in surface waters are thought to produce most marine DOC, but radiocarbon dating shows that marine DOC is thousands of years old, so more information is needed to clarify how it mixes and lingers in the ocean.

Relying on radiocarbon dating of seawater samples collected during a research cruise in 2016–2017, Druffel et al. provide new insights into DOC dynamics in the eastern Pacific and Southern Oceans. Their investigation lends support to a hypothesis that hydrothermal vents could be an important source of DOC in this region.

While traveling south aboard NOAA’s R/V Ronald H. Brown, the researchers collected seawater samples at multiple sites, including from a station near Antarctica to a site off the Pacific Northwest. Parts of their path followed the East Pacific Rise, a key area of hydrothermal activity off the west coast of South America.

Radiocarbon dating of the samples enabled construction of a profile of isotopic ratios found in both DOC and dissolved inorganic carbon (DIC) at various depths for each site studied. Analysis of these profiles showed that both forms of dissolved carbon age similarly as they are transported northward in deep waters. According to the authors, this suggests that northward transport is the main factor controlling the isotopic composition of both DOC and DIC in these deep waters.

Meanwhile, the radiocarbon data indicate that hydrothermal vents associated with the East Pacific Rise may contribute ancient DOC to ocean waters. In line with earlier research, the findings suggest the possibility that chemoautotrophic microbes at these vents may “eat” DIC from ancient sources, converting it into DOC that is released into the ocean.

Further research will be needed to confirm whether hydrothermal vents indeed contribute significant amounts of ancient DOC to seawater, affecting its isotopic composition. If so, models of global ocean circulation may need to be adjusted to account for that contribution. (Geophysical Research Letters, https://doi.org/10.1029/2021GL092904, 2021)

—Sarah Stanley, Science Writer

Half of the IPCC Scenarios to Limit Warming Don’t Work

EOS - Wed, 07/07/2021 - 11:47

Through the 2015 Paris Agreement, nearly 200 state parties collectively aspired to limit global warming to 1.5°C above preindustrial levels. Even compared to 2°C of warming, meeting this goal would significantly curtail the extent of heat waves and other extremes induced by rising temperatures. But by 2017, the world had already reached 1°C above preindustrial levels and is projected to hit 1.5°C in 2040 with the current pace of warming.

How can we meet such an ambitious target?

The United Nations formally turned to the Intergovernmental Panel on Climate Change (IPCC) to get input from scientists. In response, the IPCC presented in 2018 a special report that included about 50 scenarios that could limit warming to 1.5°C. These mitigation scenarios modify variables including population, consumption of goods and services (including food), economic growth, behavior, technology, policies, and institutions. A new study, published in Environmental Research Letters, finds a problem with these scenarios: Only half can be realistically achieved, and all require the world to take a wide array of very bold actions.

We Need All Options

The study assessed how reasonable the IPCC scenarios were based on the extent to which they include five actions: reducing fossil fuel use, reducing energy use, planting more trees, reducing greenhouse gas emissions besides carbon dioxide (CO2), and removing CO2 from the air to store deep underground.

Shifting away from fossil fuels is vital, but “we have to do something more than that in terms of structural changes, behavior, energy demand, and land demand.”Through their appraisal, the authors found that the only realistic scenarios to limit warming to 1.5°C are ones in which all five options are pursued at full throttle. “We do not have the luxury to discard a few and just rely on the others,” said Elmar Kriegler, Professor for Integrated Assessment of Climate Change at the University of Potsdam, Germany, and a coauthor of the study.

The work is “highlighting some points that people are missing” from the IPCC special report, said Natalie Mahowald, a professor of atmospheric science at Cornell University in Ithaca, N.Y. Mahowald was a lead author of the special report but was not involved in the new study. Shifting away from fossil fuels is vital, she said, but “we have to do something more than that in terms of structural changes, behavior, energy demand, and land demand. I feel like people didn’t really understand that” when the special report was published.

For example, the required reductions in global energy use “are profound,” Mahowald said. “We did not do them under COVID,” she pointed out, when global CO2 emissions dropped by only around 6%.

“Questionably Optimistic” Assumptions

Many modeled scenarios rely too much on bioenergy with carbon capture and storage.In the paper, the authors identify several scenarios that include “questionably optimistic” technology deployments and behavioral shifts. “The underlying assumptions which have been made in the report are not always realistic or feasible,” said Daniela Jacob, director of the Climate Service Center Germany, a coauthor of the study, and a coordinating lead author of the special report. Most prominently, she said, many modeled scenarios rely too much on bioenergy with carbon capture and storage, or BECCS. This strategy entails growing crops (or harvesting scraps) to burn for energy, trapping the resulting CO2, and strategically storing it deep underground. Because the plants pulled CO2 from the air as they grew, the net effect is long-term CO2 removal.

At the time of the special report, the scenarios—most of which were created using integrated assessment models—“had difficulty limiting energy demand and dropping CO2 emissions quickly enough,” said Heleen de Coninck, a professor at Eindhoven University of Technology, Netherlands, and a special report coordinating lead author who was not involved in the new study. “There was this option available: BECCS that was producing electricity and it was giving you negative emissions. The models were overusing it, some models more than others,” she said. Alternative technologies like direct air capture of CO2 will likely take some of the spotlight from BECCS in future IPCC reports as the science advances.

More to Consider

Although the paper has value in “narrowing down the options,” said de Coninck, it avoids some important conversations that limit its scope. Notably, competition between strategies like reforestation and BECCS over land resources is left out, a factor that would likely further reduce feasibilities.

Still, the paper makes bold, yet warranted, conclusions that the IPCC, with its stance to be “policy-neutral, never policy-prescriptive,” cannot. Jacob agrees that the IPCC should not be policy prescriptive, but “it’s very clear that we have to act now,” she said. Outside the IPCC, “we cannot shy away from absolute statements on feasibility and on urgent needs.”

These absolute statements aim to motivate substantive debate about how to meet the Paris Agreement’s target. But that debate needs to happen quickly if the world hopes to limit warming to 1.5°C. After a couple more years of current emission rates, Kriegler said, “our attainability statement that it’s still possible is going to disappear.”

—Jordan Wilkerson (@JordanPWilks), Science Writer

Elliott Receives 2020 John Wahr Early Career Award

EOS - Wed, 07/07/2021 - 11:45
Citation John R. Elliott

In his research, John Elliott focuses on using interferometric synthetic aperture radar (InSAR) and other geodetic methods for advancing knowledge of earthquake cycle deformation and tectonics. His papers range from studying tiny ground displacements associated with stress buildup between large earthquakes to mapping and modeling meter-scale ground ruptures in major seismic events. What sets John’s studies apart is his insightful integration of geodetic results with good knowledge of local geology and tectonics, resulting in papers that are more complete and go further than usual. Some of these studies are well known in the community for this reason, such as his papers on the Christchurch, Gorkha, and Van earthquakes. In addition to his outstanding contributions to the field, he also addresses societally important questions in his work related to hazard and risk.

John possesses excellent communication skills, and like many others, I always look forward to his inspiring presentations at conferences. They are packed with interesting information, and still he uniquely manages to clearly explain complex topics, hypothesis testing, alternative ideas, and in-depth implications of the results. He is clearly passionate about his work and keeps his audiences easily engaged with his humor, enthusiasm, and high-octane presentation style. His interest in geodesy, earthquakes, and tectonics also translates into private discussions and meetings, and surely into the classroom as well.

John has been very active in the geodetic community, both in England and beyond, working tirelessly in committees, at conferences, and as a lecturer at workshops. When I was associate editor for the Journal of Geophysical Research, John was my favorite reviewer. He finished his reviews early, was exceptionally thorough and detailed, yet fair, and provided excellent comments and constructive suggestions for improvement.

Given his many achievements and contributions to our community, I am glad that the AGU Geodesy section has recognized John with the 2020 John Wahr Early Career Award.

—Sigurjón Jónsson, King Abdullah University of Science and Technology, Thuwal, Saudi Arabia

 

Response

I am very grateful to have been selected for the John Wahr Early Career Award and for the kind words and time Sjonni has made in providing his citation. I also thank the Geodesy section award panel for their support and recognition in choosing my nomination. I appreciate that this is often a difficult decision, and I know there are many suitable candidates in the field for this award, so I am humbled to have been chosen this year.

The pursuit of scientific inquiry and its endeavors are increasingly collaborative and team based. My research has been possible only through the skill and strength of the many collaborators I have worked with across the world. I therefore thank the many excellent partners I have had the pleasure to collaborate with and learn from in the past 15 years, as well as my research group more recently.

Being able to succeed in science requires the trinity of aptitude, hard work, and luck, with the greatest of these, I feel, being luck. And by luck, this often means opportunity. I have been fortunate to have been given great opportunities to pursue my lines of research. In particular, I must thank the mentors, supervisors, and advisers who enabled me to develop as a scientist and who provided those key opportunities. These favorable circumstances have enabled me to stay within science, despite almost taking a different fork in the road on at least three occasions. My greatest thanks go to Susanna Ebmeier for her support over more than a decade with advice, acting as a sounding board and providing ideas.

—John R. Elliott, University of Leeds, Leeds, U.K.

Mostafavi Receives 2020 Fred L. Scarf Award

EOS - Wed, 07/07/2021 - 11:44
Citation Parisa Mostafavi

Parisa Mostafavi completed an outstanding Ph.D. thesis titled “Shock waves and nonlinear plasma waves mediated by pickup ions and energetic particles” in the Department of Space Science at the University of Alabama in Huntsville. A question of long-standing interest is how energetic particles, whether solar energetic particles, pickup ions, or anomalous and galactic cosmic rays, mediate the structure of shock waves as they are energized via diffusive shock acceleration. The problem is important to space weather because some very fast, strong interplanetary shocks are completely dominated by the energy of the accelerated particles, often rendering the character of the shock quite different from classical magnetohydrodynamic shocks. Parisa examined the foundations of shock structure, developing a theoretical description that accounted for the energetic particles and their coupled feedbacks to the background thermal plasma and fields. Parisa’s model described accurately the structure of the Voyager 2 observed TS-3 heliospheric termination shock crossing, including the preferential heating of pickup ions. In another major contribution, Parisa pointed out, much to the surprise of many in the community, that the very local interstellar medium (VLISM) is collisional on scales of interest to the Voyager observations, unlike the collisionless interplanetary medium. Hence, collisional dissipative processes are important for weak VLISM shocks. The puzzling observations of unusually broad weak shocks transmitted from the solar wind into the VLISM observed by Voyager 1 were explained by Parisa as a natural consequence of collisional dissipation (primarily the collisional heat flux associated with proton–proton collisions). Parisa was able to describe the structure, scalings, and properties of the interstellar shocks observed by Voyager 1. This paper is establishing a new paradigm for the physics of the VLISM. Parisa’s thesis resulted in nine refereed journal papers, of which she was first author on four, and seven refereed conference papers.

—Gary P. Zank, Department of Space Science and Center for Space Plasma and Aeronomic Research, University of Alabama in Huntsville

 

Response

I would like to thank the award committee and the AGU Space Physics and Aeronomy section for selecting me for this year’s Fred L. Scarf Award. I am deeply honored to receive it. I am grateful to many people in my life. Specifically, I would like to express my deepest gratitude to my Ph.D. adviser, Dr. Gary Zank, for his continuous support. He always generously found time in his busy schedule to help me whenever I needed it. He gave me excellent guidance on my research and taught me to work hard. During the last year of my Ph.D., I had the privilege of working with the space physics group at Princeton University. I am thankful to Dr. Dave McComas for this opportunity. He taught me how to work through research challenges and use my time wisely. I would like to extend my thanks to Dr. Len Burlaga for supporting my work and giving me the opportunity to collaborate with him. I want to thank Dr. Peter Hunana, Dr. Eric Zirnstein, and Dr. Laxman Adhikari for their valuable discussions. I also owe many thanks to my instructors and colleagues in the space physics department of the University of Alabama in Huntsville. Finally, I wish to thank my family for their unconditional love and support. Now I am a postdoc working at the Johns Hopkins University Applied Physics Lab and looking forward to many more new research experiences.

—Parisa Mostafavi, Johns Hopkins University Applied Physics Laboratory, Laurel, Md.

Improved Seismic Imaging Via Optimal Transport Theory

EOS - Wed, 07/07/2021 - 11:30

Wide-angle seismic refraction profiles are commonly undertaken to image crust and uppermost mantle structure.  Seismic waveform data like those above (black phases in panels above) encode variations in subsurface velocity and density but are typically band limited. As a consequence, traditional inversion approaches are highly susceptible to cycle-skipping, a manifestation of nonlinearity in the inverse problem.

Górszczyk et al. [2021] employ a new inverse formulation based on optimal transport theory to mitigate this nonlinearity and apply it to observations from the Nankai Trough in Japan. They demonstrate successful solution recovery (blue phases are predicted data from solution in lower panel) even when the initial model is far from the solution (red/blue phases are predicted data from initial model).

Citation: Górszczyk, A., Brossier, R., & Métivier, L. [2021]. Graph-space optimal transport concept for time-domain full-waveform inversion of ocean-bottom seismometer data: Nankai Trough velocity structure reconstructed from a 1D model. Journal of Geophysical Research: Solid Earth, 126, e2020JB021504. https://doi.org/10.1029/2020JB021504

—Michael Bostock, Editor, JGR: Solid Earth

Turner and Willis Receive 2020 James R. Holton Award

EOS - Tue, 07/06/2021 - 12:20
Citation for Alexander Turner Alexander Turner

Alexander Turner receives the James R. Holton Award for his groundbreaking contributions to atmospheric sciences, including advances in atmospheric chemistry, climate, and the carbon cycle.

Atmospheric methane has proven a challenge to interpret, and definitive answers to why methane has gone through periods of growth, stabilization, and growth in the past 30 years have proven elusive. Historically, the primary atmospheric loss mechanism, the hydroxyl radical (OH), has been treated as effectively constant in time (supported by inferred stability in OH from methyl chloroform observations), and thus changes in atmospheric growth have been linked and attributed to changes in sources. Alex demonstrated that OH levels could have shifted in conjunction with changes in atmospheric methane while still being consistent with methyl chloroform observations. This changes the perspectives of recent atmospheric variability, as it elegantly illustrates that subtle changes in OH could explain some changes in atmospheric methane and that variable OH must be considered as we move forward.

While Alex has built on this work in expected directions (continuing analysis of recent changes in methane), he has also demonstrated powerful lateral thinking that has led to significant insights. He has considered how OH may vary on different timescales and in correlation with different climate/weather features. This culminated in a model study covering a 6,000-year period linking variability in OH to the El Niño–Southern Oscillation (ENSO). This work linked climate and chemistry in a manner not previously considered and has implications for how we consider and interpret contemporary methane.

Alex’s body of work is extensive, and beyond methane he has made contributions to mathematical methods for inverse modeling as well as developing new approaches combining multiple data streams to infer photosynthesis from space-based observations.

It is a pleasure to present the James R. Holton Award to Dr. Alexander Turner.

—Eric Kort, University of Michigan, Ann Arbor

 

Response

I am deeply honored to receive this award. Speaking frankly, I was shocked when I received an email about it from AGU. It would have been flattering to know that I was nominated, let alone to receive this award. I never had the honor of meeting Jim Holton, but his impact on the field is obvious to all. I distinctly remember getting a copy of his textbook as a first-year graduate student; it felt like the first step toward becoming an atmospheric scientist. Receiving an award bearing his name is flattering, truly.

This award is particularly special to me because among the many prominent past recipients, my former undergraduate research adviser, Arlene Fiore, was the second to receive it. She is someone whom I deeply admire and one of the people who inspired me to pursue a career in atmospheric science. Being included on a short list with so many luminaries in the field is simply humbling.

There is a long list of brilliant and passionate scientists who have influenced me along the way. A few who stand out are Daven Henze for introducing me to research—I would not be an atmospheric scientist had I not met him as an undergraduate; Daniel Jacob for his unwavering support as I stumbled and grew through my dissertation; Ron Cohen for the freedom to explore an eclectic set of topics and invaluable feedback; and Inez Fung for continually pushing me to ask interesting questions. As I prepare to start my career, I hope I can emulate a fraction of those great scientists I learned from and support young scientists as they pursue their studies in atmospheric science.

—Alexander Turner, University of Washington, Seattle

 

Citation for Megan D. Willis Megan D. Willis

Megan Willis receives the James R. Holton Award for her groundbreaking contributions to atmospheric science, in particular, the importance of aerosol composition in remote and polluted environments.

Megan’s doctoral research made extensive use of an aerosol mass spectrometer, and early in her degree she led experiments in which a new version of the instrument, which included the ability to measure soot-containing particles, was characterized. Using measurements collected from a roadside site, Megan was able to quantify the mixing state of black carbon (soot) from traffic, with important implications for air quality and climate. She also collaborated on a number of studies in which the instrument was used in a laboratory, to probe the impacts of oxidation on soot particles, and in the field, to characterize carbonaceous particles in an industrially influenced boreal region.

Megan’s most significant contributions were made during her participation in NETCARE (Network on Climate and Aerosols: Addressing Key Uncertainties in Remote Canadian Environments). Megan published two first-author articles demonstrating the importance of biogenic compounds for the growth of small particles in the Arctic atmosphere during the summer. Those studies reported the first observations of the growth of newly formed particles in the Arctic marine environment and identified secondary chemistry as a likely contributor. Her work provides new information about how local ocean–atmosphere feedbacks via atmospheric chemistry and aerosol–cloud interactions may change as the Arctic climate system warms. A subsequent paper focused on disentangling the many factors contributing to aerosol loading during springtime in the Arctic—when the impact of lower latitudes on the region is more significant. Given her extensive work in this area, Megan is now one of the community’s experts on the processes affecting Arctic aerosol composition, and she published a review article on the subject in 2018.

—Jennifer Murphy, University of Toronto, Toronto, Ont., Canada

 

Response

I’d like to begin by expressing my gratitude to the AGU Atmospheric Sciences section for this award. It is both humbling and inspiring to receive an award bearing John Holton’s name, and I aspire to live up to his legacy as a scientist, mentor, teacher, and community member. I’m also humbled to receive this award alongside Alex Turner, whose work I admire.

I am fortunate to be supported by an incredible community of scientists and mentors. I’ve had the opportunity to both learn from and work with many kind, smart, and passionate people around the world in my short career. Our field is becoming increasingly interdisciplinary, and so I can’t imagine working without the collaboration and generosity inherent to this community.

While many people have supported and guided me along the way, I’d like to express my gratitude to a few people explicitly: Erik Krogh and Chris Gill for igniting my enthusiasm for environmental chemistry; my Ph.D. adviser, Jon Abbatt, for opening my awareness to a wide world of questions in atmospheric chemistry, for his tireless support, and for always gently pushing me farther than I thought I could go; and Kevin Wilson for giving me the opportunity to think about atmospheric chemistry from a different perspective and for putting up with me, a fieldwork person, while I tried to learn laboratory physical chemistry.

Finally, I am deeply grateful to Kevin Worthington and the rest of my family for their unconditional support. I would be nowhere without them.

—Megan Willis, Colorado State University, Fort Collins

Klotzsche Receives 2020 Near-Surface Geophysics Early Career Achievement Award

EOS - Tue, 07/06/2021 - 12:19
Citation Anja Klotzsche

It is a great pleasure to cite Anja Klotzsche as the inaugural winner of the AGU Near-Surface Geophysics Early Career Achievement Award. Dr. Klotzsche’s contributions are remarkable because they combine theoretical methods development with meticulous and creative applications to a range of geological, hydrogeological, and biogeological problems. She brought cross-borehole ground-penetrating radar (GPR) data analysis from ray tracing into full-waveform inversion. Her work overcame both theoretical challenges and significant practical hurdles for dealing with real borehole data. Full-waveform inversion offers significantly higher resolution, facilitating a decimeter-scale resolution of the subsurface that opens the door to a range of problems waiting to be solved. The value of the full-waveform inversion was quickly recognized internationally. Through a series of collaborations, Dr. Klotzsche has demonstrated the impact of the method on questions related to flow in porous media, peatland processes, agricultural monitoring, Mars analogue soils, and more, through both borehole and surface GPR. Remarkably for an early-career investigator, Dr. Klotzsche has cosupervised the work of 11 Ph.D. students and nine M.S. students. Many of her recent papers share student coauthorship. On top of her exceptional collaborations and mentoring, she has been a steady and active contributor to the near-surface geophysics community, within both AGU and the Society of Exploration Geophysicists. Her impact is a testament to her remarkable ability to solve both theoretical and practical problems and to collaborate productively with investigators from around the globe.

—Sarah Kruse, University of South Florida, Tampa

 

Response

Thank you, Sarah, for the very kind citation and nomination for the Near-Surface Geophysics Early Career Achievement Award. I am truly honored to receive this award and deeply grateful to Sarah, AGU, and the near-surface geophysics community. Throughout my scientific career, I have had the great chance to be inspired by and to work with great scientists, mentors, collaborators, and friends who guided me and shaped my working life. I would not have received this award without many of them, and I am sorry that I can name here only a few.

Already during my master’s studies, I was blessed with a great supervisor and mentor, Jan van der Kruk. While working with him on my master’s thesis, I got introduced into the concepts of hydrogeophysics, GPR, and the full-waveform inversion. I was so fascinated by these topics that I never left sight of them, and they are now the fundaments of my career. During my Ph.D. and postdoc time at the Forschungszentrum Jülich, Jan, and also Harry Vereecken, always encouraged me to think big, link disparate fields, and understand processes, but also to maintain a work–life balance. At the Agrosphere Institute, I always found great colleagues and an inspiring environment to broaden my understanding of different fields. Furthermore, I had the chance to visit other labs and universities as a visiting scientist. These visits allowed me to extend and strengthen my research and cooperation and to broaden the field of applications for the GPR full-waveform inversion. It was especially Peter Annan, Andrew Binley, John Bradford, Tony Endres, Antonios Giannopoulos, Susan Hubbard, Rosemary Knight, Sarah Kruse, Niklas Linde, Majken Looms, Lars Nielsen, and Craig Warren who tremendously inspired me in my career. To everyone I have been fortunate enough to work with, to interact with, to exchange ideas with, and to the entire near-surface community: Thank you!

—Anja Klotzsche, Agrosphere IBG-3, Forschungszentrum Jülich, Jülich, Germany

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer