EOS

Syndicate content
Earth & Space Science News
Updated: 4 hours 27 min ago

Integrating Landscape Terrestrial and Aquatic Carbon Fluxes

Fri, 10/11/2019 - 11:53

Developing effective strategies to mitigate global warming elicits an urgent need to understand and accurately quantify carbon exchange among the atmosphere, continents, and oceans. Last May, 15 scientists from different disciplines gathered in Montreal to discuss how best to integrate carbon fluxes in aquatic and terrestrial ecosystems into regional and global carbon budgets.

Fig. 1. The workshop gathered a complementary mix of 15 terrestrial and freshwater ecologists, biogeochemists, and physical scientists with expertise in forest, grassland, wetland, river, and lake ecosystems, including scientists who work at the interface between terrestrial and aquatic systems (e.g., hydrological connectivity between soils and streams). The mix of terrestrial scientists featured a methodological balance between scientists who use top-down (e.g., eddy covariance flux towers) and bottom-up (e.g., mechanistic models) approaches.

Land-atmosphere carbon exchange is among the most uncertain components of the global carbon cycle. The reason for this is that continental landscapes are made up of a heterogeneous mosaic of elements like forests, wetlands, inland waters, and other environments that each have their own ecosystem properties and processes. This complexity has led to conceptual compartmentalization in landscape carbon budgets, where each landscape element is treated independently of the others.

In nature, however, terrestrial and aquatic elements of the landscape are interconnected and dependent on each other. Thus, the current compartmentalized perspective misses integral aspects of the structure and functioning of the landscape and therefore poses a risk of biased estimates of land-atmosphere carbon exchange when such estimates are based on terrestrial biosphere models or carbon inventory changes.

Acknowledging Ecosystems’ Interconnectivity

Current frameworks do well at representing the different landscape elements that contribute to carbon exchange, yet the frameworks mostly neglect the elements’ interdependence.Workshop discussions revolved around a central question: Do existing carbon exchange frameworks effectively capture the heterogeneous nature of the landscape and the hydrological connectivity between landscape elements? The participants reached the conclusion that current frameworks do well at representing the different landscape elements that contribute to carbon exchange, yet the frameworks mostly neglect the elements’ interdependence.

In particular, whereas regional assessments of land-atmosphere carbon exchange are starting to represent the role of inland waters as net emitters of carbon to the atmosphere, inland waters’ role as ecosystem connectors that reallocate carbon across the landscape is widely overlooked.

For instance, terrestrial models track carbon loss in forests and wetlands but may not distinguish between direct losses to the atmosphere and losses to inland waters. Thus, some of the carbon emissions that models assume to come from terrestrial ecosystems are actually emitted from inland waters. These factors imply that if modeled terrestrial carbon emissions are simply added to measured emissions from inland waters, we run the risk of accounting twice for the same carbon emissions in the landscape.

Furthermore, ignoring the interconnectivity and dependence of different landscape elements limits our ability to understand and predict potential future feedbacks, such as the impact of land use change or wildfires on the terrestrial carbon that is lost to inland waters.

Thinking Holistically

Future research should work toward a more holistic framework that effectively integrates all landscape elements and how they interact with each other.Workshop attendees agreed that future research should move away from compartmentalized approaches that treat landscape elements independently and work toward a more holistic framework that effectively integrates not only all landscape elements but also how they interact with each other. This requires increased synthesis between different terrestrial and aquatic research disciplines (Figure 1). Attendees also recommended using a hydrologically defined landscape (e.g., watershed) to facilitate the representation of the lateral movement of carbon from terrestrial through aquatic ecosystems.

The workshop offered a unique opportunity for aquatic and terrestrial scientists to work on a common terminology, identify data needs, and define research challenges and opportunities across disciplines.

The workshop was partly financed by the Natural Sciences and Engineering Research Council of Canada and Hydro-Québec’s Carbon Biogeochemistry of Boreal Aquatic Systems Industrial Research Chair, with further support from various instances of the Université du Québec à Montréal. This is a collaboration of the International Boreal Forest Research Association and the International Federation of Boreal Aquatic Research.

Author Information

Pascal Bodmer, Joan P. Casas-Ruiz (jpcasasruiz@gmail.com), and Paul A. del Giorgio, Groupe de Recherche Interuniversitaire en Limnologie, Département des Sciences Biologiques, Université du Québec à Montréal, Canada

Deforestation Could Exacerbate Drought in the Amazon

Thu, 10/10/2019 - 11:56

In August, the skies over São Paulo turned black with the smoke of tens of thousands of fires burning through the Brazilian Amazon thousands of kilometers away. Experts quickly linked the fires to deforestation practices in which rain forest is razed and set ablaze to clear land for crops or livestock. Indeed, a September study showed significant overlap between the 125,000 hectares of forest that were cleared in early 2019 and where fire “hot spots” appeared in the summer.

Much of the international outcry focused on the impact the fires would have on the Amazon’s role as a carbon sink. As the largest rain forest in the world, the Amazon accounts for a quarter of all greenhouse gases absorbed by the world’s forests every year.

But the Amazon also has a critical role in Earth’s water cycle, releasing water vapor into the atmosphere that can travel hundreds or thousands of kilometers before falling to the ground.

A new study finds that converted land is much less efficient at supplying this atmospheric river than intact rain forest. This reduced efficiency is most evident during droughts, which are expected to become longer and more frequent as climate change progresses.

The study, published in Ecohydrology, considered what impact land use changes might have on the local energy balance—the processes by which trees and other vegetation either reflect energy from the Sun or turn it into heat or water vapor. Land use changes might include the conversion of rain forest to agricultural land or pasture.

“It’s a really nice contribution that lets us understand how ecology matters for these energy balance processes and the sensitivity of the forest to climate change,” said Scott Stark, an assistant professor at Michigan State University who was not involved in the study.

A Closer Look

The researchers used satellite observations of both intact and disturbed rain forest areas in Rondônia, a state in northeastern Brazil. Rondônia falls within the “arc of deforestation,” where human settlements and activities are putting increased pressure on natural ecosystems.

The fish bone pattern of small clearings along new roads is the beginning of one of the common deforestation trajectories in the Amazon. Credit: NASA

Though researchers have used satellite imagery to study the Amazon in the past, most of the images produced have a very coarse resolution—on the scale of 500 meters up to 1 kilometer. What sets this study apart is the high spatial resolution, according to Stark.

Here the researchers used data collected by a satellite-based sensor, the Advanced Spaceborne Thermal Emission and Reflection Radiometer, which allowed the team to zoom in on the landscape down to 15 meters.

Even with the increased resolution, getting clear images of the forest in the region is a challenge. “It’s very hard to get data from this sensor in the Amazon rain forest because of cloud cover,” said Gabriel de Oliveira, a postdoctoral researcher at the University of Kansas and lead author on the new study. “We were lucky to find images.”

The team compared the satellite data to measurements from a flux tower on the ground. A flux tower is a structure built up through the forest canopy that allows researchers to track meteorological conditions and the exchange of both water and carbon dioxide between the forest and the atmosphere.

The team found that cropland and pastures tended to have higher soil and air temperatures, which could exacerbate drought conditions, according to de Oliveira. The results also showed that forested areas had roughly 3 times higher rates of evapotranspiration—the process by which water evaporates from soil and leaves.

These findings could have serious implications for precipitation around the world. “All the water that the forest is pumping back into the atmosphere will go to the equator first, the tropics, and then will be released all over the world,” de Oliveira said. “So if you have this decrease in evaporation, or in the water that comes back to the atmosphere, you have a problem in the flux of water in the tropics, the equator, and maybe reaching the polar region.”

How Deforestation Exacerbates Drought

Evapotranspiration rates in primary forests were higher during the drought than the wet year, suggesting that old-growth forests in this region may be quite resistant to drought.The team also looked at how drought might affect these energy balance processes in both intact forest and converted land, comparing a relatively wet year to “one of the worst droughts that’s ever been observed in the Amazon,” according to Stark. The results showed that evapotranspiration rates in primary forests were higher during the drought than the wet year, suggesting that old-growth forests in this region may be quite resistant to drought.

So even during dry spells, the rain forest continues pumping water into the atmosphere. But the same cannot be said of croplands and pastures. Deforested land converted more of the Sun’s energy to heat, effectively aggravating the drought conditions.

That’s in line with previous research linking deforestation and drought in the region. “There have been these megadroughts in the agricultural regions in southern Brazil and northern Argentina, and people have already linked that with some reasonable certainty to deforestation,” Stark said, “because you just lose this ability to have that conveyor belt of water that can take rainfall and redistribute it.”“Some models consider all of the Amazon as forest, but it’s not. The biome nowadays is very degraded. You have a lot of pasture areas, agriculture areas, and this has to be taken into consideration in these models; otherwise, our representations are wrong.”

Now researchers have a better sense of the connection between deforestation and drought on an ecological scale. That’s a critical insight for researchers wondering what’s in store for the Amazon as global warming progresses and droughts in the region likely become longer, more intense, and more frequent.

The main takeaway for de Oliveira is that the study could help improve regional and global models of water fluxes in the Amazon. “Some models consider all of the Amazon as forest, but it’s not,” he said. “The biome nowadays is very degraded. You have a lot of pasture areas, agriculture areas, and this has to be taken into consideration in these models; otherwise, our representations are wrong.”

Ultimately, the effects of water flux in the Amazon will ripple beyond the rain forest and around the world. It’s a “major concern,” Stark said, “in terms of the ability for the basin to continue its key function as a hydrological pump that cools the global atmosphere.”

—Kate Wheeling (@katewheeling), Freelance Writer

Celebrating Scientists and Other News of the Week

Thu, 10/10/2019 - 11:53
51 Peg b Wins the Nobel.

Congratulations to Michel Mayor & Didier Queloz being jointly awarded the 2019 #NobelPrize in Physics!

51 Pegasi b for the win! https://t.co/0M0xY46IXE

— AGU’s Eos (@AGU_Eos) October 8, 2019

At long last! This year’s Nobel Prize in Physics is partly awarded to the two researchers who discovered the first exoplanet around a Sun-like star, 51 Pegasi b. How exciting to see the field that I studied in grad school win a Nobel Prize! This informative thread (below) goes into the background of the “first” exoplanet and gives a bit more context.

A hearty congratulations to Michel Mayor & Didier Queloz, for kickstarting the field that I’ve built my career in! Their discovery of 51 Peg b happened in my senior year of high school, and I started working in exoplanets in 2000, when ~20 were known.

A thread:

— Jason Wright (@Astro_Wright) October 8, 2019

—Kimberly Cartier, Staff Writer

 

One of the best social media genres is sharing the joy of a scientist having their life’s work validated. (My favorite remains Andrei Linde being surprised at his door by a colleague with champagne to tell him they’d found evidence for his theory of cosmic inflation.) This year, let’s enjoy Michel Mayor taking a break from a speaking tour to find messages flooding in about his Nobel Prize for his exoplanet discovery with Didier Queloz.

New laureate Michel Mayor was on a lecture tour in Spain when he heard the news about his #NobelPrize in Physics.

Here Mayor is in the cafeteria of San Sebastian airport, looking at all the messages flooding in! pic.twitter.com/NCYcgZYUXx

— The Nobel Prize (@NobelPrize) October 8, 2019

—Heather Goss, Editor in Chief

 

Scientists Pour onto the Arctic Ice (So Does Their Whiskey). I’ve been happily following along with the MOSAiC Expedition in the Arctic, and I like this reporter’s on-the-scene description of sea ice cocktails and scientists trying to beat the bad weather. Eagerly awaiting future dispatches! —Jenessa Duncombe, Staff Writer

 

Radical warming in Siberia leaves millions on unstable ground. Siberia has warmed substantially more than the global average through the 20th and 21st centuries. This stunning multimedia piece mixes science and storytelling to illustrate the sobering changes—to the landscape, economy, and communities—under way in parts of Siberia as a result of climate change. —Timothy Oleson, Science Editor

 

The Biggest Lie Tech People Tell Themselves—and the Rest of Us. There is no such thing as the natural evolution of technology. —Caryl-Sue, Managing Editor

 

Octopus Dreaming.

This octopus changed colors while she was asleep, enchanting us all with the question, “What could she be dreaming about?!” —Jenessa Duncombe, Staff Writer

Exposing Los Angeles’s Shaky Geologic Underbelly

Thu, 10/10/2019 - 11:51

Los Angeles, Calif., is one of the 10 largest cities in the world that have been historically shaken by damaging earthquakes [Bilham, 2009]. The 1994 magnitude 6.7 Northridge earthquake, for example, sparked fires and collapsed roadways and buildings across the region. And although it caused no significant damage in Los Angeles, shaking from the Ridgecrest earthquake sequence that struck to the city’s north this past July served as a recent reminder of the city’s seismic vulnerability. Little doubt remains whether a future large earthquake will strike this region: The question is only when. Los Angeles therefore holds a special place in our existing understanding of—as well as in efforts to further illuminate—how best to mitigate natural hazards and their impacts on large populations.

The greater Los Angeles area—a megacity by the United Nations’ definition—is the second largest urban area in the United States, one of its fastest growing regions, and the third largest city in the world based on combined statistical area. Here the seismic hazard is driven by the potential proximity of large earthquakes and complicated local structure. Sources of potentially damaging earthquakes in the LA area include the southern San Andreas Fault, located roughly 60 kilometers northeast of the city, as well as the series of faults that lies below the area and just offshore. Meanwhile, the collection of complex sedimentary basins underlying the area is known to amplify the motions from seismic waves [e.g., Graves et al., 2011; Lovely et al., 2006].

Through the ShakeOut scenario, CyberShake, and other similar efforts, scientists are working to improve estimates of the ground shaking that would result from a large earthquake in this region. A plausible event detailed in the original ShakeOut scenario [Jones et al., 2008] is a magnitude 7–8 earthquake on the southern San Andreas Fault that causes large ground motions in downtown Los Angeles. One estimate of the ground motions in such a scenario, based on studying ambient noise correlations (correlations in background seismic signals) between seismic stations located on the San Andreas Fault and in downtown Los Angeles, suggests that these motions could be approximately 4 times larger than those predicted by current numerical simulations [Denolle et al., 2014]. This indicates that our assessments of risk could underestimate the potential damage due to this type of earthquake.

The low seismic velocities and concave shapes of these basins—the San Gabriel and the San Bernardino—tend to trap energy and channel it toward the downtown Los Angeles area, which leads to larger ground motions.The discrepancy between the different methods appears to stem from the fact that the northern basins in the Los Angeles area are not well characterized by the current 3-D seismic velocity models used in the computer simulations. Instead of allowing seismic energy to disperse into the surrounding region, the low seismic velocities and concave shapes of these basins—the San Gabriel and the San Bernardino—tend to trap energy and channel it toward the downtown Los Angeles area, which leads to larger ground motions [e.g., Olsen et al., 2006].

Borehole and seismic reflection data in these basins are sparse, however, in part because oil companies—which have historically collected much of this sort of data—have not explored these basins as extensively as they have the Los Angeles Basin itself. This data shortage makes it difficult to determine precisely the shapes and seismic velocities of the basins, which hampers accurate earthquake hazard assessments. Additional data and improvements in the 3-D seismic velocity model used to simulate ground motions are thus of fundamental importance.

Volunteers Deploy Seismic Sensors

Starting in 2017, we set out to better determine the shapes and seismic velocities of the northern basins. We deployed dense 2-D seismic arrays across the San Gabriel and San Bernardino Basins (Figure 1), along with 20 additional seismic broadband instruments.

Fig. 1. The areas covered by the Basin Amplification Seismic Investigation (BASIN) surveys in the greater Los Angeles region are shown in this map. Sensor array lines are labeled SB for San Bernardino and SG for San Gabriel. Blue solid lines are completed 2017–2019 BASIN surveys, and the blue dotted line is scheduled for completion in late 2019. Open black circles are permanent seismic stations in the Southern California Seismic Network. Red dots are additional broadband stations temporarily deployed in 2018. The black outline marks the San Gabriel and San Bernardino basins. Pink lines are the major faults in the area. The small red polygon shows the location of the tall buildings shown in the photo of downtown Los Angeles above.

These surveys are possible only because of a new type of seismic instrumentation that was first used by oil companies in Los Angeles in 2011.The data from these Basin Amplification Seismic Investigation (BASIN) surveys will be used to construct a 3-D model that should better predict the strong ground motions in downtown Los Angeles from events on the San Andreas Fault. These surveys are possible only because of a new type of seismic instrumentation—a compact, autonomous unit containing a standard 5-hertz three-component geophone, a battery, and a GPS clock—that was first used by oil companies in Los Angeles in 2011 [Lin et al., 2013].

The BASIN surveys described here represent a new type of deployment that might be called “urban seismology.” The strategy we pursued involved installing instruments in linear arrays, with two-person teams deploying 16–20 stations each along portions of instrument lines. The teams were given maps marked with assigned points and were instructed to place a single instrument within a half-block radius of each point. They looked to site instruments in viable private residences or businesses or, if none were available, in median strips along roads or open fields. Whenever possible, sensors were completely buried in a 20-centimeter hole to minimize noise and to keep the instruments hidden.

The deployments involve considerable interaction with the public in seeking permission to place the sensors on private property. We generally have a high success rate if residents are at home and answer the door, and for this reason we usually deploy on weekends. Our deployment teams have included approximately 60 volunteers who span a diverse range of ethnicities, genders, and careers and range in age from high school students to retirees.

There is a certain level of risk that these instruments will be lost or damaged—to date we have lost seven sensors, presumably because of theft. This is about 1% of the instruments we’ve deployed, which we consider an acceptable rate and inevitable with this type of survey. We also suspect that a couple of instruments were disturbed by coyotes.

Initially, we thought that the noisy environment of the basins would preclude effective recording of distant events, but that is not the case.We have completed 9 sensor lines of a planned total of 10 lines. These 9 lines comprise 482 sensor sites, which have generated some 4 terabytes of data at 250 samples per second. The average station spacing in the in-line direction is approximately 250 meters, and the stations remained operational for ~35 days on the basis of battery life.

Structure from Data

We plan to use receiver functions—a technique to enhance seismic waves reflected off interfaces between layers in the subsurface—determined from moderate and large teleseismic earthquakes as far away as Fiji to determine the crustal structure beneath the sensor lines. Initially, we thought that the noisy environment of the basins would preclude effective recording of distant events, but that is not the case (Figure 2). The sensors along lines SG1, SG2, and SB4 clearly show the structure, including the Moho (the boundary between Earth’s crust and mantle) and interfaces above this boundary [Liu et al., 2018]. The sediment-basement interface (the bottom of the basins) is also well defined.

Fig. 2. Receiver functions, computed from three-component seismograms, show the relative response of Earth structure near a seismic sensor. The vertical component recordings at left were measured by sensors along the SG1 line of the BASIN surveys during a 2017 earthquake in Bolivia. The right panel shows receiver functions (RF) computed from this earthquake as recorded along SG1, adapted from Liu et al. [2018]. They reveal the 2-D structure beneath SG1, including the sediment-basement interface and the Moho, as well as a possible fault. Click image for larger version.Determining a basin’s effectiveness in channeling seismic energy is contingent upon determining the basin’s shape and its shear wave velocity. To measure shear wave velocity, we plan to use the analysis of surface waves determined from ambient noise correlations, which has been shown to be effective in the Los Angeles region [Lin et al., 2013].

Figure 3 shows an example where Rayleigh and Love surface waves can be seen in the correlations. These types of waves have the largest amplitudes and thus produce the strongest ground motion. Also, Rayleigh and Love waves are most easily seen in correlations and thus are very useful for determining subsurface structure. We will do our initial analysis in a 2-D sense along the sensor lines and then extend the analysis to 3-D by including correlations between our instrument arrays and the instruments of the Southern California Seismic Network (SCSN), which has approximately 20 permanent stations within and surrounding the basins. We have determined that the correlations can be done over distances up to 40 kilometers and for a frequency range (passband) from 1- to 10-second periods. During the deployment of our SB2, SB3, and SB6 lines, we also installed temporary broadband stations (shown in Figure 1) to supplement the SCSN stations.

Fig. 3. These examples of ambient noise correlations show surface waves amplified by the structure of the San Gabriel Basin and detected along the SG1 sensor line. Distance on the vertical axis denotes distance along the line, and the horizontal axis represents the travel time of the wave to the sensors. The tangential-tangential correlations (TT) show Love waves traveling in both directions with respect to the sensor line. The radial (RR) and vertical (ZZ) panels show the fundamental Rayleigh wave and its first overtone (a wave with twice the frequency of the fundamental wave).

We also plan to try a variety of other techniques to better determine the near-surface structure of the basins as well the deeper structure beneath them, with the primary goal of producing better ground motion predictions for Los Angeles. These techniques include body wave tomography and full-waveform inversion using data from both earthquakes and correlations, horizontal-to-vertical spectral ratios, and autocorrelation imaging. We plan to incorporate the models we determine into the Southern California Earthquake Center’s community velocity models. The data will be available shortly after the final sensor line is completed.

Planning for Resilient Cities

Large earthquakes occurring close to vulnerable metropolitan areas present considerable risk, as demonstrated by the 2010 magnitude 7.0 Haiti earthquake, which devastated the Port-au-Prince metropolitan area [DesRoches et al., 2011]. In California alone, earthquakes cause average annual losses of about $3.7 billion, according to a 2017 report produced by the Federal Emergency Management Agency, the U.S. Geologic Survey, and the Pacific Disaster Center. Losses from the next large earthquake in the state, if it affects a major urban area, are predicted to be even larger [Branum et al., 2016].

Efforts to improve the resilience of cities, so that they are capable of withstanding large earthquakes, are an increasingly essential component of disaster mitigation and urban planning.Urban, high–seismic hazard settings such as Seattle, Vancouver, Dhaka, and Mexico City [Pagani et al., 2018] that are also, like Los Angeles, underlain by sedimentary basins face additional hidden threats from the loose ground beneath them. In these cases, BASIN-type surveys are essential for obtaining realistic ground motions and accurately assessing seismic hazards and risks.

Urban growth will continue to be the largest contributor to global population increase for the foreseeable future. Thus, efforts to improve the resilience of cities, so that they are capable of withstanding large earthquakes, are an increasingly essential component of disaster mitigation and urban planning [Godschalk, 2003].

Acknowledgments

We are grateful to the deployment and pickup crews from the California State Polytechnic University, Pomona; local California high schools; the Jet Propulsion Laboratory; the California Institute of Technology; and Louisiana State University who helped with the fieldwork; the Los Angeles area homeowners for their willingness to host the nodes; Portable Array Seismic Studies of the Continental Lithosphere (PASSCAL), Louisiana State University, University of Utah, and University of Oklahoma for providing nodes; and the PASSCAL engineers for their help coordinating their nodes. P.P. thanks the Department of Geology and Geophysics at Louisiana State University for supporting this project. This research was partially supported by U.S. Geological Survey awards GS17AP00002 and G19AP00015 and Southern California Earthquake Center awards 18029 and 19033.

Equity Concerns Raised in Federal Flood Property Buyouts

Wed, 10/09/2019 - 18:02

The first national-level assessment of property buyouts in flood-prone areas, published today in Science Advances, reveals that buyouts funded by the Federal Emergency Management Agency (FEMA) have taken place mostly in high-income, densely populated areas. Projections suggest that it is poor and rural areas, however, that would benefit most from a managed retreat from rising waters.

The study highlights a number of program management issues, including those concerning equity, that need to be addressed before scaling up property buyouts as part of a national strategy for managed retreat, the study’s authors said.

Where Buyouts Happen

“Here in the U.S. and virtually anywhere you look globally…development has placed people and assets in hazardous places,” lead researcher Katharine Mach told reporters at an 8 October press conference. A changing climate has led to more dangerous storms, more property damage, and more loss of life. “When it comes to weather and climate events we are now experiencing, we are unambiguously behind the eight ball.” One way for society to adapt to climate change is through managed retreats from flood-prone areas.

One way for society to adapt to climate change is through managed retreats from flood-prone areas.

“Retreat is moving people and assets out of hazardous places and taking that land and restoring it to open space,” said Mach, who researches climate change risk management at the University of Miami in Florida.

A floodplain will absorb more water than a parcel of developed land and can help manage flood risk. FEMA has administered the majority of U.S. funds for managed retreat for the past 30 years through its property buyout program.

The team gathered publicly available data on FEMA buyouts, flood maps, declarations of disaster, and the U.S. Census Bureau’s American Community Survey. FEMA funded more than 43,000 buyouts of flood-prone properties between 1989 and 2017. Over the 30 years of the FEMA program, city- and county-level governments administered 94% of the buyouts across 49 states, Puerto Rico, Guam, and the U.S. Virgin Islands. (No buyouts took place in Hawaii.)

“We found out buyouts overwhelmingly take place in counties with higher flood risks,” said coauthor Carolien Kraan, a graduate student studying climate change adaptation at the University of Miami. “At the same time, however, not all counties that have high flood risk do buyouts.”

The three states with the highest property damage costs—Florida, Louisiana, and Mississippi—had middling buyout numbers. This disparity might be caused by local choices about flood mitigation strategies and the prioritization of flood insurance subsidies over buyouts, the team speculated.

These maps of the 50 U.S. states and Puerto Rico show areas that (a) are at high risk of floods, (b) had flood-related property damage, and (c) had flood-related property buyouts between 1989 and 2017. Blue lines in the 1989–2017 map trace out major river systems. Credit: Mach et al., 2019, https://doi.org/10.1126/sciadv.aax8995, CC BY 4.0 Inequities in Administering Buyouts

What made a flood-prone county more likely to administer buyouts?

“We found the counties that have administered buyouts, on average, have higher income and population density,” Kraan said. Population density and income level were used as proxies for a county’s administrative capacity.

“However, when we looked within those counties to where the buyouts took place, the neighborhoods where the buyouts were actually located, on average, have lower household income, lower population density, and more racial diversity,” Kraan added. High-buyout areas also corresponded to areas with lower levels of education and English language proficiency.

These trends raise concerns about equity. Do the programs enable white people to move into less diverse areas? Are people of color, those with less formal education, or those with a lower proficiency in English relocated in the same numbers and in the same ways as less marginalized groups?

Cases have also shown that vulnerable populations have been pressured into buyouts, lied to about flood risk, and relocated to equally flood-prone areas.Case-based analyses and an independent investigation conducted by NPR earlier this year have shown that “after a disaster, rich people get richer and poor people get poorer. And federal disaster spending appears to exacerbate that wealth inequality.”

Cases have also shown that vulnerable populations have been pressured into buyouts, lied to about flood risk, and relocated to equally flood-prone areas.

The FEMA data do not include demographic or personal information about buyout participants, so researchers could not assess whether the property buyouts augmented large-scale systemic inequalities, widened the wealth gap, or increased neighborhood segregation.

Property buyouts are long, arduous, and complicated processes, so the trend might simply reflect county-level resources. “Buyouts require resources and capacity to administer, and not all governments may be equally able to access the program,” Kraan said. “While it isn’t clear why buyouts are taking place in more vulnerable neighborhoods, the study points to the importance of evaluating equity in buyout practices and outcomes.”

Concerns for Scaling Up

The data also revealed that more and more counties each year buy out only a couple of properties. Small buyouts are less cost-effective than larger buyouts, can have a higher administrative burden, and could miss opportunities for more strategic planning of floodplains. Understanding why buyouts have been small can help us scale up for future retreat efforts, the team wrote.

Regardless, if buyouts are left to local governments to administer, “future retreat in poorer and more rural communities may be less likely to be supported and managed by government. These populations could therefore be at increased risk of becoming trapped in areas of high flood risk,” the team wrote.

Another concern is the gaps in FEMA records of property buyouts. “Fifty percent of the fields are blank when it comes to what types of structures and what types of residences they are,” Mach said. “We know they are mostly single-family homes that are primary residences. But then if you go to the address-level data…it seems like, for example, mobile home park residences may particularly be falling through the cracks.” “It’s about how retreat fits into our broader portfolio for climate change adaptation.”

The data gaps add a new layer to the equity questions raised here, she added, and hinder our ability to learn from the buyouts to date and to inform future large-scale deployments.

“It’s not a ‘retreat or else’ type of question,” Mach said. “It’s about how retreat fits into our broader portfolio for climate change adaptation.”

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

Putting a Price on the Costs of Climate Related Health Impacts

Wed, 10/09/2019 - 12:14

Estimating the price tag of climate change on human health is critical but challenging. Scientists at the Natural Resources Defense Council (NDRC), a United States-based international nonprofit environmental organization, have been working closely with academic partners to paint a clearer picture of these missing health costs.

Their 2011 study was a first attempt to shed light on the scale of health costs related to climate change. It predicted that climate-related extreme events that can harm health were only going to get more frequent and more intense—which is to say, more harmful—in years to come. And they were right: globally, the five warmest years on record all occurred after the paper’s 2011 publication.

While climate change is a global issue, its impacts are localized and personal.But while climate change is a global issue, its impacts are localized and personal and there is growing demand for specific information on how climate change will impact health in different places. NRDC has been active in translating global warming into a local issue, with a human face.

The latest work is presented in a research article recently published in GeoHealth. It describes ten events in 2012 in the United States, and offers more specific, localized information on the health impacts of climate change, and their costs. I had an opportunity to ask Vijay Limaye and Kim Knowlton, part of the article’s author team, some questions about their latest study and the importance of Earth sciences in offering insights into public health concerns.

Why is it important to estimate the costs associated with climate-sensitive diseases?

Climate-driven health impacts are serious, widespread, and costly—but these damages are largely absent from the policy debate.Climate-driven health impacts are serious, widespread, and costly—but these damages are largely absent from the policy debate around the costs of inaction and delay on climate change.

Figuring out these costs is challenging because of the array of coordinated data sources — from environmental monitoring and health tracking to healthcare use — that are needed to connect environmental exposures to health impacts to costs.

Our study highlights the importance of supporting coordinated, climate-health monitoring and tracking, to gain a more complete, better-articulated picture of the whole fabric of climate-sensitive events in the United States and their associated costs.

By understanding the local health costs of climate change, there can be more targeted local efforts to avoid the root causes of climate change, and to enhance climate resilience and avoid health harms and their costs, before they occur.

What are some of the challenges to estimating climate-sensitive health costs?

Taylor Bridge Fire, one of many wildfires in Washington state in 2012. Credit: Washington Department of Natural Resources (CC BY-NC-ND 2.0)

We identified three key hurdles. First, tracking of the range of climate-sensitive health impacts is limited. Many U.S. states are tracking the health impacts of extreme heat exposures, for example, but other climate-driven impacts that are not as obvious — such as the mental health impacts of stronger hurricanes and wildfires — are not monitored comprehensively.

Second, estimating illness costs requires information about specific patient diagnoses, which can be hard to access.

Third, it’s difficult to capture healthcare costs that occur well after a disaster occurs, such as the need for ongoing outpatient care or prescribed medications. Our attention often shifts from one climate change-fueled event to the next, but we really need sustained attention on the ground to better understand the cumulative impacts of increasingly common climate-driven health problems.

What was the focus of your study?

We looked at ten different climate-sensitive events across the United States that occurred in 2012, when the country experienced record-breaking heat. These case study events were diverse in type, intensity, and location: wildfires in Colorado and Washington, harmful algal blooms in Florida, tick-borne Lyme disease in Michigan, Hurricane Sandy in New Jersey and New York, ozone air pollution in Nevada, allergenic oak pollen in North Carolina, extreme weather in Ohio, a mosquito-borne West Nile Virus outbreak in Texas, and extreme heat in Wisconsin.

The ten climate-sensitive events from 2012 included in the study. Credit: Limaye et al. [2019], Figure 1Climate change is expected to worsen each of those problems in the future, to differing degrees. We were interested in estimating how much these climate-driven events cost us, as a society, in terms of certain health-related expenses: emergency room visits, hospitalizations, lost work days, medications, outpatient care, and premature deaths.

In total, those 10 climate-sensitive events during 2012 led to an estimated 917 deaths, 20,568 hospitalizations, 17,785 emergency department visits, and nearly $10 billion in health-related costs.For each case study, our team used state-collected health surveillance data, federal reports, and other published data on health impacts to estimate health-related costs.

In total, those 10 climate-sensitive events during 2012 led to an estimated 917 deaths, 20,568 hospitalizations, and 17,785 emergency department visits, along with other health-related expenses, totaling nearly $10 billion (in 2018 dollars) in health-related costs. That’s an astounding price tag, and it only scratches the surface in terms of documented climate-health impacts in recent years.

Were there any findings that surprised you?

Destruction of property on Staten Island caused by Hurricane Sandy in 2012. Credit: John de Guzmán (CC BY-ND 2.0)

We were struck by the many types of health problems linked to climate-sensitive events that extended way beyond what typically comes to mind when we think about the health risks posed by increasing global temperatures.

Our research identified a whole host of illnesses, including pregnancy complications, carbon monoxide poisonings, and kidney disease complications, all linked to the aftermath of Hurricane Sandy in New Jersey and New York.

We were also surprised by the deadly toll of wildfire-driven air pollution, which is estimated to have caused hundreds of premature deaths in Colorado and Washington.

We’ve got to urgently address the root problem of climate change to prevent widespread suffering on a level that we’ve never seen before.And we doubt whether many people outside Florida realize the number of hospital admissions and emergency room visits there due to harmful algal blooms.

Based on what we know about the data, these examples also signal to us that there’s a whole host of other health problems that are likely hitting people right now, so we’ve got to urgently address the root problem of climate change to prevent widespread suffering on a level that we’ve never seen before.

How does this study relate to our broader understanding of the health impacts of climate change?

While there’s a growing scientific evidence base on the health impacts of climate change across the country and around the world, there’s been less of an effort to stitch together individual events to help us better understand the scale of the overall challenge.

Our work tries to synthesize evidence of the profound and growing burden of climate change on individuals, families, and communities.Our work tries to bridge this gap by synthesizing evidence of the profound and growing burden of climate change on individuals, families, and communities—especially the most vulnerable among us. Our study was particularly novel in applying recent advances in the climate-health science arena when it comes to the identifying the health impacts of wildfire smoke, allergenic oak pollen, and harmful algal blooms.

We hope that this work will stimulate future efforts to help us better understand the full picture of what’s happening around the world, and ultimately strengthen the case for more ambitious action on climate change mitigation and adaptation.

Earlier you mentioned that the ten events in 2012 generated around $10 billion in health-related costs. Is that the cost of climate change?

No. It’s important to note that our study was not a climate change attribution analysis, and we did not apply statistical modeling techniques in order to definitively link climate change to any particular health harm. Rather, our study is an effort to identify the range of climate-sensitive health problems, i.e. those expected to worsen in frequency, intensity, duration, and/or areal extent in the future due to climate change.

Blooms of the alga Karenia brevis, known as Florida red tide, can cause human respiratory and digestive illnesses. Credit: NOAA (public domain)

Attributing specific environmental events to climate change is an increasingly precise and urgent science; our endeavor was to synthesize the evidence on climate-sensitive exposures and make the connection to health costs.

We carefully document how each of the case studies in our analysis is linked to climate change and make the case that we’ve already got enough strong evidence about those risks right now in order to act and in order to protect public health.

We also know that far more than ten climate change-fueled events occurred in the United States in 2012, at different scales – regional, statewide, and local. But only about 20 or so states have networks that track any climate-health outcomes. We have a huge opportunity to expand tracking networks and disseminate valuation methods, which would empower local health departments to estimate costs for the most serious local climate-health events. If we knew the whole national health cost burden, there’d be yet another compelling reason to put the brakes on runaway climate change.

What role do the Earth and environmental sciences play in studies like yours that focus on human health?

Advances in the geosciences are crucial for helping to make the connections between climate change, environmental exposures, and human health.Advances in the geosciences are crucial for helping to make the connections between climate change, environmental exposures, and human health.

As we point out in our study, the data linkages necessary to conduct climate-health impact analyses and cost estimates rely on robust environmental monitoring and health surveillance. Public health protections are only as strong as the environmental data that underlies risk assessments.

We’re living in a brave new world of unprecedented environmental change, and we need interdisciplinary efforts that can effectively connect the dots between those changes and real consequences for human health.

How can the AGU journal GeoHealth help enhance scientific collaboration between environmental and health scientists?

GeoHealth is an opportunity to illuminate links between environmental change and human well-being.The forum offered by the GeoHealth journal provides a unique opportunity to better illuminate the links between environmental change and human well-being, a topic that’s of vital importance. By bringing together scientists from a range of disciplines, approaches, and geographic areas, the journal can inform urgent policy debates in a robust and timely manner.

Important national and international synthesis reports on climate change, like the U.S. National Climate Assessment, are directly informed by peer-reviewed interdisciplinary science on climate impacts on health. The GeoHealth journal can also help to address important research gaps in this area, by deploying and supporting a growing network of earth and health scientists.

—John Balbus, National Institutes of Health, USA; Vijay Limaye (vlimaye@nrdc.org;  0000-0003-3118-6912) and Kim Knowlton (kknowlton@nrdc.org;  0000-0002-8075-7817, Natural Resources Defense Council, USA

Gravel Gives Clues to the Strength of Paleotsunamis

Wed, 10/09/2019 - 12:13

Spending a day on the shores of the Pacific Ocean means traveling past at least one blue, rectangular-shaped warning sign: tsunami hazard zone. The sign warns that if the earth rumbles, the sea suddenly retreats from the beach, or strange noises come from the ocean, beachgoers should hustle for higher ground.

Tsunami risk areas are usually determined on the basis of estimates of how big local tsunamis were in the past. But understanding the reach of ancient tsunamis—how big, how fast, and how far inland—can be tricky.

In a new paper in Scientific Reports, researchers reconstructed the characteristics of past tsunamis using gravel. Specifically, scientists looked at how round the sediment was in paleotsunami deposits to infer how far inland the wave reached. They then used that inundation level to estimate the size of ancient tsunamis.

Tsunami Deposits

The study site—Koyadori, a V-shaped valley along the Sanriku coast in eastern Japan—experienced a tsunami wave run-up during the 2011 Tohoku earthquake event. In addition to the modern event, Koyadori has historical records of past tsunamis spanning 400 years.

The team looked at sediments in trenches, outcrops, and drilling cores to identify tsunami deposits. The team ultimately collected gravel from 164 tsunami deposits at 56 sites in the valley.

Researchers collected sediments from 164 tsunami deposits at 56 sites in the Koyadori valley, Japan. Credit: Daisuke Ishimura

“Event deposits are distinguishable layers within normal sediments,” said Daisuke Ishimura, an author on the study and an assistant professor at Tokyo Metropolitan University. He added that the sediment structures, macro- and microfossils, and mineralogy helped the team distinguish paleotsunami deposits from storms and high tides.

But it’s not just geological evidence that pointed the researchers to tsunami deposits.

“The local resident interviews were also useful,” said Ishimura. “They told me inundation history at Koyadori during the [most] recent 50 years.” Ishimura added that the eyewitness reports also helped his team confirm whether the deposits were a result of a storm or high tide versus a tsunami.

Clues in the Gravel

Instead of eyeballing how round the sediment was, the team used machine-led image analysis.After collecting and identifying tsunami deposits, the team examined their shape. “Each of the source sediments at Koyadori has a unique distribution of gravel roundness,” said Ishimura.

Instead of eyeballing how round the sediment was, the team used image analysis. This machine-led task gave the team “10 to 100 times more data than existing, manual methods,” said Ishimura. “Therefore, we could statistically calculate the mixture ratio of fluvial and beach sediments.”

Ishimura said because they had recorded inundation distances from three modern and historical tsunami deposits, researchers could normalize each sample distance from the coastline.

In each paleotsunami deposit, “we found a common, abrupt change in the ratio of beach sediments to fluvial sediments at approximately 40% of the inundation distance from the coastline, regardless of tsunami magnitude,” said Ishimura. They named this change in ratio the tsunami gravel inflection point (TGIP). Ishimura said that by using TGIP, he and his colleague were able to estimate paleotsunami sizes at Koyadori.

To understand the tsunami hazard in an area, you have to know “how big, how often.”This estimation was an important contribution of the new research, said Bruce Jaffe, a research oceanographer at the U.S. Geological Survey not involved in the new study.

“Historically, most of the focus amongst tsunami geologists has been on establishing occurrence: how often the tsunamis hit and the time between two tsunamis,” said Jaffe.

Jaffe added that to understand the tsunami hazard in an area, you have to know “how big, how often.”

Jaffe said he’d rather know if an area was hit by a small or huge tsunami in the past rather than how often an area was inundated. “Of course, knowing both how big and how often is even better and allows, with enough data points, a probabilistic approach to tsunami hazard assessment.”

Site-Specific Tsunami Study

Ishimura said although their research was a successful case study of estimating paleotsunami sizes from the roundness of deposits, it is site specific.

“Specific values and patterns may be different in other places,” said Ishimura, “but we believe that some relationship may be found site by site.”

Jaffe agreed, noting that there are places where this approach may not work—a coastal plain region, for example, where water is not as concentrated as the narrow valley of Koyadori. But he said that the team’s “logic is sound and I would like to see the approach applied to other places in the world.”

“There has to be an additional step where the science gets back to the people who are planning the evacuation routes, determining building codes, and educating the public about the risk of tsunami,” said Jaffe. “This is one step of the journey to decrease loss of life and destruction from tsunamis.”

—Sarah Derouin (@Sarah_Derouin), Freelance Journalist

Groups Oppose Cuts to Federal Advisory Committees

Tue, 10/08/2019 - 19:45

Dozens of organizations—including scientific, environmental, and public health groups—are calling on President Donald Trump to rescind his 14 June 2019 executive order that limits the number of Federal Advisory Committee Act (FACA) advisory committees.“The [executive] order would arbitrarily eliminate essential advice that informs government decisionmaking.”

“By requiring elimination of one-third of existing advisory committees and capping the total number of committees at 350, the order would arbitrarily eliminate essential advice that informs government decisionmaking,” states a 4 October letter signed by 77 groups, including the Consortium for Ocean Leadership (COL), Natural Resources Defense Council, and Union of Concerned Scientists (UCS).

FACA committees provide important outside expertise to federal agencies on a broad range of issues, including science, energy, medicine, and the arts. The executive order states that criteria for potential termination include whether the committees’ stated objectives have been accomplished and whether operation costs for committees are considered excessive in relation to their benefits to the federal government.

Committees Have Been a “Bargain for Taxpayers”

The convening of experts “to deliberate on pressing matters is a bargain for taxpayers,” the letter states. “The removal of advisory committees across the government without a compelling rationale is a threat to a vital independent source of information and deliberation.”

Currently, two committees already have been confirmed as being cut, according to Genna Reed, a lead science and policy analyst at the UCS Center for Science and Democracy. Those two are the Invasive Species Advisory Committee at the Department of the Interior and the Marine Protected Areas Federal Advisory Committee at the Department of Commerce. Reed determined that those committees were cut by their inclusion on a 29 September 2017 executive order continuing their functions but not on a similar 27 September 2019 order. The 14 June executive order that requires the termination of at least a third of federal advisory committees set a 30 September deadline for federal agencies to comply with the order.

Reed thinks other cuts have been made and committee members have been or will be informed. She says she thinks the public will find out about further FACA cuts “in a piecemeal way.”

“The fact that the agencies haven’t put out a public list of advisory committees that they cut is telling. It shows that they understand that there is a public backlash and that these cuts that they are engaging in are inappropriate and unwarranted,” Reed said.

The executive order “is part of this administration’s track record with failing to listen to its own scientists and failing to listen to its external experts who have different opinions perhaps, or inconvenient information that would be disruptive to the administration’s agenda,” she said.

Jon White, president and CEO of COL, said that FACA committees are the best means to ensure that the brightest scientists in the world “are being used to make the best possible decisions by our government. If we don’t have them, then we are losing an incredibly valuable part of our [capability] to inform our future and help us to navigate effectively and safely and not run aground.”

White said he doesn’t accept the rationale that these committees should be cut because of their costs. “I don’t know what the reasoning is, but I don’t buy that,” he said.

He added, “I don’t know of any [committees] that I would offer up to actually cut. It’s a huge concern for the scientific community and, I think, for our nation as well.”

Congressional Concerns

“There really should be a concerted federal effort to look at how and why agencies are making decisions to cut these advisory committees.”Some members of Congress have raised concerns about the executive order. Rep. Eddie Bernice Johnson (D-Texas), chair of the House Committee on Science, Space, and Technology, questioned the executive order in 12 July letters to a number of federal science agencies. Rep. Sean Casten (D-Ill.) introduced the Preserve Science in Policymaking Act of 2019 on 27 September. The bill would “prohibit the termination of advisory committees before the end of their charter unless authorized by law, and for other purposes.”

The elimination of FACA committees “is going to continue to be an area that Congress is interested in doing oversight on,” Reed, from UCS, said. “There really should be a concerted federal effort to look at how and why agencies are making decisions to cut these advisory committees and what kind of impact these cuts are going to have on the quality of government decisions.”

—Randy Showstack (@RandyShowstack), Staff Writer

AGU CEO Announces Leadership Transition

Tue, 10/08/2019 - 16:48

As a result of Chris’s efforts, AGU is stronger than everIt is with mixed emotions that I am announcing that Chris McEntee, our CEO and executive director, has informed us that she will be departing AGU at the end of the first quarter of 2020.  During her tenure, her remarkable leadership has positioned AGU as the leading voice and convener of Earth and space science globally. She has led key initiatives that have increased the relevance and value of Earth and space science to humanity. As a result of Chris’s efforts, AGU is stronger than ever, is well positioned for a vibrant and successful future, is celebrating its centennial year, and is creating its next strategic plan. The organization is poised for continued success.

Although we had hoped Chris would stay with us longer, she has decided that now is the time for her to enter the next phase of her career and life and pursue a balance of professional and personal interests. We will miss her energy, commitment, and friendship, and we are sure she will continue as a strong advocate for our work in whatever future path she pursues. This is an opportunity to have a new CEO as an integral partner in guiding the priorities of the next strategic plan that will enable another century of discovery science and connecting our science with the needs of society.

AGU’s Board of Directors is now organizing the search process, which will be overseen by a committee that draws broadly from our community. As President, I will chair this search committee. More information about the process will be forthcoming in the next several weeks. In the meantime, Chris will continue to focus on the society’s business and strategic priorities and work with the board and staff to ensure a smooth transition.

Questions for now should be directed to me at president@agu.org. Thank you for your ongoing support of AGU and our global community of Earth and space scientists.

—Robin E. Bell, AGU President

Resilient Peatlands Keep Carbon Bogged Down

Tue, 10/08/2019 - 12:05

Some boreal peatlands may prove more resilient to climate change than previously thought and could thrive as major carbon sinks even as northern regions become drier and less hospitable, new research suggests.

Maintaining a diversity of moss species in northern zones could help these landscapes withstand some effects of climate change.By their nature, peat bogs and other mossy wetlands require some level of moisture to stay intact. If they dry up, through either prolonged drought or drainage for land development, the peat mosses die, and other plants can infiltrate and morph the landscape.

Climate projections suggest that some boreal regions will become warmer and drier with climate change and that this transformation will jeopardize peatland health and capacity to store carbon. But new research from a team based in France suggests that two common peat moss species have considerably different tolerances for different levels of warming and drying, suggesting that some species could take the place of others as conditions shift. Research suggests that maintaining a diversity of moss species in northern zones could help these landscapes withstand some effects of climate change, the team reports in the journal Global Change Biology.

“It was quite surprising to see that,” said Vincent Jassey, an ecologist at Université Toulouse III–Paul Sabatier in France and lead author on the paper.

These findings may have important implications for the future of global carbon storage, he added. All soils have the capacity to store some carbon, but peatlands impressively store roughly 30% of all soil carbon on the entire planet, even though they comprise less than 5% of Earth’s surface.

Carbon Sinks

Peatlands capture carbon so effectively because unlike most other plants, peat mosses don’t readily break down when they die. Instead, they build up underground, sometimes accumulating meters in depth over the course of thousands of years. This accumulation means that the carbon they suck out of the atmosphere during photosynthesis gets trapped underground rather than cycling back into the atmosphere.

Microbes struggle to decompose peatlands because the mosses contain complex compounds that are difficult to degrade and because they tend to generate highly acidic, low-oxygen environments—conditions that many microbes can’t withstand.

But once these peatlands lose water, oxygen creeps in, and microbial activity can rev up. The rootless plants may also struggle to gather enough water for photosynthesis during dry times.

Still, although prolonged drought certainly stresses bogs, a bit of warming actually promotes photosynthesis and growth in some peat moss species for the same reason that some plants grow better in a greenhouse.

Moss Species

To try to tease apart these compounding effects of warmer, drier conditions on peat mosses, Jassey and his team studied two common species, Sphagnum medium and Sphagnum fallax, under varying levels of temperature increase and moisture loss in the Jura Mountains of France.

To simulate warming, the team built Plexiglas frames around their study plots in the Jura Mountains of France. Credit: Vincent Jassey

They found that S. medium had the capacity to hold 3 times more water than S. fallax, thanks to larger water-storing tufts called capitula. S. fallax, however, experienced greater growth than S. medium did during warming. This result suggests that S. medium may tolerate more drought-prone regions during climate change, whereas S. fallax may succeed more readily in areas that experience more warming.

Matthias Peichl, a biogeochemist at the Swedish University of Agricultural Sciences who studies the role of peatlands in the global carbon cycle, finds these results interesting but not entirely surprising.

“It seems to make sense that given such a diverse species presence [of peat mosses], there would always be some that respond positively or negatively to any change,” said Peichl, who was not involved in the study. More than 350 species of peat moss exist around the world, including in some tropical regions.

But, Peichl added, understanding the true, lasting effects of climate change on these species would require a longer-term study because a plant’s response to short-term disturbance may differ from its response to more sustained change.

Hongjun Wang, a biogeochemist at Duke University who also studies the effects of drought on peatlands and wasn’t involved in the new study, agreed that these findings call for follow-up research. He also noted that peatlands can withstand only a certain threshold of change before they inevitably break down. A shift past that threshold, he said, “will change the peatland from a sink to a carbon source.”

Still, in the short term, Jassey said that his team’s findings offer a compelling argument to preserve peatlands and help these landscapes continue to stand as major carbon sinks.

“It shows that peatlands could be resistant to future climate change,” he said, “if we don’t disturb the ecosystem and we keep the diversity as it is.”

—Laura Poppick (@laurapoppick), Freelance Science Journalist

Ancient Maya Farms Revealed by Laser Scanning

Mon, 10/07/2019 - 19:23

Ancient Maya civilization thrived for thousands of years beneath the cover of tropical forest in Central America, but once their civilization disappeared, so did much of the evidence of it. For a century, researchers slogged through dense tropical brush to study Maya sites, a laborious and slow-going process.

That all changed when scientists began using lidar, a remote sensing technique that shoots lasers from low-flying planes. By measuring a laser’s travel time, scientists can determine the shape of the ground within a few centimeters and create a picture of the landscape stripped bare of vegetation. In Central America, the technique has exposed thousands of structures previously concealed by the forest canopy.The latest study used lidar to uncover a network of ancient canals and farming fields in the low-lying wetlands of northwestern Belize.

The latest study used lidar to uncover a network of ancient canals and farming fields in the low-lying wetlands of northwestern Belize.  According to previous research, these fields may have held maize, arrowroot, avocado, and other crops, and new dates from the study show that the fields were heavily used between 1,800 and 900 years ago. The researchers discovered four distinct farming networks in the area, one of which was much larger than earlier estimates and another that the scientists hadn’t known existed.

The findings suggest “early and extensive human impacts on the global tropics,” according to professor Tim Beach from the University of Texas at Austin and lead author on the paper published in the Proceedings of the National Academy of Sciences of the United States of America. The researchers hypothesize that cultivating the wetlands for farming could have caused carbon dioxide and methane emissions and may be one source of early greenhouse gas emissions from humans.

Fields of Plenty

Researchers had known for decades about the suspiciously straight-running channels east of two ancient Maya settlements, Akab Muclil and Gran Cacao, but couldn’t rule out natural causes. For the latest study, they flew 570 meters above the suspected sites and sent out more than 6.5 billion lidar pulses.

The most well studied area in the paper was the Birds of Paradise wetland network that stretched 5 square kilometers and had a maze of canals that ran for 71 kilometers. Beach and his colleagues had been studying the area for 2 decades but didn’t know the extent of the site was 5 times larger than they suspected.

“This paper is another reminder of how lidar is revolutionizing archaeology in the tropics.”Lidar “is able to pick up features even the most experienced researchers can miss on the ground,” said Christopher Carr, a research assistant professor at the University of Cincinnati who was not involved in the study. “This paper is another reminder of how lidar is revolutionizing archaeology in the tropics.”

The authors emphasized that ground truthing lidar findings is key, and the latest study used multiple lines of evidence to rule out natural processes, including excavating ancient canals, chemical analysis, and radiocarbon dating of the soil.

Digging into the field at 23 sites, researchers uncovered layers of ash that were left after the Maya burnt the fields before planting. The scientists also tested the ratio of stable carbon isotopes 12C and 13C in the soil and found that the ratio rose during Maya farming. The ratio reflects the types of plants growing in the area, and higher values indicate maize and other species associated with human activities.

“We now are beginning to understand the full human imprint of the Anthropocene in tropical forests.”The authors claim that the results show a “widely distributed agroecosystem” for the Maya living in northwest Belize and suggest that so much farming could have led to an increase of carbon dioxide and methane in the early days of human civilization.

“We now are beginning to understand the full human imprint of the Anthropocene in tropical forests,” said Beach in a press release. “These large and complex wetland networks may have changed climate long before industrialization, and these may be the answer to the long-standing question of how a great rainforest civilization fed itself.”

—Jenessa Duncombe (@jrdscience), News Writing and Production Fellow

Foretelling Forest Death from Above

Mon, 10/07/2019 - 15:36

The speed at which a forest recovers from disturbances can foretell that forest’s untimely demise. In a paper published today in Nature Climate Change, researchers tracked via satellite the vitality of California’s forests during the recent prolonged droughts and developed an early-warning signal for forest death. The new signal can detect a forest’s death spiral 6–19 months ahead of time.

Statistical and empirical formulas for predicting forest mortality “can change over time, especially as climate in the future will be outside the regime of historical climate,” said lead researcher Yanlan Liu, an environmental scientist at Stanford University in California. “This method…directly monitors the dynamics of vegetation from remote sensing, meaning that it’s bridging the gap between climate and vegetation.”

Complexities of Modeling Mortal Forests

“Every year, generally, [a forest’s] biomass increases during the green season and reduces in the dormant season,” explained coauthor Mukesh Kumar, a hydrologist at the University of Alabama in Tuscaloosa. “When a tree is stressed, its physiological functions are impaired. The rate of the recovery of the vegetation with respect to its normal cycle gets slower.”

“By the time forests show obvious signs of failing, like browning leaves or leaf loss, it’s usually too late.”“Forest managers are constantly trying to predict the fate of forest stands in the face of increasing stressors so that they can prescribe management tools to try and avert forest loss and transition to other vegetation types,” said Heather Alexander, a professor of forest biology at Mississippi State University in Mississippi State who was not involved with this research. “However, by the time forests show obvious signs of failing, like browning leaves or leaf loss, it’s usually too late.”

Predicting a forest’s time of death is challenging for models of vegetation dynamics, Kumar said, because mortality is a complicated process on tree and ecosystem scales: A forest sometimes can adapt to challenging conditions for a time, estimates of carbon and water budgets can be off, and data on forest dynamics can be at a too low spatial resolution.

Identifying the Start of a Death Spiral

To address these challenges, the researchers analyzed high-resolution images of California’s forests taken by the Landsat 7 satellite from 1999 to 2015. That time span encompasses two periods of intense drought in California, one during 2007–2009 and the other in 2012–2015. Forest managers estimate that the entire drought, which lasted until 2017, killed nearly 150 million trees.

With the images, “we’re using the normalized difference vegetation index, NDVI, which is a measurement of greenness from satellite data,” Liu said. “It’s proportional to biomass.”

The researchers used NDVI to track the ebb and flow of the forests’ life signs in different regions and to spot whether and when recovery began to slow down. By comparing these calculations to a map of forests that eventually died, they found that the loss of a forest’s resilience was an early-warning signal for its death.In some regions, this signal appeared before a loss of greenness indicated a forest’s imminent demise.

“We found that 75% of the cases exhibited an early-warning signal more than 6 months before mortality,” Kumar said. “In 25% of the cases, it showed more than 19 months before mortality.” In some regions, this signal appeared before a loss of greenness indicated a forest’s imminent demise. But in some areas, the signal for the forest as a whole was very muddled.

“The breakthrough point came when I separated out the early-warning signal by species distribution, meaning that I applied the relationship for pines and oaks separately,” Liu said. “And suddenly, the relationship became very clear. So that means that the resilience signal is species specific.”

The species dependence was initially surprising, both Liu and Kumar said, but it made sense in hindsight. “Each species of tree responds differently,” Kumar explained. “Some trees can handle stress for a long time before shedding their leaves or dying. Others are more susceptible or shed their leaves much earlier.” They found that oaks were more likely to survive under low-resilience conditions than were spruces and pines.

“The key point is that we need to separate the species rather than put them together, because they all have different translations, so to say, between low resilience and mortality,” Liu added.

Saving a Forest’s Life

The researchers emphasized that there’s still a lot of work to be done before this method can be used to predict forest death in other areas of the world. “Caution needs to be taken, because the relationships between low resilience and mortality can change between species and also across climate regions,” Liu said.

Satellite data availability over tropical forests can be spotty because of clouds, and “another thing is that we need accurate [tree] species distribution maps, which I’m not currently aware of for many regions,” Liu added.

“This would give managers more time to implement management strategies to alleviate stressors and hopefully restore forest health.”Alexander thinks this method has promise. “The early-warning signal tool offered by Liu et al. could provide an exciting new way to predict forest resiliency to stressors like drought many months before the forests show obvious signs of decline,” she said. “This would give managers more time to implement management strategies to alleviate stressors and hopefully restore forest health.”

Doug Miller, an environmental informatics researcher at Pennsylvania State University in University Park who was not involved with this study, added that this research provides “the type of tools that land management agencies are really going to need to manage resources under a changing climate.”

“We do think that this lead time will allow us to do something like prescribed burnings or removal of infested trees, or maybe even do variable density thinning,” Kumar said. Other factors like the economics and logistics of forest management, he added, also need to line up to determine whether this signal gives enough warning to keep a forest alive.

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

CAT Pictures of Internal Solitary Waves in Indonesian Strait

Mon, 10/07/2019 - 15:24

Internal solitary waves are generated when strong tides interact with steep ocean bathymetry to produce rapid, deep-reaching and intense changes in the ocean temperature, along with enhanced turbulence and mixing of the water column that can influence biological productivity.

The relatively narrow Lombok Strait in Indonesia is a known generation area of internal solitary waves that are visible as surface slicks in remotely sensed synthetic aperture radar (SAR) images. However, the SAR snapshots provide little details about the abrupt subsurface changes associated with internal waves.

Syasmudin et al. [2019] observed the subsurface structure of the sequential passage of internal solitary waves in Lombok Strait for the first time using a moored coastal acoustic tomography (CAT) array. The inferred temperature changes with depth associated with the waves show alternating warm and cold peaks of 3 to 6°C appearing over periods of hours that were coherent over the upper 600 meters in the water column. While the internal waves are linked to the diurnal tidal cycle interaction with the strait sill, they are superimposed and potentially influenced by the large-scale background from the southward flowing Indonesian Throughflow in Lombok Strait.

Citation: Syamsudin, F., Taniguchi, N., Zhang, C., Hanifa, A. D., Li, G., Chen, M., et al. [2019]. Observing internal solitary waves in the Lombok Strait by coastal acoustic tomography. Geophysical Research Letters, 46. https://doi.org/10.1029/2019GL084595

—Janet Sprintall, Editor, Geophysical Research Letters

Wildfires Affect Water Resources Long After the Smoke Clears

Mon, 10/07/2019 - 15:18

The number of wildfires burning across the western United States over the past 6 decades has been steadily increasing, and those fires are growing larger and more severe, especially in mountain areas where more than 65% of clean water resources for the West’s 75 million people originate. What happens when fires intersect water resources is the subject of two new papers in Hydrological Processes.

Large-Scale Modeling

The watersheds of the Sierra Nevada deliver water to more than 25 million people, primarily via snowmelt, and the conifers of the Sierra are where many of the most severe fires are burning.

As the trees burn, those land cover changes affect the hydrologic cycle. Previous studies involving experiments on changes in runoff and streamflow, evapotranspiration, soil moisture and infiltration, and snow dynamics have indicated that all of these factors would be somewhat affected after a fire. But until now, scientists haven’t put it all together, said Fadji Zaouna Maina, a hydrologist at Lawrence Berkeley National Laboratory (LBNL) and lead author of one of the new papers.

Maina and her LBNL colleague Erica R. Siirila-Woodburn devised a large-scale modeling effort to understand how “postfire perturbations” affect hydrologic dynamics in the Cosumnes watershed, a vast and complex watershed that spans the Sierra Nevada and the Central Valley. The watershed includes 2,000 meters in elevation change from the headwaters to the valley, irrigated areas as well as forestland (more than half the watershed is conifer forests), and variegated geology, from low-permeability volcanic rocks to highly permeable sands and gravels in the valley. Most precipitation falls as snow. It’s “highly representative” of most watersheds in California, Maina said.

Maina and Siirila-Woodburn ran simulations based on fires occurring in the upper mountainous part of the watershed, in the intermediate area of the watershed, or in the Central Valley downstream. They modeled hydrologic changes based on one of the driest years on record (2015) and the wettest year on record (2017).

Land Cover Changes

Snow accumulations increase, and evapotranspiration decreases regardless of whether the fire is followed by a wet or dry year.Maina and Siirila-Woodburn found that land cover changes were the primary factor controlling hydrodynamics in the watershed.

It’s counterintuitive, Maina said, but snow accumulations increase, and evapotranspiration decreases regardless of whether the fire is followed by a wet or dry year.

Whether there’s a lot of precipitation or a little, snowpack is larger after a fire, which then means that runoff is larger, explained research hydrologist Dennis Hallema, who wasn’t involved in either of the new studies. That’s because as snow falls on an unburned tree canopy, the canopy intercepts much of the snow, which is then lost to sublimation rather than falling to the ground, melting, and recharging aquifers or running off into streams.

The research presents “an interesting pattern,” Hallema said. Burned mountainous watersheds, which researchers found had the most impact, produce higher streamflows downstream than expected, even in a drought. However, he said, “the extra water that comes downstream after a fire is not necessarily beneficial for municipal water supplies because of water quality issues” such as higher phosphorus levels and more sediment.

Despite the increasing streamflows downstream, he added, “I would not recommend burning down your watershed to have a bit more water.”

Soil Property Changes

In the other Hydrological Processes paper, Jingjing Chen of the Virginia Polytechnic Institute and State University and colleagues looked specifically at the issue of soil repellency and infiltration after fires. They compared soils in burned and unburned areas in Virginia and North Carolina and confirmed that water repellency is increased in soils after fires. The depth of the most water repellent soil varied across the sites. The factors of fire-induced soil water repellency include fire temperature, duration, and intensity; soil water content; and organic matter content and its composition derived from plants and microorganisms.

Chen said the factors of fire-induced soil water repellency include fire temperature, duration, and intensity; soil water content; and organic matter content and its composition derived from plants and microorganisms. Soil texture (including compaction), clay content, and even clay mineralogy all influence the soil water repellency degree, she said, and “the rainfall amount, frequency, and intensity may also influence the persistence of fire-induced soil water repellency.”

“The effect of depth is interesting,” Hallema said. But the fact that Chen and her colleagues found these changes were still significant more than a year after the fires lends credence to the idea that differences in water repellency might be more related to physical properties in the soils than the fires, Hallema said.

Chen, however, says that the findings indicate that hydrologic processes take longer to recover than previously thought. “If the fire-induced repellency disappears,” she said, that would mean the hydrologic processes reverted to normal, which would have a positive influence on the recovery of plants and ecosystems.

It will be important to see, she added, whether soils in the West respond to fires like the soils her team studied—and that is their next step, along with identifying the mechanisms driving the repellencies.

Fires are hard to prepare for. About the best scientists can do, Hallema said, is develop better models so water and forest managers can make better decisions.

—Megan Sever (@MeganSever4), Science Journalist

Ocean-Based Actions Provide Big Opportunities to Curb Emissions

Mon, 10/07/2019 - 12:44

The global ocean is not only “a victim of climate change” but also “a major part of the climate solution,” according to a new report commissioned by the High Level Panel for a Sustainable Ocean Economy.

“The ocean is on the front lines of the battle against climate change,” the report says, with the ocean already absorbing 93% of the heat trapped by human-generated carbon dioxide emissions.

“Ocean-based climate action can play a much bigger role in shrinking the world’s carbon footprint than was previously thought.”However, ocean-based mitigation options could reduce greenhouse gas emissions by nearly 4 billion tonnes of carbon dioxide equivalent per year in 2030, and by more than 11 billion tonnes in 2050 relative to projected business as usual emissions, states the report, “The Ocean as a Solution for Climate Change: 5 Opportunities for Action,” issued on 23 September. While the report stresses that deep cuts in greenhouse gas emissions from terrestrial sources also are needed, these projected reductions “are larger than the emissions from all current coal-fired plants worldwide.”

“Ocean-based climate action can play a much bigger role in shrinking the world’s carbon footprint than was previously thought,” Peter Haugan, one of the expert authors of the report, said at a 3 October briefing at the World Resources Institute (WRI) in Washington, D.C.

Opportunities and Ancillary Benefits

The five areas that the report analyzes as opportunities for ocean-based action to mitigate greenhouse gas emissions include renewable energy; ocean-based transport; coastal and marine ecosystems; fisheries, aquaculture, and dietary shifts; and carbon storage in the seabed.

The opportunities recognized by the report are practical and largely familiar: investing in offshore energy sources (including wind and tidal); increasing energy efficiency for oceangoing vessels; conserving blue carbon ecosystems; shifting to low-carbon marine food sources; and investing in more research into seabed carbon storage.

“What was exciting to me was the fact that [the report] looked beyond climate-related benefits.”At the briefing, speakers said that these opportunities to curb emissions come with many potential side benefits. “What was exciting to me was the fact that [the report] looked beyond climate-related benefits. It looked at nonclimate benefits, which is an important decision factor for many governments,” said Manaswita Konar, an expert author of the report and an ocean economist at WRI.

For example, more energy efficient shipping could be a win–win by helping governments and industry save money on fuel. Dietary shifts could dramatically lower freshwater use and land conversion associated with food production. Preserving blue carbon ecosystems could help mitigate cyclone damage for coastal communities.

The ancillary benefits of mitigation strategies were echoed by Alfonso Silva Navarro, Chile’s ambassador to the United States, at the briefing. The mitigation potentials of the report’s five focus areas “are critical to fight climate change,” he said, while also providing potential economic, social, and environmental benefits, including employment opportunities, enhanced global food security, and the reduction of ocean acidification.

“This is not easy. It will require greater political will, clear policy signals to support private-sector engagement, finance mechanisms, and development of new technologies.”“This is not easy,” Navarro said. “It will require greater political will, clear policy signals to support private-sector engagement, finance mechanisms, and development of new technologies.”

Haugan, who is a program director at the Institute of Marine Research in Bergen, Norway, told Eos that to achieve the goals of the report, there are many hurdles to overcome, including inertia and strong economic forces that resist change and want to maintain the status quo. However, Haugan said that he is fairly optimistic that the changes envisioned in the report can happen. “There are so many opportunities for wealth and prosperity by using the ocean in a better way than we’re doing,” Haugan said, as an example of why he is hopeful.

Achieving the report’s goals “is doable,” Haugan said. It’s just a question of getting the right conditions in place and realizing the potential and going full speed forward.”

—Randy Showstack (@RandyShowstack), Staff Writer

Interstellar Interloper Borisov Looks Like a Regular Comet, for Now

Fri, 10/04/2019 - 11:39

The solar system’s newest known visitor may help fill in some of the gaps in our understanding of how planets form.

At the beginning of September, astronomers realized that a newly discovered comet, 2I/2019 Borisov, originated outside our solar system, making it the second known interstellar comet. Borisov looks more like a comet than did its predecessor, 1I/2017 U1, informally known as ‘Oumuamua, which appeared more like a lump of rock. Astronomers are beginning to probe the new interstellar visitor’s composition and are finding it surprisingly like the solar system.

“It looks just like another random comet belonging to our Sun,” said Alan Fitzsimmons, a cometary scientist at Queen’s University Belfast. Fitzsimmons led an international team to use the William Herschel Telescope in the Canary Islands to study the cyanogen gas in Borisov. A molecule of carbon and nitrogen gas bound together, cyanogen is one of the first things studied in comets orbiting the Sun, in part because it is so easy to spot.

“I would be really shocked if months down the line, we didn’t find significant differences between Borisov and solar system comets.”The prevalence of cyanogen gas in our solar system didn’t mean that its presence in Borisov was a sure thing. “This is an object from another solar system,” Fitzsimmons said. “Although in almost every single comet we’ve ever studied closely enough in the solar system we’ve seen cyanogen gas, there was no guarantee that it would be in [Borisov]—and yet, there it was.”

Fitzsimmons cautions that these results are preliminary, noting that other teams of astronomers are already probing for different types of gases in the visitor. As scientists learn more about the chemistry of Borisov, Fitzsimmons expects that it will look less and less like our friendly neighborhood comet.

“I would be really shocked if months down the line, we didn’t find significant differences between Borisov and solar system comets,” Fitzsimmons said.

Karen Meech, a cometary astronomer who was part of Fitzsimmons’s team as well as on the team that discovered ‘Oumuamua, agrees. “We wouldn’t have any reason to believe that the exact same chemical mixture should be identical from one solar system to another,” she said. “If it is, it is telling us about some fundamental process [in planetary evolution] that is the same everywhere.”

The new paper will be published in Astrophysical Journal Letters and is available on the preprint server arXiv.

Planetary Leftover Gas giants can toss debris into other worlds, toward their stars, or out of the system completely, where such debris could one day wander into the solar system. Credit: NASA, ESA, and A. Feild and G. Bacon (STScI)

Throughout the galaxy, gravity pulls clouds of gas and dust together to make stars. The leftover debris around a newborn star may form a disk that can birth planets. After a few tens of millions of years—an eyeblink in astronomical terms—some of the dust and rocks get thrown about, crashing into newly formed planets, tossed into the star, or hurled from the system completely.

“Being left over from planet formation is kind of the simplest explanation [for Borisov],” said Sean Raymond, who models planet formation at the Université de Bordeaux in France.

According to Raymond, most of the stuff hurled out of a young planetary system is icy because icy things tend to form farther out, where the star’s gravity has a weaker home. Giant planets are also expected to form farther from their stars, and they are often the most culpable when it comes to tossing out young comets.

“We expect most of the things floating out there in interstellar space should be kind of like comets,” Raymond said.

Like Winning the Lottery

Although the asteroid-like ‘Oumuamua left astronomers scratching their heads, Borisov looks more like what astronomers have expected to spot since they began hunting for interstellar visitors decades ago. But even Borisov has its own surprises.

“This object is just too bright,” said Robert Jedicke, a comet researcher at the University of Hawai‘i. Jedicke uses his knowledge of how asteroid surveys find asteroids to calculate how often astronomers should be able to discover new interstellar objects. Because astronomers hadn’t found any interstellar objects prior to ‘Oumuamua in roughly a decade and a half of observations, astronomers expected that ‘Oumuamua would be the last one seen for a few years.

But Borisov is shining even brighter than expected. At its brightest, around the week of Christmas, the extraterrestrial comet should shine only slightly dimmer than Pluto. That’s still too faint for most amateurs to spot, although the high-end telescopes sported by some enthusiasts could catch it. That’s already been demonstrated because the visitor was discovered by an amateur astronomer, Gennadiy Borisov, using a homemade 0.65-meter telescope.

“This object is either a complete fluke or it’s telling us something fundamental about the behavior of these kinds of objects.”“This is like winning the lottery the first time we ever played it,” Jedicke said. For every interstellar interloper as bright as Borisov, he estimates there should be many, many fainter objects—but other than ‘Oumuamua, none has been discovered. Jedicke is confident that they haven’t been overlooked.

“This object is either a complete fluke or it’s telling us something fundamental about the behavior of these kinds of objects,” Jedicke said. Exactly what that might be will take time to learn and will probably require observing more interstellar objects.

If Borisov suddenly sheds a bulk of material in a cometary flare, it could grow even brighter than expected, allowing astronomers the opportunity to search for even more hard-to-find chemical constituents. Even without a flare, the comet will remain in sight for almost a year, although it will be visible only from the Southern Hemisphere after December. Astronomers will have far more time to study it than the scant 2 weeks they had with ‘Oumuamua.

“We’re only just beginning to explore this new population of objects,” Fitzsimmons said. “It’s the birth of a new field of scientific investigation.”

—Nola Taylor Redd (@NolaTRedd), Freelance Science Journalist

Freshwater Pools Show Antarctica Is More Vulnerable Than We Thought

Fri, 10/04/2019 - 11:37

There are more lakes of melted ice on the East Antarctic Ice Sheet than previously thought, according to the most comprehensive survey to date of the world’s largest ice sheet. The research, published in Scientific Reports on 25 September, found that the meltwater lakes predominantly form at low elevation, on shallowly sloped ground, and near the border where ice meets land.

“This dataset should help us better understand why lakes are forming where they are and that will help us predict how the distribution of lakes will change in the future, especially if air temperatures warm,” lead researcher Chris Stokes said in a statement. Stokes is a glaciologist at Durham University in the United Kingdom.

Dotting the Edges of Ice

The researchers gathered high-resolution images of the East Antarctic Ice Sheet from Landsat 8 and Sentinel-2A satellites taken during the January 2017 melt season. The team searched roughly 5 million square kilometers of the ice sheet.

From those images, the team identified nearly 65,500 supraglacial meltwater lakes that cover a total area of 1,400 square kilometers. Many of the lakes were about the size of a standard swimming pool, but the largest was about 70 square kilometers, bigger than the country of San Marino. The researchers said that there likely were many more lakes smaller than their detection limit.

“Some regions of the [ice sheet] may be closer to the threshold of instability than previously thought.”Most of the lakes formed at 300-meter elevation or lower, in areas with surface slopes shallower than 1°, on ice flowing slower than 120 meters per year, and within 5–10 kilometers down ice of the grounding line. Around 39,000 (60%) of the lakes were on floating ice shelves along the periphery of the ice sheet. These patterns are similar to those discovered for meltwater lakes on the Greenland Ice Sheet, the researchers say.

The results “clearly indicate that some regions of the [ice sheet] may be closer to the threshold of instability than previously thought,” the researchers wrote. They cautioned that the 2016–2017 austral summer was probably an above-average summer for melting. Future surveys are needed to understand how the distribution of meltwater lakes changes over time and in different climate conditions.

This research will give glaciologists a baseline against which to measure rapid changes to the cryosphere from global warming, the team says. “Whilst there is no imminent threat to the stability of the ice sheet,” Stokes said, “our study has shown which areas we should be keeping an eye on over the next few years and beyond.”

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

What Makes for Ethical Citizen Science Research?

Thu, 10/03/2019 - 11:37

It’s common for consumers to wonder how social media sites use their data or what methods banks use to protect their financial information online. But what about data collected for citizen science projects?

On the surface, participating in citizen science projects can seem like an innocuous way for interested individuals, even children, to have fun, experience science enrichment, and meaningfully contribute to research. Such projects don’t appear to be cyberspace lurking spots where data predators lie in wait to commit nefarious acts. Yet if care isn’t taken, real harms can occur to citizen scientists, researchers note in “Coercion, Consent, and Participation in Citizen Science,” a study posted to the arXiv preprint server.

The paper discusses ethical questions, issues, and considerations surrounding citizen science projects. The research has been accepted for publication with major revisions to restructure the ordering of the content and condense some sections, said Pamela Gay, a senior scientist at the Planetary Science Institute in Tucson, Ariz., and a coauthor of the paper. Alison Reiheld, a philosopher studying ethics issues at Southern Illinois University in Edwardsville, is the other coauthor.

Is This Citizen Science or a Game?

Gay said that the idea for the project was sparked by the results of user design testing conducted with first-time citizen scientists for an online CosmoQuest project. The original goal was to evaluate whether the website design made sense to new citizen scientists and if it conveyed the information needed for them to be able to accomplish scientific tasks for the project, she said.

A CosmoQuest informed consent diagram created to explain to prospective citizen scientists how their data might be used. Gay and Reiheld recommend that such diagrams contain illustrations and minimal text. Credit: Courtesy of Pamela Gay

Part of this process entailed asking users how they thought data collected through the website would be used. Alarmingly, even though the website was designed to align with the best practices of the time and included statements such as “You are contributing data to help NASA,” some citizen scientists misunderstood the intended uses for project data, Gay said. Many participants said they thought the project tasks were part of a game or training exercise and didn’t realize the collected data were intended for research.

How often have citizen scientists misunderstood the nature of the projects they have participated in? How often have they not realized their data were being collected and used for research purposes?This confusion left Gay wondering: How often have citizen scientists misunderstood the nature of the projects they have participated in? How often have they not realized their data were being collected and used for research purposes?

Informed Consent and Ethical Citizen Science

In the paper, Gay and Reiheld make it clear that ethical citizen science research must ensure that participants have given truly informed consent.

One piece of this puzzle requires providing digestible information detailing what participant data will be collected, how it will be used, and the potential impacts of those uses. For instance, participants in some studies may not realize that in addition to the research use of data they submit, they may be agreeing to the use of their demographic data in future studies assessing certain characteristics of people who take part in citizen science, Gay said.

In addition, Gay and Reiheld maintain, researchers must explain to prospective citizen scientists what, if any, formal recognition they might receive for their efforts. For instance, will their name be listed on the project website? Will they be named as coauthors on studies that are published? The team also said that scientists need to explain how their own careers might benefit from the citizen science project (such as receiving awards, funding, promotions, or tenure).

The use of easy-to-understand documents with minimal text can help ensure informed consent, the team noted, especially when contributors might not be fluent in the same language as the researchers. For online projects, to ensure participants have agreed to the project terms, logins must be used, Gay said.

Another critical component to ethical citizen science research is ensuring that participants can provide uncoerced consent to the terms of the project.

If children are too young to provide consent to, say, having social media accounts or other online uses of their data, they also can’t consent to having their data collected through online citizen science projects, Gay said. Moreover, she said, using data from child citizen scientists can create harm if their main takeaway from the exchange is that adults benefited from using their information but the children didn’t gain much in return. This perception of imbalance can be particularly damaging if it happens when kids don’t have much experience with science; it can potentially encourage them to become science averse, Gay added.

“This project challenges us to think about the many relevant paradigms that might inform judgments about ethical obligations in citizen science.”Instructors and others interested in helping youngsters learn more about citizen science can use alternative means, such as discussing the field using citizen science or showing examples of projects without actually submitting data to them, Gay said.

Researchers found that older children and even adults can be coerced into citizen science project participation. Often the coercion is accidental. An egregious example is when instructors require project participation for a classroom grade and don’t offer any alternative means for achieving the assignment credit, Gay noted. Even offering citizen science projects as one of several possible ways of earning a grade can be coercive because that project might seem easier or less time-consuming than the alternatives, she said.

“This project challenges us to think about the many relevant paradigms that might inform judgments about ethical obligations in citizen science,” wrote Ana Iltis, director of the Center for Bioethics, Health and Society at Wake Forest University in Winston-Salem, N.C., in an email to Eos. Iltis was not involved with the study,

“For example, collecting and using data about citizen scientists for research on citizen science might be more like doing an observational study or using patient records to do research than enrolling patients into a randomized clinical trial. There are possible paradigms beyond what we typically see as human research and health care, such as marketing research in which virtually all of us unwittingly participate when we shop online or go to the grocery store. Looking at the questions citizen science raises might encourage us to rethink some existing entrenched paradigms,” Iltis added.

— Rachel Crowell (@writesRCrowell), Freelance Science Journalist

Red Skies, Black Holes, Green Lakes, and Other Colorful Things

Thu, 10/03/2019 - 11:34

A Blood-Red Sky: Fires Leave a Million Indonesians Gasping. Wildfires in Indonesia, many begun deliberately to clear land to produce palm oil and wood pulp, are the worst since 2015 and threaten endangered species and human health. The article and its many disturbing but beautiful photographs remind me to be thoughtful about my choices as a consumer. —Faith Ishii, Production Manager

 

Google Searches for “Climate Change” Finally Beat Out Game of Thrones. September was a big month for climate coverage in the media. Looks like it finally beat out Game of Thrones on Google searches!

—Jenessa Duncombe, Staff Writer

 

NASA Visualization Shows a Black Hole’s Warped World.



I find NASA’s new animation of a black hole mesmerizing. Check it out for yourself, but be careful—you just may get sucked in.

—Tshawna Byerly, Copy Editor

 

The Water Is Rising in Kīlauea’s Halema‘uma‘u.

Going up! (…the lake level in #Kilauea‘s #Halemaumau Crater, that is.)

This animation shows telephoto images of water accumulation at the #volcano‘s summit #caldera #lake from Aug 7 – Sep 24, 2019. #USGS #HawaiianVolcanoObservatory #HVO https://t.co/k7r7VfaPIQ pic.twitter.com/rK8lt2B1Tn

— USGS Volcanoes (@USGSVolcanoes) September 30, 2019

I could watch this animation of the water level rising in Halema‘uma‘u for hours, contemplating what it says about the geology and hydrology beneath the volcano. The greenish-bluish-yellowish palette of the water surface, with steam wafting above, is at once eerie and beautiful.

—Timothy Oleson, Science Editor

 

Nuclear Winter May Bring a Decade of Destruction.

Nuclear winter would follow a prolonged series of nuclear explosions in urban areas. The more than 2,000 nuclear explosions that have already been detonated have largely been in unpopulated areas, such as Bikini Atoll, above. Credit: U.S. Department of Energy

A fascinating study models a war between the United States and Russia, but left me thinking about literal and figurative fallout from other state and nonstate actors.

—Caryl-Sue, Managing Editor

 

Mars Is Heaven.

Credit: ESA/DLR/FU Berlin, CC BY-SA 3.0 IGO

Phil Plait said it best on Twitter: “What you need today is a staggeringly huge, high-res, and drop-dead GORGEOUS pic of Mars from pole-to-pole. Yeah, trust me here. Wow.” Yep, he did not oversell that one.

—Kimberly Cartier, Staff Writer

Seafood Farming: A Key to Future Global Food Security

Thu, 10/03/2019 - 11:30

One of humanity’s greatest challenges is to increase food production by 50 percent or more to meet the needs of Earth’s burgeoning human population, while simultaneously taking major steps to reduce agriculture’s environmental toll on the planet. Schubel and Thompson [2019] summarize many of the issues involved, including limitations of freshwater and arable land, as well as other ecological impacts associated with modern food production.

The sea has provided significant protein resources for the human population over millennia. Although marine aquaculture development has lagged behind terrestrial food production, the authors point out that it has great potential not only for food production, but also to reduce environmental impacts since it requires relatively little land and freshwater resources.

They conclude that substantially greater development of aquaculture, particularly in the ocean, if responsibly managed and integrated with more sustainable terrestrial agricultural practices, could be a significant contributor in meeting the world’s future food requirements.

Citation: Schubel, J. R., & Thompson, K. [2019]. Farming the Sea: The only way to meet humanity’s future food needs. GeoHealth, 3. https://doi.org/10.1029/2019GH000204

—Paul A. Sandifer, Editor, GeoHealth

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer