EOS

Syndicate content
Earth & Space Science News
Updated: 5 hours 40 min ago

Report Examines New Tools to Protect Coral Reefs

Fri, 06/14/2019 - 18:38

With coral reefs under threat worldwide, a new report examines and provides a framework to assess novel intervention options that could provide a way forward to protect them.

In the face of threats including habitat destruction, pollution, and climate change, the aim of these interventions is “to increase the ability of these coral reefs to persist in these rapidly degrading environmental conditions,” according to the report, A Decision Framework for Interventions to Increase the Persistence and Resilience of Coral Reefs, which was released on 12 June by the National Academies of Sciences, Engineering, and Medicine (NASEM).

The tools themselves, which also were detailed by NASEM in 2018, include genetic and reproductive interventions such as managed selection and breeding; physiological interventions including pre-exposure of corals to increase their tolerance to stress factors; environmental interventions such as marine and atmospheric shading; and managed relocations of coral populations.

“These new tools are needed because established approaches for managing coral reefs are neither sufficient, nor designed, to preserve corals in a changing climate.”Seven of the 23 examined tools already have been field-tested in specific locations, and all of the interventions have potential risks that need to be carefully weighed against perceived benefits, according to the report, which was requested and funded by the National Oceanic and Atmospheric Administration (NOAA), with additional support provided by the Paul G. Allen Family Foundation. For instance, different types of managed selection present the risk of a decrease in genetic variation. Genetic manipulation could alter the wrong genes and result in unknown risks. Another tool, shading, would alter light regimes.

“These new tools are needed because established approaches for managing coral reefs are neither sufficient, nor designed, to preserve corals in a changing climate,” the report states. “Coral interventions that address the impacts of ocean warming and ocean acidification are part of a three-pronged approach for coral reef management that crucially also includes the mitigation of greenhouse gas emissions and the alleviation of local stressors.”

Managers and decision makers “are faced with the task of evaluating the benefits and risks of a growing number of interventions, separately and in combination,” the report continues. “The interventions have different risks, benefits, and feasibilities in different regions.”

Because there is “no single generalizable approach” for coral reef interventions, the report recommends a structured and adaptive management framework that engages a wide range of stakeholders and that is tailored to local environmental and ecological settings, management objectives, and preferred intervention options.

A Bridge to the Future

“Mitigating [greenhouse gas] emissions is the only way that corals are going to be able to thrive into the far, far future,” Stephen Palumbi, chair of the NASEM committee that produced the report, said at a 12 June briefing. Palumbi, a coral scientist, is a professor of marine sciences and a senior fellow with the Woods Institute for the Environment at Stanford University. “But in this century, when we are hopefully getting a handle on mitigation of the emissions, and things will eventually be getting better by the century, it will take coral interventions now in order for those coral systems to bridge between now and the end of the century.”

In an interview with Eos, Palumbi summed up the report: “Coral reefs are in trouble, there are some things we can do about them, and we now have the tools to begin to be able to make that work in the future.”

“Corals are not just pretty things that we’d like to have around,” Palumbi said. “They support hundreds of millions of people.”

Shallow-water coral reefs, which cover less than 1% of the Earth’s surface, conservatively provide an estimated $172 billion per year in benefits to people in the form of food production, property protection, and tourism, according to NOAA’s Coral Reef Conservation Program Strategic Plan.

Moment of Opportunity

There is “a moment of opportunity” to help protect coral reefs if these interventions are managed properly and are ready for deployment when needed.Marissa Baskett, a member of the NASEM committee that produced the report, told Eos that there is “a moment of opportunity” to help protect coral reefs if these interventions are managed properly and are ready for deployment when needed.

“We have a variety of potential interventions that can increase coral persistence in the future,” said Baskett, an associate professor of environmental science and policy at the University of California, Davis. “All come with uncertainties and risks. But if we leverage that uncertainty, we can learn from the process to mitigate risk, maximize learning, and improve the future.”

Baskett said that there needs to be a stakeholder-driven and scientifically driven process for understanding the potential risks and benefits of these interventions. “We have an extraordinary opportunity right now to be ready to deploy them when they are necessary” and to be proactive rather reactive, she said.

Committee members also have briefed Congress, the White House, and NOAA about the report. NOAA, which received the report about a week ago, currently is developing its response, according to Tali Vardi, a coral scientist with ECS, a federal contracting company, who is NOAA’s point person to the NASEM study.“How do we stop just cataloging those declines and start putting our energy into doing things? That’s what I want this report to be.”

“There is a lot of work to do” to protect coral reefs, she told Eos. “Reefs are disappearing while we sit here and chat.”

Palumbi told Eos that he wants this report to make a real difference for the future of coral reefs.

“Everybody I know who has worked on reefs for the last 20 or 30 years knows places that were fabulous and are virtually dead now. How do we turn that around? How do we stop just cataloging those declines and start putting our energy into doing things?” Palumbi said. “That’s what I want this report to be. I want it to be the foundation on which people say, there are things to do, there’s energy to do it. There’s a goal. We have to wrap it into climate mitigation. We have to wrap it into the local stressors thing. But there is a way forward. Let’s take it, because what are we going to do if we don’t take it?”

—Randy Showstack (@RandyShowstack), Staff Writer

Shallow Low Frequency Tremors in Japan Trench

Fri, 06/14/2019 - 12:12

Low frequency tremor is a newly discovered type of seismic activity indicative of slow slip of a fault, rather than the typical “fast” slip that occurs in regular earthquakes. While such activity has been discovered in many global subduction zones, it has yet to be identified in northern Japan.

Tanaka et al. [2019] report the discovery of shallow low-frequency tremors near the Japan Trench by using the newly installed seafloor seismic observation network (S-Net). They show that tremor activity in this region, co-located with low-frequency earthquakes, is distributed in two main clusters, separated by a gap where large earthquakes have nucleated and where aftershock activity of the 1994 Sanriku-Oki earthquake was also located. This indicates fairly rapid along-strike variations in the frictional properties of the shallow plate interface and helps us to better understand the driving mechanisms for both slow and regular earthquakes.

Citation: Tanaka, S., Matsuzawa, T., & Asano, Y. [2019]. Shallow low‐frequency tremor in the northern Japan Trench subduction zone. Geophysical Research Letters, 46. https://doi.org/10.1029/2019GL082817

—Gavin P. Hayes, Editor, Geophysical Research Letters

Arctic Glacial Retreat Alters Downstream Fjord Currents

Fri, 06/14/2019 - 12:10

As climate change progresses, glaciers continue to retreat worldwide. In the Arctic, glacial meltwater delivers sediments and nutrients to fjords and, ultimately, to the ocean. New research by Normandeau et al. reveals how glacial retreat impacts downstream sediment delivery and associated currents in Arctic fjords.

Heavier flows resulting from glacial retreat can boost the amount of sediment delivered to the mouths of rivers that empty into fjords, rapidly increasing the size of the river deltas. However, the links between glacial retreat and underwater delta dynamics are complex and have been poorly understood, limiting predictions of how, exactly, glacier retreat will reshape Arctic nearshore fjords.

To get a better picture of this system, the authors of the new study mapped the underwater features of 31 river mouths in fjords along the eastern coast of Canada’s Baffin Island. The maps incorporated high-resolution bathymetric data collected over several years from aboard the R/V Nuliajuk and the CCGS Amundsen as part of the ArcticNet program.

These mapping efforts revealed which deltas contained sediment waves, large-scale patterns in deposited sediment that are formed by the fast, downhill flow of sediment-laden water. These fast flows are known as turbidity currents, and their presence or absence depends on upstream glacial and watershed dynamics.

Statistical analysis of links between the mapping data and watershed data compiled for each river mouth showed that the presence of turbidity currents depends on the presence and size of upstream glaciers, which erode material that becomes transported as sediment. However, if lakes form upstream from fjords during glacial retreat, they may trap sediment, keeping it from flowing downstream and halting turbidity currents.

The researchers used these findings to create a model of evolving delta dynamics over the course of upstream glacial retreat. The model accounts for the formation of lakes that halt downstream turbidity currents, as well as reactivation of turbidity currents that may occur if lakes later fill up with sediment. The scientists applied their model to 644 rivers emptying into fjords along Baffin Island, predicting which are likely to contain turbidity currents.

This work could help improve predictions of future coastal changes worldwide, including effects on marine ecosystems that rely on nutrients transported in sediments. It could also help refine understanding of past glacial retreat. (Journal of Geophysical Research: Earth Surface, https://doi.org/10.1029/2018JF004970, 2019)

—Sarah Stanley, Freelance Writer

Oldest Meteorite Collection Found in World’s Oldest Desert

Fri, 06/14/2019 - 12:08

Each year, millions of meteors intersect with Earth. Most of these burn up on entering our atmosphere, but some larger space rocks survive the journey and land on Earth’s surface.

A new study looking at a sampling of more than 300 meteorites collected in Chile’s Atacama Desert is shedding some light on the rate and variety of meteor strikes over the past 2 million years.

Meteorites can land anywhere on Earth, but those that fall in deserts and on ice sheets are more likely to be preserved and recovered, says Alexis Drouard, an astrophysicist at Aix-Marseille University in France and lead author of the new study, published in Geology.

But both locations have drawbacks: Most deserts on Earth are only a few thousand years old, and meteorites that land on ice sheets are often transported and concentrated by glacial processes, making it difficult to determine how many meteors might have fallen in a given time period, a statistic known as the meteorite flux.

“This confirms the long-term, multi-million-year stability of the Atacama Desert surfaces and offers a unique opportunity to study the meteorite flux to Earth.”“We wanted to see how the meteorite flux to Earth changed over longer timescales, over millions of years,” says Drouard.

To find evidence of older meteorites in a stable environment, Drouard and his colleagues turned to a collection of over 300 meteorites found in Chile’s Atacama Desert. “The Atacama is the oldest desert on Earth,” Drouard says. “The Sahara was green 5,000 years ago, but the Atacama has been arid for at least 7 million years and maybe as long as 20 million years.”

The team subjected a sample of 54 rocky meteorites to cosmogenic age dating using the chlorine-36 isotope and found that the oldest samples fell to Earth between 1 and 2 million years ago, with a mean age of 710,000 years, making this the oldest meteorite collection found to date on Earth’s surface.

“This confirms the long-term, multi-million-year stability of the Atacama Desert surfaces and offers a unique opportunity to study the meteorite flux to Earth and meteorite weathering over the million-year time scale,” the team wrote in Geology.

Being able to study the meteorite flux sheds some light on cosmic processes and events, such as collisions, that may produce more meteorites or change the type of debris. The team found that the flux of meteorites remained constant over a 2-million-year time span with 222 meteorites more massive than 10 grams falling per square kilometer every million years.

One of the larger chondrites found in the Atacama Desert sits among smaller, lighter rocks and a rock hammer for scale. Credit: Jérôme Gattacceca (CEREGE)

“It’s extremely very rare to find a record like this that spans such a long, continuous chunk of time,” says Philipp Heck, a meteorist at the Field Museum in Chicago who was not involved in the new study.

The team also found that the type of meteorite that fell on the Atacama changed over the time period studied. All 54 meteorites studied were ordinary chondrites, the most common type of rocky meteorite, but the collection falls into three groups: high iron (H type; 25 meteorites), low iron (L type; 26 meteorites), and low iron, low metal (LL type; 3 meteorites). The team detected a sharp increase in the proportion of H chondrites over L chondrites between 1 and 0.5 million years ago.

“It’s an interesting and important result that they found an overabundance of H chondrites between 1 and 0.5 million years ago,” Heck says. “When one type of meteorite dominates, it’s most likely related to an event such as a collision that released those objects from the parent body.”

For a follow-up study, Drouard’s team could use cosmic ray exposure dating to determine how long the meteors traveled through space before entering Earth’s atmosphere, Heck says. “This can tell us something about where they came from and the trajectory they were traveling before they intersected with the Earth.”

—Mary Caperton Morton (@theblondecoyote) Science Writer

Diagnosing Soil Moisture Impacts on Model Energy Fluxes

Thu, 06/13/2019 - 11:30

The way that Earth System Models (ESMs) represent the linkage between soil moisture and evapotranspiration is still poorly captured by diagnostics in most models. Gallego-Elvira et al. [2019] apply a new method to extensive long-term satellite observations and reanalyze data of land-surface and air temperature across the globe. The study highlights world regions where ESMs capture the coupling well, such as in arid areas, and regions where models are unable to capture a realistic relationship between soil moisture and evapotranspiration, specifically, continental areas. The results are important for further model development, because soil moisture is known to be a strong contributor to temperature extremes. This research thus provides a useful platform for both global and regional climate models seeking to improve the description of land surface energy budgets.

Citation: Gallego‐Elvira, B., Taylor, C. M., Harris, P. P., & Ghent, D. [2019]. Evaluation of regional‐scale soil moisture‐surface flux dynamics in Earth system models based on satellite observations of land surface temperature. Geophysical Research Letters, 46. https://doi.org/10.1029/2019GL082962

—Valeriy Ivanov, Editor, Geophysical Research Letters

Understanding the Turbulent Nature of the Solar Wind

Thu, 06/13/2019 - 11:18

The phrase “solar wind” may conjure up images of streams of protons wafting off the Sun and floating into space like a gentle breeze. But these particles, traveling at upward of 400 kilometers per second, more often resemble a raging, turbulent current, with swirls and eddies.

Some of the most dramatic features of the solar wind are discontinuities, where the magnetic field inside the stream abruptly changes direction.

These discontinuities are analogous to wind shear that aircraft encounter in Earth’s atmosphere, and similarly, their presence usually means there’s more turbulence nearby. Strong electric currents flow near a discontinuity, and these are an important generator of turbulence throughout the solar wind.

But investigating these dynamics in detail is not easy and requires data from multiple spacecraft that straddle such a boundary.

Now Artemyev et al. have used data from NASA’s Acceleration, Reconnection, Turbulence and Electrodynamics of the Moon’s Interaction with the Sun (ARTEMIS) mission, a pair of satellites that orbit the Moon and have a unique vantage point in the pristine solar wind. By mining data from its suite of instruments on the solar wind, its plasma, and its magnetic field, the team identified roughly 300 discontinuities and analyzed their structure.

They found that the currents that accompany solar wind discontinuities are actually two currents in one: They have a dual-layer structure, with an intense, but thin, layer of current flowing within a thicker one. The thin, embedded layer is typically on the order of a few thousand kilometers thick, whereas the weaker outer layer can span hundreds of thousands of kilometers.

Intriguingly, the team also found that the behavior of solar wind discontinuities doesn’t fit neatly into theoretical categories.

In fluid dynamics theory, discontinuities like those in the solar wind come in two forms. In some, plasma flows in the same direction but at different speeds, so that no plasma flows across the boundary—a so-called tangential discontinuity. In others, the discontinuity is a kind of shock wave, so that plasma can cross the boundary but is sent off in another direction as it does—a rotational discontinuity.

But the discontinuities the team observed look like a combination of both categories. The density and temperature of the solar wind’s plasma change dramatically from one side of the discontinuity to another, suggesting a stark, tangential discontinuity where no particles could cross. Yet the team also observed that some electrons—those with energies of hundreds of electron volts or higher—could freely cross over the boundary, as in a rotational discontinuity.

The key to resolving this contradiction may lie in the motion of individual plasma particles as they gyrate through space under the influence of electric and magnetic fields, the authors write. A drop in electric potential could create conditions that appear to create separate groups of plasma—like one layer embedded in another—yet still allow some particles to cross over the discontinuity. Investigating this possibility will require theorists to branch out from treating the solar wind as a pure fluid and to use models that consider the motions of individual particles, the team writes. (Journal of Geophysical Research: Space Physics, https://doi.org/10.1029/2019JA026597, 2019)

—Mark Zastrow, Freelance Writer

Building a One-Stop Shop for Soil Moisture Information

Thu, 06/13/2019 - 11:17

Across the United States, networks of instruments in the ground and satellites overhead keep a constant watch on soil moisture conditions. Aside from the obvious applications, like helping farmers know where to irrigate, data from these networks also help government agencies assess the risk of floods, wildfires, or landslides. Soil moisture data are also valuable for predicting runoff and for constructing models of the heat and carbon exchanges between the soil and the atmosphere.

The coordinated National Soil Moisture Network (NSMN) aims to integrate soil moisture (SM) data from several existing in situ monitoring networks throughout the United States. NSMN also aims to synergistically merge these data with remotely sensed and modeled SM products to generate near-real-time, high-resolution, gridded national SM maps and other products. By doing so, the project coordinators expect to reduce societal risks from such hazards as drought, flood, and fire, as well as to improve characterization of national water budgets.

President Barack Obama’s 2013 Climate Action Plan highlighted the need for the NSMN. This plan included a call for the development of a National Drought Resilience Partnership (NDRP) to reduce societal risk from drought by connecting information related to drought preparedness. The plan articulated the need for improved collaboration between SM data providers and users of SM information. The NSMN is envisaged to meet this need, and it will assist the NDRP in reducing societal vulnerability to drought by standardizing data from multiple networks, providing an accessible delivery platform, and ultimately yielding trustworthy integrated SM information.

Integrating Data from Many Sources

The data for the NSMN come from a variety of sources. This improves the coverage and reliability of the network, but it complicates the data integration effort.The data for the NSMN come from a variety of sources. This improves the coverage and reliability of the network, but it complicates the data integration effort. In situ sensor data are assembled from various federal networks (e.g., SCAN, ARS, USCRN, SNOTEL, RAWS), state mesonets (mesoscale networks), academic and research networks, citizen science networks (e.g. CoCoRaHS), and the Cosmic-ray Soil Moisture Observing System (COSMOS) probe effort, among other sources.

SM values from these in situ locations will be merged with remote sensing and satellite data from Soil Moisture Active Passive (SMAP), Soil Moisture and Ocean Salinity (SMOS), and other satellites for upscaling to larger geospatial areas. Additional guidance will come from land surface and hydrologic models, such as the North American Land Data Assimilation System (NLDAS-2), National Water Model, and others, for stable percentile distribution generation or gap filling when satellite observations are unavailable.

Integrating in situ measurements from diverse networks presents several challenges:

the unequal spatial distribution of sensors the variety of sensor types different sensor installation depths differing reporting frequencies and periods of record [Dorigo et al., 2011; Shrestha and Boyer, 2019] the high spatial heterogeneity of soil characteristics, land use, and meteorological factors that influence SM measurements at a site and limit the spatial representativeness of any given in situ sensor [Robock et al., 2000]

Similarly, merging point measurements with remotely sensed data and simulation results presents a significant challenge. Remote sensing data and large-scale hydrologic models often have comparatively coarse spatial resolution, and remote sensing data typically capture only the near-surface soil depths [Ochsner et al., 2013]. NSMN will seek to develop products that are useful to the widest possible audience and will provide data that will serve diverse user communities. For example, near-surface SM data may be most beneficial for integration with remotely sensed data [Bolten et al., 2010], but deeper (root zone) measurements may be most useful for drought monitoring [Bell et al., 2013] or streamflow forecasting [Harpold et al., 2017].

Laying the Foundations

Early steps to lay the foundation for the NSMN began in 2010 when the U.S. Department of Agriculture, Oklahoma State University, the University of Oklahoma, and NASA partnered to create the Marena, Oklahoma In Situ Sensor Testbed (MOISST). This test bed was designed to provide a long-term field site for intercomparison of the different SM sensor types being used in networks across the United States.

Many data-related challenges are being addressed by ongoing research projects, including a soil moisture drought product that integrates NSMN data with satellites and land surface models.Subsequently, researchers at Texas A&M University began assembling the North American Soil Moisture Database (NASMD), which was funded by the National Science Foundation in 2011. NASMD was built to provide an archive of historical SM conditions to support a land-atmosphere interaction project rather than to monitor and assess SM information in real time. However, the database has since been used as a proof of concept for the integration of in situ data, and it has provided a comprehensive list of stations and station metadata [Quiring et al., 2016].

In 2014, the National Integrated Drought Information System (NIDIS) funded a pilot project for integrating in situ SM sensor data from distributed sources in real time, building on lessons learned from the NASMD. This project, which was completed in 2015, successfully merged real-time in situ SM data from disparate formats into a common end point and developed a reference architecture for NSMN.

Many of these data-related challenges are being addressed by ongoing research projects, such as developing an integrated SM drought product that integrates NSMN data with satellites and land surface models [Ford and Quiring, 2019]. This project assesses the feasibility of SM data acquisition, processing, integration, and delivery at sufficiently short latency for effective operational drought monitoring. Results from these projects have been presented in an ongoing series of workshops dedicated to advancing the NSMN effort.

Moving Forward

Once established, maintaining an NSMN will require substantial resources for database management, quality control of current and historic SM data, upkeep of the data interface for end users, data storage, and NSMN group coordination [Ochsner et al., 2013].

Standardization protocols will also have to be developed, such as for SM terminology and as guidance for future sensor installations [e.g., Dorigo et al., 2011] and strategic planning and design for near-real-time applications using NSMN products. Sufficient flexibility must be built into the data retrieval process to allow for variable data output formats and quality levels from different networks, relatively short periods of record for soil moisture data compared with other climate variables [Ford et al., 2016], and variable sensor depths in the soil column. Finally, a coordinated NSMN requires resolution of data ownership and personnel resource issues.

Once these hurdles have been overcome, NSMN data will have the potential to generate several key products intended to benefit a wide range of user groups. Some of these products include national maps of in situ SM sensor locations across the United States from the networks identified above (Figure 1). Other products will process high-resolution, gridded data derived from in situ sensors using regression kriging (Gaussian process) interpolation to generate regional SM percentiles [Quiring et al., 2018] (Figure 2). Still other products will provide users with soil volumetric water content (VWC) anomalies, SM percent of normal for watersheds for nonsurficial portions of the soil column (Figure 3), and VWC percentile maps designed to correspond to U.S. Drought Monitor categories.

Fig. 1. Locations of in situ soil moisture sensor networks across the United States from federal- and state-level networks. Credit: nationalsoilmoisture.com

.

Fig. 2. Five-centimeter soil moisture percentiles on 25 March 2018 (regression kriging interpolation). Credit: nationalsoilmoisture.com; Quiring et al., 2018, https://doi.org/10.1175/BAMS-D-13-00263.1

.

Fig. 3. Example of basinized, percent-of-normal, 50-centimeter (20-inch) depth soil moisture for watersheds in the western United States based on in situ volumetric water content observations from the SCAN and SNOTEL networks. Credit: USDA National Water and Climate Center

.

Other products include downloadable data, plain-language summaries of current conditions, and contextual information such as soil physical and hydraulic properties for interpreting current SM levels. Users can get information on standards and specifications for future sensor installations and data output and quality control procedures for evaluation of current and period-of-record data. Also planned for development is an open source code repository for working with existing SM data sources similar to the one provided by the Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI).Two bills that were recently signed into law have added national support for soil moisture efforts in the United States.

Two bills that were recently signed into law have added national support for SM efforts in the United States. In late 2018, Congress passed the NIDIS Reauthorization Act. This act stipulates that the undersecretary of commerce for oceans and atmosphere shall “develop a strategy for a national coordinated soil moisture monitoring network.’’

In addition, the 2018 Farm Bill authorizes an appropriation of $5 million annually for fiscal years 2019 through 2023 for “improved soil moisture and precipitation monitoring” to improve the accuracy of the U.S. Drought Monitor. These funds may be used to expand the number of SM stations. NSMN will play an integral role in bridging these two programs to ensure that an infrastructure emerges that can provide scientifically defensible and socially beneficial information about SM in the United States.

Acknowledgments

We thank Jessica Lucido, whose manuscript (in preparation for publication) was helpful in writing this article.  Financial support for this effort was provided in part by NIDIS. All data are available from the authors.

Many Water Cycle Diagrams Promote Misconceptions

Thu, 06/13/2019 - 11:15

Quick—name a scientific diagram you studied as a grade-schooler.

A depiction of the water cycle probably tops that list. But now researchers have shown that the majority of familiar water cycle diagrams are flawed in fundamental ways. That’s bad news because students, educators, and policy makers alike often accept water cycle diagrams as accurately representing the movement of our planet’s most basic resource.

Where Are the People?

This investigation of water cycle diagrams started as an academic question over lunch one day, said Benjamin Abbott, an ecosystem ecologist at Brigham Young University in Provo, Utah. How are people thinking about the Earth’s big cycles like the water cycle, he and other researchers wondered? When Abbott and his colleagues started looking at a few water cycle diagrams, they realized something.

“It kind of dawned on us,” said Abbott. “People are missing from almost all of these.”

Abbott and his collaborators began systematically collecting water cycle diagrams. They did Internet image searches for terms such as “water cycle” and “hydro cycle,” among others, and pulled up 350 diagrams from 12 countries. The researchers also mined textbooks, scientific literature, and government-published documents, and recovered 114 English language diagrams of the water cycle.

In total, they amassed over 450 unique diagrams published as early as the 1940s.

The researchers then analyzed these depictions of water cycling through, on, and above the terrestrial Earth. They found that the vast majority of the diagrams—85%—failed to show any effect of humans on the water cycle.

That’s unfortunate and inaccurate, said Abbott, because people have had a pronounced impact on the water cycle. Humans have changed the distribution of vegetation, for example, he said. “That’s actually affecting the land-to-atmosphere flux of water and also what happens to precipitation after it falls.”

Visual Bias

Researchers found that many of the diagrams were biased because of how they visually presented water resources, said Abbott. For instance, groundwater was often shown as extending down to the bottom of the page.

“That’s kind of implying there’s unlimited groundwater,” Abbott said.

Over 95% of the water cycle diagrams showed temperate, forested regions despite most of the world’s population living in drier areas.

Finally, climate change and water pollution, both major contributors to water crises, were noted in only 2% of the diagrams.

“You’re missing all of these realities,” said Abbott. These results were published in Nature Geoscience.

A Few Suggestions

Water cycle diagrams can be improved in three ways: by portraying different types of biomes, by conveying temporal changes in water cycling in different seasons, and by showing humans interacting with water.The scientists suggest that water cycle diagrams can be improved in three ways: by portraying different types of biomes, by conveying temporal changes in water cycling in different seasons, and by showing humans interacting with water.

“There’s no reason why we can’t integrate that into our water cycle diagrams, especially when we’re using animated or interactive diagrams,” said Abbott.

This is a unique study that melds human and physical aspects, said Paul Durack, an oceanographer at Lawrence Livermore National Laboratory in Livermore, Calif., not involved in the research. “How we communicate human influence and interference into the water cycle is something we could do a far better job of.”

Abbott and his colleagues are currently applying for funding to put together an open-source suite of static and interactive water cycle diagrams. They plan to work with teachers and hydrologists to produce diagrams that are accurate, visually appealing, and useful to audiences ranging from students to policy makers.

“We want to effectively communicate what’s going on with water and what we could be doing better to solve the global water crisis,” said Abbott.

—Katherine Kornei (@katherinekornei), Freelance Science Journalist

Fading Air Pollution Reduces Fog in Central Valley

Wed, 06/12/2019 - 12:16

Bounded to the east by the Sierra Nevada mountains and to the west by the Coastal Range, California’s Central Valley is one of the most productive agricultural regions in the world. Some estimates indicate the region supplies more than half of the fruits, vegetables, and nuts grown in the United States.

The valley is so hospitable to agriculture in part because of a unique meteorological phenomenon known as tule fog. The dense ground fog enshrouds the valley during the winter and provides a necessary winter chill that increases the productivity of fruit and nut trees. However, it also drastically reduces visibility around the region, particularly at night and in the early mornings. The bleary episodes rank as one of the primary causes of weather-related accidents in the state.

Over the past 90 years, the frequency of tule fog has boomeranged dramatically. From 1930 to 1970, the fog’s frequency consistently climbed—cities like Fresno experienced an 85% increase in fog events. Yet starting around 1980, the fog began to dwindle. Weather observations indicate a 76% reduction in fog over the past 36 winters.

Gray et al. investigated tule fog to determine the drivers behind its upward then downward trend and its year-to-year variability. The authors used the National Oceanic and Atmospheric Administration’s archives to craft a history of fog frequency dating to 1909. They used records from the National Climatic Data Center to stitch together a fog climatology that included data on temperature, dew point, precipitation, wind speed, and other climate variables. Additionally, they used Environmental Protection Agency data and local city inventories to evaluate how air pollution, specifically particulate matter (PM) and oxides of nitrogen (NOx), affects the fog. NOx is a critical precursor to wintertime nitrate formation with a long observational record, making it an ideal proxy for PM.

The authors discovered that annual fluctuations in fog occurrence are driven by local weather; of the climatic variables analyzed, dew point depression appeared to be the most critical measurement. The findings suggest that there has not been a decrease in the number of days with optimal fog conditions since 1930, but now nearly all tule fog events occur under optimal conditions, which was not the case 35 years ago.

Long-term tule fog trends are instead driven by air pollution: Pollution and fog increased until 1970 and decreased after 1980 when effective controls and regulations were implemented in California. The results showed that under low dew point depression conditions, every 10 parts per billion decrease in NOx resulted in five fewer fog days per year.

Changes in tule fog frequency affect the transportation, agriculture, drought resistance, and climate of California’s Central Valley, and these findings offer insights for managers and planners. Furthermore, the study reveals an unintended consequence of reducing air pollution. (Journal of Geophysical Research: Atmospheres, https://doi.org/10.1029/2018JD029419, 2019)

—Aaron Sidder, Freelance Writer

Space Is Polluted by Junk…and It’s Getting Worse

Wed, 06/12/2019 - 12:15

In 2016, the European Space Agency’s (ESA) Copernicus Sentinel-1A satellite experienced a sudden dip in power and a physical jolt while orbiting 700 kilometers above Earth. The culprit turned out to be a millimeter-sized speck of space debris that hit one of the solar panels. It left a 40-centimeter dent.

Right now, “there are already 20,000 objects that are tracked from the ground out there, and there are close to 1 million objects larger than 1 centimeter, the majority of which are too small to be tracked,” Holger Krag, head of ESA’s Space Debris Office, told Eos. “All of them pose a significant risk to space objects.”

“At the moment, we see…a pretty much unchallenged degradation of the environment as compared to what we’ve seen in the past.”Space debris is defined as anything artificial that clogs up space near Earth, including defunct satellites, used rocket stages, and payload components expunged after use. Each year, ESA documents the current state of the space debris environment and how it has changed since Sputnik launched in 1957. This year’s report, which was released in May, makes two things clear: Near-Earth space is increasingly polluted, and humanity must work harder to clean it up.

“It is definitely a bit of a pessimistic outlook that we have,” Krag said. “At the moment, we see…a pretty much unchallenged degradation of the environment as compared to what we’ve seen in the past.”

“In the past 10 years, the problem of debris has become very real,” Stijn Lemmens, a space debris mitigation analyst at ESA and lead author of the report, told Eos. “At least once a week, [ESA] will have to maneuver one of our spacecraft, interrupt our normal science operations to move out of the way, and move back because there is a piece of debris coming by.”

The Scope of the Problem

As of January 2019, there were around 34,000 objects larger than 10 centimeters orbiting Earth (Figure 1). Only about 2,000 of those are active satellites. Statistical models suggest that there are about 900,000 debris objects larger than 1 centimeter and about 130 million larger than 1 millimeter.

Fig. 1. The number of objects in any geocentric orbit from 1957 to the present. “Rocket” refers to anything related to the launch or the launch vehicle. “Payload” refers to anything launched into space that has a job not related to the launch. “Fragmentation debris” is anything resulting from the breakup of another object or a collision. Credit: ESA Space Debris Office

“The majority of space debris objects in orbit are due to breakups, due to literal explosions of what used to be intact objects like upper stages [of rockets] or satellites, or due to the residual energy sources on board—fuel, pressure in tanks, charged batteries, and so on,” Krag explained. “Over time, they tend to fall apart, or some of them literally explode.”

Other debris comes from collisions. “Two objects colliding in space at an average velocity of 36,000 kilometers per hour can have more severe effects than an explosion of an object,” Krag said. “Several fragments would be left behind by such a collision.” Antisatellite operations also add to the fragment count, he added.

Every new fragment can lead to more fragments later on, exacerbating the problem. “These breakups need to stop,” Krag said.

Protected Regions

The video below from ESA shows the positions of tracked and simulated debris objects. Red dots represent active or defunct satellites, yellow dots represent rocket bodies, green dots represent miscellaneous mission-related objects, and blue dots represent fragments:



Two regions immediately stand out as especially congested: the space immediately above Earth’s surface, called low-Earth orbit (LEO), and a band farther out that’s aligned with the equator, called geostationary orbit (GEO). LEO and GEO encompass sets of orbits that are particularly useful for communications and observing Earth and space.

The Inter-Agency Space Debris Coordination Committee, an international governmental forum for coordinating space debris activities, defined LEO and GEO as protected regions. It proposed a set of guidelines to mitigate the creation of space debris there. Those guidelines, however, only work when spacecraft comply with them.

“Unfortunately, we still have quite a bit to do in terms of the way we use our spacecraft in low-Earth orbit.” “In terms of GEO, the number of missions which do not respect the guidelines—essentially, when your mission is over, clear the GEO region by reorbiting yourself to a higher altitude—we see that we are asymptotically converging on a compliance rate of around 90% in this region,” Lemmens said. “This is pretty good, especially coming from below 40% 2 decades ago.”

In low-Earth orbit, “the guideline says that any object, when its mission is over, [should] have a remaining orbital lifetime lower than 25 years or be maneuvered out of LEO,” he said. Only about 5%–15% of satellites that don’t follow the first recommendation actually comply with the second, “which is very low.”

“Overall, unfortunately, we still have quite a bit to do in terms of the way we use our spacecraft in low-Earth orbit,” Lemmens added. “But it’s clear from the examples of rocket bodies and satellites in GEO that if we want to change something, we can.”

Who Goes to Space?

“We have a completely new era of spaceflight,” Krag said. “Spaceflight of the past was basically state owned, public, so it was the taxpayers’ money funding the missions. And now we see academics, amateurs, and, in particular, commercial companies starting to explore space.”

The pace of launches has sharply spiked in recent years and will continue to grow (Figure 2). “The number of launches we are expecting in the next 3 years will outnumber the number of launches the whole of humanity has done in the history of spaceflight,” Krag said.

Fig. 2. The number of launches into low-Earth orbit, altitude of 200–1,750 kilometers, between 1957 and the present separated by the type of funding source. Commercial launches have dominated in the past 2 years, and the trend is expected to continue. Credit: ESA Space Debris Office

Lemmens explained that making space more accessible is, overall, a good thing, especially when it brings new people into the spaceflight community. But it is also contributing to the space junk problem.

“At the moment, it’s a bit of a Wild West out there,” he said. “As long as you can pay for yourself to get into orbit, regulations about what you can and cannot do are quite light. [Regulations] are changing, but they need to put the notion of responsible behavior onto the person who is launching and on the person who is developing.”

Making Space Junk Free

Stopping the spread of space debris and being respectful of orbital space will require a collective change in mentality, according to Lemmens.

“We really should start thinking…, ‘Where can I go with my satellite?’ ‘What mission makes sense?’ and ‘How am I affecting the operators around me that are already there?’” he said. Satellites that can’t maneuver to avoid a collision, as with most CubeSats and ChipSats, should avoid already crowded orbits, Lemmens added.

“Before any talk of remediation, it’s about making sure we mitigate and stop polluting.”ESA and others have proposed so-called active removal missions to sweep up or deorbit space debris, Krag said, but these missions are costly and won’t be able to get it all. More promising, he said, are onboard technologies to automatically shut down or control satellite systems if fragmentation seems imminent.

“It’s definitely cheaper to equip your spacecraft right from the start with the means to make sure that it can dispose of itself rather than asking and paying for someone to remove it,” he said.

But first and foremost, Lemmens said, “is to implement the recommendations that are already out there. Before any talk of remediation, it’s about making sure we mitigate and stop polluting.”

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

Waxman Maintains Hope for Climate Change Legislation

Tue, 06/11/2019 - 20:25

“The biggest problem we face on climate is not that we need to come up with better ideas,” according to former U.S. Rep. Henry Waxman. “What we need is Republicans who will negotiate in good faith.”If some Republicans would be willing to seriously negotiate about climate change, “it wouldn’t be hard to work out the problems. It would take time. It would take trade-offs. But we could get somewhere.”

Waxman, a California Democrat who served in Congress for 4 decades as a leader on environmental and other issues, said at a 10 June forum in Washington, D.C., that although congressional Republicans and the Trump administration present significant obstacles, he is hopeful about action on climate change.

If some Republicans would be willing to seriously negotiate about climate change, “it wouldn’t be hard to work out the problems,” said Waxman. “It would take time. It would take trade-offs. But we could get somewhere.”

Waxman, who retired from Congress in 2015, had a prolific legislative career that included leadership on improving the Clean Air Act and on health care, among other issues. The 10 June forum about lessons from the Clean Air Act was sponsored by Resources for the Future and the American Academy of Arts and Sciences and also included other experts who focused on the successes and failures of the Clean Air Act.

The former congressman, who currently chairs Waxman Strategies, a Washington, D.C.–based public affairs and strategic communications firm, recalled that the successful 1990 reauthorization of the Clean Air Act benefited from a lot of horse trading with Republicans as well as with Democrats who represented states dependent on coal.

Lack of Republican Support for Earlier Climate Legislation

However, at the forum, Waxman also recalled his disappointment about the American Clean Energy and Security Act, commonly referred to as the Waxman-Markey bill. That legislation, which was introduced 10 years ago when Waxman was chair of the House Energy and Commerce Committee and Sen. Ed Markey (D-Mass.) was chairman of its Subcommittee on Energy and Environment, passed in the House but died in the Senate. The bill would have established a nationwide greenhouse gas cap-and-trade system with goals to reduce those emissions while also addressing energy efficiency and other issues.

“We said to a lot of our colleagues that there is Republican support because so many of these industry groups were supporting it,” Waxman said. However, he added, that didn’t make any difference because Republicans had decided that they didn’t want to give then President Barack Obama a victory on anything.

“They united against [Obama’s] stimulus bill even though [former] President George W. Bush had a stimulus bill when the economy went down. They wouldn’t support the Affordable Care Act [ACA]. They wouldn’t support the ability to deal with climate change. They were just against everything,” Waxman said. “When Republicans were against everything, we had to do it by Democratic votes, which made it impossible when we lost one vote, the 60th vote in the Senate.”

Waxman recalled hoping that the Senate would just pass something so that a bill could go into conference to work out any differences between a House and Senate bill. However, the legislation never made it through the Senate.

“The circumstances were that while we passed the climate change bill in the House, we thought the Senate would pass the ACA early and then we’d have the rest of the time to do the climate bill,” Waxman said. “But the Senate couldn’t pass the ACA until Christmas Eve, 2009, and it left us with no time to do anything on the climate bill.”

White House Hurdle

Waxman said that President Donald Trump, whom he called “so strange,” is a major hurdle for enacting new climate legislation.

“The Republicans in Congress are refusing to stand up to him. They’re doing it very, very reluctantly,” Waxman said. “I would be hard put to imagine that if he were reelected and if the Republicans decide that he’s still their leader because he got reelected, that we are going to get much progress. But I’m hopeful that he will get replaced.”

However, even if Trump is defeated in the 2020 presidential election, Waxman cautioned that it won’t be smooth sailing to pass strong legislation. “We’ll have a hard job, but it will tear down the biggest obstacle that we have right now,” he said. Trump “has articulated, and nobody has corrected him, that science doesn’t matter.”

“The key is changing the political dynamic, and until that happens all the good and creative ideas in the world won’t make much of a difference.”Waxman said that Trump believes that “this is just a plot from China when they talk about climate change.”

However, Waxman said that he sees changes on the horizon in public opinion and politics. “Now when you watch the nightly news, almost every night they talk about a climate disaster, and they talk about climate change as leading to these disasters,” he said, comparing current television newscasts to earlier newscasts that didn’t necessarily make that connection.

“The public opinion is clearly changing, and that is going to have an impact on their representatives of both political parties and on the courts,” he said. “The key is changing the political dynamic, and until that happens all the good and creative ideas in the world won’t make much of a difference.”

—Randy Showstack (@RandyShowstack), Staff Writer

Extreme Precipitation Expected to Increase with Warming Planet

Tue, 06/11/2019 - 11:18

The simplest thermodynamic equations make it clear that warmer air can hold more moisture than colder air: The Clausius-Clapeyron equation shows that for every 1°C temperature increase, Earth’s atmosphere can hold 7% more water.

The reality of global climate science, however, is often more complicated than the simplest thermodynamic equations.

Earth’s atmosphere is not uniform. Its composition is constantly changing, and it’s certainly not heating evenly everywhere—some places are even getting colder. Forecasting the likelihood of extreme precipitation events is therefore more challenging than adding numbers to a model. Still, the historic record, especially since anthropogenic warming took off in the 1900s, can provide insight into how Earth’s atmosphere responds to rapid warming.

In a new study, Papalexiou and Montanari use a novel technique to analyze historical data and investigate the likelihood that global warming was driving the frequency and magnitude of extreme precipitation events. The scientists collected their data from the Global Historical Climatology Network–Daily database. This data set includes measurements from approximately 100,000 precipitation stations across the world.

For their analysis, the researchers focused on the 1964–2013 period when global warming accelerated. They looked at how many complete years of data were recorded for a given station; then they chose to analyze that number of extreme precipitation events. So if a station provided 45 complete years of data, they analyzed the top 45 most extreme events. The authors argue that this analysis technique represents extreme rainfall events more accurately than simply looking at a series of annual maximum precipitation numbers because in the absence of some external force (such as rising temperatures) it should result in an even distribution. Stations with fewer than 5 complete years of data in each one of the 5 decades studied were excluded from the analysis, and after screening for a variety of other criteria, the researchers were left with a record from 8,730 stations from around the world, mostly clustering in North America, Europe, Russia, China, and Australia.

The researchers then constructed a time series for both annual frequency and average magnitude of the extreme rain events for each weather station. For the frequency data, the results were especially pronounced, with the occurrence of extreme precipitation events increasing significantly as time went on. In the last decade of data (2004–2013) the scientists found 7% more extreme precipitation events than they’d expect if no external force were skewing the distribution. The data related to magnitude were less pronounced but also indicated a slight uptick. Additionally, the researchers report they found no strong correlation between increasing frequency and increasing magnitude.

Finally, because each weather station is also tied to a geographical location, the researchers were able to analyze where the extra rain was falling, with Eurasia, northern Australia, and the midwestern United States absorbing the bulk of the new moisture.

The study suggests that as the planet continues to warm, extreme rainfall events will continue to become an increasingly common part of life for many heavily populated parts of the world. As land managers and policy makers fight to stay ahead of climate change, this type of data will become ever more informative and necessary. (Water Resources Research, https://doi.org/10.1029/2018WR024067, 2019)

—David Shultz, Freelance Writer

North Carolina Bald Cypress Tree Is at Least 2,674 Years Old

Tue, 06/11/2019 - 11:17

Wired to survive in dry, wet, or even swampy soil conditions, bald cypresses are hardy, tough, and adaptable. Yet even the hardiest of these rugged, magnificent conifers can’t guard against tree-killing humans.

Less than 1% of bald cypress forests have survived periods of heavy logging, according to David Stahle, a geoscientist at the University of Arkansas who uses dendrochronology, radiocarbon dating of tree samples, and other information (such as rainfall data) to reconstruct ancient climate conditions.

The bald cypress is the oldest-known living tree species in eastern North America and the oldest-known wetland species of tree in the world.Like rhino populations decimated out of desire for their horns, humans have ruthlessly “hunted” almost all of the otherwise resilient bald cypress trees for their timber.

However, many bald cypresses 1,000 or more years old along southeastern North Carolina’s Black River (a tributary of the Cape Fear River) have managed to escape this fate. Stahle and his collaborators now report their discovery of the oldest-known tree among those at Black River in a study published in Environmental Research Communications on 9 May 2019. The tree is at least 2,624 years old, according to their analysis, which included tree ring chronology and radiocarbon dating of nondestructive core samples.

The bald cypress is also the oldest-known living tree species in eastern North America and the oldest-known wetland species of tree in the world, the researchers wrote.

Surprise and Gratification at an Ancient Tree

“We were surprised and gratified” to discover the minimum age of this ancient tree, said Stahle, who has studied Black River bald cypresses since 1985.

A bald cypress in the Black River, above, is the oldest-known living tree in eastern North America. Credit: Dan Griffin

When he first arrived at the Black River, Stahle was stunned by the sheer number of ancient trees found there. To find that many trees that appear to be at least 1,000 years old in one place is “pretty rare,” he said.

“Dave is a pioneer in our field, particularly in the eastern U.S. and with bald cypress. His discovery of a truly ancient bald cypress is a natural trajectory of his career and his lab’s efforts over the decades,” Neil Pederson, a senior forest ecologist at Harvard University’s Harvard Forest in Petersham, Mass., wrote in an email to Eos. Pederson wasn’t involved with this study.

“I was pretty sure Dave and his lab would find a 2,000-year-old bald cypress. With this discovery, I [now] kind of expect that they [will] find one that is over or very close to 3,000 years old. (No pressure, Dave),” wrote Pederson.

The study also revealed that the area of old-growth Black River bald cypress is about 10 times larger than Stahle previously thought. Pederson actually considers that the most surprising finding of this study.

“It gives hope that there is more forest out there that has mostly escaped the heavy logging of the last four centuries,” he wrote.

Preserving Ancient Wonders

In a 1988 study published in Science, Stahle shared his discovery of a Black River bald cypress that was at least 1,700 years old. This work inspired The Nature Conservancy to act.

“We began protecting land on the Black River because of Dr. Stahle’s original research, which found trees dating from Roman times,” Katherine Skinner, executive director of the organization’s North Carolina chapter, wrote in an email to Eos. (The Nature Conservancy provided funding for Stahle’s recent research.)

“There really is no other place like it in the world,” Skinner added.

The Nature Conservancy has protected 19,000 acres (77 square kilometers) in the Black River basin. “The ancient forest is totally protected at this time, which was our top priority,” Skinner wrote.

A River Region Bursting with Natural Treasures

In the 1980s, Stahle primarily studied the Black River bald cypresses to learn more about the area’s climate record. He continues to focus on this aspect of his research, and his team’s paper extends the paleoclimate record in the southeastern United States by 900 years.

However, Stahle’s work with these trees has evolved over time to include dating individual Black River trees, especially older ones, and the conservation of the river and its floodplain.

Before The Nature Conservancy protected Black River bald cypresses, the trees were probably spared because they were too worn and weathered to attract loggers.That ecosystem is home to black bears, bobcats, river otters, rare fish species (including the Santee chub and broadtail madtom), the Cape Fear spike and other rare mussels, and neotropical songbirds, including the prothonotary warbler and the yellow-throated vireo.

Protected from Harvesting

Before The Nature Conservancy protected the ancient Black River bald cypresses, these trees were probably spared because they were too worn and weathered to attract loggers, Stahle said.

However, it is now common to harvest trees for their biomass. “With nations labeling biomass as carbon neutral, there has been renewed logging that has been pretty destructive of the productive and rich ecosystems in the southeastern U.S.,” Pederson wrote. “Ironically, one part of the green economy movement is a real threat to forests. Do not misunderstand. We need to change our economy to battle human induced climatic change [but] one early approach has become a threat to forests,” he added.

Another unexpected threat? The garden mulch industry. “Recently, other cypress forests in North Carolina were logged for cypress mulch. What a tragedy it would have been to lose these trees to mulch in someone’s yard,” Skinner wrote.

—Rachel Crowell (@writesRCrowell), Science Writer

Seeing the Light

Tue, 06/11/2019 - 11:15

When Neil Armstrong and Edwin “Buzz” Aldrin blasted off the Moon on 21 July 1969, they left a couple of packages at Tranquility Base. One was a solar-powered seismometer that collected 21 days of observations before expiring in late August. The other was an aluminum frame filled with chunks of fused-silica glass that looked a bit like a high-tech egg crate.

The Apollo 15 retroreflector contains 3 times more corner cubes than the Apollo 11 and 14 devices. Credit: NASA

Along with similar devices left on the Moon by Apollo 14 and 15, the instrument is still working—the only Apollo surface experiment that continues to provide data.

Known as a lunar laser ranging retroreflector, it bounces pulses of laser light back to their sources on Earth. Scientists time the round-trip travel time of each pulse, allowing them to measure the Earth-Moon distance to within a millimeter. A half century of these observations has provided precise measurements of the shape of the Moon’s orbit, wobbles in the Moon’s rotation, and other parameters. Those, in turn, have helped scientists determine the Moon’s recession rate, probe its interior structure, and test gravitational theory to some of the highest levels of precision yet obtained.

“This is a venerable technique that’s provided some of our best science about how gravity works,” says Tom Murphy, a professor of physics at the University of California, San Diego, who has headed a lunar laser-ranging project since the early 2000s.

Peculiar Prisms on the Moon

The devices left on the Moon by Apollo astronauts (as well as two others aboard Soviet Lunokhod rovers) consist of arrays of corner cube reflectors.

McDonald Observatory’s 2.7-meter telescope beams a laser toward the Moon. The telescope, part of the University of Texas at Austin, conducted laser observations from 1969 to the mid-1980s, when laser ranging was moved to a smaller telescope. Credit: Frank Armstrong/UT Austin

“These are like peculiar prisms—they’re shaped like the upper corner of a room,” says Doug Currie, a professor of physics at the University of Maryland in College Park who has worked in the field since the 1960s. “You could throw a tennis ball in the corner, and it would hit all three sides and bounce back to you. The lunar reflectors do the same thing. The difference is, you can send up to 1023 photons at a time, and you’re happy when one comes back.”

The Apollo 11 and 14 retroreflectors each contain one hundred 3.8-centimeter corner cubes, whereas the Apollo 15 array contains 300, so it produces the strongest return signal.

Photons are beamed toward the Moon through a telescope, such as the 3.5-meter telescope at Apache Point Observatory in New Mexico, the largest instrument ever to conduct lunar laser ranging. The laser is fired in 100-picosecond pulses—“bullets of light” just 2 centimeters thick, says Murphy, who heads the Apache Point Observatory Lunar Laser-ranging Operation (APOLLO).

No more than a few photons from each pulse return to the telescope, but the telescope fires thousands of laser bullets during each ranging session, allowing it to collect thousands of photons per session. Statistical analysis smooths out the differences in ranges between individual photons, producing a distance to the Moon with an accuracy of about 1 millimeter.

APOLLO ranges to the Moon about six times per month and targets all five of the retroreflectors during each session. France’s Observatoire de la Côte d’Azur, the other major lunar-ranging station, uses a smaller telescope but has begun ranging with an infrared laser, which is about 8 times more efficient than the standard green laser.

An Array of Scientific Contributions

Lunar laser ranging’s first scientific contribution was to produce an accurate measurement of how quickly the Moon is moving away from Earth: 3.8 centimeters per year. The retreat is the result of the ocean tides on Earth, which cause our planet’s rotation rate to slowly increase. To balance the books on the overall motion of the Earth-Moon system, the Moon speeds up, causing it to move away from Earth.

All five of the current lunar retroreflectors are located near or north of the Moon’s equator, leaving the southern hemisphere uncovered. Credit: NASA

Collecting data from the network of five retroreflectors over the course of several decades also has allowed planetary scientists to probe the Moon’s interior by measuring how the Moon “wobbles” on its axis.

Some of those wobbles are caused by the Moon’s elliptical orbit, but others are produced by motions within the Moon itself. Measurements of that interior “sloshing” revealed that the Moon has a liquid outer core that’s about 700 kilometers in diameter, roughly 20% of the Moon’s overall diameter.

“Everybody came in thinking, ‘we really know the Moon,’ but we didn’t,” says Peter Shelus, a research scientist at the University of Texas at Austin, which conducted lunar laser-ranging operations for more than 40 years. “We didn’t know the lunar rotation as well as we thought. As we got more data, though, everything fell into place, and the rotation rate allowed us to probe the interior.”

When the lunar laser-ranging experiment was conceived in the early 1960s, however, learning about the Moon itself was a secondary goal. The primary goal was to study gravity. And so far, laser ranging has confirmed Isaac Newton’s gravitational constant to the highest precision yet seen and confirmed other tenets of gravitational theory, including the equivalence principle, which says that gravitational energy should behave like other forms of energy and mass.

“What we’re after, the flagship science, is the strong equivalence principle,” says Murphy. “By, quote, dropping Earth and the Moon toward the Sun, we can use the Earth-Moon separation as a way to explore whether two bodies are pulled toward the Sun differently. That’s a foundational tenet of general relativity, and it would be very important if we saw a violation there.”

So far, the lunar laser-ranging experiment has confirmed relativity’s predictions about the equivalence principle to the highest precision yet seen—within the experiment’s margin of error, Earth and the Moon “fall” toward the Sun at the same rate.

“There’s Still Work to Do”

Despite the experiment’s success, Murphy says he’s “disappointed” in the results to date.

“We’ve managed to produce measurements we’re all confident in at the millimeter level of accuracy, but the model that it takes to extract science from this result has been slow to catch up. So we haven’t yet seen the order-of-magnitude level of improvement that we hoped for in those tests. We’ve seen maybe a factor-of-2 level of improvement, but that’s not very satisfying.”

“There’s stuff in the model and in the data that we still don’t understand. There’s still work to do.”James Williams, a senior research scientist at NASA’s Jet Propulsion Laboratory and another pioneer in the lunar-ranging field, agrees that there’s work to do to improve our understanding of the results.

“We’ve measured the Earth-Moon acceleration toward the Sun to 1.5 parts in 1013, which is a very, very sensitive test. It limits certain gravitational theories,” Williams says. “But there’s stuff in the model and in the data that we still don’t understand. There’s still work to do.”

While the models catch up, the observational side of the project could stand some improvement as well, scientists say.

The Lunokhod reflectors, for example, can be used only around sunrise and sunset; thermal problems scuttle observations at other points in the lunar cycle. The Apollo reflectors are degrading, probably because micrometeorite impacts on the surface are splashing dust onto the corner cubes. All of the current retroreflectors are placed near or north of the equator, leaving the southern half of the lunar globe uncovered. And current ranging is so precise that the orientation of the retroreflectors can cause a problem: As the laser bounces off opposite corners of an array, it can increase uncertainty in the measurements by a few centimeters.

Currie has proposed sending new reflectors to the Moon using a new corner cube design.

“We’ve been working on a 100-millimeter glass reflector that’s basically a scaled-up version of the Apollo reflectors,” he says. “You don’t have to worry whether a returned photon came from the near corner or the far corner of an array. We think that’ll improve the accuracy of a shot by a factor of a hundred. We’ve had to solve some thermal issues with the reflectors and the frame, but we can put together a package that can fly.”

Currie’s group has submitted proposals to NASA to strap one of the new modules on an upcoming lunar mission and has signed an agreement with Moon Express, a company vying to launch a lander.

“If you’re going to the Moon, these are almost no-brainer accompaniments,” says Murphy. “Their success is almost guaranteed; they require no power, they’ll work for decades and decades….It’s a low-cost, high-reward investment, which is why it was included on the initial Apollo mission.”

It’s an investment that’s still paying dividends 50 years later.

—Damond Benningfield (damonddb@aol.com), Science Writer

National Academy Can Now Expel Scientists in Cases of Misconduct

Mon, 06/10/2019 - 17:42

The National Academy of Sciences, established during Abraham Lincoln’s presidency, has long been an exclusive circle of distinguished scientists. But membership in the institution, previously conferred for life, can now be rescinded.

Scientists who violate the organization’s Code of Conduct can be stripped of their membership, the National Academy of Sciences (NAS) announced on 3 June. The change to the organization’s bylaws was approved after its thousands of members were polled, and the result was overwhelmingly in support of the amendment.

Membership as a “Major Award”

Scientists are elected to the National Academy of Sciences by invitation; fewer than 100 researchers are inducted annually. Members are quick to note the advantages of being part of the select group.

“It’s absolutely a benefit for people who get in,” said Donald Turcotte, a geophysicist at the University of California, Davis, who was elected as a member in 1986. “Short of [a Nobel Prize], it’s the major award that somebody can get.” In Turcotte’s case, he says the honor helped him secure a faculty position.

Should a scientist’s ethical conduct be considered in addition to his or her scientific prowess? Who decides the severity of the misconduct? Is there a statute of limitations?Recently, there’s been increased scrutiny over how scientific prizes and honors—like membership in the National Academy of Sciences—are awarded. That’s because of growing concerns over misconduct in the sciences.

Scientific prize–granting organizations are being faced with important questions: Should a scientist’s ethical conduct be considered in addition to his or her scientific prowess? Who decides the severity of the misconduct? Is there a statute of limitations?

The answers to these questions and others aren’t clear. What is clear is that the effects of misconduct, including various forms of harassment, can have far-reaching, long-lasting consequences: Scientists who have been harassed have switched research fields to avoid their harassers and even left academia altogether.

“In the Past, There Was No Way of Doing This”

Some scientific organizations have already taken a stance on this complicated issue.

In September 2017, AGU updated its ethics policy to take a much stronger position against harassment. The organization also requires that candidates for an AGU award, honor, or governance position complete a Professional Conduct Disclosure Form in which individuals must disclose if they have been the “subject of a filed allegation, complaint, investigation, sanction or other legal, civil or institutional proceeding.” Last year, AGU rescinded an award after receiving a formal ethics complaint about the prize winner, Nature reported.

The National Academy of Sciences, however, hasn’t had any policies in place to strip scientists of their membership. “In the past, there was no way of doing this,” said Turcotte.

But in late April, scientists attending the 156th Annual Meeting of the National Academy of Sciences in Washington, D.C., began setting changes in motion.

The policy is “bringing the NAS up to the issues that are being faced today.”At a business session on 30 April, NAS members voted to amend the organization’s bylaws. The vote allowed the 17-member NAS Council to revoke the membership privileges of scientists who violated the Code of Conduct.

Citing the “substantive” nature of this amendment, however, the National Academy of Sciences decided the vote would need to be ratified by its full membership. An email was sent to all of the organization’s roughly 2,300 members asking them to cast their ballot through the NAS website.

Cathy Whitlock, an Earth scientist at Montana State University in Bozeman, voted in favor of amending the bylaws. “I’m completely supportive of the effort,” said Whitlock, who was elected to the National Academy of Sciences in 2018 and is also a member of AGU. “It’s bringing the NAS up to the issues that are being faced today.”

The voting outcome, which closed on 31 May, was a resounding 84% in favor of the amendment.

“The amendment passed by a large margin,” Susan Wessler, home secretary of the National Academy of Sciences, announced to members on 3 June.

Adhering to the Highest Standards of Professional Conduct

This change will potentially affect only a “very, very small number” of NAS members, but it sends a strong message, said Turcotte.

Marcia McNutt, the president of the National Academy of Sciences, echoed this sentiment. “This vote is less about cleaning house and more about sending the message that the members of the National Academy of Sciences adhere to the highest standards of professional conduct and are serious about expecting that their colleagues abide by our code,” McNutt told Science. McNutt, a marine geophysicist, was president of AGU from 2000 to 2002.

The National Academy of Sciences’ decision is an important one, said Chris McEntee, AGU’s chief executive officer and executive director. “We are pleased to see organizations like the National Academy of Sciences…look at updating their own codes of ethics to address serious issues of harassment, bullying, and discrimination in science.”

—Katherine Kornei (@katherinekornei), Freelance Science Journalist

Ordinary Security Cameras Could Keep an Eye on Rainfall

Mon, 06/10/2019 - 11:21

The same security cameras used on seemingly every busy city block could also capture instantaneous measurements of rainfall intensity: the depth of rain that falls over a given time period. Developed by Jiang et al., this low-cost approach could help inform flood warnings, climate change research, water resource management, and other hydrologic pursuits.

Rain gauges traditionally provide intensity measurements but are often too sparsely spaced for high-resolution data, especially in topographically varied areas like cities. Remote sensing methods such as weather radar are too “big picture” and too indirect to aid real-time flood warnings. Instruments called disdrometers capture instantaneous rainfall intensity but are too pricey for widespread use.

The new, alternative strategy uses “opportunistic sensing,” in which novel insights are gleaned from unrelated sources. Recognizing the ubiquity of close-circuit television (CCTV) cameras, the researchers developed an algorithm that separates a CCTV video still into one layer capturing the streaky shapes of falling raindrops and another layer of the raindrop-free background. Image analysis then reveals instantaneous rainfall intensity.

The researchers tested their new raindrop identification algorithm in a series of virtual analyses. They found that it outperforms previously developed algorithms in separating raindrops from backgrounds with visual disturbances, such as moving cars and swaying trees.

They also tested their overall approach to rainfall intensity measurement in real-world settings during five different rainfall events and found that the approach has satisfactory accuracy over widely varying rainfall intensities. It also has a lower error rate than other camera-based strategies, despite its reliance on lower-quality cameras and testing with real-world scenes that are more complex.

The new approach highlights the possibility of using existing CCTV networks to opportunistically measure rainfall intensity at high resolution and low cost. Such observations could help researchers validate climate models and improve understanding of floods caused by intense storms, especially in urban settings.

The authors suggest several paths for future research, including fine-tuning the raindrop identification algorithm to capture a wider range of raindrop phenomena, such as splashing. Application of artificial intelligence techniques could also enhance the new approach. The research team is now working with the local meteorological department to implement this technology in Shenzhen, China’s “tech megacity.” (Water Resources Research, https://doi.org/10.1029/2018WR024480, 2019)

—Sarah Stanley, Freelance Writer

Role of Humans in Past Hurricane Potential Intensity Is Unclear

Mon, 06/10/2019 - 11:21

The last three Atlantic hurricane seasons (2016–2018) were very active. In 2018 the season started early and ultimately produced more named storms and more hurricanes than an average year. The 2017 season was the costliest on record, causing more than $200 billion in storm damage from June to November (Hurricane Katrina’s season cost an estimated $159 billion). The 2016 hurricane season lasted almost 11 months and included a category 5 storm.

Active hurricane seasons are associated with warm sea surface temperatures, which fuel tropical cyclones. Scientists have observed stronger storms in the North Atlantic since the 1980s. But how much have human activities contributed to this trend, and how much is from natural variability?

Trenary et al. examined several climate models to evaluate how human-caused factors like greenhouse gases and aerosols might have affected a hurricane’s potential intensity, or the theoretical maximum intensity a storm could reach based on the surrounding environmental conditions.

The team first found that greenhouse gas warming and aerosol cooling produced changes in potential intensity that had very similar spatial patterns, making them difficult to separate in observations. They then evaluated the impacts of the combined responses and could not find a coherent response across the models—some models simulated an increase in potential intensity while others simulated a decrease. This discrepancy reveals that potential intensity is sensitive to how an individual model responds to human-caused factors. Given this sensitivity and the large natural swings in potential intensity in the Atlantic, it is not surprising that a human influence on potential intensity has not been detected yet. This result is consistent with mainstream science: Although models disagree about past changes in hurricane potential intensity, they all agree that Earth is warming because of human activities. Moreover, although human impacts on potential intensity are not currently detectable, observations could potentially reveal detectable changes in other hurricane-related characteristics.

In the future, aerosol emissions are expected to decrease as a result of more aggressive air pollution policies. Meanwhile, greenhouse gases will undoubtedly continue to accumulate in the atmosphere, leading to future increases in potential intensity. (Geophysical Research Letters, https://doi.org/10.1029/2018GL081725, 2019)

—Elizabeth Thompson, Freelance Writer

Role of Continental Arcs in Global Carbon Dioxide Emissions

Mon, 06/10/2019 - 11:17

We live in an age where changing levels of atmospheric carbon dioxide are driven by the actions of humans.

However, this phenomenon is recent, covering only the last few hundred years of Earth’s long history. Prior to our influence on the atmosphere, changes in carbon dioxide were governed by two main forces: life and magma.

The actions of organisms can release, emit, and capture carbon dioxide, and magmatism in Earth’s crust releases carbon dioxide and other volatiles at both mid-ocean ridges and arcs. Carbon dioxide is a major player in global climate, so unraveling how its concentration has changed in the atmosphere over the planet’s history is a question that requires an answer if we want to understand why climate has varied.

Researchers dissected representative examples of continental arcs from the Andes, Cascades, and Sierra Nevada, quantifying the volumes of plutonic and volcanic rocks in each to estimate the total volume of magma needed for these sections.In a recent paper in Geochemistry, Geophysics, Geosystems, Barbara Ratschbacher and others tackled the potential importance of continental arcs in carbon dioxide emissions over the past 750 million years. They dissected representative examples of continental arcs from the Andes, Cascades, and Sierra Nevada, quantifying the volumes of plutonic and volcanic rocks in each to estimate the total volume of magma needed for these sections. These estimates of magma volumes in the sections allowed for a time slice of arc productivity during periods of heightened activity called flare-ups.

To get from these slices to a global estimate, Ratschbacher assumed a percentage of carbon dioxide in arc magma, a length of continental arc over time across the planet, and a ratio of flare-up and lull activity along the arcs to get the total carbon dioxide released by continental arcs across the past 750 million years.

Results revealed the warm climate of the Mesozoic correlated with the highest estimates of carbon dioxide emissions from continental arcs, about 15–18 metric tons per year. Modern estimates of continental arc carbon dioxide emissions are closer to 7 metric tons per year. The values in the new research represent carbon dioxide emitted from only continental arcs and not mid-ocean ridges or flood basalt provinces, so this is only a piece of the global carbon dioxide balance.

These calculations were built off some of the best examples of exposed continental arc crust, where both plutonic and volcanic components could be assessed.

“I have been trained to be hesitant to draw conclusions if data is missing or inaccessible. However, I do believe that the assumptions we made are well founded and the areas we choose for the calculations are the best-studied continental arc sections worldwide, making these calculations as accurate as was possible. It will be interesting to see whether our results change as more data is collected in the future, ” Ratschbacher wrote to Eos in an email.

Ratschbacher said that the idea to use magma addition rates in continental arcs to assess carbon dioxide emissions over time arose from conversations at the 2017 Cooperative Institute for Dynamic Earth Research summer program.

“Instead of trying to get at the carbon dioxide output via direct measurement at volcanic centers,” she said, “we used the input of magma to the crustal column to determine the output of carbon dioxide.”

Next Steps

Gary Michelfelder, an assistant professor at Missouri State University in Springfield who was not involved in the research, said in an email that the study was a great first step in understanding changing rates of magmatic addition over time and carbon dioxide emissions from continental arcs.

Anita Grunder, an emeritus professor at Oregon State University in Corvallis who also wasn’t involved in the study, agreed that the study was “a creative attempt to quantify carbon dioxide contributions” and wondered what other lines of evidence might exist for high carbon dioxide emissions during flare-ups: “More weathering? More limestone? Hotter climate?”

Ratschbacher thinks that the next avenue to pursue is examining arcs that are less productive and magma rich, such as the Appalachians. This comparison between more and less productive arcs could help answer a key question of why arcs can change in character from lulls to flare-ups. Putting all these pieces together will paint a clearer picture of the relationship between changing carbon dioxide in the atmosphere and global climate.

—Erik Klemetti (@eruptionsblog), Denison University, Granville, Ohio

Can Bold U.S. Federal Climate Legislation Be Enacted Now?

Fri, 06/07/2019 - 18:59

This time could be different.

That’s what former Rep. Henry Waxman hopes.

Ten years ago this month, he and Sen. Ed Markey (D-Mass.), who was then a U.S. representative from southern California, introduced far-reaching legislation to curb climate change. The American Clean Energy and Security Act (ACES), commonly referred to as the Waxman-Markey bill, would have established a nationwide greenhouse gas cap-and-trade system with goals to reduce those emissions while also addressing energy efficiency and other issues.

The bill passed in the House with a vote of 219–212 on 26 June 2009, but it died in the Senate on 22 July 2010, having failed to win enough votes because of an effective lobbying effort by special interests, among other reasons.

But the landscape has changed dramatically since then, and the chance for significant climate action in Congress may be better soon, perhaps after the upcoming presidential election if a climate-friendly president is elected, Waxman told Eos.

“The public is going to demand action,” said Waxman, who in 2009 was chair of the House Energy and Commerce Committee. “The public has been seeing constant examples of the harm that is coming from climate change.”

“The public is more attuned to what’s happening and is not going to accept the view that there’s no problem,” added Waxman, who currently chairs Waxman Strategies, a Washington, D.C.–based public affairs and strategic communications firm. “They are going to see the reality in connecting the scientific statements about climate change with the experience of climate change. It used to be that Republicans and others thought we wouldn’t see any problems for many, many decades, but we’re seeing the effects of climate change right now.”

A Riper Time for Action?

Other politicians and experts also say that the time is riper now for action, because of the increased urgency about climate change that has been highlighted in recent scientific reports and because of increased awareness and activism about the issue.

“The difference between 2009 and ’10 and today is the movement that has now been built,” Markey said at the 7 February 2019 press briefing outside the U.S. Capitol Building, where he and Rep. Alexandria Ocasio-Cortez (D-N.Y.) introduced the sweeping Green New Deal resolution, an ambitious proposal to achieve net-zero greenhouse gas emissions by 2050, promote climate justice, and create jobs, among other goals. In 2009, Markey was chairman of the House Energy and Commerce Subcommittee on Energy and Environment.

Climate change “is now a voting issue across the country,” Markey said. “The green generation has risen up, and they are saying that they want this issue solved, and they want the people who work in this building and occupy the White House to solve this problem. So this is going to enter the 2020 election cycle as one of the top two or three issues for every candidate on both sides for them to have to answer.”

He and others highlighted another distinction between then and now. The year 2009 was when the libertarian Koch brothers and others “started to pour their millions into trying to create a climate where people did not believe in climate science,” Markey said. “We now have the troops. We now have the money. We’re ready to fight. OK. And so the difference between 2009 and ’10 and today is we now have our army as well.”

That army of activists and concerned citizens, along with scientific reports by the Intergovernmental Panel on Climate Change and others, and the increasing evidence of climate change have encouraged many Democratic presidential candidates to support the Green New Deal or other strong actions on climate change.

These candidates include current frontrunners former vice president Joe Biden; Sen. Bernie Sanders (I-Vt.); Sen. Elizabeth Warren (D-Mass.); South Bend, Ind., mayor Pete Buttigieg; and Sen. Kamala Harris (D-Calif.). Another candidate, Washington governor Jay Inslee, calls climate change the country’s number one issue.

The Democratic National Committee, however, reportedly has rejected a call for a presidential primary debate to focus on climate change.

Any Possibilities in Congress This Session?

Rep. Jared Huffman (D-Calif.) told Eos that the Waxman-Markey bill stands as “the high water mark for climate leadership in the Congress,” but he is hopeful that even in the current political climate, there could be action in Congress.“Just catching up to where we were then feels like a Herculean task given the fossil fuel industries’ efforts.”

“Just catching up to where we were then feels like a Herculean task given the fossil fuel industries’ efforts and the way they have punished Republicans who did support climate leadership in the Waxman-Markey bill,” he said. However, Huffman, chair of the House Natural Resources Subcommittee on Water, Oceans, and Wildlife, added, “I have faith that even the most shriveled heart of my Republican colleagues at some point will have to think about future generations.”

Some analysts won’t totally write off the possibility of minor legislation related to climate change reaching the president’s desk for approval this Congress, despite current resistance by the Senate Republican leadership and the Trump administration’s attacks on climate science.

Indeed, a lot of climate-related legislation has been introduced in the Democratically controlled House, which has held numerous hearings about climate change. And some legislation even has passed the House, including the Climate Action Now Act, which directs President Donald Trump to honor the nation’s commitments under the 2015 Paris climate agreement, and four bills dealing with ocean acidification. However, the Climate Action Now bill likely has little chance for passage in the Senate, and the same fate may await some of the other legislation as well.

Sara Chieffo, vice president of government affairs with the League of Conservation Voters (LCV), told Eos that she has not given up on the possibility of there being moderate progress on climate change legislation during the current Congress, but she thinks that passing major legislation will need to wait for a more environment friendly Senate and White House. “Given the scope and scale of the crisis and the support from pro-environment members of Congress, we are going to try to make progress where we can,” she said, whether it is, for instance, within appropriations legislation, a potential tax vehicle, or infrastructure or defense bills.“If you are asking, is anything transformative possible with President Trump in the White House? Our answer is no.”

“There is no reason we shouldn’t be making small progress through those vehicles,” added Chieffo, who joined LCV just 3 weeks before the Waxman-Markey bill was marked up in the House Energy and Commerce Committee,. “But if you are asking, ‘Is anything transformative possible with President Trump in the White House?’ our answer is no.”

David Doniger, senior strategic director for the Natural Resources Defense Council’s (NRDC) Climate and Clean Energy Program, largely agreed with that assessment. “Nothing constructive can occur during this term with this administration not playing any positive role, playing only negative roles,” he told Eos.

However, Doniger, who followed the Waxman-Markey bill closely for NRDC, said that there could be substantive and constructive steps forward, even now, in appropriations bills and perhaps in transportation and disaster relief, planning, and preparation legislation. In addition, he said that now is a time “to continue to build and refine public support” for action on climate change and to elaborate on specific legislative proposals that could be ready to move forward perhaps in a more favorable next congressional session.

Doniger cautioned, though, that a so-called innovation agenda that some Republicans are pushing, with a focus on potential technological fixes, “is just not enough by itself” to solve the climate change problem. “We need to see a commitment and understanding [from them] that we really do have to reach net-zero emissions by the middle of the century,” Doniger said. “That has to be the yardstick for measuring the effectiveness of policies that a political office holder advocates.”

Going Big?

“We need to think differently about how to get climate action through Congress,” said Jeremy Symons, who was senior vice president for programs at the National Wildlife Federation during the Waxman-Markey effort.

“We need to go big in Congress but not put all our eggs in one legislative basket.”“Climate measures big and modest should be incorporated into all parts of Congress’s legislative agenda,” Symons, now the principal at Symons Public Affairs, a consulting firm supporting climate action, told Eos. “After all, the entire climate is changing, and tackling climate will require investment and updated regulations across all parts of government operations. We need to go big in Congress, but not put all our eggs in one legislative basket.”

The Green New Deal is one attempt to go big. Waxman commented that although a lot of the details of that resolution, including the funding for it, have not been fleshed out, he is encouraged by the effort to bring greater urgency to the climate change issue. Speaking about the resolution’s sponsors, Waxman said, “I’m proud of the fact that they have taken up this fight and they are pushing hard. I find that encouraging because it would be a shame just to withdraw from this battle, when we’ve got to prepare ideas that we need to turn to, if not now, [then] when we have the opportunity to pass legislation and use existing laws.”

Another attempt at going big, and perhaps being a successful next major step following the Waxman-Markey bill is the Energy Innovation and Carbon Dividend Act of 2019. It would impose a fee on the carbon content of fuels and on products “derived from those fuels that will be used so as to emit greenhouse gases into the atmosphere,” with the fee providing dividend payments to U.S. citizens or lawful residents. Mark Reynolds, executive director of the Citizens’ Climate Lobby (CCL), which supports the legislation, told Eos that although the bill may not pass this Congress, it sets a marker.

The current version “lays the groundwork for a simple transparent bill to be the next major climate bill introduced,” he said, noting that CCL tries to work in a bipartisan manner on climate issues. “If we can establish that this is the bipartisan approach to the problem, then I think that’s a huge win for everybody working on this issue.”

Reynolds told Eos that a related bipartisan effort, the congressional Climate Solutions Caucus, is in the process of being reconstituted following the 2019 elections. He said that Rep. Francis Rooney (R-Fla.) will be the new Republican cochair of the caucus, with Rep. Ted Deutch (D-Fla.) continuing on as the Democratic cochair.

New Changes and Familiar Opposition

The political landscape has changed in other ways since Waxman-Markey. Antonia Herzog, who was a federal climate policy analyst with NRDC when the bill was being considered, told Eos that, for instance, there has been “a lot of great stuff happening in the states and at the local levels” over the past decade to deal with climate change. However, she said she wishes that more action had already taken place.

“Every single year is a year that has been lost since it became obvious that [climate change] is a problem,” said Herzog, who currently is the climate and energy program manager for Physicians for Social Responsibility.

LCV’s Chieffo added that the clean energy economy is far more mature now than it was a decade ago. “Would we be better off if we had already been implementing climate solutions at the federal level that Congress had passed 10 years ago? Certainly. But are we starting out from zero? The answer is no,” she said.

In addition, there has been significant activity in the private sector. For instance, the We Are Still In coalition counts about 3,800 organizations in the United States—including businesses as well as states and cities—committed to meeting the Paris goals. In addition, former New York Mayor Michael Bloomberg announced today, 7 June, a $500 million Beyond Carbon campaign to tackle climate change.

One thing that hasn’t changed much in the past decade is the opposition, say a number of experts.

“The climate obstruction lobby has a dollar-driven death grip on Republican politicians. “The number one barrier to action in 2009 is the same barrier we face in 2019. The fossil lobby has teamed up with some deep-pocketed conservative donors in order to paralyze government action from the beltway to state ballot initiatives,” said Symons. “It’s far easier to scare people and block government action than it is to bring about real change. I call this the climate obstruction lobby. The climate obstruction lobby has a dollar-driven death grip on Republican politicians.”

That lobbying “lowered the probability of enacting the Waxman-Markey bill by 13 percentage points,” according to a paper, “The Social Cost of Lobbying over Climate Policy,” published in the June issue of Nature Climate Change.

“There is going to be a ton of oil and gas money fighting us, that’s for sure. But I don’t think it is guaranteed to have the same influence as 10 years ago,” said Doninger. He said that’s in part because the coal industry is not the economic or political force that it had been and because other energy industries, including wind, solar, and even nuclear, are asserting themselves in the economy and in politics “in ways you didn’t see 10 years or more ago.”

Theda Skocpol, professor of government and sociology at Harvard University, told Eos that although the fossil fuel industry is a significant driver of climate denialism, “the real force on the right is the Koch network.”

That network “is not just the brothers. It’s 400 to 500 ultraconservatives who organize to take over the policy making of the Republican Party, and they are very successful,” said Skocpol, author of a 2013 analysis of the defeat of the Waxman-Markey bill, “Naming the Problem: What It Will Take to Counter Extremism and Engage Americans in the Fight Against Global Warming.” “This is just one issue of a number where they’ve weighed in and changed the Republican policy making on the economy, the role of government in the economy in the states and the federal government. But this is absolutely a core area. So they both reward Republicans who hew the line they want, and they punish those who do not. You don’t find very many Republicans who are actually in office or who are running for office who are prepared to go along with much of anything.”

She added, “Republican officeholders and elites are just as locked into blocking action as they have been for a decade. And Donald Trump is definitely the denier in chief.”

Hope for a Better Outcome

Despite the defeat of the Waxman-Markey bill and the continued opposition to advance and enact climate change legislation at the federal level, experts say that there is renewed hope for incremental climate change steps now and major measures perhaps as soon as the next Congress. They also are encouraged by nonfederal progress over the past decade, the focus on climate change in the current Congress, and the heightened awareness and advocacy about climate change in light of the increasing urgency of the issue.

“The problem is going to be more and more on people’s minds as they see tornadoes and storms and droughts and all the consequences of what the scientists have been predicting from climate change,” Waxman told Eos. “They are going to demand action, and they are going to, in a democracy, turn away from those who say that there is no problem, because the truth is being driven home by people’s experiences.”

—Randy Showstack (@RandyShowstack), Staff Writer

Eight Ways to Support Women in Science

Fri, 06/07/2019 - 12:49

To succeed in science, women must overcome subtle biases that favor men as well as a culture of overt sexual harassment. These barriers to success result in the underrepresentation of women in science: In the United States, women receive 50% of geoscience Ph.D.’s but represent only 20% of geoscience faculty positions. The systemic underrepresentation of women in science fosters a culture of sexual harassment, which in turn discourages women from careers in science and perpetuates underrepresentation.

For people who claim more than one marginalized identity, the multiplicative effects of these barriers can be especially severe.Although calls have been made for senior scientists to address sexual harassment and break the cycle, a broader cultural shift is required to reach gender equity in science. Ultimately, the majority group (i.e., men, specifically, white men) must step up and share the burden of diversity, equity, and inclusion efforts if we are to succeed in making science diverse, equitable, and inclusive.

We focus on binary gender here, but other underrepresented groups in the geosciences face barriers on the basis of their ethnicity, nonbinary gender, sexual orientation, economic status, disability, geography, or religion, among other factors. For people who claim more than one marginalized identity, the multiplicative effects of these barriers can be especially severe. Many of the strategies outlined here can be generalized to improve the diversity, equity, and inclusion of all underrepresented groups in science.

Here we present an eight-point subset of the many available resources and concrete strategies for effecting a cultural transformation. The AGU Ethics and Equity Center, which launched earlier this year, provides further resources to educate; to promote and ensure responsible scientific conduct; and to establish tools, practices, and data for organizations to foster a positive work climate in science.

1. Stop Harassing Women

Most women in science experience sexual harassment at some point during their career, most of it perpetrated by men. In the geosciences, field research environments, which can isolate victims from reporting systems and support networks, amplify the frequency and severity of sexual harassment. One reason for the prevalence of sexual harassment in science may be the harasser’s ignorance of which behaviors are merely inappropriate and which ones constitute sexual harassment, as defined by a recent report on sexual harassment of women by the National Academies of Sciences, Engineering, and Medicine (NASEM).

It is widely recognized that sexual harassment includes unwanted sexual touching (unwelcome physical sexual advances, which can include assault) and sexual coercion (favorable professional or educational treatment that is conditioned on sexual activity). However, the vast majority of sexual harassment consists of verbal, unwanted sexual attention or gender harassment (verbal and nonverbal behaviors that convey hostility, objectification, exclusion, or second-class status toward women).

All forms of sexual harassment have quantifiable negative consequences for victims. These consequences include eroding their sense of security in the workplace, slowing their productivity, and causing them to skip professional meetings where they do not feel safe. Men in the scientific community must confront the reality that many of us have sexually harassed women and that the harassment must stop.

2. Listen to Women

Effective listening requires paying attention to, understanding, not interrupting, believing, responding to, and remembering what is being said.Listening to the scientific and personal experiences of women in science is paramount to achieving gender diversity and equity in science. Effective listening requires paying attention to, understanding, not interrupting, believing, responding to, and remembering what is being said.

The simple act of listening to women’s science promotes their work, while acknowledging the barriers they face validates their experiences and improves the institutional climate.

3. Be an Active Bystander

In addition to not harassing women, it is our responsibility to be active bystanders. When active bystanders suspect or witness potential or ongoing sexual harassment, they step in to diffuse the situation and support the targeted party. Active bystanders always prioritize the safety of the targeted party over punishing the harasser.

Active bystanders should know the resources relevant to victims of sexual harassment. For example, they should have the number 1-800-656-HOPE (4673), the National Sexual Assault Telephone Hotline, in their phones. They should also be able to provide information to their institution’s ombuds’ office, which is an independent and confidential party that helps victims of harassment resolve disputes within their institution.

Leaders of scientific institutions should require face-to-face active bystander training because online sexual harassment trainings have been shown to backfire and may actually lead to increased workplace harassment. Men should participate in these active bystander trainings when offered, such as at the recent AGU Fall Meeting 2018.

How might an active bystander respond to an incident at a scientific meeting? Here are two hypothetical examples:

A male conference attendee aggressively questions a woman speaker and repeatedly dismisses her answers. An active bystander on the session panel might interrupt the questioner and suggest moving on to questions from other attendees.

A conference attendee is holding the arm of a visibly uncomfortable woman who is giving a poster presentation. An active bystander might diffuse the situation by introducing themselves to the poster presenter with a handshake—giving the woman presenting an opportunity to free her arm from the harasser—and standing by to listen to the remainder of her presentation.

4. Implement Policies That Support Victims of Sexual Harassment

The responsibility of implementing policies that support women lies with those who hold most of the power, namely, male institutional leaders. Following the recommendations of the NASEM report Sexual Harassment of Women, leaders in science should implement the following concrete policies (the report contains a more complete list):

Leaders of scientific departments, institutions, and organizations must make it clear that sexual harassment is a form of scientific misconduct.Leaders of scientific departments, institutions, and organizations must make it clear that sexual harassment is a form of scientific misconduct that carries clear and appropriate negative consequences for proven harassers. When a victim files a harassment claim, the priority of the institution should be to ensure that the victim can safely continue their work.

Institutions need to consider the confidentiality of the target while also directing that person toward systems of support for victims of harassment. Sexual harassment policies should be clear, accessible, and consistent. They should address all forms of sexual harassment, including gender discrimination. Anonymized annual reports should be available to the entire community, detailing statistics of recent and ongoing sexual harassment investigations, including any disciplinary actions taken.

Academic institutions have a poor track record when it comes to punishing sexual harassers, especially when the harassers are faculty members. Disciplinary consequences should be progressive: They should correspond to the frequency and severity of the harassment. For example, disciplinary consequences might escalate from requiring counseling to reductions in pay to dismissal. Progressive consequences have the cobenefits of appropriately punishing harassers and reducing the fear of retaliation for victims. Funding agencies and professional organizations should rescind existing funding and awards from proven harassers.

5. Evaluate Your Personal Biases

Women in science are disenfranchised not only by sexual harassment but also by structural and implicit biases. For example, science faculty (irrespective of gender) view male students as more competent than equally qualified female students. Similarly, recommendation letters for postdoctoral fellowships in geoscience display significant gender differences that favor male applicants.

The first step to eliminating implicit biases is to recognize and quantify them. Women already count how well women are represented in conference sessions, panels, papers, and committees. Men should also evaluate the gender balance of their collaborators and departments and strive for equal representation. Men should consult existing resources for avoiding bias.

6. Promote Women Scientists and Their Work

Combating implicit biases against women in science requires an explicit effort to promote women scientists and their work.Combating implicit biases against women in science requires an explicit effort to promote women scientists and their work. When writing papers, cite women. If you can think of only a few women authors to cite, look at those papers and consider the women authors they cite—you may discover relevant papers you overlooked.

When planning invited departmental talks, consider the gender balance of invited speakers and strive for equal representation. If you are struggling to adequately represent women when organizing a panel, searching for a keynote speaker, or covering a recent paper for a media outlet, consider consulting women colleagues or resources such as the Request a Woman Scientist list, compiled by the group 500 Women Scientists. This list contains thousands of women scientists and is sortable by scientific field and level of expertise.

7. Incentivize and Support Inclusion Efforts

Women take on the majority of diversity, equity, and inclusion efforts in science at the expense of teaching and research, often without reward. Institutions should recognize, reward, and incentivize diversity, equity, and inclusion efforts. Such efforts include creating a departmental award for diversity, equity, and inclusion efforts; requiring a diversity statement in faculty applications; and recognizing diversity, equity, and inclusion efforts as positive contributions in promotion packages.

Institutions should encourage men to participate in diversity, equity, and inclusion efforts. Men can be trained as equity advisers to combat implicit bias and advocate for underrepresented groups, for example. Programs like STEM Equity Achievement (SEA) Change provide metrics to evaluate institutional efforts to improve diversity, equity, and inclusion.

8. Hire Women Faculty and Nominate Women for Awards and Leadership Positions

Women are particularly underrepresented in leadership positions in science. They represent disproportionately few geoscience faculty members; science, technology, engineering, and mathematics (STEM) department heads; and AGU Fellows and awardees. Institutions should implement policies that encourage the nomination and hiring of women. Such policies include explicit reminders to nominate women for awards; support for existing efforts to nominate scientists from underrepresented groups; and hiring clusters of scientists from underrepresented groups at the same time, a practice that can dramatically improve faculty diversity and institutional climate.

As you can see, promoting diversity in the sciences requires all kinds of efforts, large and small. Individuals can make some of these changes by being observant of their own attitudes and actions and by stepping in to help when they see an opportunity to do so. Other changes require institutional leaders to enact policies and offer training and resources that promote fair treatment. Individuals can influence these larger efforts by advocating for change and by stepping forward to assist with institutional-level efforts.

—Henri Drake (henrifdrake@gmail.com), Massachusetts Institute of Technology/Woods Hole Oceanographic Institution Joint Program in Oceanography, Cambridge, Mass.

7 June 2019: This article has been updated to identify the vessel in the photograph.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer