EOS

Syndicate content
Earth & Space Science News
Updated: 11 hours 33 min ago

Understanding Mountain Lakes in a Changing World

Wed, 09/20/2017 - 12:20

Alpine zones in many mountainous regions are warming faster than the global average. Recently observed changes in mountain lakes, including warming and increased algal growth, make it more urgent for scientists to understand the fundamental processes and properties of these lakes and to anticipate their future responses to global change.

Alpine and subalpine mountain lakes of the Northern Hemisphere are important witnesses to global change because of their rapid response to climate and atmospheric deposition and their relatively undisturbed catchments compared with lakes at lower elevations. These water bodies naturally contain low levels of organic matter and nutrients, making them ideal sites to better understand the effects of climate warming on lakes in general.

Attendees considered whether mountainous regions are experiencing increases in extreme events.However, the combined changes in the amount and type of precipitation, along with atmospheric deposition of nitrogen, phosphorus, and contaminants, complicate our efforts to understand the immediate and long-term consequences of global climate change on these sensitive water bodies.

To discuss these changes, researchers and resource managers from North America and Europe held a workshop earlier this year. Attendees discussed how extreme climate events and changes in precipitation influence biogeochemical cycling, especially the cycling of key nutrients essential for life. They considered whether mountainous regions are experiencing increases in extreme events. If so, how do local watershed characteristics moderate climate-driven influences on lakes?

A warmer climate could increase the probability of rain-on-snow events, exacerbate glacier retreat, and induce permafrost thaw, all of which alter the timing and quantity of runoff and quality of nutrient delivery to lakes. Climate change may lead to shorter winters and warmer summers. So workshop attendees wondered, What will be the ramifications on lake dynamics, particularly algal productivity and photosynthesis? Changing climatic regimes may interact with atmospheric pollution and nonnative species introductions to cause profound biological changes in mountain lakes. These stressors could have implications for downstream water quality and energy movement through food webs.

Workshop participants developed a conceptual framework unique to mountain ecosystems. This framework takes advantage of elevation gradients, long-term measurements, and experiments to develop a comprehensive understanding of ecosystem change.

Efforts are already underway to evaluate factors that determine the sensitivity of mountain lakes to species turnover.The group developed a template for a growing and comprehensive database, which will address many other research questions raised at the workshop, such as whether the species assemblages and food webs of mountain lakes are resilient to climate perturbations due to the highly variable environment in which they reside. Additionally, are winter conditions changing in mountain ecosystems? Efforts are already underway to evaluate factors that determine the sensitivity of mountain lakes to species turnover, using data from this growing database of dozens of lakes in the Northern Hemisphere. Additionally, attendees reported that several national parks in the western United States will coordinate targeted sampling efforts to better understand interactive effects between temperature and nutrients in regulating algal growth.

Although the workshop organizers and participants were primarily limnologists, we welcome the participation of other disciplines, including hydrology, atmospheric science, and climatology. We aim to continue building a comprehensive network of data that increases the probability of understanding lake processes and anticipating system changes to mountain lakes worldwide.

Readers who would like to help document the extent of algal growth in mountain lakes can download WATR2016, a citizen science app for iPhones available from iTunes. Register for the program through the app, and follow the instructions for taking and uploading pictures of mountain lake algal growth. All data will be compiled in the Alpine Algal Bloom Monitoring database on CitSci.org.

—Isabella Oleksy (email: isabella.oleksy@colostate.edu), Natural Resource Ecology Laboratory, Colorado State University, Fort Collins; and Joshua Culpepper, Department of Biology, University of Nevada, Reno

Lightning Strikes May Leave Traces Like Those of Meteorites

Wed, 09/20/2017 - 12:19

When a meteorite strikes a rock, it triggers rapid changes in pressure and temperature that alter the rock’s structure. Traditionally, scientists have treated microscopic planar deformation features in quartz crystals as a telltale sign of past meteorite impacts. However, recent research has demonstrated that lightning strikes can also leave similar signatures of shock.

In a new paper, Chen et al. mathematically simulate a lightning strike on a granite surface. They demonstrate that the resulting changes in the rock are a fingerprint of the energy and intensity levels of the lightning that caused them. More specifically, they demonstrate that shock features in quartz are created by the intense shock wave associated with the lightning strike. The results suggest that shocked quartz should not be interpreted as certain evidence of past meteorite impacts.

Scientists have known for decades that lightning can rapidly heat rock to over 2,000 kelvins near the strike point. Organic material on the surface burns off, and part of the rock itself melts almost instantaneously, later cooling to form a glassy surface layer called a fulgurite. It wasn’t until 2015 that researchers discovered shocked quartz in the granite substrate of a fulgurite.

In the new study, the research team developed a mathematical model to estimate the pressure exerted by a lightning strike on a granite surface, as well as the rapid heating and cooling of the rock. The model incorporated physical characteristics of lightning and granite, such as the typical temperature of lightning, the melting temperature of granite, and the temperature at which organic material on the granite surface would likely burn.

The simulations showed that a lightning strike can impart more than 7 gigapascals of pressure on the granite surface, enough to trigger the formation of shocked quartz. The strike creates a roughly circular layer of fulgurite about 18 centimeters across within a slightly wider region of burned organic material about 22 centimeters across.

These results are consistent with observations of fulgurite samples collected from field sites. For example, fulgurites collected from Mount Mottarone in Italy have regions of burned organic matter that are of similar size, roughly 20 centimeters across. Fulgurites from Les Pradals in France feature shocked quartz in a surficial layer less than 3 micrometers thick, consistent with the pressure calculations in the lightning strike model.

With this discovery, additional evidence will likely now be needed to convince impact geologists that shocked quartz indicates a past meteorite impact. Furthermore, these findings could help explain confusing occurrences of higher than expected impact rates, according to evidence for shocked quartz, in some regions. (Geophysical Research Letters, https://doi.org/10.1002/2017GL073843, 2017)

—Sarah Stanley, Freelance Writer

Methane Leaks May Make Natural Gas Worse Than Coal for Climate

Wed, 09/20/2017 - 12:16

Because burning natural gas produces about half as much carbon dioxide (CO2) per unit of energy as coal, many consider it an important transition fuel to carbon-neutral, climate-friendly sources of energy. However, the colorless, odorless gas has a downside: It contains more than 80% methane (CH4), a potent greenhouse gas that absorbs and retains 86 times more energy than CO2 over a 20-year period.

If too much methane leaks out as natural gas is drilled and pumped from the ground, it could negate any climate benefits derived from switching fuels.

Now, a new survey of methane leaks in a major natural gas- and coal-producing region in the United States suggests that without more stringent regulations to capture CH4 emissions, natural gas extraction could be worse for the climate than coal within 20 years.

Natural gas production in the United States has skyrocketed over the past decade, thanks largely to improvements to hydraulic fracturing and horizontal drilling—extraction techniques in which water, sand, and chemicals are pumped into the ground to break apart gas-holding rock formations. Previous studies have found that this nationwide production system isn’t leaking enough methane to make coal a better choice, but discrepancies between the methods used in different studies and the wide range of results have made the findings controversial.

In their new analysis, Ren et al. used a Cessna 402B research aircraft to quantify methane emissions from a 4,235-square-kilometer region of the Marcellus Formation in Pennsylvania and West Virginia, which accounts for about 40% of total U.S. shale gas production. Over the course of six flights in the summers of 2015 and 2016, they took air samples upwind and downwind of the natural gas operations, pulling air through a tube installed at the nose of the plane and analyzing its chemical composition. Then, they compared the levels of methane to background measurements collected in the region, taking into account other known sources of methane pollution such as cattle farms, coal mines, and landfills.

Previous studies have established a threshold at which natural gas becomes worse for global warming than coal: a leak rate of 2.4% of total natural gas production per year over 20 years. The leak rate from natural gas extraction operations in the sampled area of the Marcellus Formation was around 3.9% of the total production, exceeding that threshold, the team found. To ensure that natural gas extraction and combustion are a net benefit to the climate compared with coal, more stringent regulations for capturing those fugitive emissions are necessary, they conclude. (Journal of Geophysical Research: Atmospheres, https://doi.org/10.1002/2016JD026070, 2017)

—Emily Underwood, Freelance Writer

Cassini’s Legacy in Print

Wed, 09/20/2017 - 12:15

After two decades of incredible exploration, the Cassini Mission to Saturn is now over. The Cassini spacecraft has beamed back images and vast amounts of data, first from its flybys of Earth, Venus and Jupiter, then from 13 years spent circulating the ringed planet and its moons, as well as insights from landing the Huygens probe on the surface of Titan, the largest moon.

According to NASA, 3948 science papers have been published as a result of the mission. A search for papers in AGU journals with Cassini mentioned in the abstract published since the mission started in 1997 generated more than 750 results across 6 different journals. We are very proud that AGU has played a significant role in publishing some of the important findings from the mission. We invited some of the editors to reflect on papers published in their journals and how they have contributed to our scientific understanding. A special collection of all the papers highlighted below can be found here.

Mike Liemohn, Editor-in-Chief, Journal of Geophysical Research: Space Physics

An ultraviolet image captures an active aurora dancing around Saturn’s north pole. Credit: NASA/JPL-Caltech/University of Colorado/Central Arizona College and NASA/ESA/University of Leicester and NASA/JPL-Caltech/University of Arizona/Lancaster University

There are over 400 papers in JGR: Space Physics with Cassini mentioned in the abstract, so it is a rather difficult task to pick out just a few to highlight. Since one measure of impact is citations, here are a few reflections on the four most-cited Cassini papers.

We knew so little about Saturn’s magnetosphere before Cassini that the initial papers published soon after orbit insertion were all “discovery studies” of never-before-seen phenomena. Only one of those initial papers makes the list, though; the other three came a few years later, when the first statistical analyses could be conducted.

In poll position, the most-cited paper is Schippers et al. [2008], who presented an analysis of the electrons seen by several instruments. A complementary study of the ion populations in Saturn’s magnetosphere, second on the most-cited list, was conducted by Thomsen et al. [2010]. In addition to the survey results describing the charged particles environment around Saturn, these studies also include detailed methodologies for handling tricky instrumentation obstacles: sensor intercalibration in the first case and moments calculations in the second.

The third most-cited paper, also related to the plasma environment of Saturn, is Cowley et al. [2005], who examined Cassini measurements in the magnetosphere in relation to Hubble Space Telescope observations of Saturn’s aurora. It is an elegant explanation of spiral auroral structures being related to magnetic reconnection behind the planet in the magnetotail and the subsequent convection due to Saturn’s fast rotation. It also demonstrated that solar wind dynamic pressure controls this magnetotail reconnection, rather than the direction of the interplanetary magnetic field, as is the case at Earth.

The fourth most-cited Cassini paper in the journal is about the famous question of the planetary rotational period [Kurth et al., 2008]. Much of the sphere that we call Saturn is a fluid, not a solid, and the “surface” that we see is really just the top of a particular cloud layer. This allows for something called differential rotation, in which different latitudes rotate at different rates, making it very difficult to determine a definitive and single planetary rotation rate.

Plasma swirling around Saturn is correlated to bursts of radio waves emanating from the planet. The image on the left was obtained by the ion and neutral camera, part of the magnetospheric imaging instrument, and the data on the right from Cassini’s radio and plasma wave subsystem. Credit: NASA/JPL/JHUAPL/University of Iowa

One manifestation of this is in radio emissions from the auroral regions of Saturn, which exhibit a large-scale amplitude modulation that is very close to the nominal rotation period of 10.7 hours. This “daily” wax and wane of the radio emission slowly varies with time, and even more mysteriously, has different periods in the northern and southern auroral regions. This paper defines a Saturn longitude system timed with the radio emission modulations.

Mark Moldwin, Editor-in-Chief, Reviews of Geophysics

As mentioned above, one of the most interesting aspects of the magnetosphere of Saturn that the Cassini Mission observed was that there are modulations in charged particles, magnetic fields, energetic neutral atoms, radio emissions, motions of the plasma sheet and magnetopause and even in the rings with periodicities near the rotation period of the planet of about 10.7 hours. However, these periodicities change by about 1 per cent over time scales of a year or longer and are different in the northern and southern hemispheres. The highest-cited paper in Reviews of Geophysics related to Cassini is Carbary and Mitchell [2013] who reviewed the observations and the many models that have struggled to explain these puzzling periodicities.

Elizabeth P. Turtle, Associate Editor, Journal of Geophysical Research: Planets

In over 13 years of exploring the Saturnian system, the Cassini-Huygens mission has completely revolutionized our understanding of Saturn, its magnetosphere, rings, and moons large and small.  Over 70 manuscripts published in JGR-Planets to date illustrate the remarkable breadth and depth of the scientific discoveries that Cassini-Huygens has made possible, covering topics from the winds and vortices of giant Saturn itself [Vasavada et al. 2006] to diminutive Enceladus’ powerful cryovolcanic eruptions [Howett et al. 2011] and interior dynamics [Barr 2008].

Developments in our understanding of Titan have had particular impact. In fact, eighty percent of the twenty most-cited JGR-Planets papers related to Cassini-Huygens present Titan results. Some of these wide-ranging papers address the behavior of the atmosphere [Teanby et al. 2008, Yelle et al. 2008, Mueller-Wodarg et al. 2008, Cui et al. 2012].

Radar images capture Ligeia Mare, the second largest known body of liquid on Titan, one of the many seas and lakes in the moon’s north polar region. Credit: NASA/JPL-Caltech/ASI/Cornell

Others examine the tremendous level of complexity in the chemistry occurring on Titan revealed by in situ compositional measurements by Huygens [Niemann et al. 2010] and Cassini [Mandt et al. 2012, Westlake et al. 2012], combined with modeling [Vuitton et al. 2008, Horst et al. 2008].

They also trace the long-awaited unveiling of Titan’s surface, documenting the diversity of its geological structures and processes [Barnes et al. 2007, Le Mouelic et al. 2008, Mitri et al. 2010], materials [Clark et al. 2010], methane lakes and seas [Hayes et al. 2010], and potential subsurface exchange via cryovolcanism [Lopes et al. 2013].

Although the Cassini-Huygens mission is at an end, the wealth of data it has gathered will continue to fuel research on the Saturnian system and broader comparative planetology studies for decades to come.

—Jenny Lunn, Director of Publications, American Geophysical Union; email: jlunn@agu.org; Mike Liemohn and Mark Moldwin, Department of Climate and Space Sciences and Engineering, University of Michigan; Elizabeth P. Turtle, Johns Hopkins Applied Physics Laboratory

Pluto’s Features Receive First Official Names

Wed, 09/20/2017 - 12:14

A little more than 2 years after New Horizon’s intrepid flyby of the Pluto system, a committee that speaks for the field of astronomy has approved authoritative names for 14 features that the spacecraft’s images showed on Pluto’s surface.

“The approved designations honor many people and space missions who paved the way for the historic exploration of Pluto and the Kuiper Belt, the farthest worlds ever explored,” said New Horizons’ principal investigator Alan Stern in a statement of support to the International Astronomical Union (IAU) for the new names. The official monikers apply to specific craters, mountains, and plains, including the now renowned heart-shaped region of Pluto that New Horizons discovered in 2015.

In February 2017, the IAU’s Working Group for Planetary System Nomenclature (WGPSN) approved naming themes for features on Pluto and its largest moon, Charon. Later, the committee began soliciting suggestions for names of specific topographies.

On 7 September, the IAU announced the names of the 14 features, which are the first to be officially designated. “These names highlight the importance of pushing to the frontiers of discovery,” WGPSN chair Rita Schulz of the European Space Agency said in the announcement. The IAU stated that it expects to assign names to more features in the coming year.

Discoverers, Explorers, Pioneers, and Underworld Mythos

People who played a part in Pluto’s discovery are commemorated with prominent features on its surface. Tombaugh Regio, Pluto’s famous heart-shaped region, honors Clyde Tombaugh (1906–1997), who discovered the now dwarf planet in 1930. Burney crater recognizes Venetia Burney (1918–2009), who suggested “Pluto” as a name for Tombaugh’s newly discovered planet when she was 11 years old, and Elliott crater acknowledges astronomer James Elliott (1943–2011), who made the first detection of Pluto’s atmosphere.

A map of Pluto’s surface near the heart, with the new official names overlaid. Click image for larger version. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/SwRI/Ross Beyer

Mountain ranges received their names from human pioneers. The names of New Zealander Sir Edmund Hillary and Indian/Nepali Sherpa Tenzing Norgay, the first two people to reach the top of Mount Everest and return safely, now adorn Hillary Montes and Tenzing Montes. A third mountain range, Al-Idrisi Montes, is named for noted medieval Arab mapmaker and geographer Ash-Sharīf al-Idrīsī (1100–1165/66).

Even the heart’s curiously smooth left half, known during the 2015 close approach to Pluto as Sputnik Planum, has received a for-the-books name adjustment, to Sputnik Planitia. This new name and those of Hayabusa Terra and Voyager Terra each recognize spacecraft that made significant firsts in our history of solar system exploration. Soviet-launched Sputnik was the first space satellite; NASA’s Voyagers 1 and 2 were the first to visit Jupiter, Saturn, Uranus, Neptune, and the edge of the solar system, and the Japanese spacecraft Hayabusa was the first to return samples from an asteroid.

A close-up image of the Tenzing Montes mountain range (formerly known as Norgay Montes) taken by New Horizons’ Long Range Reconnaissance Imager on 14 July 2015. Click image for larger version. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/SwRI

Astronomers chose names of underworld places and creatures from cultures around the globe for some of Pluto’s features.Returning to a darker naming scheme that was the original source of names for Pluto, Charon, and their smaller moons, Styx, Nix, Kerberos, and Hydra, WGPSN chose names of underworld places and creatures from cultures around the globe for five more of Pluto’s features. Djanggawul Fossae, a series of thin and narrow valleys, invokes three ancestral beings in indigenous Australian mythology who traveled to the island of the dead. Adlivun Cavus, after the Inuit underworld of Adlivun, marks a deep depression near the bottom of Pluto’s heart. Sleipnir Fossa, a valley off the left edge of the heart, recalls the eight-legged horse that the Norse god Odin rode into the underworld. Virgil Fossae, a thin scar to the right of Elliot Crater, honors the Roman poet Virgil, who was also Dante’s fictional guide through hell in the Divine Comedy. Last, Tartarus Dorsa, a ragged terrain next to Tombaugh Reggio, revisits the deepest, darkest pit in the Greek underworld.

What’s in a Name?

Past IAU nomenclature initiatives have included naming minor planets in the outer solar system, a large batch of stars, and famous exoplanets. IAU’s 2015 Name Exo Worlds campaign, which assigned names to 31 exoplanets, received mixed reviews from exoplanet astronomers. These scientists, accustomed to the sometimes clunky designations like 51 Pegasi b and PSR 1257+12 bcd, have balked at using the IAU-approved names.

By contrast, astronomers appear to have embraced this newest IAU naming initiative. Pluto scientists have already written more than a dozen research articles in 2017 with the updated moniker “Sputnik Planitia,” compared to only a single mention of 51 Pegasi b’s new name, “Dimidium,” out of more than 100 related papers since the change in late 2015. It may help that all but two of the newly designated names for Pluto’s features, those of Adlivun Cavus and Tenzing Montes, had already populated the unofficial map of Pluto.

—Kimberly M. S. Cartier (@AstroKimCartier), News Writing and Production Intern

Quiet Volcanic Activity Changes Speed of Ambient Seismic Waves

Tue, 09/19/2017 - 12:30

Volcanic eruptions, earthquakes, landslides, and other events can trigger seismic waves that travel through Earth. Smaller triggers, such as road traffic or rivers, can produce quieter, ambient waves. No matter the wave’s source, changes in the velocity of seismic waves can reveal key mechanical characteristics of the shallow geological structures they pass through.

New research by Takano et al. explores changes in ambient seismic velocity at a nonerupting volcano. Past research has focused on velocity changes caused by large earthquakes or volcanic eruptions. However, the sparseness of such events makes it difficult to monitor shallow structures continuously. Instead, the new study shows other more reliable sources of seismic velocity change can prove useful.

From 2012 through 2015, the research team monitored ambient seismic velocity changes at Izu Oshima, a volcanic island off the coast of Japan that has not erupted since 1990. They gathered continuous measurements from four sensor stations deployed on the island and mathematically cross  correlated the data to determine daily changes in the velocity of ambient seismic waves of different frequencies.

Without any major earthquakes or eruptions during the study period, the scientists were able to investigate how relatively gentle deformation of the ground due to relatively quiet volcanic activity affected seismic wave velocity. Such volcano deformation is known to change seismic velocity by generating internal pressure, or stress, within rock, as well as by changing the rock’s shape (strain).

The researchers found a strong correlation between ambient seismic velocity changes and observed strain caused by volcano deformation. This suggests that volcanic pressure is the main driver of such changes at Izu Oshima, as opposed to pressure caused by heavy seasonal precipitation or ocean tides, which they also analyzed.

Further analysis revealed that most of the observed changes in seismic wave velocity occurred in the upper 1 kilometer of the ground beneath the volcano. By incorporating the results of previous studies, the researchers also showed that seismic wave velocity is more sensitive to stress at shallower depths.

These findings could enable improved understanding of Izu Oshima’s underlying geology, and similar techniques could aid investigation of subsurface geological structures in other locations. (Journal of Geophysical Research: Solid Earth, https://doi.org/10.1002/2017JB014340, 2017)

—Sarah Stanley, Freelance Writer

Putting Satellite Maps of Surface Water to Practical Use

Tue, 09/19/2017 - 12:27

The SWOT satellite mission will make the first global survey of Earth’s surface water to serve the hydrology and oceanography communities.In 2021, an international consortium of space agencies plans to launch the Surface Water Ocean Topography (SWOT) research satellite mission. This satellite mission will make the first global survey of Earth’s surface water to serve the hydrology and oceanography communities.

NASA and the French space agency Centre National d’Études Spatiales (CNES) are leading the mission, with participation from the Canadian and U.K. space agencies.

Last April, SWOT leaders invited more than 50 participants, representing stakeholders who deal with water issues in a decision-making capacity, to attend a workshop designed to explore how to best maximize SWOT’s user readiness after its planned launch in 2021.

Attendees included representatives from the U.S. Army Corps of Engineers, U.S. Bureau of Reclamation, U.S. Geological Survey, U.S. Navy, Radiance Technologies (an engineering and technical support contracting company), Mercator Ocean (a not-for-profit company), satellite data services company Collecte Localisation Satellites, Environmental Systems Research Institute, World Wildlife Fund of the United States, SERVIR (a joint venture between NASA and the U.S. Agency for International Development), Indian Institute of Technology, U.S. Naval Research Laboratory, and FM Global (a commercial and industrial property insurance company).

Workshop participants made key observations and recommendations for the SWOT mission:

There is a need for a near-real-time/short-time critical product from the SWOT mission that could provide data anywhere from less than 1 day to 5 days after the observation. Although several societal applications of these data are not latency critical (i.e., the data are not needed during or immediately after an event), participants preferred a latency period of less than 2 days. SWOT data could help with flood mapping and modeling, such as for river and coastal flooding and for storm surges. The data could also improve flood inundation and hydrodynamic models where data latency is not an issue. SWOT data would enable reservoir level and water storage measurements: a key product for water security. Using SWOT data to develop better global river models would also help researchers understand water resources. Although many aspects of water management at seasonal or annual planning scales are not latency sensitive, availability of near-real-time products could spur innovative water management efforts for many large stakeholder agencies. SWOT data would be useful for marine safety, transport, and pollution management in coastal environments and river or coastal interfaces. Sea ice forecast models, ocean acoustics programs, and derived bathymetry studies could also use this information. Because today’s data-processing tools (e.g., NetCDF, GeoTIFF, vector, and gridded raster formats) are so versatile, a wide range of data formats is acceptable. Tutorials involving example data sets and real-world case studies are needed so that the application community can understand how SWOT data fit into their scheme of business. Such education and training should be aimed at users with various levels of expertise and should be available in multiple languages.

SWOT is a research mission, but its elevation maps for surface water bodies and oceans could solve numerous societally relevant challenges.SWOT is a research mission, but the workshop deliberations also provided specific details from the viewpoints of the stakeholders on how the availability of high-frequency and high-resolution elevation maps for surface water bodies and oceans can solve numerous societally relevant challenges around the globe.

The SWOT mission applications workshop was planned by the SWOT Application Working Group, CNES, and NASA’s Applied Sciences Program Water Resources program. For more information on the SWOT mission and its applications, see NASA’s slide show and the full workshop report.

—Faisal Hossain (email: fhossain@uw.edu), University of Washington, Seattle; Alice Andral, Centre National d’Études Spatiales, Paris, France; and Margaret Srinivasan, Jet Propulsion Laboratory, Pasadena, Calif.

Caribbean Sediment Traced to 1755 Portuguese Quake and Tsunami

Tue, 09/19/2017 - 12:25

Portuguese citizens were celebrating All Saints Day on 1 November 1755 when a massive earthquake—now estimated to be between magnitude 8.0 and 9.0—struck off the coast of the country. Buildings collapsed in the intense shaking, and tsunami waves radiated outward across the Atlantic Ocean.

Smoking gun evidence of the tsunami has proven to be elusive in the New World. That’s all changed now.Written records report large waves washing ashore in Spain, Portugal, Morocco, England, the Azores, and Newfoundland. But smoking gun evidence of the tsunami—actual deposits of sand or other materials, which allow scientists to estimate tsunami intensity—has proven to be elusive in the New World.

That’s all changed now.

Alerted by a phone call from an archeologist saying, “There’s probably something of interest to you in [the Martinique city of] Fort-de-France,” marine geophysicist Jean Roger and his colleagues have now shown that layers of white and black sand in an archaeological dig in the historical city center are relics of the powerful 18th century cataclysm.

“The only possible way to inundate this city is from a tsunami.”Because Fort-de-France is located in a very sheltered area of Martinique, the team ruled out storms as a source of the unusually thick deposits, roughly 8 centimeters in total, of sand. “The only possible way to inundate this city is from a tsunami,” Roger told Eos in a recent interview.

An Unusual Stripe

Roger, an independent researcher based in Mayotte, a grouping of islands between Mozambique and Madagascar, had long sought Caribbean deposits from the 1755 tsunami. Before he received that phone call, which came in 2013, he and his collaborators had already fruitlessly scoured the region for 2 years.

Then, at the ongoing excavation of 17th and 18th century buildings in Fort-de-France, the researchers were shown an unusual stripe that ran throughout the excavated pits: a roughly 1-centimeter-thick white sandy layer rich in shells, capped by a 6- to 9-centimeter-thick layer of coarse black sand.

On the basis of the ages of ceramic fragments found in adjoining soil layers, Roger and his colleagues estimated that the sandy stripe had been laid down between 1726 and 1783. The researchers then compared this date range with historical tsunami catalogs and meteorological reports of storm surges and heavy rains. The only match was the 1755 earthquake in Portugal, whose tsunami waves would have traveled roughly 5,700 kilometers before washing ashore on the tiny Caribbean island.

Travel times in hours of the tsunami from the 1755 Lisbon earthquake. Credit: National Geophysical Data Center/World Data Service, 2017, doi:10.7289/V5PN93H7 A Tiny Beach

One mystery bothered the researchers: Where did the coarse black sand come from? “We investigated all of the beaches around Martinique, but we did not find any similar sand,” said Roger. A large coral patch in Fort-de-France Bay generates only white sand.

The answer came unexpectedly. One day, while Roger and his colleagues walked along the mouth of the Madame River, a large river that bisects Fort-de-France, they stumbled upon a tiny beach, barely 10 meters wide. Several fishermen’s huts were clustered on the beach, but the researchers’ eyes were immediately drawn to the ground. “The sand was black,” said Roger. “When I ran that sand through my hand, I knew it was the same as the sand in the archaeological site.” Laboratory analyses later confirmed the match and revealed that the black sand came from one of Martinique’s volcanoes.

“We had the sand, we knew where it came from, but there was still a problem—the size of the tsunami deposit,” said Roger. The 1755 tsunami was reported to be only 80–100 centimeters high in Martinique, far too small, on the basis of tsunami modeling, to have produced the roughly 8-centimeter-thick sandy stripe that permeated the Fort-de-France site. Instead, the model predicted that such a tsunami should only have produced a layer about 1 centimeter thick, roughly the thickness of the white sandy layer found in Fort-de-France.

A Channeled River

In a new paper submitted to Natural Hazards and Earth System Sciences and posted online by the journal on 7 August while it undergoes peer review, Roger and his colleagues may have found a solution to this last puzzle.

“We had the sand, we knew where it came from, but there was still a problem.”Historical records indicate that walls encased the sides of the Madame River in 1755, the researchers note. The deposit’s thickness might be explained if the tsunami had headed up the Madame River as a turbulent flow, known as a bore, which is highly effective at transporting sediments, Roger said, especially if the river’s enclosure had concentrated the energy of that flow. In this new scenario, tsunami waves propagating directly from the sea into the city transported the thin layer of white sand rich in shells, whereas the much thicker black layer rode in with the bore.

Valérie Clouard, a geophysicist at the Institut de Physique du Globe de Paris and a team member, notes the serendipity of the bore-transported material being a different color than the sand pushed by the tsunami waves. “If the two sands had been the same color, we could have thought that everything was deposited only by the tsunami waves,” she said, which would have resulted in an overestimate of the tsunami’s intensity.

“If the two sands had been the same color, we could have thought that everything was deposited only by the tsunami waves.”This study “highlights the benefits of collaborative research in geology and archaeology,” said Paula Dunbar, a physical scientist at the Cooperative Institute for Research in Environmental Sciences at the University of Colorado in Boulder who was not involved in the research. Its methodology can be applied in other geographic locations to improve historical tsunami catalogs, she added.

Because historical records of tsunamis are often used to establish tsunami hazard maps and evacuation plans, it’s important to distinguish between direct tsunami inundation and the effects of tsunami bores in rivers, said Roger. This work also highlights the danger of tsunami bores. “Many large cities are located close to rivers,” he noted.

—Katherine Kornei (email: hobbies4kk@gmail.com; @katherinekornei), Freelance Science Journalist

Turning up the Heat on Organic Matter to Track Carbon

Tue, 09/19/2017 - 12:23

When organisms die, they leave behind a wealth of information about carbon cycling and climate. The organic matter in soils, sediments, and water may come from decomposed land plants, dead plankton (tiny marine animals and plants), or burned wood or fossil fuels, and it offers clues about Earth’s past and present environments. These clues are critical to understanding Earth’s current and future responses to a changing climate.

A new technique offers researchers both the broad and narrow analyses that researchers formerly had to choose between.However, the myriad forms of organic matter make it difficult to access the information stored within it. Analyzing the entire organic carbon pool blends information from a wide variety of sources into one often misleading signal. Zeroing in on specific organic compounds provides detailed information on those compounds but misses potentially important context. Promisingly, a new technique bridges these approaches and offers researchers both the broad and narrow analyses they formerly had to choose between.

The new technique, called ramped pyrolysis/oxidation (ramped PyrOx or RPO), takes advantage of the way different kinds of organic matter react to heat. For example, when younger organic matter from plankton is heated using RPO, it generally reacts at lower temperatures than much older organic matter from eroded bedrock. By combining RPO results with analyses of radiocarbon (a radioactive isotope of carbon with a half-life of 5,730 years) and stable carbon isotopes, scientists can identify the sources and ages of organic matter in the environment.

In September 2016, the National Ocean Sciences Accelerator Mass Spectrometry Facility hosted a 2-day workshop on the thermal analysis of natural organic matter. A diverse group of more than 30 scientists from the United States, the United Kingdom, Switzerland, and China gathered to discuss the scientific potential of combining RPO and radiocarbon analyses. The meeting included panel discussions on the history and applications of RPO and breakout groups to discuss future technical and scientific priorities for the method.

Marine technicians from the U.S. Antarctic Program help scientists collect sediments in the Gerlache Strait off the Antarctic Peninsula from the R/V Laurence M. Gould in October 2012. RPO analysis has helped improve radiocarbon dating in sediments like those collected in the Antarctic. Credit: Brad E. Rosenheim, University of South Florida

In one session, meeting participants considered the best ways to date sediments using RPO. In environments like Antarctica that seldom preserve foraminifera—tiny organisms whose shells are the gold standard for dating ocean sediments—researchers have relied on dating bulk organic matter. This approach is not ideal, however, because Antarctic sediments contain both marine plankton and land-derived organic matter of vastly different ages. Combining RPO and radiocarbon dating, scientists can link the most-heat-reactive organic matter to the youngest carbon dates, thereby obtaining more accurate sediment ages, which correspond to the ages of surface water plankton in the sediments. This information yields radiocarbon ages that can be equivalent to foraminifera ages, representing the time since these organisms were alive at the surface of the ocean and thus the age of sedimentation.

Other sessions focused on applications that do not use radiocarbon composition for dating. For instance, RPO alone can be used to track organic carbon cycles in rivers and their watersheds, in soils, and in ocean water. Linking RPO results from riverine and coastal marine sediments will enhance our understanding of what material survives transport from the terrestrial environment to the ocean.

Combining RPO and radiocarbon dating, scientists can obtain more accurate sediment ages, which correspond to the ages of surface water plankton in the sediments.Breakout groups discussed the need for better interpretations of RPO results when combined with radiocarbon dating; the potential to use RPO to study compounds other than carbon, such as nitrogen and sulfur; and several other topics.

This was the first such workshop on the analysis of natural organic matter using thermal techniques coupled with radiocarbon dating and stable isotope measurements, and it was sponsored by the National Science Foundation. The workshop provided concrete direction on a new technique that illuminated an exciting path toward a better understanding of biogeochemical cycles in soils, lakes, rivers, and oceans. For more information about this meeting, see the RPO radiocarbon white paper on the Woods Hole Oceanographic Institution website.

—Ann McNichol (email: amcnichol@whoi.edu; @ann_mcnichol), Woods Hole Oceanographic Institution, Mass.; Brad Rosenheim (@Rosenheim_group), University of South Florida, Tampa; and Valier Galy (@valiergaly), Woods Hole Oceanographic Institution, Mass.

AGU Revises Its Integrity and Ethics Policy

Mon, 09/18/2017 - 13:18

The American Geophysical Union (AGU) Board of Directors has approved changes to the AGU Scientific Integrity and Professional Ethics Policy. The revisions adopted on 14 September were made in response to a June 2016 decision by AGU leadership, under then AGU President Margaret Leinen, to form a task force to review the organization’s ethics policy and practices in the wake of high-profile cases alleging sexual harassment in science.

The updated policy is intended to address ongoing issues within the Earth and space science community that have profound impact in the workplace and on scientists’ individual lives and careers. It is the result of the 18-member task force’s efforts over the past year.

New Standards and Expectations

Changes to the policy include identifying as scientific misconduct harassment, discrimination, and bullying in scientific endeavors.Most notably, changes to the policy include identifying as scientific misconduct harassment, discrimination, and bullying in scientific endeavors. The previous ethics policy was silent on code of conduct expectations for AGU members related to the issue of harassment.

Further, the new policy extends to all AGU members, as well as staff, volunteers, contractors, and nonmembers who participate in AGU programs. The policy is aspirational in setting standards for scientific integrity and professional ethics in the Earth and space science community, but it also establishes mechanisms that allow for the imposition of sanctions to deal with breaches of the AGU ethics policy. As the updated policy states, “when an allegation of misconduct involves activity that is against the U.S. code of law, or code of law in other respective regions, AGU will work with all appropriate authorities as needed and required to resolve the allegation.”

Key provisions of this updated policy include

AGU leadership’s affirmation of the international principle that the free, open, and responsible practice of science is fundamental to scientific advancement and human and environmental well-being a definition of scientific misconduct that includes conduct toward others definitions of discrimination, harassment (including sexual harassment), and bullying a higher standard for AGU volunteer leader conduct the extension of the AGU ethics policy to cover participants in all AGU program activities, including honors and awards and AGU governance self-reporting requirements for recipients of AGU awards and honors and for candidates to AGU elected positions ethical guidelines for publication of scientific research ethical guidelines for student-adviser relationships a clear and detailed process for reporting and investigating scientific misconduct a description of support mechanisms for issues that may not rise to the level of a formal ethics complaint

The initial draft of the ethics policy was open for AGU member review and comment in March 2017. During the comment period, many constructive responses were received and incorporated into the final version.

A Stepping-Stone Toward Further Efforts

AGU will provide additional educational resources to help foster the change in culture needed to eliminate harassment in the Earth and space science community.AGU is not alone in its efforts to expand research misconduct to include harassment; in September 2016, Celeste Rohlfing, chief operating officer of the American Association for the Advancement of Science (AAAS), proposed that federal agencies do the same. That same month, AGU was joined by cosponsoring organizations, including AAAS, the American Chemical Society, the American Geosciences Institute, the Association for Women Geoscientists, and the Earth Science Women’s Network, in hosting a workshop entitled “Sexual Harassment in the Sciences: A Call to Respond.” The National Science Foundation funded the event.

In rolling out the new policy, AGU will provide additional educational resources to help foster the change in culture needed to eliminate harassment in the Earth and space science community.

For further information about the new ethics policy, please read the latest From the Prow blog post by AGU’s president, Eric Davidson; president-elect, Robin Bell; and immediate past president, Margaret Leinen. You can review the updated policy and associated AGU antiharassment educational resources at this website.

—Michael J. McPhaden (email: ethics@agu.org), Chair, Task Force on Scientific Ethics, AGU; Linda Gundersen, Member, Task Force on Scientific Ethics, AGU; and Billy M. Williams, Vice President, Ethics, Diversity, and Inclusion, AGU

World’s Heavy Dependence on Fossil Fuels Projected to Continue

Mon, 09/18/2017 - 13:10

Global energy consumption will increase 28% between 2015 and 2040, with fossil fuels still providing the bulk, 77%, of the energy consumption by 2040, according to a new report by the U.S. Department of Energy’s Energy Information Administration (EIA). That’s a slight tick down from EIA’s 2016 report that modeled fossil fuels as accounting for 78% of energy consumption by 2040.

The increased energy use will be matched by a 16% increase in energy-related carbon dioxide (CO2) emissions over that same time period, with annual emissions rising from 33.9 billion metric tons in 2015 to 39.3 billion metric tons in 2040, according to EIA’s report, “International Energy Outlook 2017,” released on Thursday. That energy usage, increasing from 575 quadrillion British thermal units (Btus) (607 exajoules) per year in 2015 to 663 quadrillion Btus (700 exajoules) in 2040, assumes an annual 1.7% gross domestic product growth in Organization for Economic Cooperation and Development (OECD) countries—including the United States and many European nations—and a 3.8% growth in non-OECD countries.

The projected 16% increase in emissions noted in the new report differs significantly from the projected 34% increase stated in EIA’s 2016 report, which projected emissions rising from 322 billion metric tons in 2012 to 43.2 billion metric tons in 2040.

The new report, which provides long-term modeled projections of energy production and consumption, states that energy consumption could be up to 40 quadrillion Btus (42 exajoules) higher or 29 quadrillion Btus (31 exajoules) lower annually, depending on what rates of economic growth actually occur and other factors.

Reduction Expected in the Growth Rate of Emissions

The growth rate of energy-related CO2 emissions is expected to ease.The growth rate of energy-related CO2 emissions is expected to ease, with an average 0.6% increase per year between 2015 and 2040, according to the report. That’s a sizeable drop from a 1.3% annual growth rate from 1990 to 2015. EIA attributes that anticipated slowdown to increases in energy efficiency and a gradual shift from coal to natural gas and renewable energy sources.

The report, which Ian Mead, EIA’s assistant administrator for energy analysis, presented at the briefing, also forecasts a 2.8% annual increase in renewable energy, including hydropower. “By 2040, generation from renewable energy sources surpasses generation from coal on a worldwide basis,” the report states.

EIA’s projections find renewables to be the most rapidly growing energy source for electricity generation. Still, the agency may be underestimating the contributions from renewables, according to some economists, including Rachel Cleetus, lead economist and climate policy manager for the Union of Concerned Scientists.

The report also forecasts a continued decrease in carbon intensity, the amount of energy used per unit of economic growth. EIA attributes that decrease to China’s decline in coal use and to global growth in the use of non-CO2-emitting sources of energy.

The industrial sector continues to account for the largest share of energy consumption through 2040, with a 0.7% annual increase in energy use between 2015 and 2040. However, other sectors grow faster, with energy use for the transportation and building sectors increasing 1% and 1.1% annually, respectively.

An Increase in Nuclear Power

Amid concerns about greenhouse gas emissions and energy security, nuclear power for electricity generation increases from 2.5 trillion kilowatt hours in 2015 to 3.7 trillion kilowatt hours in 2040, with much of that increased capacity in China, according to the report.

“Without pointing fingers at any particular countries, there are some regions where we don’t assume that [country targets for meeting Paris emissions goals are] a binding constraint.”With the Paris accord on climate change having come into force in November 2016, EIA attempted to incorporate some details about country-specific plans to meet emissions targets. However, the report notes that a great deal of uncertainty remains about the full implementation of policies and how countries will meet their goals. It also mentions that the Paris accord covers more than energy-related CO2 emissions.

“Without pointing fingers at any particular countries, there are some regions where we don’t assume that [country targets are] a binding constraint,” Mead said at the briefing.

Fossil Fuels Continue to Dominate

The report projects that natural gas will be the world’s fastest growing fossil fuel, increasing by 1.4% annually, whereas petroleum and other liquids will increase 0.7% and coal will see just a 0.1% increase, with declined usage in China and OECD regions offset by growth in India and other non-OECD countries.

Fossil fuel dependence will continue because “a lot of it just has to do with the base that you are starting with,” Mead told Eos. “So it may take a while for some of those [other] sources to move in.”

—Randy Showstack (@RandyShowstack), Staff Writer

Envisioning and Sustaining Science at Summit Station, Greenland

Mon, 09/18/2017 - 12:32

Summit Station, in the center of the Greenland ice sheet, is a vibrant interdisciplinary research hub that has served as a crucial component of the Arctic observing system for nearly 3 decades. This station has yielded numerous scientific insights, but operating Summit and similar remote stations is resource intensive. Keeping these stations at the cutting edge of scientific research requires strategic planning and scientific vision.

Summit remains the only site on the Greenland ice sheet with a long enough history of measurements to understand and model current changes and place them in a broader context.Earlier this year, the National Science Foundation supported and hosted a planning session to update the scientific vision and direction for Summit Station. A multidisciplinary group of about 30 scientists, including remote participants, reviewed science activities at Summit, defined future scientific research questions and goals, and made community-based recommendations on science-enabling future scenarios and governance. A full report can be found on the GeoSummit website.

Participants emphasized that Summit’s high-latitude, high-altitude site, which is largely free of environmental pollutants and other local human influences, serves as a flagship for process-based scientific discovery. Scientific inquiries address questions ranging from the outer reaches of space to the bedrock below the Greenland ice sheet and all points in between.

The group also emphasized that Summit remains the only site on the Greenland ice sheet with a long enough history of climatologic, atmospheric, and glaciological measurements to understand and model current changes and place them in a broader context. They concurred on the importance of encouraging funding agencies to recognize how vitally important the climate length records collected at Summit are to a broad swath of the scientific research and modeling communities.

Summit leverages a suite of scientific measurements, allowing researchers to go beyond their own study and put their research into the larger climate perspective.Consensus among the participants also supported the establishment of governance practices to make Summit a protected site—similar in stature to the Long Term Ecological Research (LTER) sites. These practices would include ensuring a core set of publicly available measurements, including temperature, snow accumulation, cloud properties, aerosol concentrations, and others. The measurements would directly benefit society by documenting the current Arctic climate, which can be used to provide constraints on global predictions of sea level rise. The group determined that “Summit is scientifically powerful because it leverages a suite of scientific measurements, co-located over time and at one point in space, allowing researchers to go beyond their own study and put their research into the larger climate perspective.”

Aerial view of Summit Station, Greenland, in 2010. Credit: Polar Field Services

For the future, participants outlined the station’s effects on science under various logistics and operating scenarios, which included very sparse operations of the station, year-round personnel and/or power, and fully robotic operations. They recommended logistics scenarios that maintained year-round measurements to study processes that directly relate to improving atmospheric, climate, and ice sheet models; calibrating satellites; and determining long-term trends and variability for studying key scientific processes.

Other recommendations included expanding Summit Station for future astrophysics studies. The group emphasized the importance of maintaining a clean snow and air sector under any expansion scenario. They outlined a future vision of automated and robotic measurements, and they encouraged continued efforts to collect and publicly disseminate Summit data sets broadly and efficiently.

—Lora Koenig (email: lora.koenig@colorado.edu), National Snow and Ice Data Center, University of Colorado Boulder; Bruce Vaughn, Institute of Arctic and Alpine Research, University of Colorado Boulder; and Jack Dibb, Institute for the Study of Earth, Oceans, and Space, University of New Hampshire, Durham

AGU Marks Peer Review Week

Mon, 09/18/2017 - 12:29

Last week was Peer Review Week, a chance to focus the attention of researchers, publishers, societies, and the broader public on the importance of peer review.

Peer review continues to have a key role in advancing science—through both grants and research papers and related data—and in serving the public. Strengthening peer review is critical for both roles. This requires both assessing it, and working to improve it. The Peer Review Congress held in Chicago last week also focused on both aspects.

The theme of Peer Review Week 2017 was Transparency in Review. Both assessing and improving processes are steps towards achieving transparency. Last week AGU published four Editors’ Vox posts featuring new and recent initiatives designed to assess and improve our peer review process.

The excellent uptake of AGU authors registering for Orcid iDs enables us to more accurately connect people with their research outputs, as well as for people to gain recognition for their work, including reviewing.

Meanwhile, AGU requires that for publication in all AGU journals, the data associated with the research must be stored in a public repository that is accessible to all, ideally a domain repository, and this information must be explicitly stated in the Acknowledgements section of the paper. Issues related to this policy were discussed further by the Editor-in-Chief of JGR: Oceans in a post entitled “Do You Expect Me to Just Give Away My Data?”

Thanks to a grant from the Laura and John Arnold Foundation, AGU is convening a new initiative to develop a set of data management best practices for publishing in the Earth and space science community. With the emphasis on enabling Findable, Accessible, Interoperable, and Reusable (FAIR) data, the project will develop standards to connect researchers, publishers and data repositories in the Earth and space sciences.

Another improvement in our peer review process that we are implementing this month is the implementation of hypothes.is software for peer review across all AGU journals in our editorial system GEMS. This annotation tool enables reviewers, editors and authors to comment on a document and has the potential to make the peer review process more collaborative and efficient.

We also recently published a piece by AGU President, Eric Davidson, and Marcia McNutt, President of the National Academy of Sciences entitled Red/Blue and Peer Review. They reported on an alternative approach to standard peer review for evaluating climate change science.

AGU remains committed to the highest quality in the journal manuscript submission, peer review and publication process. These latest initiatives demonstrate a commitment to continual self-assessment and improvement, embracing new tools, and being transparent to all involved.

—Brooks Hanson (email: bhanson@agu.org), Senior Vice President, and Jenny Lunn, Director, Publications, American Geophysical Union

Cassini Plunges into Saturn, Ends a 20-Year Mission

Fri, 09/15/2017 - 12:34

This morning around 6:32 a.m Eastern time, NASA’s Cassini-Huygens mission finally ended as the Cassini spacecraft melted in Saturn’s atmosphere. Scientists received the loss of signal about an hour and a half later, at 7:55 a.m.

The spacecraft, after orbiting Saturn for 13 years, hit the gas giant 10° north of its equator and disintegrated in the upper atmosphere, 1,500 kilometers above swirling cloud tops. Before its final dissolution, Saturn’s atmosphere ripped the spacecraft apart as it flew in at 113,000 kilometers per hour.

Cassini captured this global view of Saturn in February 2007. Credit: NASA/JPL-Caltech/SSI/Ian Regan

The mission’s end began at 3:04 p.m. Eastern time on 11 September, when a gravitational shove from Saturn’s largest moon, Titan, nudged the spacecraft into its path of destruction.

So long my fascinating friend. Thank you for being spectacularly confusing beyond my wildest dreams. May our paths cross again someday. <3 https://t.co/w5gXjUy06V

— Sarah Hörst (@PlanetDr) September 11, 2017

Where to Crash?

Scientists decided on Cassini’s fiery end back in 2014. The spacecraft was running out of fuel, and because Titan or the small, icy moon Enceladus may be potentially habitable, scientists chose to send the spacecraft into Saturn rather than risk having it crash onto either moon—or, really, any of its 53+ moons.

An animation of the last images of Enceladus as it “set” behind Saturn from Cassini’s point of view. Images taken on 13 September, received on Earth 14 September. Credit: NASA/JPL-Caltech/Space Science Institute/Emily Lakdawalla

“We don’t want to go back to Enceladus and find [there are] microbes that we put there,” Linda Spilker, head scientist on the Cassini mission, told the Atlantic.

End of an Era

Over the past 13 years, Cassini’s cameras and scientific instruments transformed the Saturnian system from an unknown celestial neighborhood into something “as familiar as your own backyard,” Spilker told Eos.

The spacecraft flew almost 8 billion kilometers in its lifetime, starting with its 3.5-billion-kilometer journey from Earth on 15 October 1997. That lifetime includes nearly 300 orbits of Saturn, with many guided by Titan’s gravity.

It collected more than 600 gigabytes of data. Although that doesn’t sound like much today, consider that Cassini was built with 1980s technology and is still sending us data from 1 billion kilometers away, explained Earl Maize, the mission’s program manager, at a 13 September press conference.

Cassini’s highest-resolution color image of any part of Saturn’s rings to date. The image, captured 6 July 2017, shows a portion of Saturn’s B ring, between 98,600 and 105,500 kilometers from Saturn’s center. Credit: NASA/JPL-Caltech/Space Science Institute

With those data, Cassini helped scientists discover oceans underneath the icy shells of Titan and the much smaller moon Enceladus. There might even be an ocean inside the moon Dione, but data so far haven’t been conclusive. Cassini’s cameras watched waves ripple through Saturn’s rings and storms froth in Saturn’s clouds, spotted six new moons orbiting the gas giant, and showed us a huge, hexagonal jet stream with a hurricane in its center on Saturn’s north pole.

The discoveries led to new questions. How do Titan’s lakes fill up with liquid methane and ethane? Why does Titan even have an atmosphere? What drives the water jets shooting out of Enceladus’s south pole? How did those mysterious red streaks form on the moon Tethys?

A mosaic of images captured by Cassini of Titan’s second-largest lake, Ligeia Mare, between 2006 and 2007. Scientists still aren’t sure how Titan’s lakes fill with liquid hydrocarbons. Credit: NASA/JPL-Caltech/ASI/Cornell

After another nudge from Titan’s gravity back in April, the spacecraft spent its last several months rocketing through 22 orbits between Saturn and its dazzling rings. During these grand finale orbits, Cassini investigated Saturn’s magnetic field and the age of the rings themselves, a mystery scientists hope to resolve from Cassini’s last data. The final five orbits even sent Cassini dipping into the tip of Saturn’s atmosphere, where it began sampling a region into which no spacecraft has ever gone.

In Cassini’s final few moments, its mass spectrometer pointed toward Saturn’s atmosphere and collected its clearest data yet. Those data beamed immediately to Earth and were received by a giant antenna in Canberra, Australia, 86 minutes later.

“The spacecraft’s final signal will be like an echo. It will radiate across the solar system for nearly an hour and a half after Cassini itself has gone.”“The spacecraft’s final signal will be like an echo. It will radiate across the solar system for nearly an hour and a half after Cassini itself has gone,” Maize said in the days before the end.

Final Good-Bye

When she was in third grade, Spilker used to gaze at Saturn and Jupiter with her very own telescope. She always hoped to have “the chance to visit these worlds that were about half a page in my astronomy book.”

And now she and the Cassini team have. What’s more, everyone watching the Cassini mission has shared in that journey.

For Maize, that shared journey is what matters most. “We’ve left the world informed but still wondering, and I couldn’t ask for more,” he said.

—JoAnna Wendel (@JoAnnaScience), Staff Writer

Enabling Findable, Accessible, Interoperable and Reusable Data

Fri, 09/15/2017 - 12:01

Scientific results, particularly in the Earth and space sciences, are increasingly dependent on large and complex data sets, as well as models that transform these data. In many cases, the data are difficult to acquire, such as one-time observations of the Earth or other planets that cannot be repeated, for example, 40-year-old data from the Voyager mission.

Increasingly these data sets are stored or made available separately from the actual publication because of their sheer size. But even when data are saved, the ways in which they are stored and catalogued is uneven, making discovery and linking of data sets that should be allied difficult or impossible. For example, data are often stored with publishers as PDF files or other supplements without any metadata or in general repositories without any quality control or curation.

Other researchers are often not able to understand the data sets without contacting the original author, if that is even possible. For examples of such difficulties see a recent Editor’s Vox by the Editor-in-Chief of Journal of Geophysical Research: Oceans.

There is, at best, nascent interoperability across different data repositories and between repositories and scholarly publishers. Domain repositories, which act as a home for data from particular disciplines, have emerged to handle different data sets, but standards and procedures vary between them.

Over the past few years, several organizations, including AGU, have led efforts to address key parts of these problems at a high level, such as the Center for Open Science’s Transparency and Openness Promotion (TOP) Guidelines, the FAIR (Findable, Accessible, Interoperable, and Re-usable) Data Principles, and the Data Citation Principles.

In particular, AGU’s Brooks Hanson and Kerstin Lehnert of Columbia University, helped form the Coalition on Publishing Data in the Earth and Space Sciences (COPDESS), which includes a statement of commitment signed by major publishers and repositories about requiring data to be included with publications. As a result, most Earth and space science publishers now require authors to make data in support of a publication available, preferably using domain specific repositories that provide high quality data curation. However, the practice of these and other guidelines is haphazard in part because workflow solutions to connect researchers, publishers, and repositories are neither standard nor widely adopted.

To address this critical need, the Laura and John Arnold Foundation recently awarded a grant to a coalition of groups representing the international Earth and space science community. This project, which is being convened by AGU, will develop standards that will connect researchers, publishers and data repositories in the Earth and space sciences to enable FAIR (findable, accessible, interoperable, and reusable) data – a concept first developed by Force11 – on a large scale.

The project is called “Enabling FAIR Data.” The partnership currently includes AGU, Earth Science Information Partners (ESIP) and Research Data Alliance (RDA), and has support from the Proceedings of the National Academy of Sciences, NatureScience, National Computational Infrastructure, AuScope, the Australian National Data Service, and the Center for Open Science.

This effort will build on the work of COPDESS, ESIP, RDA, the scientific journals and domain repositories to ensure that well documented data, preserved in a repository with community-agreed metadata, and supporting persistent identifiers becomes part of the expected research products submitted in support of each publication. It is expected that the broader community will play a key role in the recommended guidelines and approach. A key goal is to make a process that is efficient and standard for researchers and thus supports their work from grant application through to publishing.

A set of best practices will be developed including metadata and identifier standards; data services; common taxonomies; landing pages at repositories to expose the metadata and standard repository information; standard data citation; and standard integration into editorial peer review workflows to facilitate adoption by publishers and a consistent experience for researchers. Visit the COPDESS website to keep up-to-date with the project.

Open, accessible, and high-quality data, and related products such as software, are critical to the integrity of published research. They ensure transparency, support reproducibility and are necessary for the advancement of science. In Earth and space science, critical data can also have diverse and important societal benefits and be used for critical real-time decision-making.

AGU’s Data Position Statement affirms that “Earth and space sciences data are a world heritage. Properly documented, credited, and preserved, they will help future scientists understand the Earth, planetary, and heliophysics systems.” By convening the Earth and space science community in this exciting new project, AGU continues to lead the way in developing best practices in scientific research and publishing, and making it accessible for the benefit of humanity.

—Shelley Stall, Director, Data Programs, American Geophysical Union; email: sstall@agu.org  orcid.org/0000-0003-2926-8353

Diamonds Really Do Rain on Neptune, Experiments Conclude

Fri, 09/15/2017 - 11:38

A very hard rain likely falls inside Uranus and Neptune.

In recent high-energy laser experiments, researchers have replicated the pressures and temperatures found deep in the atmospheres of such planets, known as ice giants. Those extreme conditions in the laboratory compressed hydrocarbon plastics, chemically similar to the methane found in ice giants, into tiny diamonds, giving an experimental boost to a long-standing theory about the characteristics of ice giant planets.

“It was a very surprising experiment,” said Dominik Kraus, a researcher at Helmholtz-Zentrum Dresden-Rossendorf in Dresden, Germany. His team had expected very small signs of molecules splitting apart after being subjected to high pressures, “maybe some little hints of diamonds,” he explained.

But instead, they found a very strong signal that under intense pressures, hydrocarbons in Neptune would transform into diamonds. Kraus is lead author of a 21 August Nature Astronomy paper describing the results.

Pressure Shock

In the recent tests, the experimenters first had to find a substance that was chemically similar to methane (CH4), a molecule believed to be in about 1.5% abundance on Neptune and the planet’s most common component after hydrogen and helium. They hit upon polystyrene (C8H8) plastic—not only is it a common material, but it is also easier to use because it’s a solid at room temperature, whereas methane is gas that would need to be contained.

The researchers then fired two short, but intense, pulses from a high-energy X-ray laser at the polystyrene sample. The two laser bursts hitting the sample at nearly the same time exerted a pressure shock almost 1.5 million times greater than Earth’s surface atmospheric pressure yet kept the temperature below the melting point of diamonds.

The researchers witnessed carbon separating from hydrogen and compressing into nanometer-sized diamonds.The fleeting shock simulated conditions found around 10,000 kilometers below the surfaces of Uranus and Neptune. Then, using X-ray diffraction measurements to continually monitor the chemical nature of the sample after the laser strikes, the researchers witnessed carbon separating from hydrogen and compressing into nanometer-sized diamonds.

Carrying the experiment’s analogy back to Neptune, the results indicated that hydrocarbons known to be within Neptune likely condense into a solid as you go deeper into the planet’s interior. The experiments took place at the Stanford Linear Accelerator Center (SLAC) National Accelerator Laboratory in Menlo Park, Calif.

Why Simulate Neptune’s Atmosphere?

Knowing how hydrocarbons might behave deep within an ice giant’s atmosphere will affect our understanding of how atmospheres transport heat and evolve over time, explained Kraus. What’s more, the implications of this research extend beyond our solar system to exoplanets, as a large fraction of the known exoplanets are similar in size or mass to our ice giants.

The ability to model an ice giant atmosphere’s density from the top down to the core is a critical part of characterizing that planet. For example, an atmosphere made mostly of hydrogen is much puffier than one with diamonds, Kraus noted.

A diamond-studded atmosphere also likely behaves very differently than one without diamonds. For example, atmospheric convection might have to overcome more hurdles, which may lead to sharp changes in chemical composition between different atmospheric layers, the researchers said. This could also inhibit heat flow.

“These experiments can be used to improve our understanding of the behavior of common materials in the universe at high pressures and temperatures, which has a direct connection to modeling planetary interiors,” said Ravit Helled, a computational science and theoretical astrophysics professor at the University of Zurich in Switzerland, who was not involved in the study.

One-Two Punch Keeps Old Theory in the Ring

The one-two X-ray punch was the key to the experiment’s success.Planetary scientist Marvin Ross first proposed the idea that Uranus and Neptune could have diamond precipitation in 1981. Other research groups have tried many times since then to observe this chemical reaction in the lab but have seen only hints of hydrogen-carbon separation and diamond formation. Moreover, these changes took place at pressures and temperatures that don’t match theory very well.

In contrast, the shock method used by Kraus and his team produced strong signals from the separation and diamond formation at the temperatures and pressures suggested by theory. The one-two X-ray punch was the key to the experiment’s success, according to coauthor Siegfried Glenzer, professor of photon science at Stanford University and director of SLAC’s High Energy Density Sciences Division.

“We saw carbon clusters forming under the presence of hydrogen, and then we saw those carbon clusters forming diamonds under high pressure,” Glenzer described.

The Matter in Extreme Conditions instrument (central chamber) attached to SLAC National Accelerator Laboratory’s Linac Coherent Light Source can replicate the pressures and temperatures found in the interiors of ice giant planets, enabling researchers to observe chemical reactions that may occur under those conditions. Credit: Matt Beardsley/SLAC National Accelerator Laboratory

Kraus added that “nearly every carbon atom inside the plastic turned, within this 1 nanosecond or less, into a diamond crystal structure.” He said that if the nanometer-scale diamonds could grow for longer spans of time, like they might in ice giant atmospheres, the nanodiamonds “would for sure grow to much larger size.”

To see diamond formation, the sample needed to be highly compressed but not heated beyond the melting point of diamonds, a tricky combination for most laser experiments, Glenzer explained. The lasers compressed the sample for only a few nanoseconds, too short a time to substantially increase the temperature. The team performed its experiment with the Matter in Extreme Conditions (MEC) instrument on SLAC’s Linac Coherent Light Source (LCLS).

SLAC’s full range of experimental techniques allows scientists “to be able to assess these questions of reactivity and kinetics,” said Laura Robin Benedetti, an experimental physicist at Lawrence Livermore National Laboratory in Livermore, Calif., who did not participate in the research. “It’s very exciting to have new work in this field.”

The Hydrogen Solution

“Nearly every carbon atom inside the plastic turned, within this 1 nanosecond or less, into a diamond crystal structure.”Glenzer explained that the shock method’s short timescale is important for keeping hydrogen from escaping during the diamond compression. Past experiments that observed the reactions over the course of a few seconds might have suffered from hydrogen loss, he speculated.

In the shock experiments, “hydrogen is still present, and that’s important because that’s what happens [in] Neptune,” Glenzer emphasized. “You have carbon under high pressure, and hydrogen is still around. And then we see the formation of diamonds.” In this way, the lab simulation “is a much better approximation for what we believe is happening in Neptune.”

“Hydrogen is still present, and that’s important because that’s what happens [in] Neptune.”The research group has begun conducting similar experiments with plastics of different composition to test the range of reactions that could occur, according to Kraus. He and his colleagues are particularly interested in reactions that include oxygen and helium, two elements in high abundance in not just ice giants but also Jupiter-like planets as well.

“To refine our models of the interiors of the ice giant planets and also to understand their formation processes, we will need every bit of data we can get our hands on!” Benedetti told Eos in an email.

The team also hopes to retrieve the newly formed diamonds from the MEC chamber to analyze their structure and strength, Glenzer said. Through that, there may be a practical application to this science: Harvesting the diamond nanocrystals formed in the experiments is the researchers’ first step in assessing potential applications for the diamonds in material science or industry.

—Kimberly M. S. Cartier (@AstroKimCartier), News Writing and Production Intern

Environment and Labor Groups Push to Protect EPA Budget

Fri, 09/15/2017 - 11:33

If legislation approved by the House of Representatives on Wednesday becomes law, the U.S. Environmental Protection Agency’s (EPA) budget for fiscal year (FY) 2018 could be less austere than the Trump administration’s $5.7 billion request, a request that represents a nearly 30% proposed cut to the agency.

However, the House bill, which the Senate still needs to weigh in on, is no cause for celebration, according to environmental groups and union groups representing federal government employees. They say that the House bill’s $7.8 billion number would be the agency’s lowest level of funding since FY 2001. The House’s funding level, employee reductions that could cut staff to pre-1990 levels, and demoralizing working conditions threaten to hollow out the agency and prevent it from fulfilling its mandate of protecting public health and the environment, the groups said at a 13 September briefing in Washington, D. C.

The House bill, which would fund the EPA at a level 7% lower than the final fiscal year 2017 level, would cut agency science and technology by 12% compared with the administration’s proposed 44% cut in those areas. Additionally, the bill that passed the House on Wednesday targets climate change programs and includes an amendment to bar the enforcement of EPA’s methane rule that sets limits for greenhouse gas emissions for new oil and gas sources. Another amendment bars new rule making from using the Obama administration’s social cost of carbon formula for estimating monetary impacts of climate change from carbon dioxide emissions.

EPA is “grossly underfunded and [has] been grossly underfunded for years.”EPA is “grossly underfunded and [has] been grossly underfunded for years,” said John O’Grady, president of the American Federation of Government Employees (AFGE) Council 238, which represents about 9,000 EPA employees nationwide. Council 238 and the broader AFGE, which represents 700,000 government workers, are spearheading Save the U.S. EPA, a national campaign to protect the agency.

O’Grady said that the House’s $7.8 billion level of funding is well below the agency’s FY 2001 budget, which, when adjusted for inflation, would be equivalent to $10.8 billion.

EPA in the Administration’s Crosshairs

“The administration has laid out its priorities, and it has made it crystal clear: EPA is in its crosshairs,” said Rep. Debbie Dingell (D-Mich.), who also spoke at the briefing. Dingell, who serves on the House Committee on Energy and Commerce, introduced legislation in July to prevent EPA from closing any regional or program office, including the agency’s Region 5 office in Chicago, which works on Great Lakes issues. Dingell told Eos that she will try to attach that bill to another piece of legislation. An amendment to the House funding bill that would block the closure or consolidation of any EPA regional office fell short of approval.

The Trump administration is “just blowing up programs that have taken years to create.”The Trump administration is “just blowing up programs that have taken years to create,” Dingell told Eos. “They are decimating an agency that is critical for babies that haven’t been born yet [and for] seniors who are in their 90s and 100s.”

The Art of the Deal

Briefing speaker Collin O’Mara, president and CEO of the National Wildlife Federation, told Eos that a 7% cut to EPA’s budget doesn’t sound bad. “But this is the classic art of the deal,” he said, referring to a book by President Donald Trump. O’Mara said the cut is not as drastic as the administration requested, but it would still severely affect the agency, and it would come after previous years of incremental reductions.

O’Mara said at the briefing he doesn’t only worry about changes to the agency’s climate and other high-profile programs. “There are bread and butter bipartisan things the agency does day in and day out that are not part of these top line ideological debates about the role of government,” he said. “Less sexy programs” such as cleanup and assessment programs are still on the chopping block, he said.

Mary Anne Hitt, director of the Beyond Coal campaign for the San Francisco–based Sierra Club, said at the briefing that it’s not only budget cuts that threaten EPA’s ability to do its job. “The agency can also be hollowed out by making life miserable for the people who work there, so that they leave, and then just failing to replace them,” she said.

Hitt said she has heard of EPA staff working conditions “that are pretty jaw dropping,” including staff being told not to bring enforcement actions against polluters and being told to rewrite public health safeguards. “Between the budget cuts and this kind of political interference, the EPA just won’t be able to fulfill its mission,” she said.

A Shot Across the Bow

Energy expert Frank Maisano, who observed the briefing, told Eos that the administration’s proposed drastic cut to EPA “was a shot across the bow” to send a message that it planned to reduce government spending.

“I think you’ll see that Congress is going to be a lot more reasonable than the initial proposal that [the administration] put out.”The administration “can’t reduce government spending in certain areas,” such as entitlements, said Maisano, a principal at the Policy Resolution Group at Bracewell, a Washington, D. C.–based law and government relations firm serving the oil and gas, power, and other industries. “So to get a reasonably balanced budget, they reduced [funding] in all kinds of areas that are never going to be cut by Congress. That’s the way all budgeters are,” he said, noting that, for instance, the Obama, Bush, and Clinton administrations used similar strategies. “I think you’ll see that Congress is going to be a lot more reasonable than the initial proposal that [the administration] put out.”

—Randy Showstack (@RandyShowstack), Staff Writer

Deepwater Horizon Dispersant Cleared the Air, New Model Shows

Thu, 09/14/2017 - 12:10

As the crippled Deepwater Horizon oil well spewed nearly 5 million barrels of crude oil and gas into the Gulf of Mexico in 2010, cleanup crews attacked the spill with almost 2 million gallons of Corexit, a controversial brand of dispersant used to break up oil into smaller droplets.

The research finds that the dispersant diminished the average oil droplet’s volume by more than thirtyfold, which reduced the amount of volatile oil components that reached the surface of the ocean.Now, new modeling of the oil’s behavior as it leaked out of the well has simulated the characteristics of the spill’s oil and natural gas bubbles with and without the addition of Corexit EC9500A, the main dispersant used against the spill. The research finds that the dispersant diminished the average oil droplet’s volume by more than thirtyfold, which would have likely made droplets dissolve more easily and rise more slowly, allowing them to become trapped below the sea surface.

The droplets’ shrinkage would have reduced the amount of volatile oil components that reached the surface of the ocean and, in turn, dramatically improved air quality for responders, the researchers report. In the model’s simulations, droplet diameters decreased from approximately 4 millimeters without dispersant to 1.3 millimeters with it.

“If dispersants had not been used, then according to our model, emergency responders would have had to spend much more time worrying about health risks,” said J. Samuel Arey, a senior adjunct researcher at the Swiss Federal Institute of Aquatic Science and Technology. “Anything you do that affects logistics in a negative way is going to have a huge impact” on response, he added. “Logistics is your ability to get work done. That has a direct multiplier on the scale of the disaster.”

The new model indicates that dispersant use lowered the amount of volatile organic chemicals in the atmosphere by about 30%, on average. For chemicals most harmful to humans, the reduction was far steeper. The atmospheric concentration of benzene, a component of crude oil, decreased 6,000-fold, for example. That had big implications for cleanup crews working on the site.

Woods Hole Oceanographic Institution researchers donned respirators to work on the top deck of R/V Endeavor during a rapid response expedition to the Deepwater Horizon spill in the Gulf of Mexico in June of 2010. Credit: Dan Torres, Woods Hole Oceanographic Institution

The new study shows “how the dispersant was effective in retaining [more] of the hydrocarbons at depth than would otherwise be the case. That’s really important,” said Gary Andersen, a senior scientist at Lawrence Berkeley National Laboratory in Berkeley, Calif., who has also studied the effects of dispersants.

Arey and his colleagues published the new work on 28 August in the Proceedings of the National Academy of Sciences of the United States of America (PNAS).

Dispersants at Depth

Dispersants work similarly to how dish soap breaks up grease. The chemicals in the dispersants, which are typically used on surface slicks, lower the interfacial tension of the boundary between oil and water, which allows waves and wind to break up a slick. “As a consequence, less oil reaches the coastline because it’s dispersed into the environment,” Arey said. But how dispersants would behave when used at depth was still an open question in 2010.

Since then, numerous scientists have attempted to answer that question. In theory, a dispersant would have reduced the size of the oil droplets under the water’s surface. Smaller droplets would be more likely to get trapped within the water, rising to the surface more slowly. If microbes degraded the oil droplets or if they were trapped, they might not reach the surface at all.

Limited underwater observations near the spill with a camera mounted on a remotely operated vehicle (ROV) captured some data on droplet sizes. Operated by a team led by the Woods Hole Oceanographic Institution in Woods Hole, Mass., the ROV looked at droplets 1–9 kilometers from the wellhead, observing sizes of 0.03–0.40 millimeter.

These sizes are consistent with the model, which simulated droplets only 200 meters from the leaking wellhead, according to the paper. The model’s larger droplets would have broken up by the time they drifted kilometers from the disabled oil rig because of natural turbulence and the dispersant’s effect.

The new study model shows that the dispersant injection decreased the overall concentration of all volatile organic chemicals in the atmosphere by a modest amount. However, the dispersant reduced the amount of chemicals most harmful to humans, including benzene and toluene, by a much greater amount. Credit: Natalie Renier, Woods Hole Oceanographic Institution

However, the in situ measurements are “a relatively small set,” Arey noted. Thus, many researchers have turned to lab experiments or models.

A 2015 modeling study by scientists at the University of Western Australia in Perth and the University of Miami in Florida found that mechanical energy did most of the dispersing. In addition, regarding how much spilled oil remained at depth rather than breaching the surface, “the simulations suggest that the application of dispersant actually made only a marginal difference,” the authors wrote.  They estimate that only 1%–3% less oil reached the surface because of dispersants.

Using another model last year, Jeremy Testa, a researcher at the University of Maryland Center for Environmental Science in Solomons, reported droplet sizes similar to those simulated in the new paper by Arey and his coauthors.

In experiments reported in June, also in PNAS, Andersen and his colleagues recreated the presumed conditions of the Deepwater Horizon spill in bottles, rotating the bottles so that the oil droplets didn’t become slicks. They found that the dispersant did not inhibit microbial work. Microbes degraded most of the oil from the spill “in a matter of weeks or months,” Andersen said. A lot of the work was done by a newfound organism now thought to be part of the genus Bermanella.

“Dispersants are really important in trying to retain as much [oil as] possible in the lower depths, where there’s greater potential for microbial degradation.”“I come down strongly in favor of using dispersants, much more so now than ever,” Andersen added. “Dispersants are really important in trying to retain as much [oil as] possible in the lower depths, where there’s greater potential for microbial degradation.”

The model used in the new August PNAS study allowed the scientists “to fill in everything that happened from the wellhead to the sea surface and describe in detail what was happening at the wellhead itself,” according to Arey. He said that he hopes studies like his can be used to help plan for future oil spill management.

“We still rely heavily on fossil fuel reserves,” he said. “To the extent that we can shed light on the benefits of these intervention technologies, [we can] continue to inform the policy around emergency preparedness.”

—Rachel Kaufman (email: rk@readwriterachel.com; @rkaufman), Freelance Science Journalist

Annotation Tool Facilitates Peer Review

Thu, 09/14/2017 - 12:08

We are pleased to announce that this month AGU is implementing a new way to review manuscripts across all our journals: hypothes.is. Hypothes.is is a non-profit organization working to build open source software that enables annotation on digital documents. It is already being used in diverse fields including education, research, journalism and publishing.

In a traditional review, a reviewer might provide a long, itemized list of comments tied to pages and paragraphs or line numbers in a manuscript. Some also mark up a manuscript using track changes in Word or use the editing features in a PDF document while also providing other comments and a summary separately. Multiple reviews yield separate markups that cannot easily be merged.

Why not have all the comments in one document for the author, organized by who created them and tagged by the type and importance of a comment? Even better, also have the comments in list form that can be exported, filtered and sorted, and allow the authors to respond either in line, or using the list.

We are implementing this solution for our journals by incorporating the hypothes.is annotation tool into GEMS, our editorial system. This work is a cooperative effort by EjournalPress, which provides GEMS, and Hypothes.is, with initial support from the Alfred P. Sloan Foundation, and some extensive work and testing by the publications staff at AGU.

The hypothes.is software allows each reviewer to make inline comments and organizes these by reviewers, editors, authors and importance all in one file. Confidential comments for the editor can also be included.

Currently, each reviewer will only see their annotations while the editor can see the full collection, and the authors can see all non-confidential annotations. Thus, anonymity and viewing rights are preserved.

Each annotation is labeled to help organize the review. Current labels include a summary comment, major issue, minor issue, text edit, comment on a figure, and comment regarding a reference, and these can be filtered for ease of viewing. The annotations can also be collated separately and formed into a “traditional” review.

Authors can respond inline to reviewers’ comments on the same file, providing a means for a more effective dialog among the participants in the peer review process.

Using annotations in a review does require online access (although annotations are saved immediately) but online availability has expanded greatly in the past few years. Some instructions are available here.

Conducting a review using annotations is entirely optional—a reviewer can still choose the traditional way if they wish. For those that do wish to try it, we welcome feedback. Feedback is important to refine and improve the experience and provide broader familiarity with this development.

We appreciate that any changes and innovations may take some time to get used to; however, this approach opens up some additional opportunities for collaboration and improvement in the peer review process.

—Brooks Hanson, Senior Vice President; Jeanette Panning, Director; Randy Townsend, Senior Program Manager; Paige Wooden, Senior Program Manager, Publications, American Geophysical Union; email: bhanson@agu.org

Hot Water, Cold Ice

Thu, 09/14/2017 - 12:06

Our series of Editors’ Vox to discover what AGU journal editors do when they’re not reviewing manuscripts continues with the Editor-in-Chief of JGR: Earth Surface, Bryn Hubbard. We asked him about his recent field research on the flanks of Mount Everest.

Where did you go for your most recent field research?

The tongue of the Khumbu Glacier is covered in debris, making access to the site difficult, to say nothing of the ice below. Credit: Bryn Hubbard

With most of my previous field research in the polar regions, it was something of a departure for me to pack my bags in late spring for a trip to the Himalaya.

My destination was Nepal’s Khumbu Glacier, specifically the 10-kilometer long debris-covered tongue of the glacier whose upper reaches flow off the flank of Mount Everest itself and provide access to climbers.

Despite almost a year of planning and preparation, there remained three fairly significant questions before departing: would the team be able to reach the site, would the equipment work when we got there, and would we able to collect the data we wanted. Some of these uncertainties were captured in the media coverage before we left including a BBC news article (including two videos).

How was your journey to get there?

Flying into hot and busy Kathmandu was quite different from my usual field destinations such as Greenland or Antarctica characterized by open spaces and sub-freezing temperatures. But that was just the start…

Selecting an aircraft for our internal flight from Kathmandu to Lukla in the Himalayan foothills was a somewhat chaotic activity involving seemingly randomly-distributed piles of trekking and camping gear, a great deal of arm waving and shouting, and the scene made all the more surreal by the presence of inquisitive (and acquisitive) monkeys within the terminal.

If that experience was eye-opening, landing at Lukla airstrip (2,850 meters above sea level) was rather more eye-closing. The landing strip extends at a downward slope of more than 10 degrees towards to a vertical mountainside. I had somehow omitted to read up on this aspect of the trip beforehand, although I did suspect something may be out of the ordinary when the passenger next to me began to clutch their hands and pray loudly five minutes before landing.

On the eight-day trek towards Everest Base Camp. Credit: Bryn Hubbard

Thankfully we landed safely and, after a quick breakfast, began our eight-day trek to Khumbu Glacier. I remember the first day as being relatively straightforward and filled with light-hearted banter, due in large part to the low(ish) elevation, a largely downhill route, and breathing dust-free air.

The next seven days, however, were not. The scenery was unquestionably spectacular, our pack bags were carried by porters, and we had overnight stays at convivial guesthouses, but the trek was something of a physical challenge for one who is used to snow scootering around flat ice shelves just a few meters above sea level.

Suffice it to say that we arrived nine days later, in a far from decent shape but fundamentally intact, at our first camp on the glacier a couple of hundreds of metres up-flow of Khumbu’s terminus (see image at top).

What was the focus of your scientific research?

Almost nothing is known about the interior of high-altitude, debris-covered glaciers. The reason for this lack of information is straightforward: working on such glaciers at about 5 kilometers above sea level, to say nothing of accessing their interior, is logistically difficult.

With my experience of using pressurized hot water to drill boreholes on ice masses – albeit normally in polar regions – Duncan Quincey of the University of Leeds brought me in to the EverDrill project. Together with Evan Miles, a Post-Doctoral Research Assistant, and PhD student Katie Miles, our task was to drill boreholes at three sites along Khumbu’s tongue and to install probes within those holes designed to measure the internal temperature, deformation, and structure of the glacier.

How did your drilling equipment reach the site?

Some of the equipment was slung by helicopter to the site and the rest carried by porters. En route it experienced various minor ailments ranging from surface scratches and ruptured pipes to sheared switches and major dents. This is to say nothing of a petrol motor having been filled with diesel, and then emptied by inverting the entire unit rather than via its drain plug. After a year of planning and more than a week’s trek to get there, we wondered whether we would be able to do anything that we had planned.

What were the challenges of operating the equipment at high altitude?

Our hot water drill was adapted from a car wash and all our apparatus was run using combustion motors. Although we had tested them in Kathmandu, we were now about three-and-a-half kilometers higher in altitude and we simply didn’t know if they would operate at that elevation.

Of the four motors that we needed to run, only one – the large generator with the initially ruptured fuel feed and broken recoil start – was running at the end of day one. By the end of day two this had been joined by the second generator and the water pump, leaving only the somewhat recalcitrant high-pressure-pump motor. Our boreholes were not going to get too far without this motor running as drilling by pressurised hot water requires, well, hot water under pressure.

By day three, having dismantled and tested almost every component of the motor, involving helpful – but ultimately unfruitful – voluntary contributions from at least 15 passers-by, we reassembled the motor, attempted to start it (unsuccessfully), gave it one final kick and set off to clear our minds by exploring the glacier.

On our way back to camp some hours later, reports reached us that the motor was finally running. Our enterprising (and invaluable) field guide had pulled the recoil start repeatedly for a full 15 minutes until the motor block had warmed up, at which point it fired and ran. Throughout the subsequent research, the motor never needed more than half a dozen pulls on the recoil start, which left me sagging and gasping for air. To have done this continually for 15 minutes (not to forget, at an elevation of 5,000 meters) not only probably saved the project but represented a somewhat superhuman effort.

How did the drilling of boreholes go?

With all our motors eventually operational in Khumbu’s rarefied atmosphere, although spluttering and burning far from cleanly, we didn’t know if we’d be able to actually drill the boreholes.

Knowing almost nothing about the internal structure or temperature of a glacier does not help in planning the most effective means of gaining access to that interior. For example, various experts had conversationally predicted that the ice would be so cold, flowing down from about 7,000 meters off Everest itself, that our drilling rig would simply not be able to supply the energy to melt a borehole. Others had suggested that the glacier would be so infused with debris that the drill tip, which delivers the hot pressurised water as a millimetre-diameter high-pressure jet, would not be able to penetrate any useful distance.

The hot water drill in operation on the Khumbu Glacier. Credit: Katie Miles

With these warnings foremost in our minds, we began drilling at our first site, through thin ice near the glacier’s terminus, advancing slowly and carefully at about 0.5 meters per minute.

We managed to create a first borehole to about 35 meters below the surface. Since we were unsure whether we had hit the bed or an internal debris layer, we subsequently drilled a second borehole to about 45 meters depth.

Not knowing the internal temperature of the glacier – and hence fearing that it may freeze shut within hours – we immediately logged the longer borehole by optical televiewer camera and installed along it strings of thermistors and accelerometers. This all went surprisingly smoothly and, with data being logged automatically every 5 minutes, that evening was accompanied by much celebration.

The second drilling site, located about 4 kilometers upglacier, proved more difficult due to debris within the ice and required the drilling of ten boreholes to a maximum depth of almost 30 meters. On the third site, near Everest Base Camp at an elevation of about 5,200 meters above sea level, proved more successful. We drilled a single borehole uninterrupted to a depth of about 150 meters over three days.

What’s next?

Despite the difficulties, we managed to successfully drill, log and instrument medium-length boreholes at a high elevation on a debris-covered glacier for the first time. Some of the team will return later this year to monitor the instrumentation and download ice velocity and borehole data. A second drilling campaign is planned for spring 2018.

I’ve not even begun to tell you about what we hope the data collected from the boreholes will reveal, how we will use it in a model to predict glacier evolution, and how resulting forecasts could be of wider use to policy-makers (find out more about the EverDrill project).

Bryn Hubbard (right) and PhD student Katie Miles pictured as the sun rises over Mount Everest. Credit: Bryn Hubbard

As scientists, our peers see the outputs of our work in terms of data, analysis and results presented in scholarly journal articles. But sometimes it’s worth drawing attention to the hard work behind the scenes that results in a 10,000-word manuscript.

We set out to access something fairly inaccessible and measure something about which very little was known. Despite much planning, we began our fieldtrip with a range of uncertainties, practical challenges and physical hurdles.

Thankfully, in this instance, we succeeded and are now in a position to tell the tale and soon report the results. For now though, I am back to the safer task of reviewing manuscripts for JGR: Earth Surface…

—Bryn Hubbard, Editor-in-Chief, JGR: Earth Surface and Centre for Glaciology, Institute of Geography & Earth Sciences, Aberystwyth University, UK

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer