EOS

Syndicate content
Earth & Space Science News
Updated: 1 day 12 hours ago

Airborne Fireball

Fri, 01/19/2018 - 19:44

I was driving when it happened. It was dark out but the sky brightened over the course of a few seconds. I turned my head to look out the side window and saw the flashing streak in the sky.

My first thought was, “meteor,” but I didn’t know for certain because I didn’t hear the “boom” of the shockwave. I was in a car with the radio on, so I guess I just missed it. Sure enough, reports started coming in on social media that an asteroid had hit Earth around 8:10 p.m. local time on 16 January 2018, over the skies of southeast Michigan.

When an object from deep space encounters Earth’s atmosphere, it’s called a meteor. The meteor showers that occur regularly several times a year are when Earth flies through an old comet tail remnant, and the dust grains burn through our upper atmosphere. When a boulder from outer space hits our atmosphere, it makes it down a little lower, causing a brilliant light display and, at the end, an explosion. Instead of just being a meteor, we now call it a bolide. The small fragments that make it down to the surface are then called meteorites.

Bolide strikes occur dozens of times per year, as shown in the map below. They can happen pretty much anywhere on Earth at any time of day.

Bolide events from 1994 to 2013. Credit: NASA/Planetary Science

These space rocks, really tiny asteroids, range from a basketball to a building in size. They enter the atmosphere at thousands of miles per hour, well above the speed of sound. The friction caused by the moving the air out of the way strips off the front the object, weakening it, and eventually resulting in catastrophic failure – the explosion. The sonic boom from this fast-moving object sometimes reaches the ground shaking buildings and perhaps breaking windows, like an earthquake.

The one that hit Michigan was probably the size of an exercise ball. It wasn’t a big enough sonic boom to break windows, but they estimate the shaking was equivalent to a 2.0 magnitude earthquake. After it exploded, there could have been fragments that did not disintegrate, which would have fallen to the ground. So, somewhere north of Detroit, there could a meteorite debris field scattered in the snow.

The bigger bolides can be dangerous. The one that blew up over Siberia in February 2013 had a strong enough sonic boom that roughly a thousand people in Chelyabinsk, Russia, went to the hospital, mostly from cuts due to flying glass but also broken bones from falling objects. No one died and the meteorite debris landed in a field outside of town. That one they estimated at about 10 tons, which is a rock roughly 5 to 6 feet in diameter.

Another bolide that blew up over Siberia in 1908 was gigantic. The estimate for the size Tunguska meteor is a couple hundred meters across – picture a rock that fills a sports stadium. It obliterated several hundred square miles of forest land.  If the one over Michigan had been that big, I probably would not be here to write this!

We didn’t know the Michigan bolide was coming because asteroids that small are very hard to see. NASA has telescopes looking for near-Earth objects and big asteroids heading towards our planet, but these smaller ones are too hard to see; we have to be very lucky to spot it before it reaches Earth. So, usually, we have no warning before a bolide strike.

If it’s a larger object, then hopefully we see it early enough to deflect it out of the path of Earth. NASA has plans for this type of operation, including laser blasting the object or hitting it with a missile. Luckily, we haven’t had to exercise those plans yet!

As you can see from the NASA map, bolides release many GigaJoules of energy into the atmosphere. This occurs over just a few seconds; for the Michigan bolide that I witnessed, it was less than 5 seconds of continual brightening and then a flash of light at the end. Dr. Brown’s Delorean, from the Back to the Future movies, could easily make it to 1955, and back, from one of these bolides strikes, because it is often much more than 1.21 GigaWatts of power. The only trick is to predict when it will happen – sounds like a sequel to me!

—Mike Liemohn, Editor-in-Chief of JGR: Space Physics, and Department of Climate and Space Sciences and Engineering, University of Michigan; email: liemohn@umich.edu

Editor’s Note: See Mike Liemohn talking further about bolides here.

Correction, 19 January 2018: In an earlier version of this article, two numbers were transposed in the amount of power required for time travel in the Back to the Future movies. This has been corrected.

Global Average Temperatures in 2017 Continued Upward Trend

Fri, 01/19/2018 - 19:37

Earth’s average surface temperature in 2017 placed as the second or third highest on record, according to new analyses by NASA and the National Oceanic and Atmospheric Administration (NOAA).

NASA’s analysis, released yesterday during a press conference, showed that 2017 is the second-hottest year on record and that the average global temperature rose 0.9°C (1.6°F) above the 1951–1980 average. The size of the temperature increase was calculated from thousands of measurements from more than 6,000 weather stations, ship- and buoy-based observations of sea surface temperatures, and measurements across Antarctic research stations.

“Despite colder than average temperatures in any one part of the world, temperatures over the planet as a whole continue the rapid warming trend we’ve seen over the last 40 years.”An analysis from NOAA, released during the same press conference, produced similar results: According to NOAA’s models, 2017 ranked as the thi­rd-warmest year on record. Specifically, NOAA scientists found that temperatures rose 0.84°C (1.5°F) above the 20th century average (1901–2000).

“Despite colder than average temperatures in any one part of the world, temperatures over the planet as a whole continue the rapid warming trend we’ve seen over the last 40 years,” said Gavin Schmidt, director of NASA’s Goddard Institute for Space Studies in New York City, at the press conference.

The World Meteorological Organization (WMO) also released a 2017 climate report yesterday, which also placed last year among the top three warmest years on record. According to WMO’s figure for 2017, the world’s average surface temperature has risen 1.1°C since preindustrial times. 2016 remains the warmest year on their record, with temperatures reaching 1.2°C above the preindustrial era.

Atmosphere to Ocean

At the press conference, Deke Arndt, a chief of the monitoring branch at NOAA’s National Centers for Environmental Information, described warming trends in different layers of the Earth system. He explained that temperatures in the middle troposphere, between 3,000 and 10,000 meters (where most commercial jets fly), ranked third or fourth warmest on record, depending on which group assembled the data. The upper ocean, which scientists know captures much of the excess energy trapped in the atmosphere, also reached its largest heat content on record in 2017, Arndt said.

The Warming North

Minimum sea ice extent continued to fall in 2017, the newly released analyses show.In the Arctic, which warms at a faster rate than the rest of the globe, minimum sea ice extent continued to fall in 2017, the newly released analyses show. Similar results were highlighted in December, when NOAA released its annual Arctic Report Card. In that report, scientists concluded that the mean Arctic temperature rose 1.6°C in 2017 the (second-highest average after 2016), and in March 2017, observed the lowest maximum sea ice extent on record.

Observations of Arctic conditions in 2017 “confirm that the Arctic shows no signs of returning to the reliably frozen state that it was in just a decade ago,” said Jeremy Mathis, director of NOAA’s Arctic Research Program, when the report card was unveiled.

At play here is a key feedback mechanism, Arndt noted. Sea ice, with its bright white surface, reflects solar energy back into the atmosphere, helping to cool surface temperatures. But when sea ice melts, it exposes the darker surface of the underlying water, which absorbs solar energy. And the more that sea ice melts, the more energy is absorbed—a positive feedback mechanism of accelerating warming and ice loss, he said.

ENSO Effects

Some warmer than average temperatures can be attributed to a global climate phenomenon called the El Niño–Southern Oscillation, or ENSO. Two distinct temperature and weather trends make up ENSO. One is El Niño, in which the tropical Pacific Ocean warms; the other is La Niña, in which it cools. El Niño and La Niña can bring anomalously cool or warm, or dry or wet, conditions to different regions of the world. On small timescales, the El Niño effect can amplify warming signals.

Spanning all of 2015 and the first third of 2016, for example, warming from an extreme El Niño fed into overall observed warming. However, Schmidt stressed that even when scientists statistically remove the effects of El Niño and La Niña from the record, 2017 is still one of the warmest years on record. The WMO’s analysis, similarly, showed that 2017 was the warmest year without an El Niño What’s more, studies have shown that as more greenhouse gases are released, extreme El Niños could become more frequent.

—JoAnna Wendel (@JoAnnaScience), Staff Writer

Prestigious Climate-Related Fellowships Rescinded

Fri, 01/19/2018 - 12:57

Last March, Katie Travis, who was finishing a Ph.D. in atmospheric chemistry at Harvard University, got what seemed like a major boost for her budding career: She had been selected as one of eight fellows for the 2017 class of the National Oceanic and Atmospheric Administration’s (NOAA) prestigious Climate and Global Change Postdoctoral Fellowship Program. But the announcement came with an ominous caveat—NOAA program managers did not actually have the money in hand.

With only four fellows ultimately accepted in 2017, the prestigious program is now funding fewer researchers than it ever has since it was launched in 1991.This past August, Travis learned that her fellowship offer had been rescinded because of budget cuts. “This was the first grant I wrote myself,” she said. “It was really validating for me to be selected, which is why it’s so crushing that the program ended up the way it did.”

Three other scientists chosen for the fellowships also found their offers revoked. With only four fellows ultimately accepted in 2017, the prestigious program is now funding fewer researchers than it ever has since it was launched in 1991. At least two other postdoctoral fellowship programs in the United States for climate scientists have also been defunded or put on hold, giving young climate scientists fewer options for continuing their careers.

Illustrious Alumni

The program’s annual budget, which has fluctuated around $2 million, “is among the best dollars NOAA spends in terms of return on investment,” said emeritus climate researcher Richard Somerville of the Scripps Institution of Oceanography in La Jolla, Calif.The Climate and Global Change (CGC) program has built a reputation for preparing scientific leaders, said emeritus climate researcher Richard Somerville of the Scripps Institution of Oceanography in La Jolla, Calif., who served on the program’s steering committee in the 1990s. Some 90% of the program’s 218 alumni have gone on to academic positions, according to program documents. Alumni include Gavin Schmidt, director of NASA’s Goddard Institute for Space Studies in New York; Heidi Cullen, chief scientist for the nonprofit organization Climate Central in Princeton, N.J.; and Jeff Severinghaus, a Scripps paleoclimatologist recently elected to the National Academy of Sciences.

Graduate students who aim to participate in the program team up with professors at host institutions to coauthor proposals. Fellowships provide 2 years of salary and benefits and funds to travel to meetings and a summer institute attended by other fellows and alumni. The program’s annual budget, which has fluctuated around $2 million, “is among the best dollars NOAA spends in terms of return on investment,” said Somerville.

Nancy Williams samples seawater from below the surface of the Chukchi Sea of the Arctic Ocean off the coast of Alaska. She measured carbon dioxide content and alkalinity in hundreds of seawater samples on board the NOAA ship Ronald H. Brown as part of an effort led by Laurie Juranek at Oregon State University to study late summer productivity in the Chukchi and Beaufort seas. Credit: Michael Wojahn

Seeta Sistla, a professor at Hampshire College in Amherst, Mass., and a 2013 fellow, said having independent funding enabled her to jump from studying the Arctic as a graduate student to researching agroforestry in the tropics during her postdoc at the University of California, Santa Barbara. Two years later, the fellowship’s name recognition and connections helped her land several faculty position offers. “It opened doors that otherwise would not have been opened,” she said, “both in terms of my research career and also in how I was seen when I was on the job market.”

The program receives more than 100 applications annually. Program managers anticipated making eight awards in 2017, consistent with recent years, but were told by officials in NOAA’s Climate Program Office in August that they could make only four, Meg Austin, a staff member at the University Corporation for Atmospheric Research (UCAR), told Eos. The CGC program makes up a small part of the Boulder, Colo.–based nonprofit’s portfolio, which also includes managing the National Center for Atmospheric Research in Boulder.

“It was extremely disappointing, and it still is, to know that the current federal policy and the budgetary constraints, whether they’re real or just a chilling effect, are directly hurting my career,” said Nancy Williams, who, like Travis, was selected but ultimately not funded. Williams is finishing her Ph.D. work at Oregon State University in Corvallis and has secured a National Research Council (NRC) fellowship that will fund her postdoc at NOAA’s Pacific Marine Environmental Laboratory in Seattle, where she will study carbon exchange between the ocean and the atmosphere. But she will miss networking with other CGC fellows, she said—and the NRC annual salary is some $10,000 lower. “That’s a lot of money when you have student loans to pay.”

Diversity Takes a Hit

The 2017 class is the only one in the program’s 27-year history other than the first to be all male.Especially troubling to Abigail Swann, an ecologist at the University of Washington in Seattle, is that three of the rescinded offers were to women, whereas the four who were funded are all men. That makes the 2017 class the only one in the program’s 27-year history other than the first to be all male.

Swann and two program alumni wrote a letter—since signed by more than 100 program alumni, hosts, selection committee members, and others—expressing concern that the lack of diversity makes it even harder for female geoscientists to bridge the “PhD-to-Professor gap,” a precarious career stage when many women scientists leave the field. They also noted that NOAA itself has committed to increasing diversity.

Austin said the eight selected applicants were ranked, and when the selection committee learned the funding had been reduced, program managers decided to fund the top four. She noted that almost 40% of all program fellows have been women. But she added that the 2017 outcome was “unfortunate.”

The selection committee members discussed gender diversity and even read recent reports on implicit bias while reviewing applications, said David McGee, a Massachusetts Institute of Technology (MIT) paleoclimatologist who chaired the 2017 committee. In addition to gender, the committee strives for a geographically diverse class that represents different areas of climate and global change science, he added. He said that committee members did not intend to rank their selections, and he believes there was a “misunderstanding” with program managers at UCAR. McGee signed the letter and said he hopes the program adds gender diversity as an explicit goal.

Murky Future?

The program’s future is uncertain, however. Recruiting for the 2018 class would normally have begun last August, but because of uncertainty surrounding the 2018 budget, application materials were posted just this week. A NOAA spokesperson said that although agency officials continue to see the program as “important,” because of budget uncertainties, “NOAA’s conservative plan is to award 4 fellowships this year as well.”

Katie Travis in her home office. Credit: Katie Travis

The delay and uncertainty compound an already difficult funding situation for early-career climate scientists, Travis said. In addition to the CGC program, a smaller NOAA program, Postdocs Applying Climate Expertise, did not accept any fellows in 2017 and will not in 2018. What’s more, a National Science Foundation (NSF) program that has funded 45 postdocs in atmospheric and geospace sciences since August 2014, including many who study climate and global change, has not accepted any new applications since January 2016. Amanda Adams, a program director at NSF, said the program is “paused” while the agency assesses its effectiveness and that the hiatus is not related to budgetary considerations.

Postdocs can also receive support from their institutions or from the professors with whom they work. But funding postdocs is “really, really difficult” for faculty, says McGee, because it requires having a large grant or other source of money to pay their salary and benefits. Such funding typically requires postdocs to do research aligned with an existing project, rather than develop their own.

“Graduate students are probably always somewhat concerned” about funding, said Swann, “but I think they’re feeling extra stressed right now.”

Despite the challenging environment, Travis still hopes to land a job at a university or federal agency. She got a break when MIT, her current institution, stepped in with a year of support after the NOAA offer fell through. But that money runs out in June, and she still needs to secure another source of funds. “It made starting my postdoc really stressful—I immediately felt like I had to start applying for funding again,” she said. “It slowed my research down for sure.”

—Gabriel Popkin (email: gpopkin@gmail.com), Freelance Science Journalist

Pedotransfer Functions Bring New Life to Earth System Modeling

Fri, 01/19/2018 - 12:55

Soil is a very important feature of the Earth’s surface, essential for the vitality of ecosystems and for supporting life. It is also a very active interface, a porous layer between the solid ground below and the air above that allows both water and gases to pass through. Thus, understanding the key properties of soils and how those influence different processes is an important science. In an article recently published in Reviews in Geophysics, Van Looy et al. [2017] gave an overview of pedotransfer functions, a tool used to represent soils in Earth system modeling. The editor asked two of the authors to explain more about pedotransfer functions, their applications, and recent developments in this field.

What are “pedotransfer functions”?

Pedotransfer functions relate simple-to-measure soil properties (left) to less available parameters of Earth system processes (right). Credit: Van Looy et al., 2017, Figure 1

Pedotransfer functions (PTF) are used to translate soil data that we have into data that we need but which are not available; in other words, they are a kind of tool to convert proxy measures. Some PTFs are simple while others are very complex.

The concept of PTFs was developed by Johan Bouma and Henny van Lanen in the 1980s. Some of the first applications used survey data on soil texture and structure to estimate the water holding capacity of soil.

Since then, difficult-to-measure soil properties, such as the water retention curve and hydraulic conductivity, have received the most comprehensive research in developing PTFs.

How are pedotransfer functions applied in theory and practice?

Most processes describing exchange processes at the Earth’s surface, can benefit from using the knowledge and data on soils through PTFs. PTFs provide the key soil properties and functions that are needed to operate soil water balance models, hydrological and ecological models, crop growth models, and land surface models. Traditionally, PTFs have been mostly applied in land management, but more recently have been applied to large-scale crop modeling and Earth system modeling. Recent developments in this field include parameterizations of solute transport, heat exchange, soil respiration and organic carbon content, root density and vegetation water uptake.

What are some of the challenges of applying pedotransfer functions in different locations?

PTFs should encompass the variability of the estimated soil property or process in such a way that the estimation of parameters allows for validation and can also confidently provide for extrapolation and upscaling purposes capturing the spatial variation in soils. Novel sensing techniques provide a true breakthrough for this, like new satellite information on soil humidity, providing ample data for development and validation of PTFs, together with improved modeling and spatial inference techniques such as geographically weighted regressions allowing for local-based inferences as shown for active layer thickness in Alaskan permafrost region. Further improvements are necessary for methods to deal with uncertainty and to validate applications at global scale.

How are pedotransfer functions used in different disciplines within Earth system science?

PTFs are used across Earth system sciences including applications in water flow, solute transport, root water uptake, heat exchange, and carbon and nutrient cycling processes. Further challenges are addressed in parameterization of vegetation water content, biotic processes, soil erosivity, and land use change impacts at multiple scales. In our review article, we give some examples of high resolution global application of pedotransfer functions in land surface models to highlight the strong potential of soil information for improved Earth system processes inference over large scales. We argue that a comprehensive set of PTFs can be applied throughout a wide range of disciplines of Earth system science, with emphasis on land surface models.

Pedotransfer functions enable the characteristics of soils, such as saturated hydraulic conductivity, to be considered in Earth systems modeling. Credit: Van Looy et al., 2017, Figure 5a

What are the some of the remaining methodological challenges?

Ways of improving the development and application of PTFs in Earth system science need to focus the identification, validation and integration of relationships between soil properties, states and process parameters. There are still many unknowns; for example, relating biotic processes and parameters to soil properties.

For validation of larger scale applications, nowadays there is a strong focus on estimations of preferential flow and carbon stocks. For integration, the linkages between hydraulic and biogeochemical parameters are particularly ripe for further investigation. However, in addition to methodological improvements, there needs to be experimental studies to derive PTFs for complex models such as solute transport including biotic adsorption, or boundary layer heat exchange processes.

Finally, most of the PTFs developed until now assume that predictors remain constant in time. Our review also describes the first attempts towards developing time-dependent PTFs which include time dependent predictor variables.

—Kris Van Looy and Harry Vereecken, Institute of Bio and Geosciences, Forschungszentrum Jülich, Germany; email: k.van.looy@fz-juelich.de

Scientists Create Catalog of Altotiberina Fault in Italy

Fri, 01/19/2018 - 12:52

The Apennine Mountains dominate the Italian peninsula, spanning 1,200 kilometers and reaching peaks as high as 2,912 meters. In the northern part of the range lies the Altotiberina fault, and in a recent study spanning 4.5 years, scientists from Italy’s Istituto Nazionale di Geofisica e Vulcanologia (INGV) created a detailed catalog of the seismic activity in the region that is giving them the best look yet at the fault’s behavior and seismic potential.

The Altotiberina fault is a normal fault, meaning the two overlapping slabs of Earth’s crust are being pulled apart, with the hanging wall sliding down the face of the footwall. This movement can occur in either abrupt slips or gradual “creeping.” The Altotiberina fault is also categorized as low angle, meaning the angle formed by the fault line with the horizontal plane is small—about 15°–20° in this case. Often, the anatomy of faults like Altotiberina is dominated not by large occasional earthquakes but by consistent clusters of tiny ones, a concept known as microseismicity.

To get a more complete picture of what was going at the fault, Valoroso et al. used a dense network of seismic and geodetic sensors to record data on the crust’s movement in the region from 2010 to 2014 at depths between 4 and 16 kilometers. The networks belong to The Altotiberina Near Fault Observatory, a modern multidisciplinary research and monitoring infrastructure managed by the INGV. They detected more than 37,000 quakes with a magnitude less than 3.9, occurring at a very consistent rate of approximately 2.2 events per day. The enormous trove of data provided the most detailed record to date of how the fault is evolving over time and in space.

In particular, the authors found 97 clusters of small repeating earthquakes. These miniquakes tended to occur in pairs, and the time interval between them seems to predict the rate at which the hanging wall slides down the footwall. Using these results, the authors suggest that this consistent creeping may drive the behavior of the fault at large, which has previously been calculated to be slipping at a rate of 1.7 millimeters per year. (Journal of Geophysical Research: Solid Earth, https://doi.org/10.1002/2017JB014607, 2017)

—David Shultz, Freelance Writer

New Observations of Mysterious Radar Echoes

Fri, 01/19/2018 - 12:50

A frequently observed geophysical phenomenon in the daytime equatorial ionosphere are 150-kilometer radar echoes. These very high frequency (VHF) radar echoes were first detected more than 50 years ago, however, it remains unclear how exactly they are generated. A recent study by Oppenheim and Dimant [2016] suggested the Sun’s extreme ultraviolet (EUV) radiation as the main cause of the 150-kilometer radar echoes. Patra et al. [2017] present new observational evidence of an anti-correlation between solar EUV radiation and these echoes, and therefore contradicts with the recent study that would otherwise expect a positive relationship. This calls for further investigation into the generation mechanisms of the 150-kilometer radar echoes as they are manifestations of plasma density irregularity that may affect communications in the near-Earth space environment.

Citation: Patra, A. K., Pavan Chaitanya, P., St.-Maurice, J.-P., Otsuka, Y., Yokoyama, T., & Yamamoto, M. [2017]. The solar flux dependence of ionospheric 150 km radar echoes and implications. Geophysical Research Letters, 44, 11,257–11,264. https://doi.org/10.1002/2017GL074678

—Gang Lu, Editor, Geophysical Research Letters

Revised AGU Position Statement Addresses Climate Intervention

Thu, 01/18/2018 - 13:48

The American Geophysical Union (AGU) has adopted a revised position statement on climate intervention, which is purposeful intervention by humans to alter Earth’s climate. The updated position statement, approved yesterday by AGU’s Board of Directors, replaces a prior AGU statement in which such interventions were referred to as “geoengineering solutions.”

The statement, titled “Climate Intervention Requires Enhanced Research, Consideration of Societal and Environmental Impacts, and Policy Development,” also discusses two distinct categories of intervention that are most prevalent in current research: carbon dioxide removal and albedo modification. In addition, it includes updated references, such as two 2015 reports by the National Academies.

“Climate intervention could play a key role in managing the effects of climate change, but our scientific understanding of its impacts remains poor.”A panel of subject matter experts who are also AGU members crafted the newly adopted position statement during the past year. The group worked to ensure that the statement was updated to reflect current scientific understanding in the field.

“Climate intervention could play a key role in managing the effects of climate change, but our scientific understanding of its impacts remains poor,” said David Victor of the University of California, San Diego, and Brookings Institution, who chaired the panel. He stressed that more research is needed to better understand the potential risks and opportunities of climate intervention.

Under Current Discussion by Policy Makers

This update to AGU’s position on this topic is timely, given that the U.S. House of Representatives Committee on Science, Space, and Technology recently held a hearing on climate intervention. The hearing, on 8 November 2017, titled “Geoengineering: Innovation, Research, and Technology,” addressed current scientific understanding of geoengineering, the need for research, and the need for caution in implementation.

AGU has taken a public position since 2009 on climate intervention (then called geoengineering) by originally adopting a statement on 13 December of that year in collaboration with the American Meteorological Society (AMS). AMS had adopted the statement during the preceding summer. AGU independently revised and reaffirmed its initial statement in February 2012.

Resources for Policy Makers and AGU Members

Position statements by scientific societies can serve as resources for policy makers as they seek to understand science issues and craft legislation. AGU develops and maintains position statements to provide scientific expertise on significant policy issues related to the understanding and application of the Earth and space sciences.

AGU encourages members to use our organization’s position statements to guide conversations with students, local communities, policy makers, and other members of the public. AGU makes available its position statements in the AGU Resource Center. They, along with AGU’s Advocacy Policy, are valuable resources for those looking to connect with the public on issues related to Earth and space sciences.

Panelists

• David Victor, University of California, San Diego, and Brookings Institution (chair) • Ken Caldeira, Carnegie Institution for Science • Piers Forster, University of Leeds • Ben Kravitz, Pacific Northwest National Laboratory • Marcia McNutt, National Academy of Sciences • Joyce Penner, University of Michigan • Alan Robock, Rutgers University • Naomi Vaughan, University of East Anglia • Jennifer Wilcox, Colorado School of Mines

—Elizabeth Landau (email: elandau@agu.org), Assistant Director, Public Affairs, AGU

The Amazon River’s Ecosystem: Where Land Meets the Sea

Thu, 01/18/2018 - 13:45

The Amazon River basin and the waters in the Atlantic Ocean into which the Amazon flows are home to the world’s most diverse ecosystems. This region embodies a rich history of scientific discovery.

During the 1980s, one scientific team discovered that vast amounts of waterborne carbon seemed to simply disappear in transit between the upper and central reaches of the Amazon River and the sea. These researchers, part of the Carbon in the Amazon River Experiment (CAMREX) project, made early observations of organic matter and suspended sediments flowing through the upper and central reaches of the river. By 2002, researchers discovered that most of the carbon escaped the river as carbon dioxide (CO2), a phenomenon now recognized as being globally ubiquitous across inland waters at all latitudes.

What drives these large evasive gas fluxes? How do these processes evolve as the river meets the sea?

From 2010 to 2014, an international team of scientists led by Patricia Yager (University of Georgia) set out to decode the linkage between microbial and biogeochemical processes occurring along the lower reaches of the Amazon River and its plume, a broad swath just offshore where river water mingles with ocean water. The effort was called the River Ocean Continuum of the Amazon (ROCA) project.

Previous understanding of the land-sea connection of the Amazon contained a data gap of some 1,000 kilometers. During the ROCA project, Yager led a series of cruises through the tropical North Atlantic Ocean and into the river plume. Meanwhile, a team of ROCA collaborators led by Jeff Richey (University of Washington) simultaneously probed the lower reaches of the Amazon River, from the Óbidos downstream gauging station to the mouth of the river, where tides completely reverse the flow of the river. The force of the river is so strong that although tides reverse its flow, water can remain fresh a great distance offshore from the river mouth.

ROCA represented the first systematic effort to connect processes occurring in the lower reaches of the river to those occurring in the ocean plume. Previous understanding of the land-sea connection of the Amazon contained a data gap of some 1,000 kilometers between Óbidos and the river mouth, and there were no temporally overlapping studies in the river and plume.

Our team’s most recent project, which began in 2014, aims at further understanding biogeochemical dynamics in the lower river.  This project is dubbed the Trocas Líquidas de Carbono do Ecossistema do Baixo Rio Amazonas: Da Terra para o Oceano e Atmosfera (Net Ecosystem Exchange of the Lower Amazon River: From Land to the Ocean and Atmosphere, or TROCAS).

Taking the Data

The goal of TROCAS is to develop a holistic understanding of how carbon speciation (e.g., carbon dioxide, carbonate minerals, organic matter) evolves as it travels from the landscape, through river networks to the sea, and, in the case of CO2, back to the atmosphere. The research framework is based on the concept of net ecosystem exchange, which tracks the evolution of the partial pressure of dissolved CO2 (pCO2) on a mass balance basis through defined boundaries of the river system.

We have recently completed the sixth TROCAS expedition (Figure 1). The first four TROCAS cruises involved performing measurements while the vessel was underway and doing cross-channel sampling transects along the entire study domain. We started near the river mouth at the city of Macapá and navigated upstream to Óbidos. Then we followed a water mass downstream (an approach called a Lagrangian mode) while also sampling the major clear-water tributaries—the Xingu and Tapajós rivers—each of which discharges a volume of water on the same order of magnitude as the Mississippi River.

Fig. 1. Our initial study domain for the lower Amazon River extended from Óbidos to the city of Macapá (black), including the Tapajós and Xingu rivers. In our most recent expeditions, we have traveled to the actual river mouth (green), but waters remain completely fresh up to 60 kilometers offshore (blue). From Sawakuchi et al. [2017].During these expeditions, we performed a suite of experiments to measure how quickly different types of organic matter from terrestrial and aquatic plants were converted to CO2. We also investigated processes governing the production, emission, and oxidation of methane in the river, the influence of river hydrodynamics on in situ microbial respiration rates, and the optical signature of organic matter in the river that can be seen from space.

The TROCAS team measures gas fluxes and geochemical parameters in the Lago Grande do Curuaí during the February 2016 expedition.

The physical flow of water through this complex and highly dynamic reach of the river is central to all of these questions. We measured river velocity and discharge in situ along the tidally influenced study domain and further compiled these data into a model capable of evaluating biogeochemical transformations (Base System for Environmental Hydrodynamics, SisBaHiA).

Carbon Inputs and Outputs

Data from the ROCA project allowed us to estimate that water took roughly 3–5 days to travel from Óbidos to the mouth. We considered this length of time to be significant relative to the 1–2 weeks it can take for vascular plants to turn over organic matter on the basis of initial incubation experiments.

After adding into the hydrodynamic model the actual river flow across the entire domain, along with bathymetric measurements, we now estimate that complex tidal dynamics extends the water transit time closer to 8–9 days (M. L. Barros et al., unpublished data, 2017). By comparison, more sophisticated incubation experiments showed that it took anywhere from hours to a day for organic matter derived from leachates of different plants to degrade, with organic matter leached from grasses and aquatic plants decomposing several times faster than that from harder wood tissues [Ward et al., 2016].

Continuous measurements made during discharge surveys and throughout the field campaign revealed an intriguing correlation between the river’s flow speed and the concentration of CO2 dissolved in the water. This observation motivated us to develop a shipboard system designed to measure microbial respiration rates under various degrees of mixing. Results from these experiments showed a direct link between microbial respiration and physical mixing rates across the lower Amazon River. Respiration rates measured with this system were an order of magnitude higher than those in past experiments that did not account for river flow and could almost entirely account for measured rates of CO2 outgassing [Ward et al., 2017].

In the river, land-derived organic matter is rapidly and continuously degraded to CO2, but constant input from the surroundings maintains high levels of reactive organic matter.From these insights, we have developed the perspective that although land-derived organic matter is rapidly and continuously degraded to CO2 in the river, constant input from the surrounding land and floodplains maintains high levels of reactive organic matter in the river until these sources are cut off in the inner sectors of the Atlantic Ocean plume.

In fact, measurements in the plume made during the ROCA project revealed observable levels of reactive land-derived organic matter that were degradable during both dark and light incubation experiments. These reactive molecules quickly disappeared as the water became saltier near the ocean, leaving behind relatively stable molecules that persisted throughout the plume.

We conducted experiments with and without light to mimic conditions at various locations in the Amazon River and its plume. The river remains dark below the water’s surface because of its high suspended sediment load, so microbial respiration is the primary pathway for organic matter decomposition upriver. However, as sediments settle in the plume, light can also begin to break down these molecules while also promoting primary production (plants’ conversion of inorganic carbon compounds into organic compounds). The stable molecules that persist throughout the plume might feed the pool of ~5,000-year-old dissolved organic carbon in the deep ocean [Medeiros et al., 2015].

Where the River Meets the Sea

The full suite of ROCA expeditions and the initial TROCAS expeditions laid the groundwork for interpreting chemical and biological signatures across the river-to-ocean continuum. However, we still had to answer one large question before we could accurately constrain fluxes to the ocean and atmosphere: How do tides influence the distribution and transformation of geochemicals near the river mouth?

Although our initial efforts were highly ambitious, they did not truly connect the river to the sea. The lack was due in part to the logistical difficulties involved in large oceanographic vessels taking samples close to shore and small river boats sampling far offshore. For example, an additional 150 kilometers remain between our river end point, Macapá, and the actual river mouth, and waters remain completely fresh more than 60 kilometers offshore from the mouth.

As such, we spent our final two TROCAS expeditions (November 2016, low water, and April 2017, late rising water) exploring as close to the river mouth as logistically possible in our current research vessel, the Mirage, and performing daily time series measurements in fixed locations throughout entire tidal cycles. Measurements made during the last two trips revealed that CO2 and methane concentrations can vary by order(s) of magnitude in small, but not insignificant, side channels, and these tidal effects are seen even in the main stem of the river (the main channel of the river, into which the tributaries flow).

The Mirage, captained by Valterci “Cica” Almeida de Melo (standing at the bow), has traversed the Amazon River from the river mouth to Óbidos while making continuous measurements of CO2, methane, and other geochemical parameters.

On the most recent voyage, we traveled just beyond the final end point of the geographical river mouth (where water remained entirely fresh throughout the tidal cycle at surface and depth). We are still processing our geochemical measurements, but one striking observation emerged in real time. High levels of pCO2 persisted all the way to the river mouth, and gas fluxes measured with floating chambers here were similar to rates measured even as far upstream as Óbidos.

When scaled up across the lower river domain, these fluxes are significant not only on a basin scale but also globally. The most recent CO2 outgassing estimates by Sawakuchi et al. [2017] suggest that including the lower reaches of the Amazon River in an updated basin-scale budget increases global outgassing estimates by as much as 40% because of the massive surface area that the lower river encompasses as it widens and channelizes.

These estimates still do not include the extension of freshwater into the ocean, 60 kilometers offshore, where surface area is order(s) of magnitude greater than for the river itself. Likewise, CO2 budgets for the plume in the Atlantic Ocean still do not include the inner reaches of the plume and nearshore waters, which likely maintain high levels of CO2 because of continued breakdown of any remaining reactive organic matter from the river.

Working Together to Find Answers

From our long-term involvement in Amazon research, we recognize that fully constraining the cycling of material through Earth systems requires close collaboration across disciplines and cultures. None of the important discoveries made in the Amazon throughout history would have been possible without the partnership of diverse groups of researchers and, of course, faith from funding agencies.

Our current TROCAS project represents a healthy collaboration among Brazilian and U.S.-based funding agencies, universities, national laboratories, and researchers that enabled an ambitious field and analytical effort. Through our efforts, we hope to inspire future generations to continue probing the connection between the land, ocean, and atmosphere to develop a holistic understanding of how Earth functions and responds to change.

Some of this work was presented at the American Geophysical Union’s 2017 Fall Meeting during the session “Progress in Biogeochemical Research of the World’s Large Rivers II” in a talk titled “The influence of tides on biogeochemical dynamics at the mouth of the Amazon River” (Abstract B54D-02).

Acknowledgments

CAMREX was supported by the National Science Foundation, NASA, and the government of Brazil. ROCA was supported by the Gordon and Betty Moore Foundation Marine Microbiology Initiative. TROCAS is funded by the São Paulo Research Foundation and the National Science Foundation.

Climate Change Is National Security Risk, Congress Members Warn

Thu, 01/18/2018 - 13:36

A bipartisan group of more than 100 members of Congress has urged U.S. president Donald Trump to recognize climate change as a national security risk, and they called on him to reconsider this “omission” from the administration’s National Security Strategy issued on 18 December 2017.

“As global temperatures become more volatile, sea levels rise, and landscapes change, our military installations and our communities are increasingly at risk of devastation. It is imperative that the United States address this growing geopolitical threat,” states the letter, signed by a bipartisan group of 106 members of Congress and released on 12 January. Signatories include Rep. Elise Stefanik (R-N.Y.), chair of the House Committee on Armed Services’ Emerging Threats and Capabilities Subcommittee, and Rep. James Langevin (D-R.I.), the committee’s ranking Democratic member.

“Failing to recognize this threat in your National Security Strategy represents a significant step backwards on this issue and discredits those who deal in scientific fact.”The letter, which they also sent to Secretary of Defense James Mattis, quotes testimony Mattis gave before the committee in January 2017. He stated then that “I agree that the effects of a changing climate—such as increased maritime access to the Arctic, rising sea levels, desertification, among others—impact our security situation.”

The letter also notes that the National Defense Authorization Act, which Trump signed into law on 12 December 2017, states that climate change “is a direct threat to the national security of the United States” and calls for a report on vulnerabilities to military installations and combatant commander requirements resulting from climate change over the next 20 years.

“Failing to recognize this threat in your National Security Strategy represents a significant step backwards on this issue and discredits those who deal in scientific fact,” the letter to Trump states.

Stark Difference from the Obama Strategy

The White House’s National Security Strategy differs starkly from the Obama administration’s February 2015 strategy, which identified climate change as a top strategic risk to the country, and a September 2016 White House memorandum that cited threats of climate change to national security.

“Leaders throughout the defense and intelligence communities agree that climate change poses a direct threat to our national security, a position that was affirmed by Congress in the 2018 National Defense Authorization Act signed into law by the president himself,” Rep. Langevin told Eos. “We have not yet received a response to our bipartisan request; however, it is my hope that the President will take this opportunity to listen to his own national security experts and reincorporate climate change into the National Security Strategy.”

“Not So Fast”

“The significance of this letter is that it demonstrates there is bipartisan support in Congress for addressing climate security issues.”“The significance of this letter is that it demonstrates there is bipartisan support in Congress for addressing climate security issues,” John Conger, senior policy adviser with the Center for Climate and Security, told Eos. The center is a Washington, D. C.–based nonpartisan policy institute. Conger served in the Department of Defense (DOD) as principal deputy undersecretary from 2015 to 2017. Earlier at DOD, he oversaw a portfolio including climate change and energy security while performing the duties of the assistant secretary of defense for energy, installations, and environment from 2009 to 2015.

Conger said that when the “very forward leaning” Obama administration moved ahead on climate initiatives, Congress said, “not so fast.” Now, with the Trump administration trying “to take a step back on climate,” Congress is also saying, “not so fast.” “There are clearly many Republicans who think that [climate security] does need to be addressed [and] it shouldn’t be ignored,” he continued.

“Our armed forces, already stretched to the max, will be called upon to respond to these crises, and so it is foolhardy to ignore the risks posed by climate change.”“It’s encouraging to see members of Congress, both Republicans and Democrats, urging the president to include climate change in the National Security Strategy,” Mark Reynolds, executive director of Citizens’ Climate Lobby, told Eos. The lobby is a nonpartisan grassroots advocacy organization based in Coronado, Calif. “The mass migration of millions of climate refugees will create humanitarian crises and destabilize nations. Our armed forces, already stretched to the max, will be called upon to respond to these crises, and so it is foolhardy to ignore the risks posed by climate change.”

Letter Sends a Strong Signal

David Michel, a fellow in the environmental security program at the Stimson Center, a nonpartisan policy research center based in Washington, D. C., noted that Trump based his administration’s new National Security Strategy on four pillars: protecting the United States from threats, promoting American prosperity, preserving peace, and advancing American influence. “The strategy’s failure to recognize global climate change as a serious threat to U.S. welfare at home and our interests abroad undermines all four of these goals. The recent letter to the president, signed by a bipartisan group of over 100 representatives, reflects this conviction,” Michel told Eos.

Although Michel said he doubted that the letter would change the president’s mind, “it does send a strong and public signal of support to the Department of Defense and other agencies for their continuing efforts to identify, evaluate, and prepare for the growing climate risks to America’s security and global stability alike.”

—Randy Showstack (@RandyShowstack), Staff Writer

Correction, 18 January 2018: An earlier version of this article misidentified the speaker of a quote that is now attributed correctly.

Iron Readings Hint That Ocean Depth Influences Seabed Volcanism

Thu, 01/18/2018 - 13:34

Scientists long doubted that changes in sea level could affect volcanoes erupting deep in the sea along mid-ocean ridges. Recently, however, measurements of iron that bubbled up as magma, released long ago from hydrothermal vents along those ridges, suggest a possible connection.

As Earth’s climate cycled in and out of periods of cooling and glaciation for the past 2.6 million years, sea levels fell as ice sheets that locked up water on land formed and then rose with glacial melting that returned water to the oceans again. The new findings indicate that the activity of magma at mid-ocean ridges, which lie an average of 2.5 kilometers underwater, may fluctuate in response to those sea level ups and downs and the accompanying relatively small changes in water pressure, said Jennifer Middleton, a geochemist who was a Harvard Ph.D. student at the time of the studies.

Previous studies in Iceland have suggested that thickening glaciers on the summits of volcanoes exert massive increases in pressure on these volcanoes that suppress eruptions and vice versa. Taking cues from those effects, researchers have moved this sort of research offshore to explore the potentially more nuanced effects of decreases and increases in glaciation, and the associated changes in sea level and water pressure, on underwater volcanoes.

Forging an Iron Link

“This suggests there is a global change in magmatic activity associated with changing sea level.”Iron arises in plumes from hydrothermal vents on mid-ocean ridges and is often used by scientists as a measure of the activity of magma at these locations. In preliminary findings from new investigations conducted at the Mid-Atlantic Ridge, East Pacific Rise, and Juan de Fuca Ridge, “we see that when sea levels are changing, you see changes in hydrothermal iron deposition at all of the sites that have been explored,” said Middleton. “This suggests there is a global change in magmatic activity associated with changing sea level.”

Measurements of copper, which has likewise served as a measure for the activity of magma, during those same investigations yielded a different pattern than iron, Middleton noted. However, there are reasons to think that the iron correlation remains valid, she said.

The new research gives a snapshot of the Juan de Fuca Ridge between 400,000 and 600,000 years ago, Middleton pointed out this past December as she reported their findings in a poster presentation at the American Geophysical Union’s 2017 Fall Meeting in New Orleans, La.

Unexpected Results

Middleton and her fellow researchers took a 2-week cruise in the Pacific Ocean to gather samples at the Juan de Fuca Ridge, just a few hundred kilometers west of Seattle. They were accompanied by scientists across many disciplines, each aiming to provide different sides of the same story.

“It’s pretty low-tech. We essentially attach a rope to a length of PVC pipe and drive it into the ocean floor with a rock on top.”To get the “hydrothermal story,” Middleton gathered sediment cores from the ocean floor near the ridge. “It’s pretty low-tech,” she said. “We essentially attach a rope to a length of PVC pipe and drive it into the ocean floor with a rock on top.” The sediment cores keep a historical record in their layers. The researchers examined variations in iron depositions in the samples and looked for correlations with variations in sea level. They did the same for copper, which was thought to behave much like iron.

Their findings from the cruise were not quite what they expected: The researchers had anticipated that iron and copper would fluctuate with sea level and in tandem with each other. However, copper levels occasionally spiked in a way that iron levels didn’t. Middleton told Eos that copper goes through chemical reactions upon settling on the seafloor. These reactions complicate the measurements, which may make copper levels a poor estimate of hydrothermal activity.

The researchers thought magma production and iron deposition would increase in times of low sea level and water pressure. They observed the opposite.Although iron readings seemed tied to sea level variations, those brought their share of surprises, too. Initially, the researchers thought magma production and iron deposition would increase in times of low sea level and water pressure. They observed the opposite, they reported in their poster, finding that iron quantities instead increased as sea levels did.

Lagging or Cracking?

Middleton offered two possible ways to account for this relationship. Magma is produced several kilometers beneath the ocean floor, and scientists don’t know how long it takes to reach the surface. “It may take 1,000 years, it may take 10,000,” she said. Therefore, there may be a lag between changes in water pressure and shifts in hydrothermal activity.

Alternatively, there may be no lag, but higher water pressure may lead to cracks in Earth’s crust. This explanation draws upon a theory that glacial cycles affect crust thickness under the ocean, leaving it more vulnerable to fracture at certain times. In this case, plumes of hydrothermal metals would be released when ocean levels are high. Bridgit Boulahanis, a marine geophysicist at Columbia University’s Lamont-Doherty Earth Observatory in Palisades, N.Y., investigated this cracking hypothesis on the research cruise to Juan de Fuca.

Newfound Inconsistency

“The research suggests a connection between the fluid Earth and the solid Earth that we didn’t know about 10 years ago. It’s very exciting.”Middleton said that the output of hydrothermal metals from mid-ocean ridges was thought to be steady over large timescales, but the iron findings suggest otherwise. The findings tie the fluctuation of iron release to global climate variations. She said she finds it fascinating “how the climate cycle and the solid Earth interact with each other on timescales of tens of thousands of years.”

David Lund, a professor of marine sciences at the University of Connecticut–Avery Point in Groton who was not involved in the research, said the new findings highlight a growing understanding of links between liquid and rocky components of Earth. “The research suggests a connection between the fluid Earth and the solid Earth that we didn’t know about 10 years ago,” he said. “It’s very exciting.”

—Nicoletta Lanese (email: nlanese@ucsc.edu; @NicolettaML), Science Communication Program Graduate Student, University of California, Santa Cruz

Preserving a 45-Year Record of Sunspots

Wed, 01/17/2018 - 13:16

In 1964, the late solar researcher Patrick McIntosh launched an ambitious effort to track sunspots—relatively cool, dark blotches on the Sun caused by disturbances in the star’s magnetic field. He traced sunspots and other solar surface features from daily photographs, creating a map of the full Sun approximately every 27 days. This led to important advances in the prediction of solar flares and helped to reveal the large-scale organization of the Sun’s magnetic field. Now scientists are working to preserve and digitize McIntosh’s project, a uniquely consistent record of solar activity over 45 years.

The Sun’s magnetic field is driven by the interior flow of hot plasma, or electrified gas, which creates a magnetic generator called a dynamo. McIntosh’s records showed that the location and number of sunspots and filaments—huge arcs of dense plasma that appear as dark lines on the Sun’s surface—are indicators of just how this dynamo works.

By carefully documenting the position and number of sunspots over time, for example, McIntosh’s record illustrated how the Sun’s entire magnetic field flips polarity every 11 years. The number of visible sunspots helps researchers predict this flip: When the Sun emits more X-ray and ultraviolet radiation, a period called the solar maximum, the number of sunspots peaks. When solar activity dwindles during the solar minimum, sunspots dwindle. McIntosh’s maps were unique for also tracking the position of filaments and other features that also change as the magnetic field evolves, drifting poleward or toward the Sun’s equator at different stages of the solar cycle.

Webb et al. scanned and digitally processed the hand-drawn maps that McIntosh created, known as synoptic maps, to create a free, public online archive. Ultimately, they plan to use the data to investigate long-term variations in the Sun’s activity and invite other researchers to use it as well. In recent years, the maps have provided important context for coronal mass ejections, the explosive bursts of solar wind plasma from the Sun that create the northern and southern lights and can pose a threat to Earth’s communication systems and power grid. (Space Weather, https://doi.org/10.1002/2017SW001740, 2017)

—Emily Underwood, Freelance Writer

Rising Ocean Temperatures Threaten Carbon-Storing Sea Grass

Wed, 01/17/2018 - 13:13

Sea grasses are part of a team of coastal vegetation, including mangroves and salt marshes, that store up to 100 times more carbon than tropical forests at 12 times the speed. Vast prairies of sea grasses stretch for kilometers along the seafloor, storing enough carbon to rival the world’s forests.

If rising ocean temperatures cause these sea grass ecosystems to fail, the loss will only expedite the global warming that did them in, scientists say. So, how exactly will the world’s sea grasses fare in the face of climate change? Thanks to a newly made model, researchers now have answers.

“We can see that the coasts of Australia, Polynesia, and Hudson Bay will lose sea grass if ocean temperatures rise 1.5°C.”“We can see that the coasts of Australia, Polynesia, and Hudson Bay will lose sea grass if ocean temperatures rise 1.5°C,” said Orhun Aydin, a spatial statistician and product engineer at the Environmental Systems Research Institute (ESRI) in California. “The species Zostera marina only grows in these areas and will become extinct.”

Aydin and his coauthor Kevin Butler, a product engineer at ESRI, developed their model from publicly available data on sea grasses and their environments from the U.S. Marine Cadastre. They identified key environmental conditions involved in sea grass abundance and modeled how these would change with increased temperatures. Then they scaled up their model to encompass the global ocean, using the Ecological Marine Units data set, which provides 3-D maps of ocean ecosystems around the world.

Aydin presented the team’s predictions for the fate of sea grass last month at the American Geophysical Union’s 2017 Fall Meeting in New Orleans, La.

Rising Temperatures and Rising Concerns

The researchers looked at five environmental conditions affected by rising ocean temperatures: salinity, dissolved oxygen, nitrate, phosphate, and silicate concentrations. They compared ocean ecosystems using these parameters and grouped similar environments. They then cranked up the model’s thermostat and predicted how each ecosystem type would likely change with each 0.1°C increase in ocean temperature.

“We found an increase of 1°C was the tipping point,” said Aydin. Changing patterns in sea grass occurrence reveal themselves at 1°C and are exacerbated at 2°C and beyond, he explained.

For example, the Gulf of Mexico, a current sea grass hotbed, will be preserved as a haven for underwater meadows. But some places, such as Australia and Polynesia, will become increasingly unsuitable for sea grass.

Other places will become more suitable for growth, Aydin continued. For instance, if ocean temperatures rise 1.5°C, the frigid Arctic Ocean off the north coast of Siberia could become suitable for sea grass.

However, “Just because sea grass might be able to grow in new places doesn’t mean it will,” said Aydin. “The seeds still need to get there.”

A map shows how, over the course of a 2°C increase in ocean temperature, the suitability of areas to serve as sea grass habitats will likely change. To build the map, scientists began with today’s ocean conditions, then simulated sea grass ecosystem health every tenth of a degree as ocean temperatures rose. Aggregating these simulations through this 2°C temperature change yields a map that shows the trends of sea grasses at any given point. “Recently” refers to a switch in state; for example, ocean around southern South America started off as unsuitable and switched to being suitable for sea grasses over the course of the temperature change. “Increasingly” and “decreasingly” refers to an amplification of the trend; for example, areas in the Caribbean started off suitable and became more so over the 2°C temperature increase. “Consistently” means that the trend continued unchanged; for example, areas in the North Sea were consistently suitable for sea grasses over the 2°C temperature increase. “Improving” and “declining” mean that locations were trending toward being good or bad habitats. For example, areas north of Siberia are “recently improving”: They started off as slightly unfavorable to sea grasses and switched to being favorable as the simulation progressed but are still not prime locations for sea grasses to grow. “Sporadically” means that areas that started one way oscillated between conditions throughout the simulation. For example, regions north of Canada started as unsuitable and oscillated between suitable and unsuitable as the simulated temperatures rose. Credit: Orhun Aydin, ESRI The Future of Sea Grass

The researchers note that the model does not take into account polar ice sheet melting, which would affect ocean salinity and thus environmental hospitableness for sea grass. The model also does not account for how changing levels of atmospheric carbon dioxide will increase the acidity of oceans.

“Global warming is actively destroying mechanisms for storing carbon dioxide.”Acidity matters. According to a report from the Intergovernmental Panel on Climate Change, “changes in salinity and temperature and increased sea level, atmospheric CO2 [carbon dioxide], storm activity and ultraviolet irradiance alter sea grass distribution, productivity and community composition.”

Altered salinity and acidity would likely lower the threshold at which rising temperatures change distribution of sea grass, Aydin noted. The researchers’ next steps involve incorporating the compounding conditions into their model.

“Global warming is actively destroying mechanisms for storing carbon dioxide,” said Aydin. “This means increasing temperature will not be a linear process; intuitively, I’d say it will be exponential.”

—Nicoletta Lanese (email: nlanese@ucsc.edu; @NicolettaML), Science Communication Program Graduate Student, University of California, Santa Cruz

Stefan Rahmstorf Receives 2017 Climate Communication Prize

Wed, 01/17/2018 - 13:11
Citation Stefan Rahmstorf

Stefan Rahmstorf has a unique ability to explain science in a highly understandable yet accurate way to diverse audiences, from children to government ministers. He has perfected this ability in writing hundreds of blog articles: He was cofounder of RealClimate in 2005 and the German KlimaLounge blog in 2008. His articles are devoted to public understanding of research in the best sense: They do not merely explain results but showcase the scientific method, the way scientists think. He takes his audience seriously in not “dumbing down” the science but using every opportunity to deepen their understanding.

Stefan is remarkable in the breadth of the topics of which he has a firm grasp, not only in his popular writing but also in his research: paleoclimate, ocean circulation, sea level, extreme weather events, global temperature evolution, and more. His scientific publication record is outstanding; he has been honored for his scientific work by being elected a Fellow of AGU in 2010. He has played an important role in advancing both the scientific and public debates on the issues of sea level rise, the slowdown of the Gulf Stream system, and the impact of global warming on increasing extreme weather events.

He has (co)authored four popular books. The Climate Crisis (with David Archer) explains the findings of the Intergovernmental Panel on Climate Change in plain language, lavishly illustrated. Our Threatened Oceans (with Katherine Richardson) provides a highly readable overview of the state of the world ocean. His first book, Der Klimawandel (with me), was published also in Korean, Vietnamese, Russian, and Arabic and as an audiobook. Stefan is a father of two, and his latest book is the award-winning children’s book Wolken, Wind und Wetter.

Stefan has acted as a mentor to many young scientists, encouraging and helping them to speak to the media or write their first blog post. He has advised the German government as a member of the German Advisory Council on Global Change for 8 years. He is a sought-after public speaker, has appeared hundreds of times on radio and TV, and has written countless newspaper articles and commentaries, some of which have been translated into 15 languages. He has a large social media following and is regularly contacted by leading international media.

It was an honor for us to nominate him, and he rightly deserves to be the first scientist working outside the United States to receive the AGU Climate Communication Prize!

—Hans Joachim Schellnhuber, Potsdam Institute for Climate Impact Research, Potsdam, Germany

Response

I am thrilled and humbled to receive this award! Let me first of all thank John Schellnhuber for being such a great communicator and role model and for supporting my climate communication work for more than 20 years now. I also thank those who supported this nomination. I could not do this work without a great network of colleagues around the world with whom I am in constant exchange of information and insights. Many would deserve this prize.

We all share a passion for science. But in addition to that, we are driven by deeply caring for humanity and by the conviction that scientific insight and foresight can prevent avoidable human suffering. Climate change is not just an “environmental” issue; it is foremost a massive problem for human society. A stable climate is a foundation of human civilization. Without it, we could not rely on harvests to feed us every year or build lasting cities on the oceans’ shores. Two centuries of climate science have established beyond reasonable doubt that human activities are causing a global warming that is about to catapult us well out of the stable Holocene climate of the past 10,000 years, the period during which human civilization thrived.

Those who understand this threat to humankind have a duty to speak up. All the more so as there are powerful interests on the other side whose income depends on the general public not understanding the science and who have no scruples to go to great lengths to obfuscate scientific findings. This has been amply documented, for example, by the work of Harvard science historian Naomi Oreskes.

That shouldn’t deter us from talking truth to power—and to the ostriches, as last year’s winner of the AGU Climate Communication Prize, my good colleague Richard Alley, explains in his excellent video series How to Talk to an Ostrich. It’s not enough to do good science. As atmospheric scientist and Nobel laureate Sherwood Rowland was quoted as saying in the 1986 New Yorker article “Annals of Chemistry: In the Face of Doubt” by Paul Brodeur, “What’s the use of having developed a science well enough to make predictions if, in the end, all we’re willing to do is stand around and wait for them to come true?”

So I would like to encourage many more climate researchers to get engaged in climate communication. You might even win a prize. But even more rewarding, you will likely help humanity navigate through the climate crisis with less suffering and loss.

—Stefan Rahmstorf, Potsdam Institute for Climate Impact Research, Potsdam, Germany

Accounting for the Missing Silica in the Marine Sediment Cycle

Tue, 01/16/2018 - 13:10

During the past few decades scientists have increasingly realized that the cycling of silica in the world’s oceans is entwined with other key biogeochemical cycles, including those of carbon and nitrogen. And even though this coupling means silicon plays an important role in primary production and the amount of carbon dioxide in the atmosphere, a large mismatch in our estimates of the amount of this nutrient entering the oceans versus the amount that’s being removed remains.

One of the least understood pathways in the marine silica budget is the burial of biologically derived silica along continental margins with high sedimentation rates. In contrast to standard assumptions that much of the biogenic silica dissolves following burial, researchers have recently discovered that some of this silica is being converted to clay. Although this conversion would help explain where some of the silica is going, current estimates of this process account for just one quarter of the “missing” material.

To better constrain this pathway, Rahman et al. used cosmogenic silica, which is produced naturally in the atmosphere by cosmic ray bombardment, to trace the fate of biogenic silica in coastal sediments. Despite the very low levels of biogenic silica found along continental margins, the team was able to determine the amount and type of biogenic silica stored in samples collected from four depositional settings, including subtropical and tropical deltas and temperate coastal zones.

The results indicate that traditional techniques consistently underestimate clay formation and hence the amount of biogenic silica buried in subtropical deltas and temperate coastal zones by a factor of 2–4. This discrepancy, argue the authors, is due to an underestimate of the amount of clay and other products produced during the first stages of chemical alteration and is consistent with observations of rapid clay formation in laboratory experiments.

Their data indicate the amount of biogenic silica stored in clays along continental margins ranges from 4.5 to 4.9 teramoles per year, which could account for the entire discrepancy in marine silica budgets. (Global Biogeochemical Cycles, https://doi.org/10.1002/2017GB005746, 2017)

—Terri Cook, Freelance Writer

Will Clean Air Fade Away?

Tue, 01/16/2018 - 13:06

In the proposed budget for the U.S. government for fiscal year 2018, the U.S. Environmental Protection Agency’s (EPA) research spending is cut by as much as 40% [Cornwall, 2017]. This reduction in funding could jeopardize progress in environmental quality, cost America its environmental leadership role, and allow polluted air and water to once again become commonplace in U.S. cities.

EPA’s research-based regulations have fostered significant strides in environmental improvement in the United States and inspired analogous endeavors around the globe. Notably, air quality has improved over North America and Europe in recent decades, and China and India are making concerted efforts to protect the health of their citizens and environment.

Despite this progress, the United States plans to defund research and pollution control efforts. We call for action from the scientific community, public, nongovernmental organizations, Congress, and the U.S. government. The scientific community should effectively engage in increasing public awareness about the detrimental effects of air pollution, and the public should work with their representatives and senators to strengthen EPA’s efforts to promote healthy air quality.

Societal Implications of Air Pollution

Clean air is fundamental for human survival, yet it is often compromised in urban areas, and megacities emit enormous quantities of pollutants into the air.Clean air is fundamental for human survival, yet it is often compromised in urban areas, and megacities emit enormous quantities of pollutants into the air. Air pollution, mostly by particulate matter less than 2.5 micrometers in diameter (PM2.5), causes an estimated 3.3 million premature deaths worldwide each year, and air quality–related mortality rates could double by 2050 under the business-as-usual climate scenario [Lelieveld et al., 2015].

Air pollution is associated with reduced life expectancy and a range of health problems, including cardiovascular diseases, cancer, respiratory diseases, and cognitive diseases such as Alzheimer’s and dementia [Baccini et al., 2017; Pope et al., 2009; Lin et al., 2017; Gordon et al., 2014; Power et al., 2016]. In addition, poor air quality harms infrastructure, damages iconic art and architecture, and impairs visibility. Polluted air also disrupts regional and global energy balances by altering the ratio of solar energy that Earth absorbs to the energy it radiates back into space [Liu et al., 2015].

Bucking a Global Trend Fig. 1. Trends of mean annual coarse particulate matter (PM10) levels for the western United States from 1997 through 2015. Blue triangles signify downward trends at the 5% significance level, red triangles display upward trends, and gray circles represent no trend. The size of the triangles is proportional to the slope of the linear regression model fitted to the data; that is, larger triangles represent stronger trends.

A recent study by the World Health Organization (WHO) analyzed coarse and fine particle (PM10 and PM2.5) intensities in 795 cities in 67 countries from 2008 through 2013 and concluded that the global urban air pollution level rose 8% during this period [WHO, 2016]. Our analysis of air quality in the western United States, however, shows a declining trend in PM10 and PM2.5 levels between 1997 and 2015 (Figure 1). This analysis is preliminary and has not been formally published or peer reviewed.

We found that of 425 air monitoring stations that provided a relatively long record (more than 10 years), 232 showed a significant downward trend, 175 displayed no significant trend, and only 18 stations were associated with a significant upward trend in PM10. Trend analysis of annual mean PM2.5 levels yielded a similar pattern.

Given these preliminary findings, one key question emerges: How did the western United States manage to go against the global trend?

One reason rises to the top: Despite rapid population growth and industrial expansion, air quality in the western United States has likely improved because of the regulatory efforts of the U.S. EPA to control pollution sources [Samet, 2011]. This trend is similar to the EPA’s findings for the United States overall.

Broad Regulation Efforts

The U.S. Air Pollution Control Act of 1955 was the first federal legislation to recognize the detrimental effects of poor ambient air quality on human well-being and to provide funding for research and technical assistance to control air pollution. However, it was the Clean Air Act (CAA) of 1963 that empowered EPA to enforce regulation of air pollution at the source. Subsequent legislation has recognized the right for state agencies to establish environmental regulations specific to their needs, leading to more stringent regulations in states such as California.

Scientific publications of the 1990s that describe the air quality status of the nation corroborate the effectiveness of EPA’s execution of the Clean Air Act in coordination with state, local, and tribal governments. According to one study [Cramer, 1998, p. 45], in the 1980s and 1990s, California suffered from “notoriously polluted air,” which posed “perhaps the most urgent environmental problem” facing the state. The study noted, however, that the air quality condition was improving “due to aggressive regulatory efforts.”

Western U.S. Air Quality: A Work in Progress Fig. 2. Probability of daily PM10 levels exceeding the WHO healthy air threshold of 50 μg/m3 from 1997 through 2015. Larger circles are associated with greater probability values on a linear scale. Data here are preliminary—they have not been formally peer reviewed or published.

Despite the effectiveness of the EPA’s regulatory measures to reduce air pollution since the CAA went into effect, major metropolitan areas still suffer from poor air quality, assuming that the air quality at monitoring stations is a good representative of the cities where they are located. Our quick calculations show that between 1997 and 2015, the probability of daily PM10 values exceeding the threshold for healthy air as established by WHO, 50 micrograms per cubic meter (μg/m3), reached disturbing values for some large cities in the western United States (Figure 2).

We also found that in Los Angeles and Phoenix, residents breathe unhealthy air on average 84 and 170 days per year, respectively. Even worse, Maricopa, Arizona, 55 kilometers (34 miles) south of Phoenix, experiences unhealthy PM10 levels on average 278 days per year. Other major cities, such as San Diego, Salt Lake City, Las Vegas, Reno, and Denver also experience unhealthy air quality on a regular basis (Figure 2).

Thus, even though EPA’s endeavor to ensure healthy air quality has been effective (Figure 1), even more stringent regulations may be required in select areas and circumstances (Figure 2). However, we note that the current administration is not likely to embrace these added regulations, given that they have their sights set on dismantling current regulations.

EPA’s Research Programs’ Fate

EPA relies heavily on accountability research programs to assess the impact of pollution mitigation and intervention actions on human health [Özkaynak et al., 2009]. To develop targeted source reduction measures and to maintain air quality standards, EPA and its collaborating local agencies conduct and support research and simulations. Research and scientific discovery are the founding pillars of the agency’s endeavor to justify the regulations for protection and improvement of the country’s environmental conditions.

The proposed budget by President Trump’s administration, however, has cut EPA’s Office of Research and Development’s spending by 40% [Cornwall, 2017]. This reduction will limit the EPA’s ability to provide a safe and healthy environment for current and future generations. The proposed budget threatens the already insufficient mitigation and intervention actions of EPA.

Cost-Benefit Analysis of Air Quality Regulation

The 28 March 2017 executive order to roll back the Clean Power Plan only adds to the concerns of the scientific community about the future health of Americans because fuels have long been identified as a major source of toxic PM2.5. Decreased air quality will most certainly increase medical costs associated with the treatment of illnesses caused by air pollution, likely offsetting or exceeding any possible savings to government spending from these cuts.

A recent study concluded that “monetized human health benefits associated with air quality improvements can offset 26–1,050% of the cost of U.S. carbon policies” [Thompson et al., 2014, p. 917].  So although the current administration argues that relaxing regulations is necessary for economic growth and enhancing competitiveness of American industries, costly environmental side effects that threaten American lives will have a major economic toll.

The robust growth of California’s economy over the past decades in spite of, or perhaps because of, stringent air quality standards proves that EPA’s regulations do not impede economic growth.But we also take issue with the very premise of the current administration’s argument for deregulation of air quality. We believe that EPA’s regulations do not impede economic growth. In fact, the robust growth of California’s economy over the past decades in spite of, or perhaps because of, stringent air quality standards proves otherwise.

Likewise, China now recognizes the benefits of pollution control and has proposed a plan to replace coal with cleaner energy sources and to reduce energy consumption. When Chinese Premier Li Keqiang addressed the National People’s Congress on 5 March 2017, he noted that “having reached the current stage of development, China can now advance only through reform and innovation,” adding that “we will strengthen research on the causes of smog to improve the scientific basis and precision of the steps taken” [McLaughlin, 2017].

The U.S. government’s 2018 budget is still being debated in Congress, with government agencies being funded at 2017 levels through a series of continuing resolutions. We implore Congress to reverse the proposed budget cuts and keep EPA at the forefront of international endeavors for environmental protection.

Clean air is a basic need for livable cities, vibrant communities, and healthy populations. We can’t afford to let our clean air fade away.

New Thermodynamic Model for Computing Mantle Mineralogy

Tue, 01/16/2018 - 13:04

Understanding the tectonic processes that shape the surface of our planet requires knowledge of the inner workings of its deep interior. Most critical is an accurate model for mantle mineralogy, especially one that allows for variation in bulk composition. Chust et al. [2017] present a newly developed open-access software package called “MMA-EoS” that self-consistently evaluates mineral equilibria and thermodynamic properties of multicomponent systems by Gibbs energy minimization. The software allows for the computation of properties of single phases, solution phases, and multiphase aggregates. Application of this new code to a homogeneous ‘pyrolitic’ mantle and a mechanical mixture of subducted lithosphere lithologies suggest no significant differences between the two compositional models, indicating that a heterogeneous mantle structure may not be required.

Citation: Chust T. C., G. Steinle-Neumann, D. Dolejs, B. S. Schuberth, and H. P. Bunge [2017], MMA-EoS: A Computational Framework for Mineralogical Thermodynamics, Journal of Geophysical Research: Solid Earth, 122. https://doi.org/10.1002/2017JB014501

—Michael Walter, Editor, JGR: Solid Earth

Connecting Scientific Data and Real-World Samples

Tue, 01/16/2018 - 13:02

Physical samples play an important role in Earth and environmental sciences. Data derived from samples are the basis of the interpretations published in the scientific literature. Vast amounts of samples, data, and publications collected in the past are a trove of scientific information, but their volume and variety make systematic interpretation challenging.

Can a uniform approach to data representation be applied when different disciplines and projects have very different approaches to sampling itself?“Semantic Web” and “linked data” strategies have been proposed to support linking samples, data, interpretations, and reports. But are those the right tools to use at the scale of large specimen collections? There are also conceptual challenges: What does a sample represent? What do data from samples represent? Can a uniform approach to data representation be applied when different disciplines and projects have very different approaches to sampling itself? These approaches range from collecting small numbers of key specimens to large numbers for representative statistics.

A group of Earth and environmental science informaticians met to tackle these questions at a symposium last summer. More than 70 participants (including 18 non-Australian and 20 early-career researchers) from the solid Earth sciences, marine science, oceanography, ecosystems, biodiversity, soil science, and remote sensing met to discuss these issues.

Relationships between samples, parts of samples, and sampling artifacts must be recorded to allow the resulting observations to be related back to the world.The week kicked off with a field trip to the rock store at Geoscience Australia and continued on to Australia’s National Botanic Garden, the National Herbarium, and the National Insect Collection to view examples of the problem at hand. The realities of large collections include practical concerns around identifiers and metadata that put the linked data theory to the test.

The formal meeting started with attendees from specific science disciplines explaining their motivations for linking samples and data. Technical sessions then focused on Web linking and identifiers, emerging semantic tools, data delivery services, and data publication. Standard approaches are necessary to be able to easily move from, for example, a figure in a paper to the underlying data in a repository to an unambiguous representation of the sample on which the observations were made.

A conference field trip to a sample archive confronts this informatician with the realities of large sample collections. Credit: Simon J. D. Cox

Attendees concurred that relationships between samples, parts of samples, and sampling artifacts such as drill holes and cores must also be recorded to allow the resulting observations to be related back to the world. They noted that incentives for the adoption of common standards vary between disciplines and between sectors (e.g., researchers versus agencies). Disciplines that rely on shared platforms, such as marine science and oceanography, tend to embrace standardization, but others that would benefit, like long-term ecosystem studies, may be tied to competitive identification cultures (e.g., taxonomy) that challenge standardization.

Formal presentations took less than half of the time at the symposium. The remainder of the time was spent on “unconference” sessions. Some of these tackled topics in response to the presentations earlier that day.

For example, one session explored whether the concept of “sample” is actually shared across the different communities. Participants devised a cross-disciplinary definition that encompasses material samples or specimens, along with sampling stations and statistical samples from populations. This definition can be used as an anchor for data, concepts, and interpretations via hyperlinks embedded in reports, data sets, and publications.

An “anticonference” session allowed participants to describe project failures, providing complementary insights to the usual boasts about successes. The meeting closed with a session in the Up-Goer Five Challenge format, consisting of four science presentations using only the 1,000 most used words in English. This feat requires considerable creativity when the topic is ontologies, diamond exploration, or remote sensing of polar ice.

The meeting’s theme of “linking” was achieved through bringing multiple disciplines together and improving our understanding of how the theory of linked data can work in practice for environmental data and samples.

—Simon Cox (email: simon.cox@csiro.au; @dr_shorthair), CSIRO Land and Water, Clayton, Vic, Australia; Jens Klump (@snet_jklump), CSIRO Minerals, Kensington, WA, Australia; and Kerstin Lehnert, Lamont-Doherty Earth Observatory, Palisades, N.Y.

Michael Strasser Receives 2017 Asahiko Taira International Scientific Ocean Drilling Research Prize

Tue, 01/16/2018 - 13:00
Citation Michael Strasser

Michael “Michi” Strasser is a key science driver for increasing our understanding of submarine mass movements through scientific ocean drilling. He has enthusiastically conducted research on mass transport deposits induced by historic mega earthquakes in the Nankai Trough and also by the 2011 Japan Trench mega earthquake and tsunami. His achievements have significantly contributed to our understanding of the causes and mechanisms of such deformable sediments and their tectonic backgrounds. Importantly, these scientific achievements are also highly relevant to human society in terms of natural geohazards.

Beginning with his Ph.D., he initiated his research with the study of Swiss lake sediments and proposed a novel method to reconstruct magnitudes and source areas of prehistoric earthquakes. By combining sedimentology, exploration geophysics, and geotechnical methods on seismic slope stability, he quantified prehistoric earthquake intensities produced by subaquatic sediment failure. In 2007–2008, he participated in the Nankai Trough Seismogenic Zone Experiment sailing on the D/V Chikyu as a member of the scientific team during Integrated Ocean Drilling Program (IODP) Expedition (Exp.) 316. As a shipboard sedimentologist, he clarified the origin and evolution of a tsunamigenic thrust system based on slope failure sediments. In 2010, he assumed a leadership role in proposing the Nankai Trough Submarine Landslide History (NanTroSLIDE) project, again using the D/V Chikyu, and served as a co–chief scientist during IODP Exp. 333. One of the most fascinating scientific achievements resulting from IODP Exp. 333 was his 2011 paper, which presents several novel aspects of a submarine landslide study combining the use of X-ray computed tomography and 3-D seismic interpretations of the targeted area.

In 2011, he established his own lab at the Swiss Federal Institute of Technology Zurich and systematically pursued a conceptional research scheme to study earthquake-triggered subaquatic landslides and sediment stability along subduction margins. Major scientific achievements emanating from these projects include important discoveries of transient geochemical signals in the slump deposit that constrained the triggering of the slump associated with the 2011 Japan Trench mega earthquake and the history of methane release from hydrate dissociation induced by recent offshore earthquakes. Michi’s research has expanded further to include trans- and interdisciplinary directions to integrate both observational and theoretical processes. His interdisciplinary research achievements have broadened to include the impacts of active margin tectonics on the deep carbon cycle and biosphere and the integration of numerical modeling using IODP data. Since 2010, he has been serving as a leader of the international scientific community, for example, as cochair of the United Nations Educational, Scientific and Cultural Organization’s (UNESCO) International Geoscience Programme IGCP 585 and 640 and as subchair of the Proposal Evaluation Panel of IODP.

As a recipient of the Asahiko Taira International Scientific Ocean Drilling Research Prize, Michael Strasser is honored for his outstanding contributions to the investigation of submarine mass movements using multidisciplinary approaches through scientific ocean drilling.

—Yasuhiro Yamada, Center for Ocean Drilling Science, Japan Agency for Marine-Earth Science and Technology, Yokohama

Response

I feel deeply honored to receive the Taira Prize. I thank AGU, the Japan Geoscience Union (JpGU), and IODP for establishing this prestigious prize and express my supreme gratitude to Yasuhiro Yamada for his gracious citation.

The enthusiastic lectures by Judy McKenzie, Gretchen Bernasconi, and Gerald Haug triggered my fascination for studying Earth’s structure and history through scientific ocean drilling. I cannot overemphasize the encouragement and support I received from them to apply for the ODP student trainee program in 2002. I had the good fortune to join the JOIDES Resolution with fantastic international colleagues during Leg 205 to study subduction zone processes offshore Costa Rica. I am thankful to co–chief scientists Julie Morris and Heinrich Villinger and staff scientist Adam Klaus, who nurtured my scientific growth from a student trainee to a shipboard sedimentologist.

After this cruise, I did my Ph.D. project on lakes with Flavio Anselmetti, who taught me how to conduct my own little IODP-style project in lakes as model oceans and introduced me to the fascinating research of subaquatic mass movements and paleoseismology. Thereafter, I had the great opportunity to be involved in the IODP Nankai Trough Seismogenic Zone Experiment (NanTroSEIZE), to get exposed to the tremendous technological opportunities of Chikyu, and to establish exciting interdisciplinary collaboration with many NanTroSEIZE scientists. I would particularly like to thank Greg Moore, Achim Kopf, Mike Underwood, and Gaku Kimura in addition to the NanTroSEIZE chief scientists Harold Tobin and Masa Kinoshita and all co–chief scientists of Expeditions 316, 333, and 338, who were mostly influential on my research developments. They encouraged and supported me in writing my first drilling proposal to study submarine mass movement, which was implemented during Expeditions 333 and 338. Similarly, I am deeply thankful for the great momentum created by my colleagues within the UNESCO IGCP 585 and 640 projects, in particular, Angelo Camerlenghi and Roger Urgeles, to foster submarine landslide research within IODP. In representation of all not mentioned colleagues and friends within the bigger “IODP family,” I also thank Dick Kroon as past chair of the Science Evaluation Board, the panel membership of which provided me with yet another highly rewarding experience in learning how outstanding new research proposals are emerging. I acknowledge my host institutions, ETH Zürich, MARUM Bremen, and the University of Innsbruck, for all their support, and also my students for conducting their research projects with me. Finally, I thank my wife and family for all their incredible support.

—Michael Strasser, University of Innsbruck, Innsbruck, Austria

Tests Indicate Which Edible Plants Could Thrive on Mars

Fri, 01/12/2018 - 15:50

What do kale, carrots, lettuce, sweet potatoes, onions, dandelions, and hops have in common? They could all potentially be grown in Martian soil by future colonists, according to a recent project by a class of astrobiology students and their professor. In a new addition to the class, the students tried growing a variety of vegetables in simulated Martian soil to discover which edible species may be prospects for future colonists.

“The project combined my research—astrobiology and Mars—and my hobby of growing things,” said Edward Guinan, a professor of astronomy and astrophysics at Villanova University in Villanova, Pa. Guinan developed the Red Thumbs Mars Garden Project for his annual undergraduate-level astrobiology class and supervised the experiments, which concluded last month. The students, Guinan said, were very enthusiastic about growing their own Martian vegetables.

Kale, sweet potatoes, certain lettuces, and hops grew very easily, tasting no different than their terrestrial counterparts.The undergraduate researchers attempted to raise more than a dozen vegetables and herbs in a Mars-like soil with Mars-level light conditions. They found that kale, sweet potatoes, certain lettuces, and, surprisingly, hops grew very easily, tasting no different than their terrestrial counterparts. Other foods, including regular russet potatoes—the famous staple of the stranded astronaut in the 2015 film The Martian—required special soil or light treatments.

Guinan plans to repeat these experiments with future astrobiology classes and incorporate more rigorous scientific testing of the results. He presented the results this morning at the 231st meeting of the American Astronomical Society in National Harbor, Md.

Soil of a Different Color

The Red Thumbs Mars Garden Project was a new addition to his regular astrobiology course, Guinan explained. He was inspired by NASA’s Vegetable Production System (Veggie) on the International Space Station and wanted to bring those types of experiments to his students.

Guinan (right) and two of his students (left) are tending some of their plants grown in a brick-red Martian soil simulant seen in the frontmost plant (a pea plant). Some of their plants grew well in a Mars-like environment, whereas others required different light or temperature conditions that weren’t possible with the class’s single greenhouse setup. Credit: Villanova University

Guinan and his class grew the plants in a greenhouse in a commercially available Mars soil simulant similar to one developed by NASA and the Jet Propulsion Laboratory in Pasadena, Calif. The soil is mostly an iron-rich basalt with some additional reagents to better approximate the composition of Mars’s regolith as measured by NASA’s Curiosity rover and other instruments.

“The major differences are that Mars soil has about double the iron as Earth soil, mainly iron oxides, and Earth soil is more organic,” Guinan said.

The soil used in the trial is approximately 93% similar to Martian regolith, with the main differences being the absence of some poisonous perchlorates present on Mars and the class’s addition of inorganic fertilizers to aid plant development. Mars’s soil lacks the living organisms in Earth’s soil that help plant life flourish, Guinan said, so Mars farmers would need to augment the soil with biologically rich materials like compost waste.

The students grew the plants in pots under Mars-like light conditions—about 44% the light level on Earth. They then compared their plants to the same plant varieties grown in pots in regular potting soil.

Although the light intensity was Mars-like, the atmosphere was Earth-like: Plants on Mars would need to be grown in a greenhouse with an Earth-like atmosphere, Guinan said, because they would struggle to survive in Mars’s thin, cold, and dusty atmosphere. The need for indoor cultivation actually offers a benefit, he said, because the plants’ respiration could become part of the atmospheric recycling of a colony.

Martian greenhouses would need to maintain a constant 50%–60% humidity to avoid soil dryness.Even with the right, nonpoisonous composition, the students ran into some issues with the soil. Mars’s regolith, a very fine, claylike powder, dried out very quickly if not constantly watered. Soil dryness was solved by regular watering, although Guinan estimated that Martian greenhouses would need to maintain a constant 50%–60% humidity to avoid soil dryness.

The soil also packs too tightly to let roots or subsurface vegetables grow, they found. The class solved the soil density problem by aerating it with some shredded cardboard or vermiculites to give the roots and veggies room to grow. Cardboard was more ideal, Guinan explained, because it might already be part of shipping material people would take to Mars and colonists wouldn’t need to import unneeded supplies.

A Variety of Martian Produce

With soil aeration and moisture levels accounted for, Guinan’s students found that each of the plants they tested grew moderately well. However, sweet potatoes, carrots, onions, kale, dandelions, basil, garlic, and hops were particularly robust crops under Martian conditions. The greenhouse was too hot for peas and spinach, Guinan explained, or they probably would have survived, too.

Some of the vegetables, herbs, and salad greens grown in a Martian soil simulant by Villanova undergraduate students. In this corner of the greenhouse, hops flourish in the frontmost tray, kale and lettuces fill the left tray, and dandelion bins occupy planters in the center rear of the image. Credit: Villanova University

“Of course, the students also picked potatoes because of The Martian,” Guinan said, “but the soil was too dense at first and the potatoes would not grow in it—they were squeezed. Once we added about one-third of some filler into the soil to give the potatoes room to breathe, they grew very well.”

Some business majors in the class opted to grow hops, a beer-brewing ingredient, and toyed with ways to market “Mars Beer,” Guinan explained. He noted, however, that some enterprising brewers beat them to the punch with their own hops and sorghum grown in simulated Martian soil prior to the Villanova tests.

Although none of the produce raised in Mars simulant soil tasted noticeably different from the experimental control crops, Guinan expressed concern that some of the typically iron rich leafy vegetables, like kale and spinach, might take up excessive iron from the Martian soil. Too much iron in food, he explained, can cause indigestion or even food poisoning. Guinan plans to have future classes test the iron content of the Mars-grown salad greens to see if the soil’s iron enrichment is reflected in the leaves. If so, that would be another potential concern for future Mars farmers.

“The farther and longer humans go away from Earth, the greater the need to be able to grow plants for food, atmosphere recycling, and psychological benefits,”Although NASA may be growing veggies in space, Guinan said that his class’s experiments add variety. “Once we treated the Martian soil correctly, pretty much everything grew well,” he said.

“The farther and longer humans go away from Earth, the greater the need to be able to grow plants for food, atmosphere recycling, and psychological benefits,” said Gioia Massa, a payload scientist for Veggie at NASA’s Kennedy Space Center in Cape Canaveral, Fla. “I think that plant systems will become important components of any long-duration exploration scenario,” Massa added.

Future Harvests

This first set of Villanova vegetable tests was just the start, according to Guinan. In his opinion, the students chose their vegetables on the basis of what they liked to eat, rather than what would be the most nutritious or valuable for potential Mars colonists. When he repeats this project with future classes of astrobiology students, he said, he plans to have the students test vegetables, herbs, and possibly fruit that would be more likely to be selected by colonists.

Now that the Red Thumb Project is a proven success, Guinan has received a more dedicated greenhouse space to use for his class’s next vegetable patch. The new space, he explained, will give his class more control over the temperature, humidity, and light conditions for the plants and let them refine their experiments.

“This time, we’ll be growing from January through the summer, so we’ll have more time to evaluate what’s growing” and to also test slower-growing plants, Guinan said.

He also recommends this sort of project to other astrobiology teachers looking for a way to spruce up their courses. “It’s easy to set up, it worked well, and the students loved it.”

—Kimberly M. S. Cartier (@AstroKimCartier), News Writing and Production Intern

How Drought Plays Out

Fri, 01/12/2018 - 13:19

Drought takes many forms. There’s meteorological drought, in which snow and rainfall are abnormally scarce. There’s hydrologic drought, in which rivers, lakes, and underground aquifers draw down or dry up; typically, a hydrologic drought is declared if water levels drop below the 25th percentile in a given region. And there’s agricultural and socioeconomic drought, which occurs when rain, surface water, and groundwater are not sufficient to sustain crops or other human activities. Now, a new study shows how different patterns of rainfall drive hydrologic drought when combined with human water use, such as groundwater pumping.

Meteorological droughts drive hydrologic droughts, but humans exacerbate water scarcity, too, by extracting water from rivers and underground aquifers faster than it can be replenished. To explore how humans and climate interact to impact drought conditions, Apurv et al. built a virtual watershed that included some fixed properties, such as silty loam soil, but allowed climate and human properties—like how much rain fell at different times of the year and how much water people removed from reservoirs and aquifers—to change.

The team used precipitation data from the Global Historical Climatology Network from 10 sites around the world, including Australia, the United States, China, India, and Poland, to mimic a range of different climates. Then they simulated a typical human response to changing water supplies: When rivers, lakes, and reservoirs were full, people drew on them for water to drink, bathe, water crops, and run businesses. When those sources dwindled, people started pumping water from the ground.

The researchers found that different annual patterns of rainfall played a major role in how people used water and how much water was available in aquifers long term. For example, in California, where the annual rain supply is highly variable, the model showed little groundwater depletion in response to drought. Although people pumped groundwater during dry spells, they stopped after sporadic, heavy rains filled the reservoirs, allowing the water table to recover. (The model is not intended to mimic real-life California, where overwhelming demand for water has indeed depleted aquifers, the authors note.)

By contrast, in a climate such as Mwinilunga, Zambia, where annual rainfall is less variable, the model predicted significant groundwater depletion in response to drought. When the reservoir dried up, people started pumping groundwater, which, in turn, depleted the flow in the rivers. No periodic deluges arrived with enough precipitation to refill the reservoirs, so people kept pumping, creating a positive feedback loop.

The results need to be validated with data from the modeled sites, including more accurate details about human responses. For example, intense, rising demand for water nearly always leads to groundwater depletion, regardless of the climate. But the study does shed light on how different precipitation patterns affect the human response to drought and could point to strategies for making dwindling resources last a little longer. (Water Resources Research, https://doi.org/10.1002/2017WR021445, 2017)

—Emily Underwood, Freelance Writer

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer