Syndicate content
Earth & Space Science News
Updated: 1 day 2 hours ago

Lab Tests Probe the Secrets of Steep and Rocky Mountain Streams

Fri, 04/21/2017 - 12:08

Water flowing over a mountainous landscape can be serene or tumultuous, depending on a variety of physical factors: how much sediment it is carrying, the angle of incline, the ruggedness of the terrain, and more. Understanding the mechanics—or, rather, hydraulics—of this flow is essential to preparing for floods, classifying and restoring natural habitats, and undertaking geologic and engineering projects.

Over the past century, the hydraulics of gently sloping riverbeds have been more thoroughly studied than those with sharp inclines, despite the fact that steep, rocky channels are a key part of mountain drainage systems. Although observational studies have revealed some general differences between the two, experimental tests to better explain these differences have proved difficult.

For the most part, this difficulty is because the ideal test conditions so rarely occur in nature. The prime location to conduct further research would be a steep stream devoid of particle clusters (gravel and cobbles that clump together at the bottom of a river), large boulders, and bed forms (geologic features formed by water interacting with a riverbed), all of which complicate scientific analysis of a stream’s flow.

Similar studies conducted on bed form–free, gently sloping rivers over the past few decades have allowed scientists to clearly assess the role of bed forms in the momentum conservation, flow resistance, sediment transport, and velocity profile (a function of the water’s average speed at varying heights above the surface of the riverbed) for these types of streams.

Sadly, few natural streams exist that are both steep and free of bed forms.

To get around this obstacle, Lamb et al. designed a simulated environment that would allow them to explore the hydraulics of water flow over a rough, planar (flat-layered) riverbed. Their design allowed them to study a wide range of conditions that could occur in steep mountain streams.

Their test environment, called a flume, is encased in glass and measures about 15 meters by 1 meter. The researchers can adjust the water level as well as the slope of the “riverbed” by tilting it back and forth. A motorized cart travels along railroad-type tracks, transporting various instruments used to measure the velocity of the flow.

Stones glued to the floor simulate natural formations typically found in a rough-bedded mountain stream (a laser precisely analyzes its topography), but simulated bed forms are, notably, absent. The video below shows an example of the flume in action, discharging 510 liters per second over a riverbed at an 8° slope.

In all, the researchers conducted 58 experiments, including inclines much steeper than those that have been studied in the past. By comparing the results of these experiments to a flow velocity model they developed, they found the velocities of steeply and gradually sloping streams to be surprisingly similar, given how different these types of streams appear to be at the surface (steep streams are generally splashier and bubblier).

The team was also somewhat surprised to find that two factors in the steep, simulated riverbeds (no bed forms) closely matched observations from natural mountain streams (with bed forms). For one, flow resistance tended to increase as the roughness of the riverbed increased. Using their model, they inferred that steeper slopes should carry smaller loads of sediment. Given these results, the researchers wonder whether bed forms might not be causing the observed variations after all or, at least, not to the extent that was previously believed. (Water Resources Research, https://doi.org/10.1002/2016WR019579, 2017)

—Sarah Witman, Freelance Writer

Could Stratospheric Ozone Depletion Make Hadley Cells Expand?

Fri, 04/21/2017 - 12:06

In 1735, meteorologist George Hadley shook up his field by proposing a novel model of global atmospheric circulation, since named the Hadley cell, in which warm air rises at the equator and heads toward the poles before cooling and sinking back toward Earth’s surface at midlatitudes. Over time, especially within the past few decades, these large-scale circulation patterns have been visibly widening for reasons unknown to scientists.

The southernmost edges of the Hadley cell in the Southern Hemisphere are defined by the set of points where sea level pressure is highest. Using a technique called optimal fingerprinting analysis, Kim et al. compared a series of model simulations to examine long-term observed changes in these edges during the austral summers (December through February) of 1979–2009. They found that the southernmost edges of the Hadley cell over the Atlantic and Indian oceans expanded farther poleward during this period.

While examining causes of these trends, the researchers detected within the models strong signals of anthropogenic forcing, or human activities such as industry and agriculture, which has ultimately led to increases in greenhouse gases. Specifically, the models showed a link between the expansion of Hadley cells and the depletion of stratospheric ozone. Ozone depletion can lead to “holes” in the ozone, like the one detected over Antarctica in 1985. This would induce cooler conditions over Antarctica, shifting the lower-latitude circulation system poleward, including Hadley cells.

This newfound knowledge provides an important link in the chain for scientists seeking to understand  Earth’s evolving climate. The authors note that the correlation they uncovered needs to be fleshed out with causes that pinpoint exactly how ozone depletion leads to Hadley cell expansion and what the future holds for Hadley cells if ozone does or does not recover.  (Geophysical Research Letters, https://doi.org/10.1002/2016GL072353, 2017)

—Sarah Witman, Freelance Writer

Earth and Space Science for the Benefit of Humanity

Thu, 04/20/2017 - 14:05

The March for Science is appropriately being held this weekend on Earth Day 2017. The broad theme for the March is “Science is Essential,” and this is applicable also to Earth Day. It may seem that with our growing cities, air conditioning, modern infrastructure, and energy-enabled amenities, we can be more isolated from our environment and less dependent on Earth than our ancestors, but the opposite is true: We are more intimately connected than ever before. Many aspects of modern society depend critically on rich real-time data and sophisticated models about all aspects of our planet and its space environment. Growing populations and development are taxing natural resources and increasingly altering Earth’s land, ecosystems, atmosphere, ice sheets, rivers, and oceans on a global scale. Globalization makes our societies, including the most developed ones, more sensitive to disruptions. These interdependencies make research in the Earth and space sciences critically important for society.

A collection of essays and other recent Special Collections across the American Geophysical Union journals illustrate, celebrate, and illuminate these deep connections. Three broad and generally underappreciated themes emerge across this collection. These themes have important implications in the context of recent U.S. and international political developments.

The first theme is that the notion that “basic” or “curiosity-driven” research is distinct from “applied” research is increasingly an anachronism. Most of the cutting-edge research being conducted by Earth and space scientists has direct relevance to society. This relevance is not new but is more extensive and broadly connected than in the past. Geologic research has long been a key contributor to energy and mineral exploration. But research motivated by curiosity about how the Earth works has also led to important resource discoveries. For example, deep ocean drilling to improve understanding of the ocean crust and sediments in the Gulf of Mexico in the late 1960s led to the discovery of vast oil resources.

Today, the connections are broader. Businesses, societies, and economies operating from local to global scales are critically dependent on real-time data about our planet, increasingly at very fine spatial and temporal scales. In turn, these data feed improved models that both address new research questions and provide operational data and forecasts for societal decisions, from governments to individual farmers and shippers. Examples abound. Detailed real time mapping of ocean currents helps us understand how the oceans mix, directly helping companies save fuel in ocean transportation, trade, fishing, and recreation. Understanding subtle changes in Earth’s rotation tells us about Earth’s core and history but also improves GPS signals on which we increasingly rely. A huge amount of global data of great variety, including from citizen science as well as research into numerical methods and statistics, is necessary to provide ever more accurate weather and water-supply forecasts, yielding major economic benefits, and protecting people, crops, and ecosystems. Observations of the sun and of our near-space environment are used to protect our electrical grids, satellites, and airline passengers as well as to improve the fidelity of GPS signals. Testing of sensors on other planets has improved or led to new satellites that provide key data on Earth. And Earth and space science information provide critical insights for addressing many health concerns, from air pollution to human and agricultural pandemics.

The second theme is that these current capabilities have developed, and are critically dependent on, international collaborations, cooperation, and funding. These collaborations include scientists, of course, but they also involve governments and businesses. Global data for a global economy requires global research and data-collection efforts, which require global collaboration and cost-sharing. In addition, it is clear that understanding of local weather requires rich global data; snowfall in the Sierra Nevada is influenced by dust entrained in the atmosphere from Asia and Africa. Understanding the course of one volcanic eruption or earthquake improves understanding of the next one elsewhere in the world. The costs of research and infrastructure, including satellites, have increasingly been shared worldwide. The U.S. economy, as that of every country, greatly benefits from this global research collaboration and shared financing for Earth observations. These collaborations are needed to maintain and expand our global observing effort and the economic and security benefits that it enables.

The third theme, already introduced, is the inclusion of rich data from monitoring all parts of Earth’s processes and its environments (present and past) into sophisticated models that are used both to understand Earth’s processes and to inform critical societal decisions. This understanding is regularly included in engineering models used to mitigate hazards or design better structures. Likewise, such models provide weather forecasts, help predict water supply and coastal erosion, prepare cities and regions for natural hazards and climate change, and help coordinate responses to disasters in real time. Improvements to these models depend on global data, including data whose collection was originally motivated by scientific research.

Although there has been great progress over several decades in using research in Earth and space science for the benefit of humanity, the collection of essays also highlights many areas where further progress is both possible and needed. These include new applications, constraining uncertainty, and improving models and forecasts. The authors of these essays also discuss how Earth and space scientists can better communicate both what we know and don’t know and where further improvements are within reach. The Earth complex, and the desire for more effective understanding and communication, is strong.

Two critical threats have emerged to the societal benefits provided by Earth and space science. The first is increasing nationalistic tendencies worldwide that threaten the international collaborations that have facilitated the development of global research, funding, and data collection. Our understanding of Earth processes and current global capabilities – and the economic and societal benefits – have developed directly because scientists and students have been allowed to interact internationally, conduct research worldwide, share global observation platforms, secure temporary and permanent positions in other countries, and attend international conferences. Restricting this exchange will directly harm existing capabilities and limit future scientific advances. Because this international cooperation is critical to understanding the Earth as a system, the Earth and space sciences are particularly vulnerable to such restrictions.

The second threat is proposed funding cuts in major science agencies in the United States and elsewhere. These cuts will do the most harm in two critical areas: collecting and interpreting important data, and training and engaging new scientists. The infrastructure supporting scientific data, especially relating to our planet, is fragile and needs new support for long-term preservation and connectivity, as well as broader availability and sharing of data given its critical economic and scientific role. We need better and more systematic data about our impact on the environment, not less. Instead, U.S. agencies are facing the prospect of substantial cuts, spurring efforts to “save the data.” As Harold Varmus noted in commenting on the proposed cuts to the NIH budget, the cuts are likely to fall most heavily on the youngest aspiring scientists. The proposed cuts send a message that these jobs are not valued, and that the resources needed to support both the long-term collection of data and the training of the next generation of scientists are not guaranteed.

Earth Day and the March for Science both celebrate the increasingly valuable benefit of Earth and space science research for society. It is also an opportunity to appreciate how these impacts are rooted in a very deep understanding of our planet and its past, present, and future environments. This connection between science and society can and should be made even stronger, for even greater benefit to humanity.

—Brooks Hanson, Director Publications, AGU; email: bhanson@agu.org; Jenny Lunn, Assistant Director, Publications, AGU; Ben van der Pluijm, Editor-in-Chief, Earth’s Future; John Orcutt, Editor-in-Chief, Earth and Space Science; Rita Colwell, Editor-in-Chief, GeoHealth; Susan Trumbore, Editor-in-Chief, Global Biogeochemical Cycles; Thorsten W. Becker, Editor-in-Chief, G-Cubed; Noah Diffenbaugh, Editor-in-Chief, Geophysical Research Letters; Robert Pincus, Editor-in-Chief, JAMES; Mike  Liemohn, Editor-in-Chief, JGR: Space Physics; Uri ten Brink, Editor-in-Chief, JGR: Solid Earth; Peter Brewer, Editor-in-Chief, JGR: Oceans; Minghua Zhang, Editor-in-Chief, JGR: Atmospheres; Steven A. Hauck II, Editor-in-Chief, JGR: Planets; Bryn Hubbard, Editor-in-Chief, JGR: Earth Surface; Miguel Goni, Editor-in-Chief, JGR: Biogeosciences; Ellen Thomas, Editor-in-Chief, Paleoceanography; Philip Wilkinson, Editor-in-Chief, Radio Science; Mark Moldwin, Editor-in-Chief, Reviews of Geophysics; Delores J. Knipp, Editor-in-Chief, Space Weather; John Geissman, Editor-in-Chief, Tectonics; and Martyn Clark, Editor-in-Chief, Water Resources Research

New Study Ranks Asteroid Effects from Least to Most Destructive

Thu, 04/20/2017 - 13:58

If an asteroid struck Earth, which of its effects—scorching heat, flying debris, towering tsunamis—would claim the most lives? A new study has the answer: violent winds and shock waves are the most dangerous effects produced by Earth-impacting asteroids.

The study explored seven effects associated with asteroid impacts—heat, pressure shock waves, flying debris, tsunamis, wind blasts, seismic shaking and cratering—and estimated their lethality for varying sizes. The researchers then ranked the effects from most to least deadly, or how many lives were lost to each effect.

Overall, wind blasts and shock waves were likely to claim the most casualties, according to the study. In experimental scenarios, these two effects accounted for more than 60 percent of lives lost. Shock waves arise from a spike in atmospheric pressure and can rupture internal organs, while wind blasts carry enough power to hurl human bodies and flatten forests.

“This is the first study that looks at all seven impact effects generated by hazardous asteroids and estimates which are, in terms of human loss, most severe,” said Clemens Rumpf, a senior research assistant at the University of Southampton in the United Kingdom, and lead author of the new study published in Geophysical Research Letters, a journal of the American Geophysical Union.

Rumpf said his findings, which he plans to present at the 2017 International Academy of Astronautics Planetary Defense Conference in Tokyo, Japan, could help hazard mitigation groups better prepare for asteroid threats because it details which impact effects are most dominant, which are less severe and where resources should be allocated.

Though studies like his are necessary to reduce harm, deadly asteroid impacts are still rare, Rumpf said. Earth is struck by an asteroid 60 meters (more than 260 feet) wide approximately once every 1500 years, whereas an asteroid 400 meters (more than 1,300 feet) across is likely to strike the planet every 100,000 years, according to Rumpf.

“The likelihood of an asteroid impact is really low,” said Rumpf. “But the consequences can be unimaginable.”

Modeling Asteroid Effects

Rumpf and his colleagues used models to pepper the globe with 50,000 artificial asteroids ranging from 15 to 400 meters (50 to 1312 feet) across—the diameter range of asteroids that most frequently strike the Earth. The researchers then estimated how many lives would be lost to each of the seven effects.

Land-based impacts were, on average, an order of magnitude more dangerous than asteroids that landed in oceans.

Large, ocean-impacting asteroids could generate enough power to trigger a tsunami, but the wave’s energy would likely dissipate as it traveled and eventually break when it met a continental shelf. Even if a tsunami were to reach coastal communities, far fewer people would die than if the same asteroid struck land, Rumpf said. Overall, tsunamis accounted for 20 percent of lives lost, according to the study.

The heat generated by an asteroid accounted for nearly 30 percent of lives lost, according to the study. Affected populations could likely avoid harm by hiding in basements and other underground structures, Rumpf said.

Seismic shaking was of least concern, as it accounted for only 0.17 percent of casualties, according to the study. Cratering and airborne debris were similarly less concerning, both garnering fewer than 1 percent of deaths.

Only asteroids that spanned at least 18 meters (nearly 60 feet) in diameter were lethal. Many asteroids on the lower end of this spectrum disintegrate in Earth’s atmosphere before reaching the planet’s surface, but they strike more frequently than larger asteroids and generate enough heat and explosive energy to deal damage. For example, the meteor involved in the 2013 impact in Chelyabinsk, Russia, was 17 to 20 meters (roughly 55 to 60 feet) across and caused roughly 1,500 injuries, inflicting burns and temporary blindness on people nearby.

Understanding Risk

“This report is a reasonable step forward in trying to understand and come to grips with the hazards posed by asteroids and comet impactors,” said geophysicist Jay Melosh, a distinguished professor in the Department of Earth, Atmospheric and Planetary Sciences at Purdue University in Lafayette, Indiana.

This chart shows reported fireball events for which geographic location data are provided. Each event’s calculated total impact energy is indicated by its relative size and by a color. Credit: NASA

Melosh, who wasn’t involved in the study, added that the findings “lead one to appreciate the role of air blasts in asteroid impacts as we saw in Chelyabinsk.” The majority of the injuries in the Chelyabinsk impact were caused by broken glass sent flying into the faces of unknowing locals peering through their windows after the meteor’s bright flash, he noted.

The study’s findings could help mitigate loss of human life, according to Rumpf. Small towns facing the impact of an asteroid 30 meters across (about 98 feet) may fare best by evacuating. However, an asteroid 200 meters wide (more than 650 feet) headed for a densely-populated city poses a greater risk and could warrant a more involved response, he said.

“If only 10 people are affected, then maybe it’s better to evacuate the area,” Rumpf said. “But if 1,000,000 people are affected, it may be worthwhile to mount a deflection mission and push the asteroid out of the way.”

What to Expect from Cassini's Final Views of Titan

Thu, 04/20/2017 - 13:56

Since the Cassini spacecraft entered Saturn’s orbit in 2004 and dropped a probe onto its largest moon, Titan, scientists have been captivated. Titan’s icy surface is dotted with lakes and seas, its equator wrapped in a field of dunes. Its rainstorms are eerily Earth-like, and its atmosphere swells with prebiotic chemistry.

But in a few short months, Cassini will vaporize in Saturn’s atmosphere, and scientists will wave goodbye to studying Titan up close.

“It’s going to be a very emotional next several months.”“It’s going to be a very emotional next several months,” said Elizabeth “Zibi” Turtle, a planetary scientist at Johns Hopkins University’s Applied Physics Laboratory (JHUAPL) in Laurel, Md. Turtle, along with about 60 other scientists inside and outside the Cassini mission, gathered at NASA’s Goddard campus in Greenbelt, Md., in early April for the fourth Titan Through Time workshop.

There, presenters covering Titan from its interior all the way to the top of its thick atmosphere reminded us that before Cassini’s September demise, there’s still plenty of fun in store.

On 22 April, for example, the spacecraft will sideswipe Titan and skim its ionosphere a little less than 1000 kilometers away from its surface. This flyby, designated T-126, will be Cassini’s last close trip to Titan. After 22 April, Cassini’s subsequent flybys of Titan will be from hundreds of thousands of kilometers away while it swings in and out of Saturn’s rings.

In the past 13 years, “Titan went from being a mystery, which is exciting, to being a frontier to explore,” Turtle said. With these last views of Titan—both near and far—scientists hope to see the bottom of its lakes, improve their maps of the north pole, and even spot some storm clouds.

A Strange Surface

Cassini didn’t give us our first glimpse of Titan. That came from the Voyager spacecraft, which passed by Saturn in 1980 and 1981. But Voyager couldn’t see down to Titan’s surface: Those views came only with Cassini and the short-lived Huygens probe.

Cassini obtained this radar image of the Shangri-La Sand Sea in 2016. The image contains hundreds of sand dunes that look to be moving from west to east. Dunes cover 20% of Titan’s surface and appear primarily around its equator. Credit: NASA/JPL-Caltech/ASI/Université Paris-Diderot

During Cassini’s fourth flyby in 2005, its radar instrument revealed wind-swept dunes wrapping around Titan’s equator. Dunes are exciting because they “can be an instantaneous marker for climate and wind,” said Jani Radebaugh, a planetary scientist at Brigham Young University in Provo, Utah. A dune’s shape can, on Earth at least, reveal which way the wind is blowing.

However, as with most things on Titan, even the discovery of dunes raised more questions. Currently, the sand looks like it’s moving one direction, but climate models show the wind is blowing in a different direction, Radebaugh said. And when observations and models don’t match up, scientists know that they should search for more clues.

Dunes aren’t the only unexpected feature dotting Titan’s cold landscape. Early in the mission, scientists also discovered dark patches of liquid: lakes and seas. Thanks to Cassini’s infrared spectrometer and other instruments, scientists know that these lakes are filled with liquid methane, ethane, other more complex hydrocarbons, and possibly nitrogen.

What’s more, scientists recently spotted waves on the surface of Punga Mare, a northern lake, which can tell them something about Titan’s winds and whether a future submarine exploration mission would splash or splat.

High Hopes for T-126 A mosaic of Titan’s north polar lakes and seas stitched together from Cassini’s radar images from 2004 to 2013. Scientists are hoping that the final close-up flyby, T-126, will help them understand features on Titan’s lake beds. Credit: NASA/JPL-Caltech/ASI/USGS

Thus far, however, the angle of Titan flybys hasn’t allowed the spacecraft to see the bottoms of Titan’s smaller lakes.

Scientists hope that T-126 will change that, said Marco Mastrogiuseppe, a telecommunications engineer at Sapienza University in Rome. During the last close flyby, Cassini scientists will aim its radar at the northern lakes to peek at their depths.

T-126 could also help illuminate the lakes’ origins, Mastrogiuseppe said. Could they form like sinkholes on Earth, where rain and groundwater dissolve rock from above and below? Or could there be a tectonic origin, perhaps involving rifts opening basins and liquid rushing in? Scientists also suspect there could be a subsurface network connecting the lakes and seas, but they aren’t yet sure.

Zooming Out to the Big Picture

Even Cassini’s subsequent far-off flybys, from hundreds of thousands of kilometers away, will help scientists better understand Titan’s lakes and seas.Even Cassini’s subsequent far-off flybys, from hundreds of thousands of kilometers away, will help scientists better understand the lakes and seas, said Conor Nixon, a planetary scientist at NASA’s Goddard Space Flight Center and one of the original cofounders of the Titan Through Time workshops.

From up close, the radar can show scientists small patches in high resolution as the spacecraft zooms by, but it can’t get wide shots of the entire region. Imagine driving by a house at 100 kilometers per hour and snapping a picture. There isn’t much time to get a complete view. But driving by from 100 kilometers away, you’d have more time to snap multiple pictures, Nixon said.

Similarly, during the faraway flybys, Cassini will sail over Titan’s north pole and spend hours capturing radar images of the entire region, Nixon said. These images will allow scientists to improve their maps and watch for changes along the lakes’ and seas’ shorelines.

An Active Atmosphere

As a scientist who works with Cassini’s remote sensing instruments, Turtle actually prefers the faraway flybys. The reason is because, from farther away, Cassini’s infrared mapping instrument and high-resolution camera can also capture a more complete profile of the atmosphere, Turtle said.

And Titan’s atmosphere is quite the mystery. Titan is the only large moon in the solar system swaddled in a thick atmosphere, and the Huygens probe revealed that it’s dominated by nitrogen, like Earth’s. Likewise, Titan is the only other body in the solar system with liquid on its surface. Plus, Titan boasts liquid cycling akin to Earth’s hydrologic cycle, although in Titan’s case, it’s primarily methane that gets evaporated, condenses in the atmosphere, and precipitates as rainstorms that erode and shape the surface.

Titan’s atmosphere could hold clues to life-generating chemistry.But scientists have no idea how Titan’s atmosphere got there or what replenishes its nitrogen and its methane, another major constituent of the atmosphere. One particularly surprising find from Cassini was the upper atmosphere’s complex organic molecules, Turtle said. No one expected to see benzene rings or long, complex chains of hydrogen and carbon.

Another surprising find in Titan’s upper atmosphere was heavy ions, said Sarah Hörst, an atmospheric chemist at Johns Hopkins University in Baltimore, Md. Heavy ions are key ingredients to prebiotic chemistry, which means Titan’s atmosphere could hold clues to life-generating chemistry.

A Future Window into Titan’s Skies

In May, scientists will recruit an Earth-based system to help them observe Titan’s atmosphere. Nixon and his team have scheduled time to observe Titan using the Atacama Large Millimeter/submillimeter Array (ALMA) observatory in northern Chile’s Atacama Desert.

The May observation will match up with one of the closer of the distant Cassini flybys, Nixon said, and will allow scientists to look for an even wider range of molecules in Titan’s atmosphere. This is because some molecules can be viewed only in certain wavelengths, beyond the capabilities of Cassini’s instruments. Using ALMA will allow researchers to see molecules that might be invisible to Cassini.

This simultaneous observation will give scientists a benchmark set of data that will allow them to continue to observe Titan’s atmosphere decades into the future, Nixon said, while they look for more prebiotic signatures, like sulfur, or a molecule called vinyl cyanide that could form cell-like membranes in Titan’s liquid oceans and lakes.

A Portal to Data

Even after Cassini ends, scientists will still be digging for clues, said astronomer Trina Ray from NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, Calif. Ray, along with her colleagues at JPL, has made it her mission to ensure that future scientists can use the mountains of data that Cassini has beamed to Earth.

Cassini scientists upload their raw data into an online database called the Planetary Data System, which scientists even outside the mission can use. But these data aren’t necessarily formatted in an intuitive way for those scientists, Ray said. So she cofounded a group that is puzzling out ways to help future scientists interpret Titan data specifically. She presented at the Titan Through Time workshop to solicit input from scientists studying Titan about how to aggregate all the data.

The idea is to build a Cassini “master timeline.”One of the ideas is to build a Cassini “master timeline,” Ray said, a narrative that could help guide future scientists through the mission. This timeline would include more than times, dates, and instrument information: It would include details about the intent of an activity. Why was Cassini’s camera pointing here; why was the infrared instrument pointed there?

Ray and her team have also considered a strategy that would incorporate Titan data into a ready-to-use platform like Mars Trek, an interactive, publicly available map that layers data from various Mars missions and their landing sites. Mars Trek users can toggle between layers, explore the different sites, and save and share what they’ve found with others. Ray imagines a similar map for Titan, where scientists or users could flip through layers of data from Cassini’s different instruments.

Mysteries Within Mysteries

In the subsequent seven flybys of Titan before the end of Cassini, Turtle and her team will be looking for clouds over the moon’s northern hemisphere. All the climate models predict that large storm clouds should form over Titan’s high northern latitudes as Titan enters its long summer. But so far, no clouds have appeared, another sign that the hunt for clues isn’t over.

Cassini’s narrow-angle camera spotted a wisp of clouds in Titan’s northern hemisphere in October 2016, but not the big storm clouds scientists are looking for. Scientists hope that these clouds will form in the next few months, before Cassini plunges into Saturn. Credit: NASA/JPL-Caltech/Space Science Institute/Univ. of Arizona

“Titan has really been teasing us with the clouds,” Turtle said.

Turtle may not glimpse the elusive clouds. And maybe T-126 won’t provide answers to long-standing questions about Titan’s lakes. The end of Cassini’s mission means no more sniffing the atmosphere with spectrometers, no more close-up images of meandering dunes, and no new views of its mysterious seas.

But the workshop ended optimistically, with scientists turning their focus to a future Titan mission. Perhaps a drone-like quadcopter could fly around Titan’s surface, researchers mused, taking data from multiple research sites. Or a submarine could swim through a sea.

And whatever new information comes to light will inevitably generate more questions.

“That’s the other thing that’s been really fun about [studying Titan]: mysteries within mysteries,” Turtle said.

—JoAnna Wendel (@JoAnnaScience), Staff Writer

Acquiring a Taste for Advocacy

Thu, 04/20/2017 - 13:54

On the Sunday before last year’s American Geophysical Union (AGU) Fall Meeting—about a month after the U.S. elections—I found myself in a windowless room, deep in the basement of the San Francisco Marriot Marquis. It was 8:00 a.m., and I was shaking off jet lag to attend a science communications workshop hosted by AGU’s Sharing Science program.

Communications, in this context, means talking about science to people who aren’t scientists. The workshop covered three topics: interacting with the general public, which I love; talking to journalists, which is what drew me; and interactions with politicians. I figured I’d zone out through that last bit.

But that isn’t quite how things turned out.

A Eureka Moment

Let’s back up a bit. Even when I was a postdoc in the Washington, D. C., area, almost 2 decades ago, I avoided politics like the plague.

This disengagement isn’t just unreasoned distaste. Some years ago, prompted by a request from a scientist I greatly admire to sign a policy statement I wholeheartedly agreed with, I thought long and hard about whether to engage in political discourse as a scientist.

In the end, after seeing a timely and terrific set of talks, I did not sign that statement. Instead, I settled on the idea that science can tell us what is true but not what to do—that policy might be grounded in truth but ultimately reflects values.

That principle served me well enough for a decade, even in the face of growing unease about increasingly counterfactual arguments against the increasingly deep understanding of climate change. But times have changed, and it’s become unnervingly normal to advance political agendas by denying the truths that science provides.

Even if I’m unwilling to advocate for particular policies in my role as a scientist, I am more than ready to advocate for what we do.Our democratic system means that I expect power to be mutable. But I also expect the rules to stay more or less the same, and these days that’s not such a sure bet.

Sitting in that basement room, listening to the people who argue for our work to the people who use what we learn (and pay for our curiosity), I knew that even if I’m unwilling to advocate for particular policies in my role as a scientist, I am more than ready to advocate for what we do and to make the case for stability, intellectual freedom, and openness in how we do it.

A Visit to Congress

With this readiness to advocate, I took advantage of a trip to Washington earlier this year to spend a day of my own time visiting members of Congress.

The visits were arranged by AGU’s public affairs staff, the same people who run AGU’s Congressional Visits Days. In the weeks before the trip, the staff helped me craft a message, then refine a script with stories and concrete requests. They identified whom I could meet with and what committee memberships or legislative sponsorships would make the visit most relevant. They joined me on those visits, guiding me through a bewildering building and an even more bewildering social world.

The legislative side of the federal government is like another country: unfamiliar, sometimes uncomfortable, but easy enough to get the hang of, once I had dusted off my interview suit.

Capitol Hill runs on young people. Most staff are the age of graduate students, the really senior staff about as old as postdocs. Nonetheless, it quickly becomes apparent that these are smart people with knowledge and power.

Anecdotes make facts personal—and people, after all, are far more important than ideas.Some social customs take getting used to: In everyday conversation, it would be rude to make a request without establishing a connection, but on the Hill “the ask” is the first thing on the table because this gives your hosts the chance to calibrate who you are and how to respond to you.

One-off stories have little value in scientific arguments, but they are gold in congressional offices. That’s partly because time is short (most visits last 30 minutes or less). But it’s also because anecdotes make facts personal—and people, after all, are far more important than ideas.

My biggest surprise, however, was learning firsthand how hard congressional staff work to find common ground. For my visit, AGU had arranged meetings with representatives from both political parties, guaranteeing that some of the people I met would have political viewpoints different than my own. And yet every single staff member went out of his or her way to hear what I had to say, respond thoughtfully, and identify one or more points on which our agendas aligned.

No Landscape Is Permanent

Of course, no single visit by a single scientist is going to change any legislator’s well-developed policy stance. Does that mean the visits are pointless or self-indulgent?

I don’t think so. On the most basic level, politicians respond to public pressure and opinion, and newly engaged, vocal, and organized communities can be valuable support or formidable opponents.

The landscape right now might seem so permanent, but then there’s the Grand Canyon…More fundamentally, politicians have to triangulate an enormous range of priorities. The fate of Earth science is rarely the most pressing or compelling. And yet they listen, and I’d like to think that positions might change.

Musing on the way home from my congressional visit, I couldn’t shake the image of a stream running over rocks. The landscape right now might seem so permanent, but then there’s the Grand Canyon…

—Robert Pincus (email: robert.pincus@colorado.edu), University of Colorado Boulder

La Niña Subtype May Have a Big Impact on Aerosols in China

Wed, 04/19/2017 - 12:23

Tiny particles known as aerosols pepper the atmosphere worldwide. Some aerosols are natural, including windblown dust and sea salt, whereas others consist of pollution produced by human activities, such as wood fires. Aerosols have complex effects and can influence climate, atmospheric visibility, environmental pollution, and human health.

On a regional scale, weather can affect aerosol concentrations, sometimes with major implications for air quality. This is of particular interest for certain regions in eastern China, where aerosols are a serious environmental problem. A new study by Feng et al. suggests that depending on its strength, a recently identified type of La Niña known as La Niña Modoki could have very different effects on aerosol concentrations over eastern China.

During La Niña Modoki, the cool sea surface temperature anomalies that characterize La Niña occur farther west in the Pacific than usual. However, debate is ongoing over whether La Niña Modoki is truly different from the canonical La Niña. Nonetheless, events that fit the La Niña Modoki profile may have unique regional effects.

To examine the effects of La Niña Modoki on aerosols, the team investigated two such events in the past 2 decades, one occurring in 1998–1999 and one in 2000–2001.  The team used a model of atmospheric composition known as GEOS-Chem to explore the influence that each event may have had on aerosol concentrations over eastern China. Their methods followed those used in their previous analysis for El Niño Modoki, a similarly nontraditional event.

A diagram showing the influences of the strong La Niña Modoki event on aerosols in eastern China during boreal winter. Red and blue shaded areas indicate positive and negative sea surface temperature (SST) anomalies, respectively. The brown dotted area indicates more aerosols. Solid lines indicate anomalous circulation, and thick arrows represent anomalous wind directions. Cyclonic (C) and anticyclonic (AC) circulation anomalies are also marked. Credit: J. Feng and J. Li

The results suggest that at its peak, the strong La Niña Modoki of 1998–1999 resulted in increased aerosol concentrations in the south and decreased aerosols in the north. In contrast, the moderate 2000–2001 event had the opposite effect: It decreased aerosols in the south while increasing them in the north. The analysis also revealed additional contrasts in regional aerosol levels over the course of each event.

The different effects on aerosols would likely have been caused by different wind circulation patterns that occurred during each event. The researchers calculated that the two events may have increased or decreased aerosol concentrations by up to 20% of mean levels. Therefore, they say, the effects of La Niña Modoki cannot be ignored, and the influences of air-sea interactions in the tropical Pacific should be considered in haze forecasts. (Journal of Geophysical Research: Atmospheres, https://doi.org/ 10.1002/2016JD026175, 2017)

—Sarah Stanley, Freelance Writer

Arctic River Ice Deposits Rapidly Disappearing

Wed, 04/19/2017 - 12:13

Climate change is causing thick ice deposits that form along Arctic rivers to melt nearly a month earlier than they did 15 years ago, a new study finds.

A river icing on a small unnamed river that drains into Galbraith Lake, Alaska. The people in this photo are researchers and students associated with Toolik Field Station. Credit: Jay Zarnetske

River icings form when Arctic groundwater reaches the surface and solidifies on top of frozen rivers. They grow throughout the winter until river valleys are choked with ice. Some river icings have grown to more than 10 square kilometers (4 square miles) in area—roughly three times the size of New York’s Central Park—and can be more than 10 meters (33 feet) thick.

In the past, river icings have melted out around mid-July, on average. But a new study measuring the extent of river icings in the U.S. and Canadian Arctic shows most river icings disappeared 26 days earlier, on average, in 2015 than they did in 2000, melting around mid-June. In addition, the study found most icings that don’t completely melt every summer were significantly smaller in 2015 than they were in 2000.

“This is the first clear evidence that this important component of Arctic river systems – which we didn’t know was changing – is changing and it’s changing rapidly,” said Tamlin Pavelsky, a hydrologist at the University of North Carolina Chapel Hill and lead author of the new study published in Geophysical Research Letters, a journal of the American Geophysical Union.

Scientists have studied the effects of climate change on other types of Arctic ice like glaciers and sea ice, but until now no study has systematically looked at whether river icings are changing in response to a warming climate, according to the authors.

Although the decline in river icings is likely a result of climate change, the authors are unsure whether the decline in river icings is a direct result of rising temperatures or if climate change is altering how rivers and groundwater interact.

“While glaciers tell us about climate in the mountains and sea ice tells us about sea-atmosphere interactions, the processes that control river icing may offer great insight into how groundwater and surface waters are connected in the Arctic and how our headwaters will be connected to the ocean in the future,” said Jay Zarnetske, a hydrologist at Michigan State University in East Lansing, Michigan, and co-author of the study.

The decline in river icings is remarkably rapid and if it continues, it could have huge impacts on Arctic river ecosystems, Pavelsky said.

River icings are found all over the Arctic and create wide channels that are important habitats for animals and fish. So much water is tied up in river icings that when they melt in summer, usually in July and August, they keep rivers flowing that might otherwise dry up, providing important freshwater habitat for fish and other animals, he said.

Disappearing Ice

The idea to study river icings came to Pavelsky in 2013 during a flight to northern Alaska for a recreational canoe trip. The pilot of the small plane, who had flown in the area for more than 30 years, said he noticed river icings were melting earlier in the season and the timing was becoming more unpredictable. River icings pack down the gravel on riverbeds and pilots use them as makeshift runways.

“My scientist antenna went right up,” Pavelsky said. “I said ‘Hey, I think I know how to look at that.”

Jay Zarnetske explores river icings in Alaska in 2004. A new study by Zarnetske shows river icings are melting out, on average, 26 days earlier in the year than in 2000. Credit: Jay Zarnetske

When Pavelsky returned from the trip, he downloaded data from the moderate-resolution imaging spectroradiometer (MODIS) aboard the NASA Terra satellite, which takes daily images of Earth. Pavelsky and Zarnetske then analyzed daily MODIS images of the U.S. and Canadian Arctic from 2000 to 2015, wondering if they could see evidence of changes to the ice that Pavelsky’s pilot had described.

They could. Pavelsky and Zarnetske detected 147 river icings using the MODIS data and found that of those, 84 are either becoming smaller or disappearing earlier in the season. The rest were unchanged. None of the river icings they analyzed grew or persisted later in the season.

The minimum area of ice they measured also shrank considerably over the study period. In 2000, Pavelsky and Zarnetske measured a minimum ice area of 80 square kilometers (30 square miles)—roughly half the area of Washington, D. C. By 2010, that number had dwindled to just 4 square kilometers (2 square miles) – smaller than San Diego’s Balboa Park. By 2015, the ice had rebounded slightly, with a minimum area of about 7 square kilometers (3 square miles).

“I think it’s a really important study, as another example of the types of changes we’re seeing in the Arctic landscape,” said Ken Tape, an ecologist at the University of Alaska Fairbanks who was not connected to the study. “This is not a prediction about something that will change, it’s demonstrating something that has changed, likely in response to warming.”

How Do Microbial Ecosystems and Climate Change Interact?

Tue, 04/18/2017 - 11:48

Microorganisms have been changing the climate and have been changed by the climate throughout most of Earth’s past. A new joint report, “Microbes and Climate Change,” from the American Society for Microbiology (ASM) and the American Geophysical Union (AGU), explores these dynamics and provides insights for better understanding and future work.

The report is the output of a 1-day research colloquium jointly sponsored by ASM and AGU. It highlights how microorganisms respond, adapt, and evolve in their surroundings at higher rates than most other organisms. This accelerated pace allows scientists to study the effects of climate change on microbes to understand and hopefully predict the future effects of climate change on all forms of life.

The 3 March 2016 colloquium brought together experts from multiple scientific disciplines to discuss the current understanding of microbes and our changing climate, as well as gaps and priorities for future study. Thirty invited scientists of various backgrounds, about half from each of the two scientific societies, participated in the meeting and coauthored the report. Colloquium steering committee members Stanley Maloy, Mary Ann Moran, Margaret Mulholland, Heidi Sosik, and John Spear facilitated the work of the authoring committee.

“The information in this joint report lays out the current understanding of microbial ecosystem feedbacks that accelerate or mitigate climate change, as well as gaps and priorities for future study.”The 24-page report provides a primer on biogeochemical processes and climate change, then addresses impacts in three areas: terrestrial polar regions; soil, agriculture, and freshwater; and oceans. The report also explores ecological communities of microogranisms (i.e., microbiomes), effects of climate change on these communities, and how they adapt. The report is written for public audiences, including policy makers, educators, and science-interested students, as well as scientists.

“There is much more to learn and understand about how shifts in the Earth’s climate affect complex and interconnected microbial functions and the biogeochemical cycles they mediate,” said Eric Davidson, AGU President and a biogeochemist. “The information in this joint report lays out the current understanding of microbial ecosystem feedbacks that accelerate or mitigate climate change, as well as gaps and priorities for future study. This collaborative effort between ASM and AGU is a model for future joint society activities.”

—Billy M. Williams (email: bwilliams@agu.org), Science Director, AGU

An Improved Model of How Magma Moves Through the Crust

Tue, 04/18/2017 - 11:46

Volcanic eruptions of basalt are fed by intrusions of magma, called dikes, which advance through Earth’s crust for a few hours or days before reaching the surface. Although many never make it that far, those that do can pose a serious threat to people and infrastructure, so forecasting when and where a dike will erupt is important to assessing volcanic hazards.

However, the migration of magma below a volcano is complex, and its simulation is numerically demanding, meaning that efforts to model dike propagation have so far been limited to models that can quantify either a dike’s velocity or its trajectory but not both simultaneously. To overcome this limitation, Pinel et al. have developed a hybrid numerical model that quantifies both by dividing the simulations into two separate steps, one that calculates a two-dimensional trajectory and a second that runs a one-dimensional propagation model along that path.

The results indicate that the migration of magma is heavily influenced by surface loading—the addition or removal of weight on Earth’s surface—such as that caused by the construction of a volcano or its partial removal via a massive landslide or caldera eruption. The team confirmed previous research that showed that increasing surface load attracts magma while also reducing its velocity, whereas unloading diverts much of the magma.

To test their approach, the team applied their model to a lateral eruption that occurred on Italy’s Mount Etna in July 2001. The eruption was fed by two dikes, including one that in its final stages clearly slowed down and bent toward the west while still 1–2 kilometers below the surface. The results showed that the two-step model was capable of simulating that dike’s velocity and trajectory and thus offers a new means of constraining the local stress field, which partially controls these properties.

In the future, report the authors, more complex versions of this model that incorporate information on local topography and magmatic properties could be integrated with real-time geophysical data to improve forecasts of when and where a propagating dike could erupt at the surface. (Journal of Geophysical Research: Solid Earth, https://doi.org/10.1002/2016JB013630, 2017)

—Terri Cook, Freelance Writer

NSF Director Hopes for a Fair Budget

Mon, 04/17/2017 - 11:57

Although the U.S. National Science Foundation (NSF) doesn’t yet have any news about its budget for the upcoming fiscal year (FY) 2018, NSF director France Córdova is trying to think positively.

“We’re trying not to be overly anxious. We’re trying to be optimistic. Clearly, we could prepare for all sorts of scenarios.”“We’re trying not to be overly anxious. We’re trying to be optimistic. Clearly, we could prepare for all sorts of scenarios,” Córdova said at a 12 April meeting of NSF’s Advisory Committee for Geosciences at the agency’s headquarters in Arlington, Va.

The Trump administration’s budget blueprint for FY 2018, issued on 16 March, did not include any references to NSF, but it did call for significant cuts to some federal science agencies, including the Environmental Protection Agency and the National Oceanic and Atmospheric Administration. Córdova anticipates that the administration will release a more complete budget proposal in May for Congress to consider. In the meantime, the federal government is operating under a continuing resolution (CR) that generally funds the government at FY 2016 levels; the CR expires on 28 April for the current FY 2017.

Córdova said that by focusing on the importance and possibilities of science, “we won’t let all these rumors and predictions [about the budget], some of which are not good, weigh us down.”

She hopes that the administration and Congress will recognize the value of funding basic research and of innovative efforts such as the agency’s “10 Big Ideas for Future NSF Investments” initiative: a set of forward looking proposals that includes navigating the new Arctic and integrating research across fields.

An “Opportune Time for Science to Make Its Mark” NSF director Córdova often speaks with officials about the agency’s science. Here she confers with Florida Governor Rick Scott and Florida State University administrators (not shown) in the governor’s office. Credit: Bill Lax/FSU Photography Services

The administration, Córdova said, is “still under formation with respect to basic research. So that’s a good thing. This, I think, is an opportune time for science to make its mark on what its importance is.”

Córdova stressed to the committee the importance of communicating the value of NSF and the science it supports to the new administration. “You can all help as a committee by thinking [about] what are the expressed administration’s priorities and how do those dovetail into what we’re doing, what the president says that he and his administration really care about,” Córdova said. “Obviously, there’s jobs and national security, keeping our country prosperous and safe, growing the economy. Those are all things that the geosciences contribute immensely toward. But have we framed our messages and the way the stories that we illustrate them with appropriate for that?”

She added that “every administration has appreciated the importance of basic research,” although they may have different priorities about what they think is important.

Directorate-Level Funding Córdova recently testified before the House Committee on Science, Space, and Technology. Pictured, Córdova (left) is shaking hands with Committee chair Rep. Lamar Smith (R-Texas) at the geographic South Pole during a 2014 congressional delegation visit to Antarctica. Credit: Scott Borg, NSF

On another budget concern, Córdova said she hopes that funding for NSF does not include congressional funding directives that wade down to the agency’s directorate level for the geosciences or other directorates. NSF in the past has been spared from proposed directorate-level directives, with support from the geosciences community. The agency, she said, has good message points, which have been heard in Congress, about the importance of the science and engineering NSF funds, how it all works together to address common challenges, and how one never knows where the next discovery will come from.

“I personally would be surprised if anything untoward was done to our [funding] flexibility, which we very much appreciate, of having the science and engineering communities choose the priorities,” she commented. “It would be a very different world if science, especially basic research, became political, which it would become if there were directorate-to-directorate level funding. I really feel that very, very strongly.”

She added, “You can’t solve those big questions by taking down one branch of science or another or plussing up some because that would mean that you have the crystal ball that no one does have.”

The Impact of Lifting the Hiring Freeze

The agency doesn’t have “much fat on our bones,” Córdova said. “It’s hard to think about being much leaner.”Córdova said that NSF is beginning to sort through the meaning and implications of a 12 April memorandum from the White House Office of Management and Budget that lifts a 23 January federal hiring freeze. She said that the memo, titled “Comprehensive Plan for Reforming the Federal Government and Reducing the Federal Civilian Workforce,” provides good news in lifting the hiring freeze but that agencies now need to develop workforce reduction plans.

“We are trying to balance those two. We have to figure out exactly what the rules of the road are and what that all means,” she said. Noting that NSF has “a very small workforce relative to our budget,” with about 1200 federal employees and about 170 employees assigned to NSF through the Intergovernmental Personnel Act, the agency doesn’t have “much fat on our bones,” she said. “It’s hard to think about being much leaner.”

—Randy Showstack (@RandyShowstack), Staff Writer

Establish a Scientific Integrity Advisory Board, Says New Report

Mon, 04/17/2017 - 11:54

A report released by the National Academies of Sciences, Engineering, and Medicine (NASEM) recommends the establishment of an independent, nongovernmental board to help research institutions foster scientific integrity across fields.

The new report, published 11 April and titled Fostering Integrity in Research, also makes several recommendations to improve the investigations of and responses to scientific misconduct. The recommendations were influenced by several high-profile cases of scientific misconduct, all of which involved to some degree issues with reproducibility, data fabrication or falsification, retaliation against whistle-blowers, or consequences for graduate students or coauthors.

“We don’t think the system is broken, but we think there is a lot more we as a community can do.”“We don’t think the system is broken, but we think there is a lot more we as a community can do,” Robert Nerem, a professor emeritus of bioengineering at Georgia Institute of Technology in Atlanta and chair of the committee that wrote the report, told Science.

Integrity in Science

Because of rapid globalization and changing technologies, NASEM’s Committee on Science, Engineering, and Public Policy commissioned the new report as a follow-up to a similar report released in 1992, titled Responsible Science: Ensuring the Integrity of the Research Process. The 1992 report, which was also inspired by cases of scientific misconduct, first recommended that the research community establish an independent board, but that recommendation was overlooked.

A nonprofit, independent Research Integrity Advisory Board (RIAB) would share expertise in minimizing and handling research misconduct, the new report notes. The committee acknowledges that individual U.S. institutions and funding agencies do “valuable work to foster research integrity,” but they also stress that no such entity exists to “foster research integrity at a national scale.”

An RIAB could provide support, resources, or advocacy to various stakeholders in the research process. It would have no direct role in investigations, regulation, or accreditation, the committee notes, but it would serve as a “neutral resource based in research enterprise that helps the enterprise respond to ongoing and future changes.”

Recently, sexual harassment in the sciences has become a more mainstream issue as women come forward to share their experiences. Although sexual harassment is not specifically addressed in the report, Nerem told Eos that “the RIAB thus may be ideal to foster further discussions about this serious issue.”

Other Recommendations

The new report lists 11 recommendations to parties within the scientific enterprise, including research institutions, universities, sponsors, journals, and funding agencies. One of these recommendations was that all bodies within the scientific enterprise should update and align their practices and strategies in response to a world where globalization has prompted new and international collaborations and where the Internet and new technologies have affected data collection and reporting of findings.

The report also encourages these parties to protect whistle-blowers and for federal funding agencies to supply “sufficient funds” to improve long-term data storage to strengthen reproducibility efforts.

You can read the full report here.

—JoAnna Wendel (@JoAnnaScience), Staff Writer

A New Data Set to Keep a Sharper Eye on Land-Air Exchanges

Mon, 04/17/2017 - 11:52

The turn of the millennium marked a decade of systematic measurements of exchanges of carbon, water, and energy between the biosphere and the atmosphere, made by a few pioneering scientists at their own study sites. New knowledge was--and still is--generated from these data at individual sites. It quickly became clear, however, that pooling together data from multiple sites would create a whole that is greater than the sum of its parts: a powerful tool not only for comparing and combining sites but also for studying land-air exchanges at regional and even global scales.

Scientists formed regional networks to foster sharing of data and methodologies, and FLUXNET was created, establishing a global "network of networks." But a challenge remained: Data sets were still too diverse, incompatible, and hard to compare. In June 2000, at the Marconi Convention Center on the northern California coast, a group of scientists resolved to change that. The first FLUXNET data set was born.

FLUXNET2015 is the third data set in the series, following the original Marconi data set from 2000 and the widely used 2007 version produced after a workshop in La Thuile, Italy. Released in December 2015, with two augmentations in 2016, the FLUXNET2015 data set includes more than 1500 site-years of data at 30-minute intervals from 212 sites, ranging from the Norwegian tundra in Svalbard to the South African savanna in Kruger National Park, from the birch forests of Hokkaido, Japan, to the scrub oak near the launch pads of the Kennedy Space Center in Florida. Each site has records of fluxes of carbon, water, and energy and other ecosystem and biometeorological variables.

The fidelity of these flux measurements and their singular spatial and temporal coverage uniquely position them not only to help answer a broad range of questions about ecosystems, climate, and land use but also to bridge gaps between field observations and larger-scale tools like climate models and remote sensing. If past usage of FLUXNET data sets is any guide, we expect that scientists will use these data to validate satellite measurements, inform Earth system models, provide insight into a host of questions in ecology and hydrology, and fuel novel applications, many harnessing big data tools, on scales ranging from microbes to continents.

Updating FLUXNET Data SetsFig. 1. This map shows the geographical extent of the FLUXNET tower site network, including active and historical sites. The FLUXNET2015 data set includes a subset of the sites shown in the map. The color and size of the circles indicate the length of measurements (as of December 2015).

FLUXNET sites span all continents and all major climate zones (Figures 1 and 2; Figures 3 and 4 and the photograph above illustrate three sites). Some regions of the globe and ecosystem types are still underrepresented, but this newest release has improved coverage (see Figures 2a and 2b). Increased data contributions from regional networks in the Americas, Asia, Europe, and Oceania helped expand the geographic and temporal coverage of the data set.

The longest continuous flux data records now exceed 25 years. AmeriFlux and the European networks reached 20 years of age in 2016, and other networks host records nearly as long. This combination of longer time series and new sites from undersampled regions make the FLUXNET2015 data set the state of the art for understanding long-term interactions between the atmosphere and the biosphere and for addressing questions about year-to-year variability and trends in fluxes.

Such long flux records are also essential for gaining insight on extreme events and the ways that disturbed ecosystems respond over time. They can also help address emerging science questions, such as identifying the causes and effects of the greening of the Arctic, detected by nearly 30 years of remote sensing, or finding what drives the increasing amplitude of seasonal variability of atmospheric carbon dioxide (CO2), observed in nearly 60 years of records from the Mauna Loa Observatory in Hawaii.

Fig. 2. The length of record and diversity of environmental conditions that FLUXNET2015 covers have expanded over the 2007 FLUXNET LaThuile data set and give a more representative picture of the range of conditions experienced over the entire globe. The panels show FLUXNET sites distributed along axes of annual precipitation and average air temperature, a scheme also known as Whittaker's biome classification. In Figures 2a-2c tower sites are shown as circles, with sizes and colors representing the length of the record for a site. The land surface from the terrestrial globe is plotted with gray dots using the CRU TS 3.22 gridded climate data set: Temperature and precipitation grid cells (pixels) used here were averaged over the past 30 years, at 0.5 resolution, and include only land surface (excluding ice-covered areas). (a) Sites included in the 2007 FLUXNET LaThuile data set, (b) sites included in the FLUXNET2015 data set, and (c) all sites present in the FLUXNET network (compiled from multiple sources). (d) Histograms comparing the distribution of land surface (gray), FLUXNET sites (black), and sites in the FLUXNET2015 data set (purple) across the temperature and precipitation ranges.

This update was also motivated by the opportunity for higher-quality data. The FLUXNET data team and site teams collaborated on extensive data quality control that allowed us to avoid or correct many of the data issues that are common in observed data such as missing sensor calibrations or inconsistent processing of a measured variable. Advances in the science and the availability of complementary data sets also allowed for the creation of new data products, such as uncertainty estimates and better methods for filling long gaps in micrometeorological data.

Earlier FLUXNET data sets were used in hundreds of peer-reviewed papers, for studies ranging from soil microbiology to effects of climate change at a global scale. As a simple metric, a Web of Science keyword search for FLUXNET now yields more than 400 papers. Stimulating new science with useful data products is a main motivation behind the work to create FLUXNET data sets.

What's Inside

Data collection at FLUXNET sites engenders only minimal disturbance to the ecosystem and produces flux estimates on temporal scales from hours to years, and now decades.Data collection at FLUXNET sites engenders only minimal disturbance to the ecosystem and produces flux estimates representative of spatial scales of hundreds of meters and multiple temporal scales, from hours to years, and now decades. To make the data simple to use by scientists from many disciplines, data included in FLUXNET2015 were consistently quality controlled and gap filled. Also, ecosystem exchange of CO2 was partitioned into ecosystem respiration (as CO2 released into the atmosphere) and gross primary production (as CO2 uptake by the ecosystem).

The FLUXNET2015 data set includes a number of new features. One is the revised and extended data quality checks, which not only increase data quality for individual sites but also help harmonize quality levels among all sites. This uniformity in the data is important for synthesis analyses, which require sites to be comparable.

Fig. 3. Tonzi Ranch, in the lower foothills of the Sierra Nevada Mountains, is one example of an oak savanna flux site (FLUXNET ID US-Ton). Credit: Dennis Baldocchi

Also included for the first time are estimates of uncertainties for key steps in the processing. Some of the steps, such as filtering for low wind conditions and partitioning of CO2 fluxes, were implemented with multiple methods, resulting in more thorough estimates of uncertainties. These uncertainty estimates have been a long-standing request from ecosystem and Earth system modelers and make this data set especially useful for applications like model validation and constraining.

FLUXNET2015 also includes estimated energy corrections that were applied to achieve energy balance closure (between storage and incoming and outgoing energy), which makes the data more useful to climate and ecosystem models that require closed energy budgets when using flux data.

To fill long gaps in meteorological data, FLUXNET2015 used downscaled data based on the ERA-Interim global reanalysis data set, which provides a gridded and uninterrupted record derived using a data-informed model. This approach improved the accuracy of gap-filled micrometeorology data points and of temporally aggregated products such as those at daily or yearly resolutions.

Some of these data products and processing steps, especially the data quality checks, prevented some sites from being included in the data set. As a result, not all sites that contributed data or were part of the LaThuile data set were included in FLUXNET2015. The FLUXNET data team continues to work with these site teams to include data from their sites in future FLUXNET data sets.

Embracing the Benefits--and Challenges--of Open Data

Open data sharing has gained momentum and is becoming a cornerstone for scientific research. In principle, open data sharing doesn't just benefit users--it also benefits the teams collecting the data by enhancing community integration, collaboration opportunities, data collection protocols, recognition for the data collection and curation work, and fulfilling funding agency requirements on data availability.

Data policies have evolved at least as much as the FLUXNET data sets themselves.In practice, however, sharing data remains complicated. The extra work and the logistics of supporting open sharing are beyond the reach of many scientists. FLUXNET data sets are based on field data collected by many independent teams from many countries. The multisource nature of the data set brings the added challenge of ensuring that the various data collectors receive proper credit when their data are used. Proper attribution is also necessary for site teams to get a better measure of the impact of their work and opportunities to participate in the science using their data. Data policies aimed at addressing these requirements have been put in place to enable data sharing via the services provided by regional networks and FLUXNET.

Data policies have evolved at least as much as the FLUXNET data sets themselves. For the Marconi data set, the policy stated that site teams had to be informed of the data usage and could request work on conflicting topics be postponed. For the LaThuile data set, each site team classified their data into one of three data policy tiers. The most lenient required only acknowledgement of the data source. A middle tier required users to submit proposals, which needed approval by a committee with representatives from each regional network and FLUXNET. In the strictest tier, data access was restricted to the teams who contributed data.

Fig. 4. Ankasa, on the coast of Ghana, is a tropical evergreen broadleaf forest flux site (FLUXNET ID GH-Ank). Credit: Giacomo Nicolini

The FLUXNET2015 data policy has matured to allow access to all interested users, with two tiers differing only in the terms for using the data, for example, for publications or class assignments. One tier requires acknowledgments, and the other requires that data providers be given the opportunity to add intellectual contributions and potentially become coauthors. This policy approach allows much broader access to the data and more opportunities to experiment with it.

Over the past decade, many FLUXNET site teams moved from being cautious about data sharing to being advocates for increased openness. FLUXNET's track record of more than 20 years of sharing data helps pave the way toward reproducible science and should encourage other communities to tackle the challenges of sharing data sets with many sources and users.

Where We Go from Here

The FLUXNET2015 data set can be downloaded from the FLUXNET-Fluxdata website. Download it, use it, discover--have fun! And let us know: Questions, suggestions, and comments can be sent to fluxdata-support@fluxdata.org. Site teams interested in contributing data to future FLUXNET data sets are encouraged to contact their regional networks.

There is still much room for improving the spatial coverage and representativeness of future data sets, considering all the existing sites that are potential new data contributors (Figure 2c). We thank the site teams that helped us prepare their site data for inclusion in this data set and urge teams from sites that were not ready in time for this release to work with us to add their sites to upcoming data sets.


Site teams are the true engine of all FLUXNET datasets. Their extensive effort in collecting data, often under harsh working conditions, generating quality data products, and providing their data for use by a broad community is invaluable and irreplaceable. The FLUXNET2015 dataset is truly a community achievement!

This work used eddy covariance data acquired and shared by the FLUXNET community, including these networks: AmeriFlux, AfriFlux, AsiaFlux, CarboAfrica, CarboEuropeIP, CarboItaly, CarboMont, ChinaFlux, Fluxnet-Canada, GreenGrass, ICOS, KoFlux, LBA, NECC, OzFlux-TERN, TCOS-Siberia, and USCCC. The ERA-Interim reanalysis data are provided by ECMWF and processed by LSCE. The FLUXNET eddy covariance data processing and harmonization was carried out by the European Fluxes Database Cluster, AmeriFlux Management Project, and Fluxdata project of FLUXNET, with the support of CDIAC and ICOS Ecosystem Thematic Center, and the OzFlux, ChinaFlux and AsiaFlux offices.

Author Information

G. Z. Pastorello (email: gzpastorello@lbl.gov), Computational Research Division, Lawrence Berkeley National Laboratory, Berkeley, Calif.; D. Papale, Department for Innovation in Biological, Agro-food and Forest systems (DIBAF), University of Tuscia, Viterbo, Italy; H. Chu, Department of Environmental Science, Policy, and Management, University of California, Berkeley; C. Trotta, DIBAF, University of Tuscia, Viterbo, Italy; D. A. Agarwal, Computational Research Division, Lawrence Berkeley National Laboratory, Berkeley, Calif.; E. Canfora, Impacts on Agriculture, Forests and Ecosystem Services, Euro-Mediterranean Center on Climate Change, Viterbo, Italy; D. D. Baldocchi, Department of Environmental Science, Policy, and Management, University of California, Berkeley; and M. S. Torn, Climate and Ecosystem Sciences Division, Lawrence Berkeley National Laboratory, Berkeley, Calif.

As Winters Get Warmer, Sugar Maples May Absorb Less Silicon

Mon, 04/17/2017 - 11:50

Scientists predict that rising winter temperatures will reduce snow cover in mountainous and temperate forests. Without the insulating effects of snow, the underlying soil will freeze more readily. New research by Maguire et al. suggests that increased soil freezing will hinder uptake of silicon by sugar maple roots. This lack of silicon could have significant ecological effects.

Sugar maples and other plants obtain silicon-containing substances from groundwater. The trees convert this silicon into biogenic silica, which can perform a variety of functions, such as structural support and protection against harmful fungi. Increased availability of silicon for plants has been linked to thicker leaves, increased chlorophyll content, enhanced seedling growth, and increased seed production. Silicon uptake by plants also affects silicon levels in downstream ecosystems.

Photograph showing snow removal plot at Hubbard Brook Experimental Forest. Credit: Anne Socci

The team hypothesized that silicon uptake could be hindered by reduced snow cover (previous research had shown similar results for nitrogen uptake). To test this idea, they obtained sugar maple roots that had been collected during a prior study in the Hubbard Brook Experimental Forest. Some roots were from trees grown in plots that were left alone for the first 6 weeks of two consecutive winters. The rest were from plots that had been cleared of snow in the first 48 hours after the winter's first 6 weeks of snowfall.

The researchers measured and compared the amount of biogenic silica in sugar maple roots from each type of plot. Specifically, they examined the fine roots: the thin, filamentous roots responsible for absorbing water and nutrients from surrounding soil.

They found that soil freezing caused by snow cover removal reduced biogenic silica in sugar maple fine roots by 28%. This decrease represents a reduced uptake of about 8 kilomoles of silica per square kilometer of forest; a similar amount of silica is regularly transported out of temperate forests by streams and rivers. These results suggest that warmer winters could raise downstream silicon levels significantly, potentially affecting downstream ecosystems.

The researchers also made the first reported estimates of silicon content in sugar maple fine roots. The team calculated that the fine roots of a single sugar maple contain 29% of the total amount of biogenic silica in the tree, although accounting for just 4% of its total biomass.

Why might sugar maple fine roots contain such a high proportion of biogenic silica? The researchers hypothesize that biogenic silica could protect growing, water-seeking roots from abrasive soils. As temperatures rise and soils freeze more often, reduced silicon uptake could hinder the ability of trees to find and obtain water and nutrients.

Overall, the findings highlight the importance of sugar maples in controlling silicon levels in their ecosystem. Future studies could reveal further details of this role and clarify the potential impacts of climate change. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1002/2016JG003755, 2017)

--Sarah Stanley, Freelance Writer

How Arctic Ice Affects Gas Exchange Between Air and Sea

Fri, 04/14/2017 - 12:14

Climate change is rapidly transforming the world's oceans, and researchers are scrambling to understand what that means for the physical and biogeochemical processes that govern ocean systems around the world. Scientists have measured dissolved carbon dioxide (CO2) gas dynamics in many ocean regions to predict future CO2 exchange between the air and sea, which will influence ocean acidification and global warming. Nonetheless, such data are sorely lacking for remote polar regions, where sea ice hinders ship access.

Woods Hole and Purdue researchers auger an ice hole for deployment of equipment through the ice in 2014. Credit: C. Beatty, University of Montana

To help fill the polar data gap, Islam et al. investigated gas exchange in the waters of the Arctic Ocean's vast Canada Basin. In August 2012, they deployed ice-tethered profilers in two regions, one with a dense cover of sea ice and another with only sparse ice. Each profiler included a bundle of sensor instruments suspended about 6 meters deep in the water and tethered to the ice floating above.

For almost 50 days, the sensors measured carbon dioxide and oxygen levels, temperature, salinity, and chlorophyll a fluorescence, which helps reveal biological production. The sites were 222 kilometers apart, on average, and as the sea ice drifted, the tethered sensors did too.

The team previously published their gas measurements in May 2016, reporting that carbon dioxide levels at both sites were below atmospheric saturation during the study period, whereas dissolved oxygen was slightly supersaturated. In the new study, the scientists compared the two sites to examine how ice cover influenced observed variability in oxygen and carbon dioxide levels. They used computational modeling to analyze sensor data in the context of concurrent oceanic and atmospheric conditions.

The results suggested that in the region with sparse ice cover, biological production, gas exchange with the atmosphere, and mixing between different layers of seawater all influenced oxygen and carbon dioxide variability. In the ice-dense region, mixing played a dominant role in gas variability, and biological production and gas exchange provided a negligible contribution.

These findings could help improve understanding of gas exchange in the Arctic Ocean. Arctic sea ice is declining rapidly, and some researchers predict that fresh meltwater will inhibit nutrient transport and limit biological activity, allowing the surface ocean to come into equilibrium with atmospheric CO2 and promoting acidification. The authors say that continued CO2 monitoring in the Canada Basin is necessary to better understand current trends and future possibilities. (Journal of Geophysical Research: Oceans, https://doi.org/10.1002/2016JC012162, 2017)

--Sarah Stanley, Freelance Writer

Better Estimates of Clouds' Climate Effects Are on the Horizon

Fri, 04/14/2017 - 12:08

The water that makes up a cloud can exist as liquid droplets, ice crystals, or a mixture of both phases. Cloud phase affects how much radiation from the Sun reaches the ground, stays in the atmosphere, or makes its way back into space; all three influence Earth's temperature. However, inadequate tools and data have made it challenging for scientists to accurately incorporate cloud phase into predictions of future climate.

In a new study, Matus and L'Ecuyer present a recent update to an algorithm for processing satellite data that could make such predictions more accurate. They used the algorithm to determine the influence of different cloud phases on solar radiation. The results confirm that the mixture of liquid and ice in a cloud can significantly influence how the cloud affects its environment.

The new version is an update of 2B-FLXHR-LIDAR, which is designed to use satellite data to estimate the amount of radiation passing through a given part of the atmosphere. The 2B-FLXHR-LIDAR algorithm uses data from CloudSat and the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite, which fly in formation to gather data on the structure of clouds, including their liquid and ice particles. It also incorporates cloud imaging data from the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard the Aqua satellite.

The latest version of 2B-FLXHR-LIDAR features improved representation of cloud phase. Specifically, it is now better at realistically representing supercooled liquid water clouds, thin ice clouds, and clouds composed of a mixture of liquid and ice, yielding improved estimates of the ability of a cloud to reflect solar radiation and trap emitted thermal radiation.

To evaluate the strength of the new update, the team compared its outputs with measurements made by NASA's Clouds and the Earths Radiant Energy System Information and Data (CERES) instrument that also flies aboard Aqua. The researchers used the updated algorithm to estimate radiation fluxes at the top of the atmosphere; they found that these estimates agreed better with CERES observations than the outputs of previous versions of 2B-FLXHR-LIDAR did.

The researchers then used the newly updated algorithm to calculate, for the first time, the radiative effects of clouds of different phases seen in satellite data collected from 2007 to 2010. For each type of cloud phase, they calculated the net effect on the exchange of solar and thermal radiation at the top of the atmosphere. A negative number yielded by the calculation indicated that more radiation was reflected into space than was retained through the cloud greenhouse effect, resulting in a net cooling effect, whereas a positive number indicated net warming.

NASA's CloudSat satellite is able to detect the liquid droplets and ice crystals that make up clouds. A new update to CloudSat's algorithm can quantify the degree to which the amount of solid ice or liquid water in clouds contributes to the net cooling or warming effect in the atmosphere. Credit: Alex Matus, UW-Madison/NASA/Cloudsat

The algorithm showed that clouds cool the Earth on average by 17.1 watts over every square meter (W/m2). Warm liquid clouds have a net cooling effect of -11.8 W/m2, whereas ice clouds have a warming effect of 3.5 W/m2. For the first time, the study found that clouds consisting entirely of liquid or ice account for nearly half of the total cooling effect of clouds on the climate. Clouds containing a mixture of ice crystals and water droplets (mixed-phase clouds) cool the globe by 3.4 W/m2, and clouds consisting of multiple distinct ice and liquid layers cause an additional cooling of 5.4 W/m2.

The large variation in these effects by region and season highlights the importance of accurately simulating cloud phase when making climate predictions. Although further improvements and better data will be needed to reduce uncertainty, the newly updated algorithm could prove especially helpful in discerning the effects of mixed-phase clouds, which are known to play an important yet not fully understood role in the global energy budget. (Journal of Geophysical Research: Atmospheres, https://doi.org/10.1002/2016JD025951, 2017)

--Sarah Stanley, Freelance Writer

Former NOAA Chief Scientist Warns of Threats to Science

Fri, 04/14/2017 - 12:05

Walking stick in hand, Rick Spinrad, the former chief scientist for the National Oceanic and Atmospheric Administration (NOAA), has averaged 24 kilometers a day on his postretirement 5091 kilometer trek across the country.

Spinrad, 63, who started out from Cape Henlopen, Del., on 5 March, has already hiked 675 kilometers to McKeesport, Pa., east of Pittsburgh, through a "meteorological smorgasbord" of snow, sleet, driving rain, and clear blue skies. He plans to conclude his trek in early October in Newport, Ore.

During the walk, he is reflecting on his time at the agency and what's happening now with science under the Trump administration.

An Antiscience Attitude

Spinrad says that there is a critical need right now to understand the Earth system well enough to predict its behavior and response to human activity.

However, he worries that the Trump administration's budget blueprint for fiscal year 2018 will cause that need to go unmet or be delayed. The proposed budget unveiled on 16 March will sharply cut funding for science, including for climate science programs and some Earth observing satellites. Spinrad also worries about attitudes toward science within the administration.

"It's a code orange," Spinrad said. "Generally, there's a strong antiscience attitude within this administration.""It's a code orange," Spinrad told Eos over hot chocolate in a restaurant in Washington, D. C., on a cold day earlier in his walk.

"Generally, there's a strong antiscience attitude within this administration. I have heard nothing that suggests support for a scientific agenda," said Spinrad. He expressed specific concern about some administration appointees "who have clear antiscience agendas" and about proposed drastic cuts to the NOAA budget that include slashing the satellite division by 22% and the Office of Oceanic and Atmospheric Research by 26%. Cuts that big are "not something you can recover from," he said.

However, Spinrad's concern isn't yet in the red zone because he has confidence in those who are still working diligently on the scientific agenda in U.S. federal agencies.

Promoting Research at NOAA

A political appointee who retired in December, Spinrad served as NOAA chief scientist for about 2.5 years during the Obama administration. It was his second stint at the agency, where he had served as an assistant administrator from 2003 until 2010.

The highlights of his tenure as chief scientist include a policy to transition research and development output into operations, a strategic research guidance memorandum to help direct future research at NOAA, and a "chief scientist's annual report," issued for the first time in December 2016, that not only documents research at the agency but focuses on the beneficial impact of scientific investments on the American public.

Spinrad said he hopes that the Trump administration will maintain the position of chief scientist, which currently is vacant. "Even if the NOAA administrator is an environmental scientist, he/she will never have the bandwidth to focus on just the scientific issues. The administrator needs a trusted agent without a particular agenda or bias, who can advise him/her on strategic scientific issues; thats what a chief scientist can do," he told Eos.

Spinrad was vice president for research at Oregon State University in Corvallis from 2010 to 2014, where he earlier had received his master's degree and Ph.D. in oceanography. Now he lives in Bend, Ore., and his deep ties to that state made it a good end point for the trek.

During his "long walk home," Spinrad's wife, Alanna, has helped him travel light by driving him to and from lodgings and helping with other logistics.

Targeting Anything About Climate Change

Spinrad told Eos that he does not believe NOAA is being particularly targeted by the Trump administration at this point "because I don't think it has risen above the radar." He said the big targets right now are higher-visibility agencies, including the Environmental Protection Agency and the Department of Energy.

However, Spinrad does think that anything associated with climate is being targeted. "You can see that everywhere. This administration has a very different view of climate change, climate research, and the need to address the issues associated with climate change," he said. "I think somebody is probably doing a global search for anything that has climate in the title and saying this is not consistent with administration policies."

"This sense that you can somehow segregate components of a research portfolio and therefore align the research with some ideology is woefully ignorant of what research is all about."Meeting some policy priority by "surgically" removing anything from the budget related to climate change "is neglectful of the fact that so much of climate research, climate observations, is integrally connected with the same observations and research that we would use for weather," Spinrad noted. He said, for instance, that data collected on sea surface temperature are as valuable for numerical weather prediction as they are for understanding the climate record.

"This sense that you can somehow segregate components of a research portfolio and therefore align the research with some ideology is woefully ignorant of what research is all about," he said.

Concern About NOAA Satellites

Spinrad acknowledged that there is some validity to the argument that the commercial sector could help to maintain and operate satellite systems for the government. However, he said that because the Trump administration "emphasizes almost exclusively the transactional nature of everything, there is an assumption that as long as it makes good business sense, it's OK to have commercial entities provide [satellite] data."

Sometimes it's not about the return on investment but about protecting lives and property, Spinrad said. "It's like saying, 'Would you be comfortable with commercializing the military?' Of course not," he commented, adding that Americans want to know that their military forces are aligned with the public interest.

"The same should be true for environmental security," which satellite observations can help to provide, he continued. "The fact that public safety and the economy are so dependent on environmental factors means that absent the capability to understand and predict the environment, we will suffer both economically and in terms of safety."

Communicating the Relevance of Science

Spinrad said that the scientific community is partly to blame for an antiscience attitude and potential big budget cuts. "We have benefited from eras of relatively healthy support and felt that the value of what we did was self-evident," he told Eos.

Rick Spinrad strides through downtown Washington, D. C., on his "long walk home" from Cape Henlopen, Del., to Newport, Ore. Credit: Randy Showstack

However, Spinrad urged scientists to become better at explaining the value of their work to the public. He said that in the grand scheme of things, the Earth science research portfolio "is viewed as less relevant to the American public than health care research. I don't necessarily disagree with that. But I think it is much more relevant than most people think it is. That's on us to raise the visibility."

He is hopeful that can happen and that the science will gain more support.

Spinrad also is hopeful about completing his trek. First, though, he needs to recover from plantar fasciitis, a heel pain that has temporarily halted the walk in Pennsylvania. He hopes to resume his walk in a few weeks.

The delay, however, will not keep him from participating in the 22 April March for Science. Spinrad told Eos that he will give a keynote speech at the march in Newport, Ore. "Any pain that I might endure from the hike wont compare with the suffering that could result from the cuts to research by our federal government," he said.

--Randy Showstack (@RandyShowstack), Staff Writer

Hydrogen Molecules Hint at Habitability of Enceladus's Ocean

Thu, 04/13/2017 - 18:03

In a Science paper published today, scientists from NASA's Cassini mission to Saturn announced new findings that have big implications for the habitability of Saturn's tiny, ice-covered moon Enceladus.

"Habitability, as the astrobiologists have defined it, is pretty much manifested in the interior ocean of Enceladus."Spoiler alert: It's not aliens. But in the geyser-like plumes that constantly spew out of Enceladus's southern fissures, the team found molecular hydrogen (H2). This finding adds more evidence to the idea that on the floor of the moon's ocean, liquid water percolates into the rocky core and drives chemical reactions similar to those found at hydrothermal vents on Earth.

Finding molecular hydrogen, in addition to carbon dioxide, methane, and other molecules that scientists had already detected in the plumes, means that "habitability, as the astrobiologists have defined it, is pretty much manifested in the interior ocean of Enceladus," said lead author Hunter Waite, a planetary scientist at the Southwest Research Institute in San Antonio, Texas.

Hunting for Hydrogen

Cassini spotted Enceladus's plumes in 2005, a year after the spacecraft entered Saturn's orbit. But scientists weren't sure where the plumes' water came from. Since then, studies of Enceladus's gravity indicate that the moon hosts a global ocean under its ice shell.

In 2015, Cassini detected nanometer-sized particles of silicon-rich material in Saturn's rings. This was an odd finding because Saturn's rings are composed mostly of ice. The researchers traced those particles--analyzed to be silicon dioxide, also known as silica--back to Saturn's E ring, where Enceladus orbits. Cassini images had previously shown scientists that material from the plumes shoots into the E ring, so the researchers concluded that this silica must also be coming from Enceladus's plumes.

In 2015, Cassini's narrow-angle camera captured this view of Enceladus's southern pole, where plumes of water from the moon's ocean shoot into space. Credit: NASA/JPL-Caltech/Space Science Institute

In space, it's rare to find floating bits of silica, but in Earth's oceans, silica disperses from hot rock interacting with water at hydrothermal vents. Scientists suggested that hydrothermal reactions on Enceladus's ocean floor must be releasing the silica, but they were missing a key indicator of hydrothermal activity: H2, a molecule seen in abundance around Earth's hydrothermal vents.

In the new paper, researchers examined data from Cassini's deep dive through Enceladus's plumes in October 2015. Gas particles zoomed through Cassini's Ion and Neutral Mass Spectrometer (INMS), which "you could call Cassini's 'nose,'" Waite said, because it "sniffs" the molecules in the plume to determine what they are. In this paper, the team's data point to a wealth of H2 coming out of Enceladus's oceans.

But there are other potential sources of hydrogen on Enceladus that wouldn't involve hydrothermal vents, the paper notes. When the moon formed, billions of years ago, it could have incorporated hydrogen as part of its primordial material.

However, Waite said, vestiges of this primordial material are unlikely to be the source of this newly detected hydrogen. This is due to the relatively tiny hydrogen molecules' ability to easily escape from substances. Enceladus is small, with a diameter only about one tenth the distance from Los Angeles to New York. So it couldn't have held on to hydrogen from its formation, Waite noted. Any loose hydrogen would have leaked into space by now.

This leaves an internal origin for the hydrogen, which most likely comes from water interacting with minerals on a hot, rocky ocean floor, Waite explained.

Earth-Like Chemistry?

Enceladus is quite different from Earth. Earth, at more than 12,000 kilometers in diameter, is huge compared with Enceladus's 504 kilometers and holds enough residual energy to heat its interior. Enceladus does not have hot, buoyant rock rising from its innards to heat a mantle or any magma pushing through a crust. Instead, scientists think that tidal stretching and squeezing of Enceladus by Saturn's gravity cause friction that creates enough heat to support liquid water.

Finding hydrogen in Enceladus's plumes suggests that the moon's rocky core may be hot enough to drive chemical reactions with water that trickles through the moon's seafloor into the porous interior, reactions similar to those found at hydrothermal vents, Waite said.

And the hint of hydrothermal vents is why the finding is so exciting. At Earth's hydrothermal vents, bacteria and other creatures don't need sunlight to create energy. Instead, these critters take advantage of the smorgasbord of chemical reactions happening around them, which result from water interacting with minerals in hot rock.

Methanogenic bacteria, for example, get energy from converting carbon dioxide and hydrogen--among other things--into methane. Waite said that the high level of hydrogen found in the plumes suggests that Enceladus's ocean could be extremely rich in molecules necessary for this kind of reaction.

So "if you had those same conditions [at] a hydrothermal vent on Earth, methanogenic bacteria would be eating the hydrogen as if it were candy," he continued. But this is all supposition; Waite stressed that there is no current evidence for life on Enceladus.

Until Next Time, Enceladus

"The Enceladus results provide a new location for potential life well outside the Goldilocks zone.""The Enceladus results provide a new location for potential life well outside the Goldilocks zone, that narrow region around a star where liquid water can exist on the surface of a planet," said Linda Spilker, Cassini project scientist at NASA's Jet Propulsion Laboratory in Pasadena, Calif., who was not involved in the research. Finding hydrogen means that "liquid water is chemically interacting with the rock beneath the ocean, producing the kind of chemistry that is favorable for potential life," she said.

Because NASA plans to send the aging Cassini spacecraft plunging into Saturn in September, it's goodbye to Enceladus for now. But it might not be goodbye forever: Future spacecraft may return to send us back more clues of the moon's newfound Earth-like environment.

--JoAnna Wendel (@JoAnnaScience), Staff Writer

What Led to the Largest Volcanic Eruption in Human History?

Thu, 04/13/2017 - 11:53

In the northern part of the Indonesian island of Sumatra lies the Toba caldera, a massive crater formed by what scientists think is the largest volcanic eruption ever experienced by humanity. The eruption, called the Youngest Toba Tuff supereruption, took place about 74,000 years ago.

By dating zircon, a diamond-like gemstone, and other minerals in the area such as quartz, Reid and Vazquez have pieced together clues about the activity of magma below the surface prior to the supereruption.

Zircon is the oldest dated mineral on Earth. With a hardness rating of 7.5, it is resistant to chemical and mechanical weathering and can withstand metamorphism (structural changes due to heat, pressure, and other natural processes). All of these factors make it an ideal mineral for geological dating, especially for magma. Because zircon does not gain or lose uranium or lead even at magmatic temperatures, zircon typically contains high uranium and low lead levels, and scientists may use the ratio of these two elements in the zircon to determine the age of the sample.

The way in which zircon crystals in the Youngest Toba Tuff magma appear to have nucleated and grown over time, the researchers found, provides evidence of intermittent changes in the composition of the underground body of magma that eventually erupted. Certain characteristics of the zircon also indicate repeated episodes of magma recharge--fresh influxes of magma that often trigger eruptions--occurring tens of thousands to hundreds of thousands of years before the supereruption.

The team's findings are significant for modern-day humans, given that aerosols and ash that erupted from Youngest Toba Tuff are thought to have entered the atmosphere, causing global cooling and the near extinction of the human race. A supereruption of equal or greater magnitude today could therefore have similarly drastic consequences. By better understanding the conditions that led up to the Youngest Toba Tuff supereruption, scientists can help paint a clearer picture of the future. (Geochemistry, Geophysics, Geosystems, https://doi.org/10.1002/2016GC006641, 2017)

--Sarah Witman, Freelance Writer

How "Godzilla" El Nino Affected Tropical Fish in Low-Oxygen Zone

Thu, 04/13/2017 - 11:50

It's been a weird past few years for the Pacific Ocean, thanks to rising global temperatures. First, there was "the Blob," a mass of warm water that hit the West Coast of the United States in late 2014, killing whales, sea lions, and many other types of marine life. Then, in spring 2015, "Godzilla" El Nino arrived, a climate anomaly driven by the warmest sea surface temperatures in half a century.

Many effects of Godzilla El Nino were immediately obvious, such as wildfires in Australia and the Amazon and severe drought in Southern California. Others, such as its impact on marine life, are still being investigated. In a new study, Sanchez-Velasco et al.examine how the event changed the distribution of four species of tropical fish larvae in a particularly sensitive environment: a large, shallow, low-oxygen zone off the coast of Mexico.

Bregmaceros bathymaster, one of the fish that researchers studied to see how a monster El Nino impacted distribution and abundance. Credit: S.P.A. Jimenez Rosenberg

The warmer and saltier water is, the less oxygen it can hold. Although most fish become extremely stressed when oxygen levels fall below 1 milliliter per liter, different species of fish have varying tolerances for low-oxygen environments. The researchers collected larvae from four different species of tropical fish--Diogenichthys laternatus, Vinciguerria lucetia, Bregmaceros bathymaster, and Auxis subspecies (spp.), a species of tuna--to see how their distribution and abundance changed before and after the monster El Nino. They also collected zooplankton, tiny drifting organisms that provide an important source of food for fish and whales.

To their surprise, the team found more fish larvae after the El Nino event, whereas zooplankton levels declined. The distribution of different species had changed, however: Before, V. lucetia, a small, bioluminescent fish commonly known as the Panama lightfish, and Auxis spp. were most abundant close to the ocean surface, where oxygen levels are highest, whereas B. bathymaster, a type of cod, and D. laternatus, another type of bioluminescent fish, dominated the deeper, lower-oxygen layers.

After the El Nino event, however, all four species were more abundant in the upper, higher-oxygen layer, suggesting that they were trying to avoid the rise of an extremely low oxygen layer from underneath. Their migration could signal an adaptive response to prolonged periods of ocean warming, the team suggests. (Journal of Geophysical Research: Oceans, https://doi.org/10.1002/2016JC012622, 2017)

--Emily Underwood, Freelance Writer

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer