EOS

Syndicate content
Earth & Space Science News
Updated: 1 day 3 hours ago

Tsunami Records Show Increased Hazards for Chile’s Central Coast

Mon, 07/24/2017 - 11:55

In the early morning of 8 July 1730, residents of central coastal Chile felt what would later be known as the largest earthquake to strike this region since the beginning of local written history (around 1540). The tremor destroyed buildings along more than 1000 kilometers of the coast. Researchers previously thought that the quake may have reached a magnitude of Mw 8.5 to 9.0.

Now Carvajal et al. suggest that this historical quake was even larger than previous estimates and likely reached a magnitude of more than Mw 9, meaning that it was a truly giant event.

Despite the 1730 tremor’s strength, few people were killed, thanks to a strong foreshock that prompted many to leave their homes before the big one hit. People also survived by fleeing to higher ground when they saw seawater receding—a warning sign of the ensuing tsunami that inundated residential areas.

In fact, historical observations of this tsunami, which also reached Japan, were what prompted the authors to reexamine the quake’s magnitude.

An anonymous Jesuit priest’s account of the 1730 earthquake and tsunami and their impact in the city of Concepción (now Penco), Chile. After a similar tsunami event in 1751, the city was relocated to higher ground. The top paragraph reads, “Relation of the pitiful and terrible damage to the city of La Concepción of the kingdom of Chile, caused by the trembling and flooding of the day July 8 of 1730.” Credit: Archivum Romanum Societatis Iesu (Roman Archives of the Society of Jesus). Click image for larger version.

In one account, a Jesuit priest in the historical city of Concepción reported the flooding of several religious and public buildings. In Valparaíso, about 500 kilometers north, first- and second-hand accounts describe the flooding. Records from Japan detail damage to barriers, rice fields, and desiccation ponds where salt was harvested but report no human injuries or deaths.

The researchers used these reports to reconstruct the tsunami’s height and the extent of flooding. They then investigated the size and depth of the earthquake required to generate such a tsunami.

Using contemporary knowledge of tsunami generation and progression, the scientists ran simulations of tsunamis produced by hypothetical earthquakes of varying magnitudes, depths, and slip amounts off the coast of central Chile. They found that a quake of Mw 9.1–9.3 best fits with the historical tsunami records in both Chile and Japan.

According to the best fitting simulation, this earthquake would have occurred along a rupture 600–800 kilometers in length, with an average slip amount of 10 to 14 meters. The tsunami records and additional evidence of coastal uplift suggest that the depth of this slip was shallower toward the northern end of the rupture and deeper to the south.

The researchers note that since 1730, tremors in the same region have involved little slip at shallow depths. Slips at shallow depths are widely agreed to pose the most tsunami hazards, so a lack of shallow slip since 1730 may indicate that stress along the shallow portion of the fault has built up for nearly 300 years.

If this potential shallow stress buildup is released in a future earthquake, the subsequent tsunami could be devastating. The authors point out that such a shallow quake might cause only moderate shaking, which could give the local population a false sense of security.

The researchers recommend that this possibility be used to inform disaster prevention plans in the area, which is home to most of Chile’s coastal population. (Journal of Geophysical Research: Solid Earth, https://doi.org/10.1002/2017JB014063, 2017)

—Sarah Stanley, Freelance Writer

Tiny Particles with Big Impact on Global Climate

Mon, 07/24/2017 - 11:53

Aerosols, tiny particles in the atmosphere, influence cloud-forming processes, scatter and absorb solar light and thus alter the climate. A recent review article published in Reviews of Geophysics focused on one particular type of aerosol, secondary organic aerosol (SOA), highlighting recent developments in understanding about the formation and properties of SOA, and discussing how these developments could impact estimates of global climate forcing. The editors asked one of the authors to explain the significance of SOA and suggest where further research efforts are still needed to better understand their influence on climate.

What is “secondary organic aerosol” and why is it important?

Unseen by the naked eye, large quantities of carbon-containing vapors enter the atmosphere as they are released from trees and escape during the combustion of fossil and biomass fuels. The atmosphere serves as a large chemical reactor, “cooking” these emissions and transforming them chemically into new molecules with new properties. Some of these molecules can stick together, forming new particles, or condense onto existing particles; the condensed material is commonly referred to as “secondary organic aerosol”.

Secondary organic aerosol is important because it is a major fraction of global submicron-sized atmospheric organic aerosol. SOA can cool or warm the atmosphere by changing the amount of solar energy reaching the Earth’s surface. SOA processes could also result in formation of new particles that act as cloud condensation nuclei, affecting clouds and precipitation.

In the last decade, great advances have been made in terms of understanding how SOA particles form, what are their properties, and how they evolve in the atmosphere as a function of temperature and relative humidity. These advances have major implications in terms of representing processes related to the atmospheric lifecycle of SOA in models. This work critically assesses the understanding developed in the last decade and outlines several future research directions that are needed to address the outstanding issues in SOA.

Processes governing the climatic importance of SOA. Credit: Shrivastava et al., 2017, Figure 1

What are the challenges of accounting for secondary organic aerosol in climate models?

SOA represents a complex aerosol system that involves thousands of organic species in both gas and particle phases. Although the understanding of SOA formation and properties has greatly advanced in the past decade, substantial gaps remain.

A better understanding of SOA formation mechanisms is needed. In addition to their volatility, the condensation of molecules onto particles can also be impacted by the phase of the particles, with uptake and mixing limited in highly viscous particles. Ultimately, all important processes that shape atmospheric aerosol size distributions and concentrations of cloud condensation nuclei need to be adequately represented in climate models.

The computational burden associated with advection of gas- and particle-phase SOA species could be a major challenge. To this end, efforts are needed to determine the most influential non-linear processes affecting SOA, so that they can be efficiently represented in atmospheric chemistry-climate models.

What are the major unsolved or unresolved questions in this field?

As the understanding of SOA formation processes evolves, an important question is how this new understanding affects our estimates of aerosol climate forcing, which is calculated as the difference between present-day and pre-industrial conditions. Of particular interest are processes wherein anthropogenic emissions interact with biogenic emissions and affect the chemical mechanisms of SOA formation. However, currently, we only understand a small subset of these processes, and even this subset is not holistically represented in climate models.

Also, the natural background aerosol state strongly affects calculations of aerosol climate forcing, but is not accurately characterized, in large part because we cannot go back in time and measure past aerosol concentrations, as has been done for greenhouse gases. Thus, we are highly reliant on process-level understanding of the atmospheric lifecycle of particles in general, and SOA specifically, when assessing the climate impacts.

How could additional data or modeling efforts contribute to a better understanding?

Integrated measurements and modeling efforts are needed to determine the non-linear effects of SOA on clouds and radiation. It is important to continuously develop experimental techniques that provide information about the entire lifecycle of SOA, beginning with formation of molecular clusters to their growth to particle sizes that enable the particles to function as cloud condensation nuclei.

New model parameterizations will need to be developed and these model predictions will need to be validated with as many experimental constraints as possible, to ensure accuracy and ascertain that we are getting the right answers for the right reasons.

—Manishkumar Shrivastava, Pacific Northwest National Laboratory, Washington; email:  manishkumar.shrivastava@pnnl.gov

Seafloor Data from Lost Airliner Search Are Publicly Released

Fri, 07/21/2017 - 12:28

A vanished airliner rather than scientific curiosity prompted a recent extensive campaign to map a large swath of the seafloor, one of the last unexplored frontiers of our planet. Now these data have been publicly released for the first time and may offer a lasting boon to science.

On 8 March 2014, Malaysia Airlines flight MH370 took off from Kuala Lumpur en route to Beijing, lost contact with air traffic control, and disappeared. The Boeing 777, with 239 people on board, is presumed to have veered wildly off course, eventually crashing into a remote part of the southern Indian Ocean.

Searching for Debris

These data can further our understanding of the deep-ocean floor and forces, like earthquakes, that shape it.In the days and weeks following the disappearance of MH370, boats and aircraft scoured the area in search of debris. Wreckage found in the western Indian Ocean allowed researchers to estimate where the aircraft might have entered the water.

Between June 2014 and June 2016, scientists used ship-mounted sonar technology to map the topography of the seafloor in the MH370 search area roughly 2000 kilometers off the western coast of Perth, Australia. Despite having to penetrate more than 6000 meters of water in some places, these new observations yielded maps with at least 15 times the resolution of older maps. That substantial increase in resolution allowed researchers to record details of seafloor features such as canyons and landslides. These data can further our understanding of the deep-ocean floor and forces, like earthquakes, that shape it, researchers suggest.

The data release included videos that give viewers three-dimensional tours through some of the seafloor regions mapped by the search.

Fly-through of Broken Ridge and Diamantina Trench from © Commonwealth of Australia (Geoscience Australia) 2017 on YouTube.

A Thoroughly Mapped Region A fault in the seabed slices through a series of submarine volcanoes spotted from above by searchers looking for wreckage from lost Malaysia Airlines flight MH370. Credit: © Commonwealth of Australia (Geoscience Australia) 2017

“It is estimated that only 10 to 15 per cent of the world’s oceans have been surveyed with the kind of technology used in the search for MH370, making this remote part of the Indian Ocean among the most thoroughly-mapped regions of the deep ocean on the planet,” according to aquatic and environmental chemist Stuart Minchin. He’s chief of the Environmental Geoscience Division of Geoscience Australia, the main Earth science agency of the Australian government. The agency, which served as an adviser for the seafloor mapping, publicly released these detailed maps of the Indian Ocean seafloor earlier this month.  Geoscience Australia also incorporated the data in an interactive story map on its website.

The extensive search of the seafloor spanned more than 275,000 square kilometers, but no additional wreckage from MH370 was found. The presumed crash remains one of the greatest mysteries in aviation history. In January 2017, the governments of Malaysia, China, and Australia jointly announced that the search for MH370 would be suspended until “credible evidence is available that identifies the specific location of the aircraft.”

However, scientists are looking forward to analyzing the trove of seafloor data that was collected. “The reason we mapped this area was of course to find the aircraft and bring closure to the families, but it is a unique data set that would never have been collected without the search,” says Minchin.

—Katherine Kornei (email: hobbies4kk@gmail.com; @katherinekornei), Freelance Science Journalist

Natural Resource Exploitation Could Reach New Depths

Fri, 07/21/2017 - 12:22

Buildings, infrastructure, mobile phones, batteries, and electric cars contain valuable metals such as copper, zinc, silver, and gold. With the seemingly insatiable demand for products enabled by these materials, new sources are needed, particularly as land-based reserves become scarce or are located in places that are too difficult, dangerous, or costly to access.

Hydrothermal vent structures at the Mariner site in the South Pacific. Such vents are deep-sea locations where minerals are often found. The presence of iron in sulfide structures tints these chimneys red. Credit: WHOI/National Deep Submergence Facility, CC BY-NC-SA 3.0 US

These metals exist, however, in mineral deposits on rocky submarine mountains, on abyssal plains, at mid-oceanic ridges, and around underwater hydrothermal vents. For example, it has been suggested that nodules of manganese found in places such as the Clarion-Clipperton zone, an extensive area on the Pacific Ocean floor, could satisfy current demand for decades.

Beaulieu et al. reflect on a session at the 2017 American Association for the Advancement of Science annual meeting that posed the question, Should we mine the seafloor? As scientists, they wanted to approach the question with objective, scientific evidence.

The question is not without controversy. Although some people see the oceans as the last untapped resource on Earth, others see them as a precious natural asset to be protected.

The authors note that exploitation of the shallow seafloor already takes place, with the dredging of sand and gravel and extraction of tin, gold, and diamonds from shallow reserves. Technology also enables the oil and gas industry to operate on seabeds up to 3 kilometers below the surface.

It is not yet possible to mine reserves that lie deeper, but the authors point out that it may become a reality. As of mid-2017, 27 licenses for exploration have been granted by the International Seabed Authority, a United Nations body established in 1984 with responsibility for regulating deep seabed mining.

The main thrust of discussion explores whether deep seafloor mining is economically feasible, technologically possible, environmentally appropriate, ecologically sustainable, and legally manageable. However, the authors’ attempt to make an objective assessment was limited because this topic is mired in so many uncertainties.

Because little is known about the quantity and quality of resources in existence, the resources may not even be worth exploiting. No one yet knows whether extraction from the ocean could be competitive with land-based mining.

Technological uncertainties also abound. For example, will equipment work effectively in the extreme environment of the deep oceans? Although technologies to map and mine these resources have developed significantly, much further testing remains.

And perhaps most important, not enough is known about the ecological implications of deep seafloor mining. Deep-sea environments aren’t well studied, and even less is known about the vulnerability or resilience of marine ecosystems to such interference.

But the authors see a key opportunity: Unlike most other forms of natural resource exploitation in human history to date, scientists could work with lawyers to put a legal framework in place before large-scale exploitation starts, thereby ensuring a responsible and regulated approach. There are developments to this end; for example, the International Seabed Authority is currently working with scientists on a first draft of environmental regulations for mining in areas beyond national jurisdictions.

However, with a lack of accurate scientific information and so many economic, technological, and environmental uncertainties, is it even possible to create effective environmental regulations? To this end, the researchers suggest that when exploration and testing contracts are granted, those executing the contracts not only survey potential resources and try new technologies but also use the opportunity to study ecosystem responses and provide valuable data to researchers. They also call for a transdisciplinary approach, drawing on the expertise of researchers from across different fields in the physical and social sciences to inform such international agreements.

Recent years have seen a shift from speculation to limited exploration of the deep seafloor, but at some point, the authors stress, resource exploitation will become a reality. (Earth’s Future, https://doi.org/10.1002/2017EF000605, 2017)

—Jenny Lunn, Contributing Writer

The Value of Disaster Damage Data

Fri, 07/21/2017 - 12:18

The past decade has seen increasingly severe floods across the world which have caused widespread damage to life, property and economic resources. Data on the spatial extent and cost of damage can help, not only in the immediate aftermath of a food, but also in the longer term to mitigate future risk through effective planning and preparedness. A new book entitled Flood Damage Survey and Assessment: New Insights from Research and Practice, just published by the American Geophysical Union, presents a compilation of real world examples and best practices in the collection, storage, analysis and sharing of flood damage data at different spatial scales from global to national and regional. Here, the editors answer some questions about research and practice in this field.

What kind of information can be collected after a flood event in order to assess damage?

Wet documents in a firm affected by a 2012 flood in Marsciano, Umbria, Italy. Credit: Molinari et al., Politecnico di Milano

Several types of information are useful to build a picture of the impacts of a flood event. First it is important to collect data on the physical characteristics of the flood, such as the extent of the flooded area, the spatial distribution of water depth and velocity, and the concentration of sediments and contaminants.

Next it is essential to gather information on damage both due to the direct contact of flood water and due to systemic connections with the flood.

All features susceptible to damage – such as residences and businesses, farms and crops, people and livestock, infrastructure and public services, cultural and environmental heritage – are known as “exposed assets.”

How can this information be used?

Damage data collected in the aftermath of a disastrous event can support a variety of actions. During the emergency stage and immediate aftermath, the information can help to identify priorities for intervention. Later it can be used to define the victims who will benefit from compensation schemes. In terms of learning from the disaster and planning for the future, the information can be used to create a complete event scenario to understand the fragility of the location and tailor risk mitigation strategies. It also helps with the definition of damage models to feed cost-benefit analysis of structural and non-structural mitigation actions, including insurance schemes.

What are the challenges to collecting good quality data?

Damage data collection is not mandatory and standards do not exist for its collection. Neither are there are common guidelines for the effective use of damage data for risk mitigation objectives. As a consequence, damage data are scarce and incomplete (not all the useful information described above is necessarily collected). Data are also not usually comparable between locations because they are collected at different scales and according to different formats, not only depending on the country in which the event occurs, but they also may vary from event to event even in the same country.

What tools or approaches would improve the collection of such data?

The variety of exposed assets and possible strategies for flood risk management calls for an interdisciplinary approach in damage assessment including expertise ranging from hydrology and engineering, to spatial planning, economics and law. Only strong cooperation between scientists and practitioners can assure that all relevant information is collected. However, it is not always easy to get people from different backgrounds with diverse interests to coordinate and work together effectively.

Consistency is necessary if data are to be used to compare situations, and is possible to achieve despite differences in spatial scale or juridical context by centralizing the collection or by adhering to agreed formats. More difficult to achieve is securing the commitment of many stakeholders, particularly but not only those from the private sector, to agree to exchange and share damage data with others.

Meanwhile recent developments in information technology – from satellite imagery to social media – have been changing and improving the ways in which damage data is collected, stored, shared and accessed. In particular, the ability to collect “big data” is an interesting tool with future potential.

Flood Damage Survey and Assessment: New Insights from Research and Practice, 2017, 288 pp., ISBN: 978-1-119-21792-3, list price $149.95 (hardcover), $119.99 (e-book)

—Daniela Molinari, Scira Menoni, and Francesco Ballio, Politecnico di Milano, Milan, Italy; email: daniela.molinari@polimi.it

Volcano’s Toxic Plume Returns as Stealth Hazard

Thu, 07/20/2017 - 12:07

Toxic airborne emissions from volcanoes in Iceland may be dangerous even when pollution-monitoring instruments indicate that the air is safe, according to a new study. Volcanologists observing a 2014–2015 eruption in Iceland have found that even when air quality detectors had determined that sulfur dioxide (SO2) gas had fallen to acceptable levels in the air downwind from an eruption, another form of contamination known as sulfate aerosols had sneaked back, sullying the air.

The aerosols’ ingredients, tiny particles of sulfuric acid and trace metals, likely also harm people, but less is known about their toxicity than about the health consequences of SO2. Although Iceland’s public and environmental health agencies do not track sulfate aerosol concentrations, those pollutants contain “heavy metals found in human-made air pollution that are linked to negative health effects,” said Evgenia Ilyinskaya, a volcanologist at the University of Leeds in the United Kingdom, who led the study.

The findings by Ilyinskaya and her colleagues will appear in the August edition of Earth and Planetary Science Letters, which posted the results online in early June.

Record-Breaking Sulfur Dioxide Release

More than a cubic kilometer of lava gushed from the rift, along with record-breaking amounts of sulfur dioxide gas.For 6 months spanning the end of 2014 and the beginning of 2015, an eruption from the Holuhraun fissure in the Bárðarbunga volcanic system of southeastern Iceland captivated the world. More than a cubic kilometer of lava gushed from the rift, along with record-breaking amounts of SO2, a common volcanic emission that can be extremely harmful to humans. The gas can damage the lungs, and it causes irritation of the nose and throat. Other scientists have estimated that the Holuhraun eruption emitted 1130 kilograms of SO2 every second.

During the eruption, the Environment Agency of Iceland (EAI) and the Icelandic Meteorological Office (IMO) issued regular warnings about SO2 pollution and encouraged residents to remain indoors during times of heightened levels. Readings came from the volcano site and monitoring stations in cities downwind from the eruption. Also during the eruption, Ilyinskaya and her collaborators collected their own gas samples from a helicopter hovering just tens of meters above the fiery rift.

Long after the eruption had ended, when the researchers started analyzing their data along with data from monitoring stations in areas surrounding the volcano, something odd stuck out. In two cities, a small town called Reykjahlíð, 100 kilometers downwind from the eruption, and Iceland’s capital, Reykjavík, 250 kilometers downwind, the researchers noticed that on some days when monitoring stations detected low levels of SO2, their data indicated simultaneous high levels of sulfate aerosols, which arise from sulfur dioxide cooling in the air while reacting with other airborne molecules with the help of sunlight.

On those days, IMO characterized SO2 levels as safe but did not issue warnings for aerosols. The particles in aerosols are fine enough to deeply penetrate lungs, and airborne particulates may contribute globally to millions of deaths every year, the researchers noted.

“Plumerangs”

Ilyinskaya and her team found that a plume of sulfur dioxide would evolve into aerosols and travel back toward Iceland in a matter of days—they dubbed these secondary plumes “plumerangs.”By comparing their direct observations to models of how sulfur dioxide converts into sulfate aerosols, Ilyinskaya and her team found that a plume of sulfur dioxide would evolve into aerosol particles and travel back toward Iceland in a matter of days—they dubbed these secondary plumes “plumerangs.” The researchers were stunned to see that the SO2 plumes could last long enough to “mature,” or fully convert into plumes of aerosols, Ilyinskaya said.

When Ilyinskaya mentioned the data to a colleague at EAI, he offered some anecdotal evidence: On one particular day in Reykjavík, when SO2 levels were reportedly low, residents still reported burning eyes and throats. The day was 20 September 2014. The researchers checked their data and found that a plumerang had blown in that very day.

“On at least 18 days during the 6-month-long eruption, the plumerang was in the capital city of Reykjavík, while the official forecast showed ‘no plume,’” Ilyinskaya said in a press release.

Public Health Follow-Up

The researchers are currently conducting a follow-up study to determine what kinds of health effects resulted from the plumerangs, Ilyinskaya said. In the meantime, the researchers suggest that SO2-to-aerosol conversions should be considered in future air pollution forecasts, especially considering the profuse SO2 emissions known to come from Iceland’s volcanoes.

“Now we know what to look out for, not if, but when, we get the next eruption,” Ilyinskaya said.

—JoAnna Wendel (@JoAnnaScience), Staff Writer

Antarctic Microbes Shape Nutrient Content of Snowmelt

Thu, 07/20/2017 - 12:04

Snowmelt on the Antarctic Peninsula has been increasing for decades because of rising temperatures, and it shows no signs of stopping. As the snow melts, it can release nutrients that eventually make their way into downstream marine ecosystems. However, along the way, snow-dwelling microbes may use and transform these nutrients, influencing their downstream effects.

Despite the potential importance of microbes residing in Antarctic snow, few studies have investigated their role in melting. New research by Hodson et al. addresses this gap in knowledge by illuminating the activity and influence of microbes inhabiting wet snow on Livingston Island, located off the tip of the Antarctic Peninsula.

During austral summer in 2013 and 2014, the research team collected snow samples at five different sites in the island’s Hurd Peninsula region. Three sites were less than 100 meters from the shoreline, one site was 500 meters inland, and the fifth site was 750 meters inland. The scientists investigated the chemical, physical, and biological characteristics of the snowpack at each site.

To assess the snow algal population, they used microscopy and laser counting techniques (flow cytometry), finding that red-pigmented algae were dominant at the snow surface, whereas green algae were present below. Bacterial cells are too small for detailed characterization in this way, so the team focused on the cells’ ribosomal RNA (16S rRNA). Although microbes have evolved over millions of years, rRNA genes have changed very little. Measuring slight changes in this easily sequenced gene therefore gives clues to how closely microbes in a given population are related and allows researchers to establish the bacterial community structure.

Genetic sequencing of 16S rRNA genes in the snow samples revealed that microbial communities closer to the coast consisted of a greater diversity of bacterial species than those in the inland glacial snowpacks. Inland communities consisted mostly of typical snow bacteria, whereas coastal communities also had species associated with soil and the ocean.

The scientists also found evidence that algae growing on the surface of the snowpacks can accelerate melting. This is because some algae produce pigments that darken the snow and increase absorption of solar radiation. Such algae were much more plentiful near the coast than inland, likely thanks to fertilization with calcium from rock debris and with ammonium from the droppings of penguins and seals.

At both inland and coastal sites, microbial activity increased concentrations of dissolved organic and inorganic carbon in the snow. It also increased the amount of carbon dioxide in interstitial air pockets within the snow. These concentrations were higher for snowpacks closer to the shoreline.

Dissolved organic carbon can serve as a food source for other microbes and for marine life, so these findings indicate that Antarctic snow-dwelling microbes can indeed influence nutrients in meltwater. The authors propose that the effects of microbial activity on downstream nutrients be taken into account when considering the full impact of warming on the Antarctic Peninsula. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1002/2016JG003694, 2017)

—Sarah Stanley, Freelance Writer

2017 AGU Union Medal, Award, and Prize Recipients Announced

Thu, 07/20/2017 - 12:02

Every year, the American Geophysical Union (AGU) recognizes individuals through our renowned honors program for their exceptional achievements, outstanding contributions and service to the scientific community, and attainment of eminence in Earth or space science fields. This distinguished group of honorees—scientists, leaders, educators, and communicators—has explored new frontiers in geophysical or space research through their creativity, original thinking, and groundbreaking advances. On behalf of AGU’s Honors and Recognition Committee, our Union selection committees, and our organization’s leadership and staff, we are very pleased to present the recipients of AGU’s 2017 Union medals, awards, and prizes.

Fulfilling AGU’s Mission

Our 2017 honorees’ work and scientific achievements embody AGU’s mission of promoting discovery in Earth and space science for the benefit of society. Their discoveries or other contributions have helped improve the lives and prosperity of people and communities around the world. Their passion for scientific excellence and their dedication continues to enhance our understanding of Earth and space science.

We thank all who have given their support and commitment to AGU’s honors program, including the volunteers who serve on the medal, award, and prize selection committees that have chosen this year’s Union honors recipients. We also are grateful for the commitment and engagement of all of the nominators and supporters who make our honors program possible through their dedicated efforts to recognize and commend their colleagues.

Celebrate at Fall Meeting

We look forward to celebrating these 29 exceptional recipients of our Union medals, awards, and prizes and their contributions and achievements at this year’s Honors Tribute to be held on Wednesday, 13 December 2017, at the Fall Meeting in New Orleans.

Please join us in congratulating our esteemed class of 2017 Union honorees listed below.

Medals

William Bowie Medal Thomas H. Jordan, Southern California Earthquake Center, University of Southern California

James B. Macelwane Medal Robert E. Kopp, Rutgers University–New Brunswick Michael P. Lamb, California Institute of Technology Yan Lavallée, University of Liverpool Wen Li, Boston University and University of California, Los Angeles Tiffany A. Shaw, The University of Chicago

John Adam Fleming Medal Mary K. Hudson, Dartmouth College

Maurice Ewing Medal Donald W. Forsyth, Brown University

Robert E. Horton Medal Eric F. Wood, Princeton University

Harry H. Hess Medal Roberta L. Rudnick, University of California, Santa Barbara

Roger Revelle Medal Kevin E. Trenberth, National Center for Atmospheric Research

Inge Lehmann Medal Brian L. N. Kennett, Research School of Earth Sciences, Australian National University

Devendra Lal Memorial Medal S. K. Satheesh, Indian Institute of Science

Awards

Africa Award for Research Excellence in Earth or Ocean Sciences Bruno V. E. Faria, National Meteorological and Geophysics Institute

Africa Award for Research Excellence in Space Science Melessew Nigussie, Washera Geospace and Radar Science Laboratory, Bahir Dar University

Ambassador Award Jean Marie Bahr, University of Wisconsin–Madison Robert A. Duce, Texas A&M University Richard C. J. Somerville, Scripps Institution of Oceanography, University of California, San Diego

Robert C. Cowen Award for Sustained Achievement in Science Journalism Richard Monastersky, Nature

Edward A. Flinn III Award Robert L. Wesson, U.S. Geological Survey

Excellence in Earth and Space Science Education Award Thure E. Cerling, University of Utah James R. Ehleringer, University of Utah

Charles S. Falkenberg Award Hook Hua, NASA Jet Propulsion Laboratory

Athelstan Spilhaus Award Erik Meade Conway, Jet Propulsion Laboratory

International Award Hubert H. G. Savenije, Delft University of Technology

Walter Sullivan Award for Excellence in Science Journalism–Features Tony Bartelme, Post and Courier (Charleston, S.C.)

David Perlman Award for Excellence in Science Journalism–News Courtney Humphries, Freelance Journalist, Boston, Mass.

Prizes

The Asahiko Taira International Scientific Ocean Drilling Research Prize Michael Strasser, University of Innsbruck

Climate Communication Prize Stefan Rahmstorf, Potsdam Institute for Climate Impact Research

 

—Eric A. Davidson, President, AGU; and Samuel Mukasa (email: agu_unionhonors@agu.org), Chair, Honors and Recognition Committee, AGU

Tracking Water Through the North Atlantic Ocean

Wed, 07/19/2017 - 11:50

Warm, salty water from the northeastern Atlantic Ocean enters the Nordic seas through the passages between Iceland, the Faroe Islands, and Scotland. These waters continue north along the eastern margin of the Nordic seas, where they gradually cool, sink to progressively greater depths, and fill all basins with dense water. This water eventually flows back into the deep North Atlantic, compensating for the water transport and associated heat flux to high latitudes that help maintain the mild climate of central and northern Europe.

A strength of the workshop was the review and discussion of sustained observations, novel techniques, and modeling activities.This flow of warm water north and the return of cold water at depth have been a major focus of many institutional and international initiatives over the years. In January 2017, the Faroe Marine Research Institute (FAMRI) hosted a workshop to bring these activities into clearer focus and to identify issues of concern.

A strength of the workshop was the review and discussion of sustained observations, novel techniques, and modeling activities. Sustained observation includes hydrography, which began with the seminal study by Helland-Hansen and Nansen in 1909 and continues today with quarterly hydrographic surveys by FAMRI and Marine Scotland Science. Participants agreed that these surveys have played a major role in tracking interannual and decadal changes in water properties and long-term variations in transport.

While sailing across the North Sea and the North Atlantic, the high-seas ferry M/S Norröna collects valuable data on ocean currents and water properties on its transits from Denmark to Iceland via the Faroe Islands. Credit: Jógvan í Dávastovu (Smyril Line)

In addition, since 1994, numerous deployments of ocean current meters and acoustic Doppler current profilers on moored platforms have given researchers a far better understanding of the variability of currents at selected sites over a wide range of timescales. The high-seas ferry M/S Norröna now also observes ocean currents and water properties underway at high horizontal resolution on her passage across the North Sea, the Faroe-Shetland Channel (FSC), and the Iceland-Faroe Ridge (IFR).

Remote sensing of sea surface height complements the above hydrographic and moored and underway approaches by providing a valuable space-time context of the upper ocean, workshop attendees agreed. Roughly half of all water entering the Nordic seas spills back into the North Atlantic through the Faroe Bank Channel and across the IFR. Of great interest to participants was how these waters flow across and slide down the ridge’s slopes, mix with surrounding waters, and eventually equilibrate. They noted that this flow needs to be understood to properly parameterize large-scale general circulation models.

Major topics to emerge at the workshop included the following:

the recognition of large temporal variations in Atlantic inflows toward the Nordic seas, much, but not all, of which is wind driven the surprisingly large variations in the southward flow on the western side of the FSC, the causes of which are not yet fully understood the challenge of reconciling estimates of Nordic seas exchange rates from diverse observational approaches processes with shorter timescales (less than 2 weeks) and smaller spatial scales (5–10 kilometers) and their impact on the larger-scale flow patterns

Participants formed several subgroups to explore how the different observational methods can be best used to support and complement each other. Each subgroup will report its findings for continued discussion and development at a planned follow-up meeting in Bergen, Norway, in September 2017. Interested parties are welcome to join.

A comprehensive report of workshop discussions can be found on the FAMRI website.

Acknowledgments

Participants are grateful to the various funding agencies that have ensured sustained observations in the FSC and IFR and supported related research projects. In particular, the attendees acknowledge the financial support of the following organizations and projects: the Norröna project (U.S. National Science Foundation and Smyril Line); the European Commission through the North Atlantic Climate (NACLIM) program (FP7, G.A. number 308299), AtlantOS (H2020, G.A. number 633211), and Blue-Action (H2020, G.A. number 727852); the Scottish government; the Danish government through Dancea projects Western Valley Overflow (WOW; 2016–2018) and Faroese Monitoring (FARMON; 2017–2019); and the Bjerknes Center for Climate Research, Research Group 4 for travel funding.

—Barbara Berx (email: b.berx@marlab.ac.uk), Marine Scotland Science, Aberdeen, UK; Karin Margretha H. Larsen, Faroe Marine Research Institute, Tórshavn, Faroe Islands; and T. Rossby, Graduate School of Oceanography, University of Rhode Island, Kingston

Quantifying Coastal Rain Forest Carbon Transport

Wed, 07/19/2017 - 11:48

Coastal margins are dynamic zones at the interface between land and ocean, where freshwater and nutrients flow downstream from coastal watersheds into the nearshore marine environment and where anadromous fish, many species of seabirds, and semiaquatic animals return marine-derived nutrients to the land. Quantifying this flux of materials is imperative to understanding linkages between terrestrial and marine environments, but it is often neglected in discipline-specific research.

The north Pacific coastal temperate rain forest (green) extends from Alaska to northern California; glaciers and ice fields (blue) constrain the forested ecosystem to the north and east. Credit: A. Bidlack

The temperate rain forest of the northeastern Pacific, stretching from Alaska to northern California, is a particularly useful model system for understanding material flux. Abundant rainfall and snowmelt transport nutrients from the carbon-dense forests and peatlands of this region to the ocean via a distributed network of thousands of steep, small streams. These material contributions to the ocean may play a critical role in supporting productive marine food webs, abundant salmon runs, and a variety of fisheries.

Aquatic biogeochemistry was the subject of the first Coastal Rainforest Margins Research Network workshop. Thirty-five participants from the United States and Canada took part in 2.5 days of plenary talks, group discussions, and focused work sessions. Attendees included researchers from academia, the nonprofit sector, First Nations and Alaska Native organizations, provincial governments, and federal agencies.

Workshop discussions centered on modeling hydrologic and carbon fluxes for the Pacific coastal temperate rain forest region.Workshop discussions centered on modeling hydrologic and carbon fluxes for the Pacific coastal temperate rain forest region, with the ultimate goal of developing a data-driven regional flux model for water and carbon. Participants split into five working groups that focused on hydrology, forest and soil ecology, dissolved and particulate organic carbon flux, dissolved inorganic carbon flux, and nearshore marine ecology. The groups discussed data collection methods, model building, and how best to link field and modeling efforts among disciplines and subregions.

The working groups identified several major data needs, which include the following:

a seamless hydrology layer that incorporates the cryosphere and the hydrosphere (e.g., glacier mass balance and accurate precipitation models) scale-appropriate soil maps to use in hydrological routing and biochemical flux models multispecies carbon models for freshwater, including dissolved organic, dissolved inorganic, and particulate carbon a stronger understanding of the freshwater-marine interface, specifically a region-wide estuary classification based on watershed type and connection to ocean

The immense but distributed delivery of minimally processed carbon to nearshore marine ecosystems may be underappreciated in the global carbon cycle.Attendees agreed that the immense but distributed delivery of minimally processed carbon to nearshore marine ecosystems is the defining feature of this and other similar coastal rain forest ecosystems, which may be underappreciated in the global carbon cycle. Workshop participants debated the relative significance of each carbon component in these fluxes (e.g., particulate, organic, inorganic), and they agreed that to better understand nearshore carbon flux, evaluations will be needed across time and spatial scales.

Meeting participants resolved to initiate a specific effort to quantify the fluxes of carbon to coastal systems and to present a conceptual framework for the importance of the Pacific coastal temperate rain forest in regional- and global-scale carbon cycling.

The next Coastal Rainforest Margins Research Network workshop will focus on nearshore marine ecosystem processes, carbon integration into food webs, and nearshore ocean chemistry. It is scheduled for February 2018 in Vancouver, B.C., Canada. More details about the network and its workshops can be found on its website.

This project is funded by a grant (1557186) from the National Science Foundation and is supported by the Hakai Institute, the Alaska Coastal Rainforest Center at the University of Alaska Southeast, and the University of Washington Freshwater Initiative.

—Allison Bidlack (email: albidlack@alaska.edu), Alaska Coastal Rainforest Center, University of Alaska Southeast, Juneau; Brian Buma, Department of Environmental Sciences, University of Alaska Southeast, Juneau; and David Butman, School of Environmental and Forest Sciences and Department of Civil and Environmental Engineering, University of Washington, Seattle

Saturn Unveiled: Ten Notable Findings from Cassini-Huygens

Wed, 07/19/2017 - 11:47

As summer closes and children around the country gear up for a new school year, a mission that fundamentally changed how we think of life in the universe moves deep into its winter. On 15 September, NASA’s Cassini spacecraft will plunge into Saturn’s atmosphere and burn up, just weeks shy of its twentieth birthday.

An illustration of NASA’s Cassini spacecraft at Saturn. The spacecraft has traveled more than 3 billion kilometers since its arrival at Saturn and downlinked more than 300,000 raw images in the past 13 years. The Cassini-Huygens mission launched in October 1997 and will end on 15 September, when the spacecraft dives into Saturn’s atmosphere. Credit: NASA/JPL

The Cassini-Huygens mission launched on 15 October 1997, carrying 12 scientific instruments and a 2-meter-wide saucer-shaped probe called Huygens to land on Saturn’s hazy moon Titan. With it launched a generation of scientific careers and scientists who dedicated more than a decade of their lives to combing through gigabytes of data to understand the gas giant, its rings, and its moons. And those schoolchildren? They have never known a world without Cassini-Huygens.

The mission itself was conceived in the 1980s, after the Pioneer and Voyager spacecraft sent back images and data from the Saturn system that left scientists wanting more. Cassini’s primary goal was simple: explore Saturn and its surroundings and teach us about the strange places we saw and the data we received during previous spacecraft rendezvous.

Cassini-Huygens did not disappoint. Since its arrival at Saturn in 2004, Cassini has traveled more than 3 billion kilometers in more than 200 orbits around Saturn. It has flown by Titan more than 100 times and the icy ocean moon Enceladus 23 times. And on 14 January 2005, when the Huygens probe touched down on Titan, the mission became the first to successfully drop a lander on an outer solar system moon.

Over the course of its travels, Cassini-Huygens has transmitted more than 300,000 raw images of the planet, its moons, and its rings. Caught in Cassini’s web, our imaginations were—and still are—ensnared by Titan’s murky methane lakes; Saturn’s vast, roiling storms; and, perhaps most important of all, the potential for life below Enceladus’s sheath of ice.

Soon Cassini’s exploration of Saturn’s neighborhood will end and with that the steady stream of images, data, and scenes of curious landscapes.But soon Cassini’s exploration of Saturn’s neighborhood will end and with that the steady stream of images, data, and scenes of curious landscapes. For Linda Spilker, the Cassini-Huygens mission’s head scientist, Cassini’s end feels like a high school graduation. After the mission ends, she said, the Cassini team will disperse “and go all different directions. We’re going to take our knowledge that we’ve learned from Cassini and use that to go forward into the future for the next mission,” whatever that may be.

There are feelings of sadness, of course, she continued, but also of “tremendous pride in all that Cassini has accomplished.”

And those accomplishments pile high. Here are 10 notable findings from data spanning the Cassini-Huygens mission.

1. Cassini Revealed Enceladus’s Potentially Habitable Internal Ocean Cassini captured this image of Enceladus, with its south pole plumes of water shooting into space, from a distance of 777,000 kilometers. The 500-kilometer-wide moon and its internal ocean have captivated the scientific world ever since the Cassini team detected them in 2005. Credit: NASA/JPL-Caltech/Space Science Institute

In July 2005, mission scientists sent Cassini flying by Enceladus’s south pole. Enceladus, its ice-covered south pole gouged with deep crevasses, orbits within Saturn’s magnetic field. But the magnetic field acts strangely around Enceladus—it responds to the moon somewhat akin to how magnetic fields act around bodies with thin atmospheres. So researchers sent Cassini to look for that atmosphere.

What they found changed almost the entire focus of the mission. During that flyby, instruments detected water vapor and ice grains. These data, along with data showing that the south pole is warmer than the rest of Enceladus, tipped them off that Enceladus is geologically active. Then, in November, images returned to Earth displayed bright plumes of water and ice spewing from south pole crevasses.

Scientists were stunned. Enceladus doesn’t have an atmosphere. It hosts jets of water spewing out its icy surface.

Intrigued, the team added 19 more Enceladus flybys; by mission’s end, Cassini had swung past the moon 23 times.

Through those flybys, scientists found that Enceladus’s ice shell encloses a global ocean. By sending the spacecraft through the fountains again and again, researchers detected sodium, water vapor, and organic molecules composed of oxygen, nitrogen, carbon, and hydrogen—molecules thought to be important for life. They even recently found molecular hydrogen in the plumes, which suggests that the ocean’s floor may host minerals reacting with hot water. Could these be hydrothermal vents, which on Earth teem with life?

“Cassini has shown that Enceladus satisfies almost all the present criteria that define habitability.”“Cassini has shown that Enceladus satisfies almost all of the present criteria that define habitability,” said Hunter Waite, principal investigator on Cassini’s Ion and Neutral Mass Spectrometer and researcher at the Southwest Research Institute in San Antonio, Texas. Those criteria would be liquid water, organic molecules, and an energy source for microbes (the possible seafloor hydrothermal activity).

Unfortunately, Cassini does not hold instruments that can study the ocean’s depths, Spilker said. “Had we known that Enceladus had geysers, we would have maybe picked some instruments to help answer” the question of whether life exists underneath the icy shell.

“We must go back and look for life,” Waite added.

2. Huygens Showed Us Titan, a Possibly Primordial Earthlike World

When Voyager passed by the Saturn system, it sent back images of Saturn’s largest moon, Titan, covered in a thick, yellowy haze. Voyager’s infrared spectrometers detected a nitrogen-rich atmosphere peppered with hydrocarbons and organic molecules thought to be biological precursors, leading scientists to speculate that Titan could resemble a primordial Earth. To investigate further, the European Space Agency designed the Huygens probe, which hitched a ride to Titan on Cassini’s back.

On 24 December 2004, the saucer-shaped craft popped off Cassini and careened toward Titan. Several weeks later, on 14 January 2005, Huygens parachuted onto Titan’s surface. Throughout the 2.5-hour journey, Huygens gathered the first in situ data about Titan’s atmosphere and sent humanity the first pictures of the moon’s surface. Mission scientists took those images and stitched together this video:

Cassini has flown by Titan more than 100 times since 2005. Some flybys aimed to sample its upper atmosphere and observe surface features, whereas others allowed Cassini to gaze at Titan’s clouds and study its climate.

The flybys showed that Titan is more complex than anyone imagined, said Elizabeth Turtle, a planetary scientist who works with Cassini’s Imaging Science Subsystem and radar instruments. Cassini-Huygens revealed that Titan’s atmosphere holds heavy, carbon- and hydrogen-rich complex ions, thought to be precursors of life. Dunes of electrified hydrocarbon sand sweep over Titan’s surface, hinting at an atmosphere whipped by wind. Storm clouds dump methane rain that may carve river channels through the frozen surface. This rain may also help to fill the lakes of liquid methane and ethane that spread across the north pole. The liquid cycling of methane is akin to Earth’s water cycle, scientists speculate.

Could Titan be a window into our own planet’s past?Although Huygens and Cassini revealed much about Titan, there are still mountains more mysteries to solve, Turtle noted. How do complex molecules form in the atmosphere? How do the lakes and seas get their liquid hydrocarbons, solely from the rain or possibly from a subsurface reservoir? Could Titan be a window into our own planet’s past?

3. Cassini Changed How We Think of “Habitability”

The Cassini-Huygens mission’s observations of Enceladus and Titan marked one of the mission’s most profound legacies: a tectonic shift in how we think of “habitability.”

Before Cassini-Huygens, when scientists pondered life beyond Earth, they imagined searching terrestrial planets like Mars or Venus, orbiting close enough to the Sun where liquid water could pool on a rocky surface. Unfortunately, Venus turned out to be a scorching hot planet with a runaway greenhouse gas effect, and a cavalcade of rovers has yet to find biological traces on Mars.

Ocean worlds encapsulated by ice are now targets in the broader search for life beyond Earth.But Cassini-Huygens changed the game when it found complex hydrocarbons on Saturn’s ocean moons, hinted at hydrothermal vents on Enceladus, and revealed a cycle of liquid methane on Titan, explained Scott Edgington, the Cassini team’s deputy project scientist.

The bottom line is, ocean worlds encapsulated by ice are now targets in the broader search for life beyond Earth. “Those two tiny little moons in the Saturn system have changed our paradigm” of what a “habitable” world might look like, Edgington said.

4. Cassini Found Enceladus Ocean Material in the E Ring

Saturn’s outermost ring, the E ring, is huge, stretching from Titan to the smaller moon Mimas (a distance of about 1 million kilometers). The ring is also puffier than the other flatter rings encircling Saturn.

Saturn’s icy ocean moon Enceladus orbits within the densest part of the E ring, which made scientists wonder if material from Enceladus’s plumes had enough energy to leave its atmosphere. Turns out, they were right. This image was snapped by Cassini’s wide-angle camera in September 2006 as the Sun was directly behind Saturn from Cassini’s point of view. Credit: NASA/JPL/Space Science Institute

But that’s not the strangest thing about the E ring. Enceladus, that game-changing moon, orbits Saturn within this ring. In fact, Enceladus orbits in the densest part of this ring. Once the Cassini team discovered Enceladus’s geyserlike plumes in 2005, they wondered, Could particles from Enceladus’s plumes be bulking up the E ring?

Analyzing icy tendrils of particles jetting out of Enceladus and comparing those particles to E ring particles answered this question: Yes, some of the plume material does indeed leave the moon fast enough to populate the ring, rather than falling back onto Enceladus’s surface.

This finding is particularly exciting because it “makes the entire ring a sample of the moon’s interior,” said Matthew Hedman, a physicist at the University of Idaho in Moscow who studies Saturn’s rings. Perhaps by studying E ring particles, scientists can learn more about Enceladus’s interior ocean, he noted.

5. Cassini Unlocked Mysteries of Saturn’s Hexagon This image of Saturn’s hexagonal polar jet stream was captured by Cassini’s wide-angle camera in April 2014. Researchers think that the hexagon could be what a jet stream might look like when undisturbed by landmasses or oceans. Credit: NASA/JPL/Space Science Institute

“If you’re a meteorologist, you love Saturn,” said Edgington.

One of Saturn’s features that fascinates atmospheric enthusiasts is the six-sided north polar jet stream simply deemed “the Hexagon.” Voyager’s images hinted at the structure back in the 1980s, but Cassini confirmed its existence once and for all. The Hexagon spans 25,000 kilometers, and thermal imaging revealed that it may extend 100 kilometers down into the planet’s atmosphere.

The Hexagon is a polar jet stream of winds, similar to the river of winds whipping around Earth’s latitudes. The difference is that Earth’s jet stream dips and wiggles because of landmasses and oceans pulling it every which way, Edgington said.

In contrast, Saturn’s hexagon might be what an undisturbed polar jet stream looks like in an atmosphere devoid of land and oceans. Why the jet stream is six sided rather than five or eight sided is still a mystery.

6. Cassini Showed Us One of Saturn’s Huge, Infrequent Storms…

Small thunderstorms pop up from time to time on Saturn, but huge storms arise only about once every 30 years, Spilker said. Because Cassini has orbited the planet for such a long period of time, scientists got to witness one of these massive, global storms from beginning to end.

When Voyager flew by Saturn, it detected lightning, but researchers were unsure of where the lightning signature came from; some even thought it could be from the rings, said Robert West, a coinvestigator on Cassini’s Ultraviolet Imaging Spectrograph (UVIS) instrument. But Cassini’s instruments showed us storms “rising and setting over the horizon” as the planet turned, confirming that the lightning came from Saturn’s atmosphere, he continued.

Then, in 2010, a large white spot formed, signaling a storm. The storm stretched 10,000 kilometers across, about the size of Earth. The planet’s rotation dragged this storm along until it became a band streaking across the planet, widening north to south to span 15,000 kilometers.

From late 2010 through mid-2011, this massive storm raged on Saturn. The storm wrapped around the planet’s northern hemisphere. Storms like this are rare on Saturn; scientists think they occur about every 30 years. Click here for the full caption and description of each stage of the storm. Credit: NASA/JPL-Caltech/Space Science Institute

Over the next several months, as Saturn rotated, the storm continued to drag around the planet like a ribbon, covering 5 billion square kilometers, until one day the tail of the storm collided with the head of the storm and “they annihilated each other,” West said. Then, over just a week, the storm’s activity ceased.

Scientists still aren’t sure why the storm abruptly ended, he noted.

7. …And That Storm Helped Cassini Detect Atmospheric Water

Studying this massive storm also showed that water exists in a lower layer of Saturn’s atmosphere.

On Earth, storms arise because of atmospheric convection from surface heating aided by water vapor that is slightly more buoyant than the rest of the atmosphere. Cool, dense air sinks, and hot, light air rises. The turbulence from this mixing can induce a thunderstorm.

But on Saturn, water is the heavier component. It condenses under hydrogen, helium, and clouds of ammonium sulfide and ammonia. This cap of dry upper layers over wet layers suppresses storm formation for a long time, researchers found.

Because of the lack of mixing between dry and wet layers, the dry region above the water layer eventually cools, becomes heavy, and sinks, pushing the wetter layer upward. This sets off a domino effect of convection like on Earth, but on a much larger scale. During the 2010 storm, that wet layer eventually “punched through the upper clouds” as the dry layer sank, and Cassini was able to clearly detect water clouds in the atmosphere, West said.

8. Cassini Dazzled Scientists with Saturn’s Color-Changing Atmosphere

When Cassini reached Saturn, its northern hemisphere was just coming out of a 7-Earth-year-long winter. Upon arrival, scientists found a crisp, blue atmosphere, West said, rather than the yellowy beige that researchers were used to from ground-based observations.

The blue was a hint of a “clean” atmosphere.Scientists were quick to discover the reason: This blue was a hint of a “clean” atmosphere. Saturn’s upper atmospheric layers are formed mostly of hydrogen and helium, which scatter blue light, and methane, which absorbs red light, West said. So light that scatters off the atmosphere takes on a bluer hue.

Then, as summer peaked in Saturn’s northern hemisphere, Cassini observed a color shift to yellow. This happens because sunlight breaks apart methane, and those broken pieces form long chains of hydrocarbons that form particles. These particles, in turn, scatter light toward the red end of the spectrum. These chemical reactions, which are similar to the creation of smog on Earth, create Saturn’s yellowy haze.

What’s more, Saturn’s hexagonal polar jet stream often remains blue long after the surrounding atmosphere turns yellow: This is partly because it doesn’t receive as much sunlight as the lower latitudes, Edgington said, and partly because the sides of the jet stream block mixing from lower latitudes.

9. Cassini Spied Saturn’s Rings Acting Like a Seismometer Cassini snapped this photo from 4° above the ring plane, from 456,000 kilometers away. By looking at thousands of ring images, scientists noticed that waves propagate through the disk. These waves, they eventually found, could potentially tell us something about Saturn’s interior. Credit: NASA/JPL-Caltech/Space Science Institute

Scientists on Earth can figure out our planet’s internal structure by studying the way earthquake waves move through the ground. But Saturn is a gas giant, devoid of earthquakes. With Cassini, however, and its hundreds of images of Saturn, scientists noted something curious: waves propagating through Saturn’s rings.

Gravitational pull from the planet’s many ring-bound moons can create undulations, but calculations showed that these tugs alone were not enough to produce the signal observed by Cassini. Something else was going on.

By studying thousands of images and running thousands of calculations, scientists tracked the source of these mysterious waves to Saturn itself.

“It’s like the rings are giving you a window into the interior of the planet, much like earthquakes and the frequencies of the waves they generate give you a window into the structure of the Earth,” Carolyn Porco, the imaging team leader for the Cassini mission, told National Geographic.

But what these waves tell us about Saturn’s interior remains a mystery.

10. Cassini Showed Us Saturn’s Other Dynamic Moons This color-enhanced mosaic, captured in 2015, shows mysterious red streaks painted across the northern hemisphere of Tethys. The streaks look superimposed on craters and other surface features, indicating that they formed relatively recently. Credit: NASA/JPL/Space Science Institute

Enceladus and Titan may capture scientists’ imaginations when it comes to habitability, but Saturn’s other moons beckon with mysteries as well.

Tethys, for instance, which orbits in Saturn’s E ring, sports long, mysterious red streaks across its surface. “Nobody has seen anything like them anywhere else in the solar system,” said Amanda Hendrix, a coinvestigator on Cassini’s UVIS instrument and resident icy moon expert at the Planetary Science Institute in Tucson, Ariz.

Unfortunately, Cassini spotted the stripes too late in the mission to garner closer inspection—it’s not a trivial task to change the trajectory of the spacecraft for more flybys, Hendrix noted. The most mysterious aspect of the stripes is their apparent young age. Researchers can estimate the ages of planetary surface features by counting craters and the relative age of features by what’s covering or underneath a crater. But these stripes travel over craters and the surrounding surface, a signal of youth. Nothing seems to have weathered the stripes or hit them.

Two views from Cassini’s closest ever encounter with Pan, Saturn’s walnut-shaped moon. Pan orbits Saturn within the A ring, in a gap it carved out called the Encke Gap. Researchers suggest that Pan’s ridge formed from ring material falling onto its equator. Credit: NASA/JPL/Space Science Institute

“It really is like someone took a crayon and drew on the surface yesterday,” Hendrix said.

Then there are Saturn’s “tutu” moons (or enchilada moons or walnut moons or ravioli moons, depending on whom you ask). The tiny moon Pan, which cleared out its own path through Saturn’s A ring, made a splash on social media when new high-resolution raw images were released by the mission team.

Even tiny Dione, half the size of our own Moon, could harbor an internal ocean, Hendrix said. Instruments on Cassini have detected a magnetic signal similar to the signal from Enceladus that hinted at its plumes. But subsequent flybys of Dione have uncovered nothing, Hendrix said.

“They all have their own personalities,” Hendrix joked.

Cassini’s Farewell

With oceans of data left to process and interpret, the Cassini-Huygens mission will influence research decades into the future. But it’s no easy task to say goodbye to a mission that to some, shaped their very careers. West likened the end of a mission to a divorce, whereas Hendrix went even further: “It really is almost a death,” she said. “Just thinking about it, how [Cassini is] going to be gone, it’s a little weird.”

Leigh Fletcher, an astronomer at the University of Leicester in the United Kingdom, started working on his Ph.D. studying Saturn’s and Titan’s atmospheres in 2004, about the time Cassini entered Saturn’s orbit. He still works with Cassini data to this day, studying Saturn’s atmosphere using Cassini’s Composite Infrared Spectrometer.

“Whenever I watch the [animated] movie of Cassini’s final demise, it’s hard not to feel moved,” Fletcher said, referring to a video NASA released in April that simulates Cassini’s upcoming crash with Saturn. Fletcher feels “pride in what we’ve accomplished, gratitude that I was offered a chance to become involved, and sadness that a team I’ve worked with for 13 years will now be moving on to pastures new.”

Sarah Hörst of Johns Hopkins University, who studies Titan’s atmosphere, says she’ll miss looking up to the sky every morning and saying, “Hello Saturn, hello Cassini!” on her way to work.

In the spacecraft’s last few seconds, before it disintegrates in Saturn’s atmosphere and the molecules that make up its instruments and structure become part of Saturn itself, “Cassini will be like a Saturn probe,” Spilker said, collecting its first—and last—direct measurements of the upper atmosphere.

And a billion kilometers away, past planets, moons, and asteroids, scientists will receive those data “until the very end,” she said.

—JoAnna Wendel (@JoAnnaScience), Staff Writer

Testing Models of Near-Space Electrical Currents

Tue, 07/18/2017 - 11:45
Artist’s conception of the large-scale Birkeland currents at Earth shown as “conical” sections, with arrows indicating the direction of electrical current and their relationship to the auroral emissions at low altitudes shown at the bottom of the currents. Credit: Johns Hopkins University Applied Physics Laboratory/S. G. Smith

High in Earth’s atmosphere, electrical currents called Birkeland currents flow along the planet’s magnetic field lines, connecting the magnetosphere to the ionosphere and powering the aurora. These currents often exceed 10 million amperes during geomagnetic storms and are crucial to scientists’ understanding of extreme electromagnetic events, which can disrupt satellite operations, radio communications, GPS navigation, and even the electric power grid.

However, it can be difficult to reliably simulate the magnetosphere-ionosphere (M-I) system because of the many complex factors that feed into its behavior. Because of this modeling challenge, space weather scientists have taken part in a concerted effort, led by the National Science Foundation’s Geospace Environment Modeling program, to compare and evaluate the available models with the physical data.

Most of that work has used data from magnetometers at ground level because they are easily available and can be helpful for evaluating the effects of space weather on the ground. In a new study, Anderson et al. use observations from the Active Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE) to compare how well four different models match real Birkeland currents during two storm events. AMPERE uses magnetic field data from the Iridium Communications constellation of 66 satellites in 780-kilometer-altitude orbits to measure the Birkeland currents. Conveniently, these currents are one of the basic physical quantities calculated in modern physical simulations of Earth’s interaction with the solar wind, so comparison between these observations and models is easy.

Actual magnetic signatures of the Birkeland currents from AMPERE for 00:2000:30 Universal Time on 4 August 2010. Arrows show the magnetic signal residuals relative to the background magnetic field of Earth projected into the horizontal plane. Different colors indicate data from different satellites of the Iridium Communications constellation. The view is from above the north magnetic pole, with local noon at the top and dawn on the right. Credit: Johns Hopkins University Applied Physics Laboratory/AMPERE/R. J. Barnes

To get an accurate picture of how well the models matched the actual Birkeland currents, the authors used data on how the total current and the spatial distribution of the current density changed over time. They found that all of the models gave the general behavior of the total current, but the quantitative results for the current strength varied considerably between models. When it came to the current density distribution, the models did not fare so well. In some cases, they even predicted the opposite of what actually occurred.

Each model has its own strengths and weaknesses, but they all had trouble accurately predicting the latitude spread in currents, pattern variations in current density, and many of the currents on the nightside of the planet. This outcome suggests that the models were good at reproducing the directly driven aspects of the currents but did not account for more complex processes like the ring current and substorm dynamics.

In the end, no one model clearly stood out to the researchers as better than the rest. The results highlight the need to acquire observations continuously to track space weather as it happens. As new generations of simulations are developed, tests against observations will always be needed to know when our predictions of space weather impacts are reliable. (Space Weather, https://doi.org/10.1002/2016SW001529, 2017)

—Leah Crane, Freelance Writer

Mesmerized by Gracefully Gliding Albatrosses

Tue, 07/18/2017 - 11:43

This is one of a series of Editors’ Vox that discovers what else AGU journal editors do when they’re not reviewing manuscripts. Here, Uri ten Brink, Editor-in-Chief for JGR: Solid Earth, looks back on some recent fieldwork off the coast of Alaska.

Mesmerized by gracefully-gliding albatrosses, holding on for dear life as the boat violently rocks, engaging in late-night scientific discussions, and being free of internet tyranny, is how I spent three and half weeks last May and June doing field work.

The boat was the 106 foot (32 meter) long research vessel Medeia, a converted crab-fishing boat from the Gulf of Mexico, now operated by the Alaska Fish and Game Department. Persistently bad weather and small shared quarters meant that all 13 of us (5 crew and 8 scientists) had to get along or else … and we did get along very well, which added to the pleasure of conducting the field experiment.

View through the wheelhouse on a lovely day at sea! Credit: Uri ten Brink Becky, the ship’s cook, and Brian, a USGS geographer, barbequing in the rain. Credit: Uri ten Brink

 

 

 

 

Queen Charlotte Fault (red line) and the survey area (shaded grey). The red stars show Magnitude +7 earthquakes during the past 90 years; stripes in the ocean show magnetic anomalies. Credit: Uri ten Brink

 

 

 

The science we pursued was mapping the northern part of the Queen Charlotte Fault, a 900-kilometer-long fault stretching from southern Alaska to Vancouver Island, which constitutes the plate boundary between the Pacific and North American plates.

The dimensions of this plate boundary are very similar to the much better-known plate boundary farther south in California and the rate and sense of motion between the two plates is the same in both places, yet the observed deformation due to the relative plate motion is remarkably different.

In California, the deformation is distributed among many faults across the entire state. Off southeast Alaska, the deformation is largely concentrated on a single remarkably straight fault located at the shelf edge and upper slope.

That fault has experienced seven earthquakes greater than magnitude 7 in the past 90 years. The interactions between the fault and the extremely-high sediment influx during de-glaciations adds an interesting perspective to the study of this place.

During this USGS-funded survey, we mapped the seafloor along the fault and its surrounding area using two main devices deployed simultaneously.

A multibeam echosounder – the head of which was fitted on a customized hydraulic wing attached to the vessel – collected data which can map bathymetry (seafloor depth) and backscatter (an indicator of seafloor texture and composition).

Meanwhile, a high-resolution seismic-reflection system – consisting of a 32-channel hydrophone streamer and a sparker sound source towed on a cable behind the boat – collected seismic-reflection profiles (vertical images of sediment layers beneath the seafloor).

As data was coming in, it felt like nature was slowly turning a page in a book to reveal its secrets to us.

Chief scientist Daniel Brothers trimming the edges of the mini-sparker elements to improve their performance. Credit: Uri ten Brink Albatrosses and shearwaters gliding behind the boat while a cable (lower right) tows sub-seafloor imaging devices. Credit: Uri ten Brink

 

 

 

 

 

 

 

—Uri ten Brink, Editor-in-Chief, Journal of Geophysical Research: Solid Earth

USGS Library Cuts Would Harm Research, Education, Say Scientists

Tue, 07/18/2017 - 11:42

The U.S. Geological Survey (USGS) Library, home to one of the largest Earth and natural science collections in the world, faces a 52% funding decrease in the fiscal year (FY) 2018 federal budget proposed by President Donald Trump.

The potential funding loss of $3 million would close at least three of the library’s four branches, eliminate three quarters of the supporting staff, and end public and researcher access to USGS Library collections, according to the FY 2018 USGS budget justification.

This rollback of librarian services and other impacts would damage geoscience research and education, said Earth scientists, educators, and scientific society leaders interviewed by Eos. The harm would also ripple through libraries and other institutions that rely on the USGS Library for materials and guidance not available elsewhere, said librarians and others from outside USGS.

“If these resources are rendered inaccessible, the nation will lose an invaluable scientific asset.”“Defunding the USGS Library has the potential to be devastating,” said Aaron Johnson, executive director of the American Institute of Professional Geologists (AIPG) in Thornton, Colo., referring to the possible effect on research projects of AIPG members.

“If these resources are rendered inaccessible, the nation will lose an invaluable scientific asset and the opportunity for continued commercial return from the information housed in the Library,” wrote 23 science organizations in a 16 June letter to several members of Congress urging continued library funding in 2018 at the level of $5.8 million that USGS currently receives. If that doesn’t occur, the nation “would also lose the federal investment that has already been made in the Library’s collections,” they warned. (The publisher of Eos, the American Geophysical Union, is a signatory of the letter).

Access to Collections and Librarians May Cease

Currently, “USGS librarians provide assistance with finding publications, data, and relevant information to support research activities,” Catharine Canevari, director of the USGS Libraries Program, told Eos. “Librarians assist with hand searching older literature to locate material that has not been digitized, is not catalogued, and is not listed in online indexes.”

With so few staff left after the anticipated cuts, those who would remain are expected to focus on “inward-facing, technical, and operational tasks, with minimal capacity for research support and digitization,” according to a statement that the USGS Office of Communications and Publishing (OCP) provided to Eos. Branch closures would restrict public, researcher, and educator access to nondigitized collections and USGS librarians, the statement also noted. The library operates branches in Reston, Va.; Lakewood, Colo.; Flagstaff, Ariz.; and Menlo Park, Calif.

The most important information to which geologists could lose access is “foundational” materials, such as topographical maps, land use patterns, and historical records, that serve as the starting points for geophysical research projects, Johnson said. He believes that losing access to the nondigitized collections could derail ongoing and future research projects.

Under the proposed funding restrictions, USGS would not be able to maintain its Publications Warehouse, the online official index to USGS-authored publications, according to the OCP statement.  The warehouse site received more than 1.2 million unique visitors in 2016 and was the 11th most visited website in the U.S. Department of Interior in the last 30 days, according to analytics.usa.gov.

No Alternative

Much of the USGS Library’s content is unique or available from fewer than 10 libraries around the world, the agency reported in a 2014 blog post about digitization of its library holdings.

During 2015 and 2016, the USGS Library filled “over 3,600 requests for resources from 820 individual institutions,” according to the OCP statement. “Many other libraries use it as a resource, to get documents and information that they can’t get anywhere else,” said Maeve Boland, director of geoscience policy at the American Geosciences Institute (AGI) in Alexandria, Va.

For example, according to Lisa Long, the librarian at the Ohio Geological Survey (OGS), collections held by the USGS Library and the OGS have little overlap. “In some cases,” she told Eos, “we depend on the USGS collections to have items or be able to explain the provenance of items that we may or may not have in our collections. The organization of the information and help in accessing it that the librarians bring to their public service is not replaceable.”

Potential Education Impacts

Beyond geological research, the USGS Library has provided resources for geology educators and the public for years. USGS estimated that 40% of visitors to the Denver, Colo., branch and 80% of visitors to the Reston, Va., branch were from outside USGS.

Students studying geology in college would also be hit hard by the loss of access to the USGS Library.Students studying geology in college would also be hit hard by the loss of access to the USGS Library, according to Johnson. An associate professor of geology at Missouri State University for 9 years, Johnson recalled that he relied on data from the USGS Library to create course content for undergraduate classes ranging from introductory to advanced senior-level courses. In particular, he regularly used the USGS Library materials to provide his students with real-world applications of difficult geological concepts. “In one exercise,” Johnson described, “my intro students used peak flooding data available from the USGS stream gauging program to…make predictions of peak flooding events.”

“My students have found working with USGS data to be one of the most valuable parts of their preparation to be professional geoscientists,” he added. “In my opinion, you can’t underestimate the impact on undergraduate education in the geosciences.”

What’s Next?

The requested cut to the USGS Library budget is part of a 15% reduction of overall USGS funding in President Trump’s FY 2018 budget request. In an 18 July draft 2018 spending bill the House Appropriations Subcommittee for the Interior has recommended $116.8 million more for USGS than the president’s request. However, the agency total still falls $46.2 million short of its current funding level, and congressional actions overall on the FY 2018 federal budget remain at an early stage.

In the meantime, “options are being identified and evaluated to inform implementation strategies and decisions that will define the full impact of changes to library services, resources, and collections,” the OCP statement said.

As Boland noted, the agency and its library “are not in control of their own destiny.”

—Kimberly M. S. Cartier (@AstroKimCartier), News Writing and Production Intern

Cities Partner to Prepare for Natural Hazards and Climate Change

Tue, 07/18/2017 - 11:37

Bringing satellite observations and climate projections together with key data collected on the ground helps cities better anticipate geophysical hazards and adapt to climate change.

In November 2016, 10 scientists, engineers, and officials from Rio de Janeiro City Hall visited NASA Goddard Institute for Space Studies (GISS) in New York and received specialized training on urban heat islands, sea level rise, and water quality. At the workshop, Rio officials and NASA scientists learned from each other’s operational experiences.

The first day’s presentations outlined several efforts by Rio de Janeiro and New York to help city planners face current and future challenges. A talk discussed Rio’s sophisticated Operations Center, which monitors many aspects of the city’s operations, including traffic, crime, and weather. Another outlined New York’s efforts to strengthen its climate resilience, for example, by building new coastal defenses at Jamaica Bay. Further talks highlighted NASA’s range of Earth science products with applications for situational awareness and decision support, as well as efforts of the Urban Climate Change Research Network’s (UCCRN) Latin American Hub in Rio.

For decision-making at the city level, global data sets and projections must be paired with high-resolution in situ measurements and local knowledge.A science policy roundtable, which included representatives from Rio City Hall, the city of New York, NASA GISS, and the C40 Cities Climate Leadership Group (a global network of megacities committed to addressing climate change), focused on the applicability of Earth observations to support cities’ climate resilience efforts. Remote sensing data sets (e.g., precipitation) and climate projections were deemed to be valuable tools. For decision-making at the city level, however, participants noted that global data sets and projections must be paired with high-resolution in situ measurements and local knowledge.

The second day of the workshop consisted of concurrent training sessions on the workshop’s three main themes. NASA technical experts presented summaries of new research and introduced relevant data sets and resources. In the session on urban heat islands, participants learned about the effects of different surfaces on air temperature, as well as the impacts of green infrastructure on the urban environment. In the session on sea level rise, participants learned that sea level is rising by approximately 3 millimeters per year in the New York area, reflecting a combination of the warming ocean and long-term glacial adjustment. Rising sea level has major consequences for coastal cities, including accelerated erosion, saltwater intrusion, and more frequent “nuisance” street flooding. In the session on water quality, presenters discussed the challenges of studying ocean ecology and monitoring water quality with remote sensing. The presenters noted that several planned satellite missions, such as NASA’s Plankton, Aerosol, Cloud, Ocean Ecosystem (PACE) and the joint Argentinian-Brazilian Satellite of Environmental Information of the Sea (SABIA-Mar), are expected to provide high-quality measurements, particularly in coastal zones.

Advancing climate resilience through city partnerships requires strong leadership and sustained communication.On the last day of the workshop, the group visited areas of Lower Manhattan affected by Hurricane Sandy in 2012. Participants saw the high-water mark of the storm surge generated by the hurricane, well above the boardwalk at Battery Park. Seeing the affected areas emphasized the need for coastal cities to continue their preparations for further sea level rise and more frequent extreme weather events. A Facebook Live event at Battery Park highlighted the NASA-Rio partnership and featured several of the workshop participants.

Advancing climate resilience through city partnerships requires strong leadership and sustained communication. Rio de Janeiro plans to leverage NASA Earth observations and models, as well as international networks like UCCRN, to develop city-specific sea level rise projections, disaster response plans, and water quality monitoring.

A full workshop summary and description of the NASA–Rio de Janeiro partnership are available at a dedicated NASA-Rio partnership Web page.

—Margaret M. Hurwitz (email: margaret.m.hurwitz@nasa.gov), NASA Goddard Space Flight Center, Greenbelt, Md.; also at Science Systems and Applications, Inc., Greenbelt, Md.; Felipe Mandarino, Instituto Pereira Passos, Rio de Janeiro, Brazil; and Dalia B. Kirschbaum, Hydrological Sciences Laboratory, NASA Goddard Space Flight Center, Greenbelt, Md.

A Powerful New Tool for Research

Mon, 07/17/2017 - 12:00

The open-source Generic Mapping Tools (GMT) software is widely used to process, manipulate, and display geoscience data sets in multiple dimensions. Although earlier versions of GMT provided basic grid input/output for MATLAB®, a separate “mapping toolbox” and programming language developed by MathWorks, the two products could not directly share their data or methods.

Now Wessel and Luis have developed a simple and flexible interface between the two programs that increases their interoperability and extends the capabilities of both tools. The GMT/MATLAB Toolbox provides GMT users with full access to MATLAB’s robust computational abilities while also allowing MATLAB users to access GMT’s specialized applications, including those that produce publication-quality illustrations. The new toolbox is able to access not only the core components of the GMT software package but also custom extensions installed by the user, including one specially developed for the Global Seafloor Fabric and Magnetic Lineation Data Base Project. These advances are made possible by the GMT library, which enables similar interfaces for Octave, Julia, and soon Python.

A screen capture shows the programs in use. The figure on the left is a PNG produced by GMT but loaded into MATLAB. On the right is a MATLAB graph of the individual profiles, with the averaged profile drawn in a thicker line. Credit: Wessel and Luis [2017]In addition to an overview of the new toolbox, the researchers provide several detailed examples of how it can be applied to data sets of interest to the geoscience community, including an analysis of crossover error that could not easily be accomplished by either program alone. The GMT/MATLAB Toolbox, which the team describes as “a giant step forward in interoperability,” is freely available online for all computing platforms at the University of Hawaii’s GMT website. (Geochemistry, Geophysics, Geosystems, https://doi.org/10.1002/2016GC006723, 2017)

—Terri Cook, Freelance Writer

Algorithm Discerns Where Tweets Came from to Track Disasters

Mon, 07/17/2017 - 11:57

There’s a smartphone in nearly everyone’s pocket these days, and crowdsourced data are downright plentiful: photos, videos, and posts on Facebook or Twitter, to name just a few. Now, researchers have developed a new algorithm to pinpoint the geographic locations of natural disasters mentioned in tweets.

The method, which groups tweets together on the basis of the location likely being referred to in the message, can help emergency personnel gauge the magnitude of a disaster and react quickly to deliver relief services, its developers say. This algorithm, which works nearly in real time, complements traditional methods of investigation such as satellite observations, which can be impeded by cloud cover, smoke, or other obstructions.

“In a natural disaster…as long as a single connection to the Internet remains, the disaster’s victims can get their calls for help out.”“We need information about what is happening right now,” said Jeroen Aerts, a geographer at Vrije Universiteit Amsterdam (VU Amsterdam) and a member of the research team. “We have to rely on people who are actually in the area.”

“In a natural disaster, landlines might fail, a local cellular network may go down, but as long as a single connection to the internet remains, the disaster’s victims can get their calls for help out, ” said Christopher Brown, a linguist at The University of Texas at Austin who was not involved in the research.

Which Boston?

Researchers have previously used Twitter data to investigate natural disasters. At the American Geophysical Union’s 2016 Fall Meeting, for example, scientists reported that Twitter posts accurately mapped the extent of flooding in Japan.

Unfortunately, geotagging tweets isn’t as simple as recording the latitude and longitude of the sender’s computer or smartphone. That’s because the Twitter feature of attaching GPS coordinates to a tweet is turned off by default, which results in fewer than 1 in 100 tweets having associated coordinate information. So Jens de Bruijn, a geographer at VU Amsterdam, and his team turned to more creative ways of inferring the geographic origins of tweets.

De Bruijn and his colleagues started by collecting tweets about floods because the researchers had expertise assessing flood risks worldwide. They gathered more than 35 million tweets published in 12 different languages between 2014 and 2017. From this database, the researchers extracted the roughly 11 million tweets that mentioned one or more flood locations.

The researchers then faced a challenge: locations noted in individual tweets were often ambiguous. Did “Flooding houses! #BostonFlood” refer to the capital of Massachusetts in the United States, the town in eastern England, or the municipality in the Philippines?

Metadata to the Rescue

To overcome this uncertainty, de Bruijn and his collaborators analyzed the tweets’ metadata, ancillary information that includes a user’s time zone, hometown, and, if available, GPS coordinates. The team used these data to further refine the location referred to in a particular tweet.

For instance, if a user with a hometown of Waltham, Mass.—a city near Boston, Mass.—tweeted “Flooding houses! #BostonFlood,” the researchers concluded that the flooding was occurring in the United States as opposed to England or the Philippines.

“It’s an exciting area of research,” said Aerts. “It combines the natural sciences and the social sciences.”

“A single tweet can only convey 140 characters, but together with hundreds or thousands of other related tweets, the available information adds up to a more actionable sum.”Unlike other geotagging algorithms that analyze tweets on an individual basis, the one developed by de Bruijn and his colleagues assigns locations to groups of tweets. It lumps together tweets that share a keyword tied to a particular location and that were published close together in time. This method made it possible to geotag more tweets than would have been possible considering each tweet on its own, according to the team.

“It’s a group effort: A single tweet can only convey 140 characters, but together with hundreds or thousands of other related tweets, the available information adds up to a more actionable sum,” explained Brown.

Putting the Method into Practice

Using a control set of thousands of tweets that one researcher had manually geotagged, the scientists showed that their algorithm automatically and correctly geotagged approximately 70% of tweets. That’s a roughly twofold increase over other geotagging methods that analyze tweets on an individual basis, the team reported.

This new algorithm can churn through millions of tweets in a matter of hours, and de Bruijn and his team are working on how to best share their findings. “We want to see how we can use [this algorithm] in practice,” said Aerts. “We’ve been in contact with the Red Cross and the World Bank.”

—Katherine Kornei (email: hobbies4kk@gmail.com; @katherinekornei), Freelance Science Journalist

Climate Change Indicators Are Not Enough

Fri, 07/14/2017 - 11:21

Dealing with climate change will require transforming both science and society in the coming decades. Society is struggling to accept the idea that Earth’s climate will change radically this century unless we double or triple energy efficiency and rapidly shift from burning fossil fuels to renewable energy sources.

Scientists, who were trained to think of climate as 30-year averages, are struggling to understand a complex physical, ecological, and social system that is changing every decade. We are all faced with an uncertain future and mounting risks that could overwhelm our societies unless we change direction very soon [Oreskes and Conway, 2014].

Ironically, political and scientific interests alike can use our limited understanding of the complexity of the Earth system to rationalize waiting either for greater certainty or for more data. The desire of politicians to protect vested interests is clear, but scientific reticence is looking foolish as extremes of weather and climate increase. Earth science itself needs a paradigm shift.

A Shifting Baseline

Science deals with measurable quantities: We search for the clearest indicators of climate change that we can present to the public so that they can understand what is happening, face reality, and adapt. Clear examples are the shrinking of the Arctic ice cap in summer, the shortened duration of the Northern Hemisphere’s cold season, and the rising sea level caused by warming oceans and the melting of grounded ice sheets.

Although year-to-year records show strong variability, the decadal trends are clear. The 60-year record of rising atmospheric carbon dioxide levels documented at the Mauna Loa Observatory in Hawaii and the satellite record of the summer melting of the Arctic ice cap are striking evidence of change.

But for society, slow changes can be ignored, especially if the interannual variability is large. In contrast, extremes of precipitation and flooding, strong hurricanes and storm surges, heat waves and severe drought cannot be ignored, although they can be dealt with as one-time events that are unlikely to recur.

Some belief systems may attribute these events to factors that are beyond our control. For science, extremes are challenging because they are statistically rare, and our statistics of extremes must necessarily use scarce data from the past.

Because the climate has changed, the weather statistics from the past are no longer valid.A still greater challenge is that because the climate has changed, the weather statistics from the past are no longer valid [Milly et al., 2008]. So the public is confused when it hears that a disaster was a “100-year flood” when it knows there have been several similar floods in recent decades.

This dependence for comfort on statistics from the past that are known to be obsolete illustrates the challenge of facing and accepting climate change. Risks are mounting, and given the large costs and lead times needed to build new infrastructure, society needs guidance now to deal with extremes in the future. But the scientific community is struggling to find satisfactory models to deal with extremes outside the historic range [Serinaldi and Kilsby, 2015].

Those who would rather stall for time and argue that it is wise to wait for greater certainty will find instead that the climate system is moving into new unexpected regimes.

Understanding Interconnected Systems

Science is important in providing new understanding of the changing climate, especially in the face of false information provided by many vested interests. For example, it’s critical for local communities in northern latitudes to understand that the rapid melting of the Arctic sea ice in summer in the past 30 years is driven by the same chain of processes that decrease snow cover and produce the warmer winters that they are experiencing.

Less snow or ice cover reduces the reflection of sunlight from Earth’s surface in the daytime and also increases the evaporation of water into the air. This evaporation reduces the amount of heat that radiates into space because water vapor is a powerful greenhouse gas that traps heat within Earth’s atmosphere. Less reflection of sunlight and more water vapor in the atmosphere increase the warming and melt even more snow and ice. As a result, snow cover on land acts as a fast climate switch [Betts and Tawfik, 2016]. In fact, the average temperature of winter increases directly with the decrease in the number of days with snow cover [Betts et al., 2014].

Similarly, small lakes at northern latitudes provide local communities with clear, undeniable evidence of their warming winter climate [Hodgkins et al., 2002; Betts, 2011]. For example, the freeze-up and ice-out (thaw) dates have been recorded over a span of decades for a small lake in northern Vermont called Stile’s Pond. Since 1970, the winter frozen period has shrunk on average by 7 days per decade. But in recent years the variability between one winter and the next has been very large, and communities want to understand why.

To understand, they have to see the global picture (Figure 1), which contrasts the global mean temperature anomalies for January–February–March periods in 2015 and 2016. The Northern Hemispheric patterns are radically different between the 2 years. In 2015, there was a strong cold anomaly over eastern North America that lasted all 3 months, with 3 months of snow cover in northern New England. The strong temperature contrast between the cold land and the warm ocean produced a series of strong coastal storms that gave Boston a record 9 feet (2.7 meters) of snow. In 2016, this same region was very warm, with very little snow cover. Stationary patterns like this produce climate extremes, but the public needs to know that we cannot yet forecast them on seasonal timescales.

Fig. 1. Contrasting global temperature anomaly patterns for January–March of (left) 2015 and (right) 2016. Data are from NASA Goddard Institute for Space Studies. The Challenge of Extremes

The public may simply accept these decadal climate changes coupled with large interannual variability. However, society reacts to extreme events. A large earthquake can lead to new building codes, even though such disasters are rare and generally unpredictable. A major flood from Tropical Storm Irene galvanized Vermont to rethink its river buffers and flood plain management.

Hurricane Sandy caused devastating flooding to New York City and New Jersey. The flood hazard assessment for New York Harbor [Orton et al., 2016] estimated that similar flooding could recur in 260 years, but after a 1-meter rise of sea level that return period shrinks to a few decades. Remarkably, the certain rise of sea level in coming decades, which will continue for centuries, has not prompted the radical rethink of coastal management that is needed.

Choosing a Path Forward

Science, like much of society, clings to traditional ways: the deification of rationality, the objectification of the living natural world, and the dream that we have power and control over the natural world. All of these are partial truths, which we inherited from our traditions, but they are inadequate to deal with climate change.

Humanity is embedded in a deeply interconnected living Earth system, whereas science is funded by a society embedded in a consumer-market-growth economy based on the exploitation of the finite resources of the planet.

The shift from the arrogance of our own power to a frame where we understand and cooperate with the living Earth system will not be easy.The shift from the arrogance of our own power to a frame where we understand and cooperate with the living Earth system will not be easy. Earth’s physical climate and ecosystem are simply adapting to changing atmosphere and oceans; climate change is accelerating. The 2015 Paris agreement is global recognition of the challenge we face, but the global transition that is needed will require a transformation of science as well as society in the coming decades.

For scientists, this change will require a new willingness to engage strongly with society, from the local to the national and international levels. I have provided some suggestions on how scientists and lay people can engage to address this issue on my website. Climate change indicators are powerful tools, but they are not enough.

Acknowledgment

This work was partially supported by NSF grant OIA 1556770 to the University of Vermont.

How Geomagnetic Storms Light Up the Geocorona

Fri, 07/14/2017 - 11:19

At the very edge of the exosphere, the outermost layer of Earth’s atmosphere, sunlight scatters off hydrogen atoms. This effect creates the geocorona, a luminous glow when seen in far-ultraviolet light. Its brightness and hydrogen density vary over time, especially after geomagnetic storms, but previous sounding rockets and satellites have never had the temporal resolution to study those shifts in detail.

Japan’s Hisaki satellite, launched in 2013, was intended to study the atmospheres and magnetospheres of the other planets in our solar system from a low orbit around Earth. In a lucky coincidence, the images captured by its Extreme Ultraviolet Spectroscope for Exospheric Dynamics (EXCEED) happened to contain information about Earth’s corona as well, masquerading as EXCEED’s foreground contamination. In a new study, Kuwabara et al. examine this useful information about geocoronal emission to assess observed brightness and density changes.

The researchers analyzed EXCEED observations during a 5-day time period in February 2014, during which three separate geomagnetic storms occurred. They found that between approximately 2 and 6 hours after changes in magnetic activity, the geocorona’s hydrogen density abruptly increased, indicated by an increase in brightness when seen in far-ultraviolet light.

After geomagnetic storms, huge amounts of energy that are injected into high-latitude regions in the thermosphere expand, transferring energy to neutral particles in the exosphere. This heating process affects the composition of the exosphere, but because the process isn’t complete until more than 10 hours after the storm, this slow energy transfer couldn’t be the cause of the geocorona’s observed sudden brightening. So what exactly was the cause?

The authors found that the brightening may have something to do with the plasmapause, the boundary marking where the atmosphere’s plasma density drops precipitously. Following the geomagnetic storms covered in this study, the EXCEED data showed that the plasmapause tightened around Earth, squeezing from 5 Earth radii above the surface to only 2. This change, like the changes in the geocorona, occurred between 2 and 6 hours after the onset of a geomagnetic storm at the same time as the geocorona’s brightening.

EXCEED observed charge exchange between hydrogen ions in the plasmasphere and hydrogen atoms in the exosphere, which then pushed hot particles to the geocorona. The time frame of the charge exchange supports the authors’ idea that the contraction of the plasmapause could cause the abrupt increase in exospheric hydrogen after geomagnetic storms. Further research into the behavior of Earth’s atmosphere after geomagnetic storms will help us better predict and prepare for them. (Journal of Geophysical Research: Space Physics, https://doi.org/10.1002/2016JA023247, 2017)

—Leah Crane, Freelance Writer

Build Four New U.S. Polar Icebreakers, Report Urges

Fri, 07/14/2017 - 11:17

Deeming the United States “ill-equipped to protect its interest and maintain leadership” in the polar regions, a national science advisory committee has issued a report calling for construction of four new ships with heavy icebreaking capability.

The United States currently owns just one operational, heavy polar icebreaker, the Polar Star, which was built in 1976 and is long past its 30-year design life. The Polar Star and one medium polar icebreaker, the Healy, constitute the entire U.S.-owned operational polar icebreaker fleet, whereas Russia has 16 polar icebreakers with 4 more under construction, Finland has 7, Sweden 4, and Canada 3.

Heavy icebreakers come with high price tags. To hold down costs, which nonetheless would average nearly $800 million per ship, the report from the National Academies of Sciences, Engineering, and Medicine (NASEM) recommends using essentially the same design for all the ships and buying them all through a block purchase.

The document, entitled “Acquisition and Operation of Polar Icebreakers: Fulfilling the Nation’s Needs,” also recommends assigning the U.S. Coast Guard to own and operate the vessels, three of which would serve in the Arctic and one in the Antarctic.

To support future scientific use of the ships, the NASEM committee encourages spending extra initially on the vessels to make it easier and more cost-effective to later equip them for science missions. All of the ships should be built to a standard of “science ready,” at a cost of about $10–$20 million extra per ship, the report recommends. For yet another $20–$30 million, one of the four ships should begin its career as fully “science capable.”

“If you’re going to build a ship that goes to places where no other ship can go, to oceans that we don’t know a hell of a lot about, then you ought to have some ability to do a little science while you’re there,” said Rear Adm. Richard West (retired), who chaired the NASEM committee that wrote the report.

Elements of a science-ready design, the report notes, could include structural supports, flexible accommodations for up to 50 science personnel, and the means to avoid interference with sonar transducers. Full science capability could include carrying oceanographic equipment, instrumentation, and facilities “comparable with those of modern oceanographic research vessels,” according to the report.

West added that with science-ready designs, “you don’t have to go back and retrofit that capability at probably a tenfold more expense.”

Failure to Respond to the Need

“The nation is in extremis. If you don’t do something now, you will be without icebreaking capability very, very quickly.”There’s no time to lose, according to West and his committee. “For more than 30 years, studies have emphasized the need for U.S. icebreakers to maintain presence, sovereignty, leadership, and research capacity—but the nation has failed to respond,” they state in the document.

West told Eos that he hopes that this study will make a difference. “We’ve been procrastinating on investing in polar icebreakers for a long time, and so this is a crucial report,” he said. “The nation is in extremis. If you don’t do something now, you will be without icebreaking capability very, very quickly.”

The admiral, who from 2002 to 2008 served as president and CEO of the Consortium for Oceanographic Research and Education, which was renamed the Consortium for Ocean Leadership during that period, noted that both the Obama and Trump administrations have expressed support for polar icebreakers.

Funding Prospects

A government project to acquire a new polar icebreaker received about $221 million between fiscal year (FY) 2013 and FY 2017, according to a June 2017 Congressional Research Service (CRS) report. The CRS report also summarizes some of the Coast Guard’s long-standing efforts to beef up its icebreaking capability, including $20 million in contracts awarded to companies in February 2017 for heavy polar icebreaker design studies and analysis.

The Coast Guard’s proposed FY 2018 budget requests $19 million in acquisition funding for a polar icebreaker. U.S. president Donald Trump said in a 17 May speech at the Coast Guard Academy that during his administration, “we will be building the first new heavy icebreakers the United States has seen in over 40 years.”

On Capitol Hill, the Senate Armed Services Committee last month unanimously approved a provision of the FY 2018 National Defense Authorization Act that calls for procurement of up to six Coast Guard polar-class icebreakers.

How Best to Accommodate Science

The report “is a measured, realistic assessment.”Carin Ashjian, a biological oceanographer who is a senior scientist at Woods Hole Oceanographic Institution and a member of the NASEM committee, said she would prefer that all four ships be made science capable.

However, she said the report “is a measured, realistic assessment. Given the present levels of funding available to the nation, we could not support [full science capability] on four of those ships. We simply do not have the research funds, nor do funding agencies have funds to maintain the operations of the science portions of a ship.” She said the report builds in flexibility by recommending science-ready and science-capable icebreaker designs.

Kelly Falkner, director of the Office of Polar Programs at the National Science Foundation, said that she appreciates the efforts by the committee to take a pragmatic approach to building polar icebreakers. However, she told Eos that she has concerns about how thoroughly the new icebreakers would meet science needs.

This report “is very important in reinforcing that we do need to immediately get going on construction of our [ice]breaker fleet. But I’m hoping we can continue the conversation on how best to accommodate science.”“In an ideal world, you’d have an asset that is controlled and scheduled completely by the science community and operated in the most efficient and effective way for science,” she said, noting that the Coast Guard has many other important priorities, including national security and search and rescue missions. “When it comes to being at the forefront of marine science in the polar regions, you really want to dedicate a vessel to that and optimize a vessel for that.”

Some people think that if the science community doesn’t throw its hat in with the Coast Guard, it will never get a science-suitable icebreaker, Falkner said. However, “I would argue that we’re not at the point where you take what you get,” she said.

This report “is very good from the standpoint of reinforcing that we do need immediately to get going on construction of our [ice]breaker fleet. And I’m hoping we can continue the conversation on exactly how best to accommodate science.”

—Randy Showstack (@RandyShowstack), Staff Writer

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer