EOS

Syndicate content
Earth & Space Science News
Updated: 2 days 2 hours ago

An “Old School” Approach and a Community Effort

Wed, 11/15/2017 - 13:12

When I became Editor in Chief of Reviews of Geophysics in October 2009, I decided to do something “old school” and read every paper that we handled. I was fortunate to have had the opportunity to talk with two earlier incumbents—Alex Dessler (1970-1974) and Andy Nagy (1980-1984)—who were not only the Editor in Chief, but at the time the only editor for the journal. They single-handedly were responsible for all the papers submitted from across the entire spectrum of Earth and space science disciplines that AGU membership spans. Since Reviews generally publishes 24 or so papers per year, I thought that I could be as “hands-on” as these former editors.

Besides learning much more about the breadth and importance of the science that the AGU community studies, I learned that my AGU colleagues are awesome. All of the authors of review papers that were published in the last eight years have made tremendous impact on our disciplines. Their review papers not only synthesized the current understanding but also helped resolve controversies and define future directions of the field. A significant fraction of Reviews of Geophysics papers are in the top 1 per cent of highly cited papers, clearly indicating the quality of the author teams’ contributions and impact [Moldwin et al., 2017].

A requirement to write an excellent review paper is a large and talented pool of referees that are willing to commit significant effort providing extensive, timely and helpful feedback to the author teams and to the Editorial Board. Their comments not only helped authors to make the reviews more concise and well structured, but also helped them provide context to the broader geoscience community.

The role of the editorial board (Fabio Florindo, Gregory Okin, Alan Robock, Eelco Rohling, Bayani Cardenas, Annmarie Carlton, Kate Huihsuan Chen, Michel Crucifix, Andrew Gettelman, Alun Hubbard, Tomoo Katsura, and Thomas Painter) in identifying author teams and referees to work with them was also crucial to the success of the journal. The Editorial Board has shown tremendous dedication to assisting author teams, referees, and each other in making outstanding decisions. During our tenure, we handled only a few papers that were problematic in one way or another and I know that the editors made thoughtful, fair and defensible decisions.

Assisting the Editorial Board were incredible AGU Editorial Assistants and Publications staff (Julie Dickson, Rochelle Odon, Jenny Solecki, Pam Calliham, Rebecca Knowlton, Graciano Petersen, Edith Judd, Zach Stahly, Randy Townsend, Jeanette Panning, Lorraine Hall-Petty, Indrani DasGupta, Victoria Forlini, Sara Young, Paige Wooden, Bill Cook, Jenny Lunn and Brooks Hanson). They provided clear communication channels between the authors, referees, editors and Wiley production staff. As is true in so many areas of science, without highly professional, competent and friendly staff support, nothing would get done.

I am very grateful for all the support and collegiality offered to me as I served in this role. I am confident that the new Editor in Chief, Fabio Florindo, will lead Reviews of Geophysics well into the future.

At the AGU Fall Meeting in New Orleans please join me in thanking the AGU publication staff, editors, authors and referees for all they have done for our community. When you are standing in a beer or coffee line, thank the person next to you, since they probably have been involved with some aspect of the publication process – as an author, as an editor, or a referee. Our future depends on our continued unselfish cooperation in research and our willingness to assist each other in communicating our science.

—Mark Moldwin, Editor in Chief, Reviews of Geophysics, and Department of Climate and Space Sciences and Engineering, University of Michigan; email: mmoldwin@umich.edu

Pine Island Glacier and Ice Sheet Stability in West Antarctica

Wed, 11/15/2017 - 13:10

West Antarctica’s Pine Island Glacier now makes the biggest single contribution to sea level rise.The future of Earth’s large ice sheets in Antarctica and Greenland represents the greatest uncertainty in predictions of sea level. The behavior of the West Antarctic Ice Sheet, in particular, is difficult to predict. Of all glaciers worldwide, West Antarctica’s Pine Island Glacier now makes the biggest single contribution to sea level rise. This glacier’s unusually large contribution has been driven by changes in the Amundsen Sea, which increasingly have brought warm ocean waters into contact with floating ice.

The United Kindom’s Natural Environment Research Council (NERC) Ice Sheet Stability Research Programme (iSTAR) is an effort to improve understanding of West Antarctica’s combined Pine Island Glacier–Amundsen Sea system. The iSTAR Programme formally ended this year, and in May, delegates gathered at the University of Leeds to review its contribution.

The meeting’s goal was a synthesis of iSTAR’s combined achievements and an assessment of these achievements against the program’s original objectives: understanding the transport of ocean heat toward the ice sheet, understanding the sub-ice processes affecting ice melt, and understanding the response of the inland glacier to the ocean-induced changes.

Two clear significant messages came out of the talks and extended discussions. First, future research in this field should include atmospheric processes, not just ice and ocean processes. Overall, water transport is driven by the ocean’s density structure. However, the way that warm water’s contribution to melting ice varies over time is controlled by atmospheric forcing (particularly wind patterns) at scales both local (100 kilometers) and synoptic (1,000 kilometers).

Numerical models of Pine Island Glacier are now limited—not by their own resolution and abilities, but by the paucity of essential observations.Second, the value of long time series observations has proved critical. Many talks demonstrated that Amundsen Sea records have been collected for a long enough period to show that ice sheet changes are an accumulated response to a sequence of oceanographic and meteorological events over recent decades, rather than simply a response to long-term trends. Presenters emphasized the exceptional value of long-term moorings in the Amundsen Sea. Meeting participants agreed unanimously that continuation of these time series efforts should be a high priority in the future.

Several talks demonstrated that numerical models of Pine Island Glacier are now limited—not by their own resolution and abilities, but by the paucity of essential observations, in particular ice sheet thickness in the critical region where the ice goes afloat. More observations are needed in that region before models can reliably predict this glacier’s future behavior.

An operator prepares an ice-sounding radar during windy weather on Pine Island Glacier. The radar measures ice thickness and detects internal layering within the ice. Credit: David Vaughan

The meeting included discussion of the new “tractor traverse” capability, in which tracked snow vehicles transport large equipment and portable living quarters across the ice. The program used this new capability in its ice sheet data acquisition efforts. The wide range and high quality of the ice sheet data included at the meeting showed how successful this had been. Now proven, this capability can support further ice sheet research across a range of geoscience disciplines, including glaciology, geophysics, geology, paleoclimate, and atmospheric observations.

The meeting was supported by NERC; the iSTAR program, which is funded by NERC, has research collaborations with the United States, South Korea, and Germany. Karen Heywood, Adrian Jenkins, David Vaughan, Andy Shepherd, and Keith Nicholls helped improve the quality of the manuscript.

—A. M. Smith (email: amsm@bas.ac.uk), British Antarctic Survey, Cambridge, U.K.

Maintaining Momentum in Climate Model Development

Wed, 11/15/2017 - 13:01

Climate models have gotten steadily more sophisticated over the past 5 decades, representing a wider range of timescales and spatial scales and capturing increasing degrees of complexity and interconnections among different components of the climate system. Climate models use mathematical tools to represent physical processes like evaporation of water from the ocean’s surface, moisture transport in the atmosphere, or mixing of heat in the ocean; the better the math is at mimicking the processes, the more accurately the models can explain past variations and predict future conditions.

Toward this end, in 2003 the U.S. Climate Variability and Predictability (U.S. CLIVAR) national research program, with funding from the National Oceanic and Atmospheric Administration (NOAA) and the National Science Foundation (NSF), assembled a group of climate process teams (CPTs) to focus on improving global climate models. Each team comprised 7–12 principal investigators from academia, partners from modeling centers, and several postdoctoral researchers (some of whom were embedded at modeling centers).

Each CPT tackled a particular physical process (e.g., mixing by internal waves in the ocean or formation of clouds in the atmosphere) and how it is represented in one or more global climate models. The CPTs have a universal mission: improving the representation of physical processes in climate and weather models to help make better predictions of the Earth system. Over the years, CPTs have made significant advances in model performance, allowing us to better represent, understand, and predict climate change.

Funding for these teams was renewed in 2010, but now this funding is running out. Yet we still require improvements to climate models to more accurately predict how environmental conditions will vary in coming years and decades. Thus, the CPTs’ vital work is not finished, and this effort must continue to receive support.

CPT Successes

By focusing in depth on a single problem for a finite time, CPTs have accelerated scientific understanding of particular processes.Past CPTs have produced important improvements in global climate models. New convective parameterizations [Bretherton and Park, 2009], improved ocean model representations of shear-driven mixing [Jackson et al., 2008], bottom boundary mixing [Legg et al., 2006], and mixed-layer submesoscale restratification [Fox-Kemper et al., 2008] are all now included in one or more state-of-the-art global climate models.

By focusing in depth on a single problem for a finite time, CPTs have accelerated scientific understanding of particular processes. Successful examples of this in the oceanographic community include a more complete picture of the ocean internal wave energy distribution [MacKinnon et al., 2017] and new research into ocean submesoscale processes that are not typically resolved by global climate models [Boccaletti et al., 2007].

Boundary layer cloud processes have proven to be very difficult to parameterize in global climate models, yet they play a critical role in modulating Earth’s climate, making their accurate representation in climate models necessary to understanding Earth’s climate and response to forcing [Bretherton and Park, 2009; Guo et al., 2015]. But thanks to CPTs, targeted studies have now led to a firmer grasp of the role of clouds in Earth’s energy balance.

Researchers now have a better understanding of the role of clouds in Earth’s energy balance, thanks to the work done by CPTs on improved convection schemes in climate models. For example, water condensation in rising air masses forms the characteristic “tower” structure of cumulonimbus clouds, like this cloud formation over Africa. Credit: NASA

CPTs also led focused research efforts to improve representation of sea ice and iceberg processes in climate models that have led to better climate predictions of iceberg calving size distribution over the Antarctic Peninsula [Stern et al., 2016].

The CPTs were instrumental in helping the involved scientific communities to develop strong and enduring links between academia and modeling centers, allowing better use of resources and expertise. Waterhouse et al. [2014], for example, synthesized ocean mixing data from a variety of observing platforms over a long period to provide an observational benchmark for improved mixing parameterizations in global ocean models. This synthesis product has the potential to increase our understanding of global ocean processes, such as the meridional overturning circulation, along with the heat and energy balance of the global climate.

Taking Stock

Currently funded CPTs are coming to an end, and members of the U.S. CLIVAR Process Study and Model Improvement Panel perceived a need to review the benefits the teams provide and to devise a plan for future efforts. The panel decided to seek input from the observational, modeling, and theoretical communities on how best to achieve a translation of process understanding into climate model improvements.

CPT activities have advanced climate models further than would have been possible with traditional funding mechanisms and smaller groups of principal investigators.To collect feedback on the utility of CPTs, the panel sent surveys to representatives of U.S. modeling centers, process studies, recent satellite missions, recent CPTs, and U.S. CLIVAR working groups. The results of these surveys confirmed broad community interest for a scoping workshop to identify processes for which newly available observational data and understanding could inform future model improvements. Subsequently, a workshop was held at Princeton University in October 2015 that brought together 90 leaders from the community to discuss a path forward.

All of the outreach and information gathered from the community emphasized that CPT activities have advanced climate models further than would have been possible with traditional funding mechanisms and smaller groups of principal investigators working on such projects. This is evident in a comprehensive U.S. CLIVAR white paper [Subramanian et al., 2016] that shows the need for launching a new CPT-like effort and addresses the questions of what form such an effort ought to take, which areas need to be tackled, and how such an effort might be implemented.

Moving Ahead

The white paper recommends that CPT activities continue in the future, drawing on feedback from the surveys and the workshop. The community’s consensus is that new activities should retain many successful aspects of the past CPTs. These include the formation of teams involving modelers, observationalists, and theoreticians. Team members should be drawn from modeling centers as well as academia, and funds should support postdocs dedicated to the task.

The white paper also lends strong support to approaches involving multiple modeling centers and multiple agencies that are well suited to delivering sustainable and comprehensive improvements to climate models. New developments should enlarge the scope of such activities to consider not only teams built around the theme of improving the representation of a specific process but also new teams focused on coupled processes and model component interactions to address specific biases or climate phenomena. New activities should also consider the emerging computational and expanded observational capabilities.

The U.S. CLIVAR survey demonstrates that the climate science community broadly supports future mechanisms to facilitate the translation of process understanding into improvements in climate models over the coming decade. We encourage our colleagues to form the cross-institutional collaborations among modelers, theoreticians, and observationalists that will enable these model improvements, and we hope funding agencies will continue to welcome these team efforts.

Space Weather Threat to Australian Power Networks Assessed

Tue, 11/14/2017 - 13:48

When the Sun unleashes its powerful storms, some of the most vulnerable pieces of human infrastructure are power grids. The mass of plasma hitting Earth’s magnetic field causes surges of current through the ground that can knock out transformers and cause entire networks to fail. This exact scenario occurred during a powerful solar storm that hit in March 1989, which caused a blackout spanning the entire Canadian province of Quebec.

In the past, power companies largely assumed that even in the strongest storms, the effects would be limited to high-latitude regions like Canada, the northern United States, and Scandinavia.

But then the “Halloween” solar storm of October 2003 struck, which damaged 12 power transformers in midlatitude South Africa, forcing them to be replaced. Since then, scientists have begun surveying and modeling power grids in midlatitude countries like Spain to determine their vulnerability to space weather.

Now Marshall et al. have performed an analysis on the power grid of eastern Australia, in Queensland and Tasmania. They combined mathematical models of space weather with maps of the power grid provided by the power companies to simulate how geomagnetic storms would affect the grid. They started by testing the model on two storms considered intense: in October 2013 and June 2015. They found that the currents of the models were very close to the actual currents that the storms induced in the power grid, which were recorded by monitoring equipment installed in 2011.

Having validated the model, they then unleashed it by feeding it space weather data from the 2003 Halloween superstorm, from before the monitoring equipment had been installed.

They found that the network in Tasmania did not experience very strong currents, partly as a result of the relatively small physical extent of the island’s power network. Induced currents were usually less than 5 amps in intense storms, with occasional spikes above 10 amps in superstorms.

However, the Queensland power network was slightly more vulnerable because its network is bigger and its power lines tend to have a dominant alignment following the continent’s northeastern coast. When the Earth’s geoelectric field happens to be oriented parallel to the coast, the induced currents tend to increase.

During intense storms, Queensland’s geoinduced currents tended to remain below 20 amps. But during superstorms, they could reach higher than 40 amps. Under such circumstances, power companies should consider precautionary action to avoid transformer damage and potential blackouts, the authors say, including reviewing planned maintenance and operating the grid to better manage the extra currents. (Space Weather, https://doi.org/10.1002/2017SW001613, 2017)

—Mark Zastrow, Freelance Writer

Solar Wind Sets the Magnetosphere Ringing

Tue, 11/14/2017 - 13:46

A Sudden Commencement of a geomagnetic storm is the result of the impact on the geomagnetic field of an abrupt solar-plasma front impacting the magnetosphere. The characteristics of the Sudden Commencement have been described for a long time. Using a combination of data from many satellites and ground-based instruments, Takahashi et al. [2017] demonstrate for the first time that the relative importance of two wave modes, the fast and shear Alfvén modes, changes as the disturbance moves earthwards. The latter is relatively more important closer to the Earth. This gives new insight into solar wind-magnetosphere-ionosphere interactions.

Citation: Takahashi, N., Y. Kasaba, Y. Nishimura, A. Shinbori, T. Kikuchi, T. Hori, Y. Ebihara, and N. Nishitani [2017], Propagation and evolution of electric fields associated with solar wind pressure pulses based on spacecraft and ground-based observations, Journal of Geophysical Research: Space Physics, 122, 8446–8461,  https://doi.org/10.1002/2017JA023990.

—Alan Rodger, Editor, JGR: Space Physics

U.S. Weather Alert Systems Must Modernize, Say New Reports

Tue, 11/14/2017 - 13:44

Weather forecasting and hazard prediction capabilities have improved significantly in the past decade, but the United States’ emergency alert and warning systems have not kept pace with advancements, according to two new reports from the National Academies of Sciences, Engineering, and Medicine (NASEM). The reports were released on 1 November.

The evolution of weather and hazard alert systems “will need to be informed by both technical research and social and behavioral science research.”Research improving the accuracy of weather forecasts and hazard prediction must continue, the reports state. However, to make the best use of forecasts, the nation’s alert capabilities “will need to evolve and progress as the capabilities of smart phones and other mobile broadband devices improve and newer technologies become available,” according to an official summary of one of the reports. The summary adds that “this evolution will need to be informed by both technical research and social and behavioral science research.”

Expand the System to New Technologies

One report, titled “Emergency Alert and Warning Systems: Current Knowledge and Future Research Directions,” identifies knowledge and coverage gaps in the current alert systems, outlines the potential challenges in building and implementing a new system, and sets a research agenda to improve the nation’s alert and warning capabilities by integrating new science and technology.

Weather and natural hazard alerts, like this text message warning residents of an earthquake near Edmond, Okla., on 18 January 2016, can quickly inform the public about dangerous conditions nearby. The new federal reports argue that the most effective alert messages will come in familiar formats through devices that people already use. Credit: J Pat Carter/Getty Images News/Getty Images

For example, the current Wireless Emergency Alerts (WEA) system, a part of the Integrated Public Alert and Warning System (IPAWS), leverages the ubiquity of cell phones in modern life. But the system can fail when cellular use is congested or unavailable, and it does not use the diverse communication capabilities of smartphones, says the report. It states that social media and private companies, including Facebook, Twitter, and Google, have begun to incorporate hazard warnings and alerts into their platforms, which likely reach more individuals than WEA.

NASEM’s Committee on the Future of Emergency Alert and Warning Systems: Research Directions, which wrote the report, suggests that “IPAWS could be augmented so that it draws on a wide variety of data sources, enhances public understanding of emergencies and public response, and uses a wider range of potential technologies and devices for delivering messages.” The committee adds that “alerts and warnings that reach people through tools and communication devices they are using and present information in a way they are accustomed to will be the most effective.”

Social and Behavioral Sciences Should Guide System Updates

The report on emergency alerts and warning systems proposes an interdisciplinary research agenda to incorporate research in social and behavioral sciences that could improve the systems’ effectiveness at delivering weather warnings. A separate report, titled “Integrating Social and Behavioral Sciences Within the Weather Enterprise,” expands on that agenda.

The report shows how people’s knowledge, experiences, perceptions, and attitudes toward severe weather forecasts shape how they respond to potential hazards. The report also highlights the need to integrate expertise in social and behavioral sciences to reduce property damage, injury, and loss of life.

By knowing how people respond to warnings and why, a more well informed system could promote better public safety.To illustrate this need, the report explains that nearly 6,000 people are killed and more than 445,000 people are injured each year in weather-related vehicle crashes on U.S. roadways, despite forecasts, reports, and alerts of hazardous driving conditions. In addition, the report notes that severe weather events with widespread warnings can still result in large-scale loss of life and property damage, as was the case with Hurricane Sandy in 2012 and Hurricanes Harvey, Irma, and Maria earlier this year. By knowing how people respond to warnings and why, a more well informed system could promote better public safety.

NASEM’s Committee on Advancing Social and Behavioral Science Research and Application Within the Weather Enterprise wrote this second report. In it, the committee explains that “an individual’s response to a severe weather event may depend on their understanding of the forecast, prior experience with severe weather, concerns about their other family members or property, their capacity to take the recommended protective actions, and numerous other factors.”

The report adds that research in social and behavioral sciences “offers great potential not just for improving communications of hazardous weather warnings but also for improving preparedness and mitigation for weather risks, for hazard monitoring, assessment, and forecasting processes, for emergency management and response, and for long-term recovery efforts.”

Challenges Ahead

“A system that instructs large populations to take a particular action may represent a significant target for spoofing or attacks.”Both reports acknowledged that their proposed modernization efforts may face significant challenges. They explain that an ever-changing technological landscape and slow adoption of new technologies mean that an updated system would need to be compatible with new and old technologies simultaneously. The summary of the alert system report also recognizes that “a system that instructs large populations to take a particular action may represent a significant target for spoofing or attacks on service availability” and that security and privacy issues would be paramount.

Nonetheless, the two reports agree that integrating new technologies into the current weather emergency alert system, guided by expertise in social and behavioral sciences, can improve disaster preparedness and mitigation. Together, the reports show that updating alert systems can enhance emergency management and response actions and ultimately save homes, lives, and communities from preventable losses.

—Kimberly M. S. Cartier (@AstroKimCartier), News Writing and Production Intern

The Gravity of Geophysics

Mon, 11/13/2017 - 13:01

Volcanoes, glaciers, tectonic activity and ocean dynamics all have an effect on the Earth’s gravity. A recent review article in Reviews of Geophysics described techniques for measuring changes in gravity over space and time. The journal’s editors asked the authors some questions about developments in this field and its application to the geosciences.

In simple terms, what is gravity and how does it vary?

Gravity literally brings us down to the ground. As many people learn in high school, “standard gravity” near the Earth’s surface is about 9.8 meters per second squared. But this is an average; gravity actually varies in both space and time. As the Earth is not a perfect sphere, this and a range of other influencing factors, such as the local density of the crust and altitude, mean that gravity is stronger in some places than others. Then factors such as fluctuations in Earth’s rotation and tides, changes in ground water content, underground movements of magma, or vertical land movements mean that gravity changes over time.

Gravity (g) depends on the location on Earth; the time; the relative positions of the Moon, the Sun and planets; the climate system; and the mass distribution. For example, ice mass changes and fluid motions in volcanic systems influence the value of g as well as Earth’s deformations and mass redistribution associated with large earthquakes. Credit: Michel Van Camp

 

By measuring changes in gravity over time, what can be learned about the Earth and different processes?

Monitoring changes in gravity over time provides information on deformations of the solid Earth and changes in the distribution of mass. This can be related to their geophysical causes such as tectonic and volcanic activity, past and present ice-mass changes, tides and the dynamics of the oceans. Hence, such data have a range of applications.

For example, gravimetry has proven useful on volcanoes, where combining gravity and deformation measurements has permitted discrimination between gas, water and magma intrusion, assessing voids opening or magma density changes associated with degassing.

How do you measure these changes in gravity over time?

The science of measuring this is known as “gravimetry” and two techniques coexist: absolute and relative.

Superconducting (blue) and absolute (black) gravimeters measuring at Conrad Observatory, Austria. Credit: Olivier Francis

Using absolute gravimeters, you repeatedly drop a mass in a vacuum chamber, measure its positions over time, and infer gravity from these data. These free-fall instruments are subject to wear, and are heavy and cumbersome, but this is the only way to accurately measure the absolute gravity value.

Using relative gravimeters, you measure a force to keep a test-mass still, counteracting gravity. Those instruments drift, which requires calibration to correct, and makes it unusable to determine the absolute value of gravity, but they are effective at monitoring gravity changes.

Presently, absolute instruments are used to perform campaign-style surveys, usually providing one gravity value after measuring for dozens of hours. On the other hand, relative gravimeters can measure continuously for years at a same place, or perform surveys such as  measuring for just a couple of minutes.

However, both types of terrestrial gravimeters are generally cumbersome, expensive (typically US$ 100,000-300,000 for relative instruments and US$ 500,000 for absolute ones), tricky to use, and require continuous power supply, which limits their contribution to Earth study.

How has technology advanced in recent years?

In recent years, our understanding of exactly what existing instruments measure, and how to get the most interesting science has strongly improved. Absolute atom gravimeters have been developed since the 1990s. Field cold-atom gravimeters are coming and should facilitate the deployment of absolute instruments, as they do not experience wear and because they will be lighter and smaller.

Microelectromechanical systems (MEMS) are also being developed as a gravimeter. These are microscopic mechanical devices, such as those present in smartphones. Presently, the noise level of MEMS is too high and their ability to measure location-dependent gravity variations is still to be demonstrated. However, those light instruments could revolutionize air- and sea-borne gravimetry; for example if such a device could be installed on a drone. They could also be deployed as a dense array around specific structures such as volcano, hydrothermal or karst systems.

Today’s instruments can achieve a high level of accuracy. For example, a superconducting gravimeter can detect the gravitational effect of a child sitting 1 meter above the instrument. This is the equivalent of one millimeter in the ground water content.

Left: A superconducting gravimeter installed in a shaft at the Rochefort station, Belgium. Center: A 13-year old boy, weighing 45 kilograms, sat with his navel 1 meter above the instrument. Right: The gravitational effect of the boy sitting for 6 minutes: his mass induced a decrease in gravity of 0.28 billionth of g. Credit: Royal Observatory of Belgium/Van Camp et al, 2017, Figures A4d and A7

 

Are there any limitations to this technology?

Gravity is an integrated quantity, which means that a unique value results from the actions of many masses around the instrument, close or very far away, and a pervasive problem in gravimetry is that surface gravity measurements do not provide the measure of the mass distribution within the subsurface. By combining different geophysical exploration techniques and by applying appropriate signal processing, in many cases it is possible to discriminate the signature of the investigated phenomenon from the other contributions to gravity.

How might this technique develop further in the future?

Hopefully the next generation of gravimeters (atom and MEMS) will be more transportable, and may be less expensive if more instruments are produced. The ideal instruments would be cheap absolute gravimeters able to monitor continuously, with a precision equivalent to 10-11 g at a period of one minute. Such low-consumption gravimeters should weigh no more than a few kilograms and be easy to operate. That way, it will be possible to deploy arrays of gravimeters to understand the functioning of volcanoes, specific hydrogeological and hydrothermal systems, or post-seismic and post-glacial relaxation.

—Michel Van Camp, Royal Observatory of Belgium; email: mvc@oma.be; Olivier de Viron, Université de La Rochelle, France; Arnaud Watlet, University of Mons, Belgium; Bruno Meurers, University of Vienna, Austria; Olivier Francis, University of Luxembourg; and Corentin Caudron, Université Libre de Bruxelles, Belgium

Hunting Rare Fossils of the Ediacaran

Mon, 11/13/2017 - 13:00

Look at the picture above. What do you see? Tire tracks? Snakeskin? Something else?

The image above is a fossil. But it’s not the kind of fossil you can pluck out of a rock, like a bone. It’s the fossilized cast of an organism, created after an imprint of the creature later filled up with sediment. And it’s among the rarest of the rare: The organism that made this infilled imprint lived around 550 million years ago.

The trick to finding the tiny fossils? Nothing beats a healthy dose of serendipity.This fossil is of Gaojiashania cyclus, a wormlike creature that crept over ocean sediments long before living things evolved the ability to build hard shells. Its squishy body had nothing that could turn to stone; the only way we know these organisms existed is because scientists have found impressions of them—some that filled to make casts and others that remained hollow—in sediments.

These fossils are tiny, so how exactly do scientists find them? One tip: Be patient and search in low-angle sunlight, said Emmy Smith, a paleontologist at Johns Hopkins University in Baltimore, Md.

Smith has spent the past 3 years searching for evidence of these creatures. The hunt “is a little bit of a wild goose chase,” she said. The trick to finding the tiny fossils? Nothing, Smith explained, beats a healthy dose of serendipity.

Fossil Hunting Tips

Don’t go out specifically searching for these rare fossils. Start your field adventure with a different question in mind.Tiny fossils of these ancient worms have ridges that cast shadows in low-angle light, so prepare to stand outside for hours, picking up rocks and turning them every which way, Smith said. “If the Sun is directly overhead, it’s very difficult to see these fossils.” Some of the fossils are the size of fingernails but are recognizable to Smith’s trained eye.

The most important tip of all? Don’t go out specifically searching for these rare fossils. Start your field adventure with a different question in mind. For example, what kinds of environmental changes signaled the end of the Ediacaran and the start of the Cambrian period about 542 million years ago?

Then, just hope for a big find, Smith said.

A fossil of Conotubus found outside Pahrump, Nev. This creature is also a Cambrian cousin that lived during the late Ediacaran. The fossil is about 10 millimeters wide and formed as minerals replaced organic matter within the creature. Credit: Smith et al. 2017, doi:10.1098/rspb.2017.0934

Smith got that big find in 2014 while on a field trip outside Gold Point, Nev. The town’s population, which reached about 14,000 at the height of the gold rush, now ranges from 4 to 8, depending on the season.

There, Smith discovered a fossil of a wormlike creature called Conotubus that lived during the last 10 million years of the Ediacaran period. Analysis of the ancient fossil also showed that it is a fossil like that of a dinosaur bone—minerals directly replaced matter within the organism. The specimen is among the first of its kind to be discovered in the United States.

“So many people have been to these sections” outside of Gold Point, Smith said. “It’s sort of surprising to be finding all of these different early forms of life” just 300 kilometers from the Las Vegas strip.

Fossil Mysteries

Why does Smith hunt for the tiny tubes, anyway? Smith studies the boundary between the Ediacaran period and the Cambrian period and what environmental changes might have occurred over that boundary.

Cambrian creatures were the very first to host body shapes similar to today’s arthropods: invertebrates comprising everything from centipedes to crayfish to crabs. The Ediacaran, meanwhile, hosted some forms of life completely unfamiliar to us. Scientists call them the “Ediacaran biota.”

Researchers know that these two groups of creatures overlapped. For example, soft-bodied G. cyclus has familiar forms that we associate with Cambrian creatures, although it predates the Cambrian’s onset. Although G. cyclus and other tube-shaped organisms like it lived during the Ediacaran, it is not what scientists consider Ediacaran biota. Members of the Ediacaran biota may have looked more like ridged disks—nothing like anything that exists today.

A fossil cast of a creature from the group known as the Ediacaran biota. These creatures existed during the Ediacaran period, which spanned from 635 million to 540 million years ago. Members of the Ediacaran biota are unlike anything that exists today. Researchers typically find these fossils in rocks that are between 580 million and 541 million years old. Credit: James Gehling

One of the biggest questions surrounding the Ediacaran biota is simple: Why did they disappear? And why did the Cambrian creatures (our ancestors) survive? Was it climate change? A huge disaster like volcanism or a meteorite?

Two leading theories attempt to explain the Ediacaran biota’s extinction, Smith said. One involves simple competition. Maybe Cambrian animals were simply better equipped to live in their ancient aquatic environment. Maybe they got all the food and stole all the resources, until the Ediacarans died out.

Another hypothesis is an environmental one, much like how large dinosaurs went extinct. Maybe an environmental change wiped out Ediacarans, and the Cambrian creatures, like the mammals that lived with the dinosaurs, blossomed to fill empty ecological niches.

To answer these questions, Smith and other scientists go fossil hunting. Their goal is to examine the chemical properties of the rocks in which the fossils reside; chemical changes within those rock layers could point to widespread environmental disasters that signaled Ediacaran doom.

A fossil cast of a specimen most likely in the Wutubus genus. It’s also a Cambrian cousin that lived during the late Ediacaran. The fossil is about 10 centimeters long. Credit: Smith et al., 2016, https://doi.org/10.1130/G38157.1 Eureka!

Fossils of G. cyclus and other wormy creatures of the late Ediacaran are not the only fossils that Smith has found. She’s also discovered fossils of the Ediacaran biota, here in the United States. And Smith couldn’t have found these specimens without a bit of luck.

This story begins 14 years ago, with a structural geologist named Jeremiah Workman and his wife, Nichole. The pair were hiking just outside Pahrump, Nev., while Jeremiah was studying the landscape for the U.S. Geological Survey’s National Cooperative Geologic Mapping Program. On the long hike back to their car, Nichole spotted some interesting-looking fossils in a slab of rock about a meter long. The rock came from a unit that Jeremiah knew dated to the Late Cambrian.

“It was getting dark, I was tired, and not particularly interested in fossils from the Cambrian,” Jeremiah told Eos. “See what I knew?”

Jeremiah’s backpack was full, so Nichole “carried it in her arms down the steep, rocky hillslope,” Jeremiah said.

For 14 years, the slab sat in various places in the Workman household, even spending 5 of those years on their porch “exposed to harsh Colorado winter weather,” Jeremiah said. Then, in 2014, after Smith contacted Jeremiah about something completely different, he sent her a picture of the slab and asked whether it was anything interesting.

When she received the pictures, Smith did a double take. She saw what the Workmans had not: The ridges and dips scattered across the rock were extremely rare fossil casts of the Ediacaran biota. “I was like, ‘Oh my god, people have been looking for these for decades,’” Smith said.

Close-up shot of the slab found by Nichole Workman in Nevada. The image shows several fossil casts and molds, marked by white arrows. All are examples of the elusive Ediacaran biota. Scale bar is 5 centimeters. Credit: Emmy Smith

She rushed out to Denver, Colo., where the Workmans live, and collected the slab, carting it to Washington, D. C., where she worked at the Smithsonian Institution at the time. During the next field season, in 2016, Smith and a team visited the site outside Pahrump and ended up finding many more fossils of the Ediacaran biota.

The team’s findings published this summer in Proceedings of the Royal Society B. But none of the specimens they found were so well preserved as those in the original slab, Smith said.

Watch Where You Step?

Finding Ediacaran and Cambrian fossils in Nevada opens up the southwestern United States as a new place to find these extremely rare signatures of life.Finding Ediacaran and Cambrian fossils in Nevada opens up the southwestern United States as a new place to find these extremely rare signatures of life, Smith said. Scientists have found these fossils only in a few other remote places in the world, like the Australian Outback and the deserts of Namibia. Gold Point and Pahrump, meanwhile, are relatively easy to access. She plans to go back to both these locales to hunt for more fossils of the Ediacaran biota.

Now that people have a clear idea of where to search and what to look for, Smith expects that more fossil hunters looking for rare specimens may comb the Nevada wilderness. Here Smith gives us our final tip: Don’t worry where you step.

“Because they’re so rare and in remote places, I don’t worry that people are stepping on them,” she said.

—JoAnna Wendel (@JoAnnaScience), Staff Writer

Gonzalez Receives 2017 Space Weather and Nonlinear Waves and Processes Prize

Mon, 11/13/2017 - 12:58
Citation Walter Gonzalez

The 2017 Space Weather and Nonlinear Waves and Processes Prize has been awarded to Dr. Walter Gonzalez of the Brazilian Space Research Institute (INPE) by the AGU Space Physics and Aeronomy section and the Nonlinear Geophysics focus group. Dr. Gonzalez has been a longtime leader in space weather research and related international collaboration.

Walter conducted early pioneering work on magnetic reconnection with Prof. Forrest Mozer. Reconnection is the fundamental process that largely governs the interaction between the solar wind and the magnetosphere. This work has been essential to the development of various coupling functions that try to quantify the energy transfer from the solar wind to the magnetosphere.

Walter is best known for his work on magnetic storms. His 1994 Journal of Geophysical Research paper “What is a geomagnetic storm?” is a seminal work. Many of Walter’s other papers quantified the solar wind input that leads to magnetic storms and the interplanetary origin of those features. Among these contributions is work done with Bruce Tsurutani that identified the effect on the magnetosphere of large-amplitude Alfvén wave trains in high-speed streams. Another paper is the first modern analysis of the great magnetic storm of 1859, the Carrington Event, which is the largest magnetic storm on record and the presumed upper limit for the most extreme space weather event that can befall our civilization. Walter has also been a leader in international collaboration in the study of magnetic storms and reconnection, organizing many workshops on these topics.

His contributions to the study of space weather extending over 40 years make him exceptionally well suited to receive this award.

—Ramon E. Lopez, University of Texas at Arlington

Response

I am greatly honored to receive the 2017 AGU Space Weather and Nonlinear Waves and Processes Prize. I would like to thank the related AGU award committee as well as my nominator, Dr. Ramon Lopez, and supporters for this award. Dr. Lopez is a brilliant space physicist who has strongly contributed to many important areas of space research, especially in magnetospheric physics.

For the two main topics of magnetospheric research related to space weather in which I have worked, magnetopause reconnection and magnetic storms, I would like to especially acknowledge the contribution of Prof. Forrest Mozer of UC Berkeley (my Ph.D. thesis adviser) and of Dr. Bruce Tsurutani of NASA Jet Propulsion Laboratory. Prof. Mozer’s insight and experimental support were crucial for the elaboration of our first quantitative model on component reconnection at the magnetopause. Similarly, the important and extensive contribution of Dr. Tsurutani in our joint work on magnetic storms over the years has resulted in many achievements toward the definition and development of research in space weather.

I would also like to thank Prof. Vytenis Vasyliunas of the Max Planck Institute, Prof. Yoshuke Kamide of Nagoya University, Prof. Eugene Parker of Chicago University, and Dr. David Sibeck of NASA Goddard Space Flight Center for their encouragement and help.

Finally, I would like to give especial thanks to my wife, Dr. Alicia Gonzalez, and to the other research members of the Brazilian National Institute of Space Research for their important contributions during my research career.

—Walter Gonzalez, Instituto Nacional de Pesquisas Espaciais, São Paulo, Brazil

Mainprice Receives Paul G. Silver Award

Mon, 11/13/2017 - 12:56
Citation David Mainprice

As befits a Silver awardee, David Mainprice’s scholarship transcends boundaries of mineral physics, tectonophysics, and seismology, enabling improved understanding of S wave splitting, thermal diffusivity, phase transitions, and relations among deformation, elastic moduli, and seismic properties. David was directly connected to Paul Silver; they coauthored two influential papers and maintained a personal friendship. But more important in the context of this award are David’s intellectual generosity, enthusiastic mentorship, and kind cooperation with students and colleagues. Beginning in 1990, he committed to making his petrophysics programs and databases freely available. Nowadays, collaborating with Hielscher, Bachmann, Schaeben, and others, David is a major contributor and teacher of MTEX, an open-source code providing robust statistical assessments of crystal preferred orientation, seismic velocity anisotropy, and shear wave polarization. At any given meeting, there are always a spectacular number of posters displaying figures using MTEX or his older software. David’s generosity extends to hosting research visits in Montpellier and providing workshops worldwide. It was easy to collect heartfelt and eloquent quotations illustrating his influence as a mentor and collaborator. D. Prior, Otago University, said, “Mainprice’s contributions…to texture measurement have been trendsetting, yet openly available…. [His lab] made world-class EBSD instruments available to international users…. [When] launched 15 years ago, it included many components built in-house, sometimes out of commonly available household items (including coffee filters).” K. Michibayashi, Shizuoka University, commented, “I was very much inspired by David…[and] still rely on his products and basic ideas.” Q. Wang, Nanjing University, commented, “[Dave is] a generous teacher and friend. He helped to establish my career and taught me how to become an honorable scientist.” S. Misra, Indian Institute of Technology, Kanpur, quoted a Sanskrit aphorism: “Sharing knowledge gives humility; humility gives character.” Misra concluded that David Mainprice epitomized the essence of that saying.

—Brian Evans, Massachusetts Institute of Technology, Cambridge —Andrea Tommasi, University of Montpellier, Montpellier, France

Response

First, I thank Brian Evans and Andrea Tommasi, who wrote the citation. It is very humbling to have received this award associated with the name of Paul Silver, an extraordinary seismologist, genial colleague, and great friend who is greatly missed by all. I acknowledge the people who taught me many things, starting with lectures at Kingston Polytechnic based on practical work and fieldwork. Ernie Rutter at Imperial College, where my research started under his direction, is an exceptional person with unlimited talents. He taught me countless things, including tensors, which at the time was of little interest to me! I was fortunate to have Mervyn Paterson as my Ph.D. supervisor at Australian National University, a man with a very strong background in physics and a very critical eye for detail. He set the standards I tried to follow. Some years later I took a sabbatical in Brian Evans’s group at Massachusetts Institute of Technology, where again I learned more about rock deformation from one of the masters of the subject. In addition to the names above, I have collaborated with many researchers who were young at the time. I will mention some of these people, as space is limited: Yvon Montardi, Shaocheng Ji, Philippe Blumenfeld, Jan Behrmann, Keith Benn, Geoff Lloyd, John Wheeler, Bernard Seront, Guilhem Barruol, Alain Vauchez, Hartmut Kern, Anke Wendt, Walid Ben Ismaïl, David Jousselin, Gwen Lamoureux, Benoit Ildefonse, Marcos Egydio-Silva, Luigi Burlini, Benoit Dewandel, Jerome Bascou, Benoit Gibert, Ela Pera, Katsuyoshi Michibayashi, Patrick Cordier, Philippe Carrez, Stanislav Ulrich, Fabrice Fontaine, Miki Taska, Manuele Faccenda, Arnaud Metsue, Qin Wang, Richard Law, Ralf Hielscher, Helmut Schaeben, Bjarne Almqvist, Razvan Caracas, Alex Mussi, Claudio Madonna, Lucille Bezacier, Marie Violay, Rolf Bruijna, Sylvie Demouchy, Santanu Misra, Luiz Morales, Victoria Shushakova, Ewin Frets, Takako Satsukawa, Mainak Mookherjee, Tanvi Chheda, Steve Peuble, Fabio Arzilli, José Alberto Padrón-Navarta, Thomas Chauve, Maurine Montagnat, Sandra Piazolo, and Baptiste Journaux.

—David Mainprice, Université de Montpellier, Montpellier, France

Sounding the Black Smoker Plumes

Fri, 11/10/2017 - 12:59

Hydrothermal vents act like pores in Earth’s skin, providing portals for the exchange of heat and chemicals between Earth’s interior and the overlying ocean. These vents play a major role in the Earth-ocean system, coupling submarine magmatic, tectonic, biological, and hydrothermal processes.

COVIS recorded a 4-year data set while it was connected to the world’s first high-power, high-bandwidth cabled seafloor observatory. Vent systems and their interactions have been a subject of intense study over the past few decades [Kelley et al., 2014]. Heat flux, the amount of heat discharged from a vent site over a given time span, is one of the most important field measurements for understanding these interactions [Rona et al., 2015]. Despite the importance of quantifying the amount of heat discharged from a hydrothermal system over time and space, this task has proven challenging for scientists.

We have tackled this challenge using an unprecedented long-term acoustic data set, spanning more than 4 years, that captures hydrothermal venting off the coast of the Pacific Northwest region of North America. The Cabled Observatory Vent Imaging Sonar (COVIS) recorded this data set while it was connected to the world’s first high-power, high-bandwidth cabled seafloor observatory: the North East Pacific Time-series Underwater Networked Experiments (NEPTUNE) observatory. Ocean Networks Canada (ONC) operates NEPTUNE at the Endeavour segment of the Juan de Fuca Ridge, just off the coast of British Columbia.

Artist’s conception of COVIS in place near the hydrothermal plumes at the Grotto vent cluster. Plumes that were not imaged by COVIS are not shown. The lighter, circular segment represents the coverage provided by COVIS’s sonar for a single ping while making plume measurements. For such measurements, the sonar head rotates vertically in 1° steps to cover the initial 30-meter rise of the plumes. Credit: Background imagery by L. Liu (Rutgers University); final artwork by K. Reading (University of Washington) Going Beyond Snapshots and Keyhole Views

Almost all heat flux measurements made to date are either one-time snapshots or annually repeated samples. Thus, they have limited application for studying the gradual evolution of seafloor venting at the finer timescales. Furthermore, many of these measurements have limited spatial coverage and resolution. Extrapolating those spot measurements over larger areas like a vent field can produce results with substantial uncertainty [Rona et al., 2015].

Fig. 1. COVIS deployed at the Grotto vent cluster on the Juan de Fuca Ridge. Credit: Ocean Networks Canada

New instruments like our COVIS (Figure 1), connected to cabled seafloor observatories, show considerable potential for quantitative monitoring of heat and vent fluid fluxes of discharge over significant spatial areas (~100  × 100 meters) of seafloor hydrothermal vent sites.

The acoustic data from COVIS yield 3-D images of focused-flow areas, including plumes rising tens of meters from black smoker vents on a sulfide structure named Grotto. COVIS also produces 2-D maps of the distribution of diffuse flow venting in the surrounding areas where optically transparent hydrothermal fluids discharge at relatively low temperatures.

More important, further analysis of the acoustic data yields time series measurements of the flow rate, volume flux, and, subsequently, the heat flux of hydrothermal venting. These measurements are essential for studying how a hydrothermal system changes over time and how it interacts with geological, oceanic, and biological processes.

What Is COVIS?

COVIS was specifically designed to acquire water column multibeam backscatter data for imaging hydrothermal plumes and measuring their vertical velocity and heat flux and to use acoustic backscatter from the seafloor to detect and monitor diffuse hydrothermal discharge. COVIS uses a multibeam sonar with three-axis rotation (Figure 2) mounted on a 4.2-meter tower. The electronics and data-handling hardware are contained in a pressure vessel near the base of the tower.

Fig. 2. COVIS sonar transducers, transmitter (T) and receiving array (R), and elevation motor (E). Reprinted from Deep Sea Research Part II: Topical Studies in Oceanography, 121, Karen G. Bemis, Deborah Silver, Guangyu Xu, Russ Light, Darrell Jackson, Christopher Jones, Sedat Ozer, and Li Liu, The path to COVIS: A review of acoustic imaging of hydrothermal flow regimes, pp. 159–176, Copyright 2015, with permission from Elsevier.

COVIS evolved from an initial theoretical study to determine whether black smoker plumes could be imaged acoustically in the 1980s. In the 1990s and early 2000s, researchers used acoustic backscattering to image plumes using various sonars mounted on human-occupied and remotely operated vehicles (ROVs). Later, scientists extended acoustic methods to image and measure both focused and diffuse flow [Bemis et al., 2015].

In 2000, a sonar mounted on ROV Jason achieved a milestone along this development timeline. Jason’s 23-hour series of acoustic images of plumes and diffuse flow at the Grotto vent cluster in the Main Endeavour Field marks the transition from snapshots to time series observations of hydrothermal flow [Rona et al., 2006]. Even that short time series shows the plume alternately bending northeast and southwest, with corresponding variations in mixing with ambient seawater driven by tidal currents.

Most recently, COVIS was installed and connected to ONC’s NEPTUNE observatory at the Grotto vent cluster in the Main Endeavour Field on the Juan de Fuca Ridge. COVIS operated at Grotto (with a few interruptions for cable repairs and network outages) from 29 September 2010 to 10 September 2015, collecting an unprecedented time series of hydrothermal flow. The current COVIS database consists of about 4 years of plume images, Doppler vertical flow, volume flux, heat flux, and diffuse discharge maps at various time intervals (Figures 3 and 4); these data can be accessed at the Ocean Networks Canada Oceans 2.0 website.

Fig. 3. Example of one 3-hour instance of COVIS data at Grotto vent, Main Endeavour Field, Juan de Fuca Ridge, on 20 March 2014 at 03:00–04:00 coordinated universal time (UTC). (a) Three plumes are shown using isosurfaces of acoustic backscatter above the bathymetry at Grotto. (b) Vertical velocity for the same plumes. (c) Vertical velocity and volume flux are plotted for the largest plume. (d) The seafloor is draped with acoustic decorrelation estimates. Yellow areas indicate where we detected diffuse discharge. Credit: Karen Bemis From Sonar Echoes to Heat Flux

The water column imaging capabilities of COVIS build on standard backscatter-based imaging. We increase the signal-to-noise ratio by replacing the magnitude of backscatter from one ping with the magnitude of the waveform difference in backscatter between successive pings. COVIS’s imaging is sensitive primarily to rapid, turbulent temperature fluctuations in high-temperature plumes and secondarily to particulates [Xu et al., 2017].

These observations capture the response of hydrothermal systems to volcanic eruptions, earthquakes, ocean tides, and even surface winds.In an alternative mode, COVIS data are processed for Doppler phase shifts to estimate line-of-sight velocities of reflected sound waves. These velocities can then be geometrically converted to yield estimates of plume vertical velocities, which are then used to estimate the volume of material flowing from the vent. We then use the theoretical dependence of heat flux on the vertical variation of volume flux to calculate the heat flux.

COVIS demonstrated the ability to monitor hydrothermal venting by producing a 4-year record with 3-hour temporal resolution of heat flux from a focused vent cluster (Figure 4) [Xu et al., 2014]. It has also monitored the areal extent of diffuse venting with submeter resolution over a 30 × 30 meter region [Rona et al., 2015] at the Grotto vent (Figure 3).

This combination of spatial and temporal coverage and resolution is unmatched by any previous measurement. These observations demonstrate the ability to capture the response of hydrothermal systems to such external forcings as volcanic eruptions, earthquakes, ocean tides, and even surface winds [Xu et al., 2013, 2014].

What Has COVIS Shown Us?

The heat flux time series shown in Figure 4 implies that the heat source driving the venting at Grotto remained relatively steady over the 26-month period. Comparison with seismic measurements taken over the same time period indicates that this period of steady hydrothermal heat output coincides with a period of reduced local seismicity.

Fig. 4. (a) An example 26-month segment from the 4-year time series of the heat flux of the Grotto North Tower plume. The green dot indicates an in situ measurement from June 2012 (red marks show the 95% confidence interval). (b) Thirty-day moving average of the time series in Figure 4a. The pink shaded area shows the 95% confidence interval. Reprinted from Earth and Planetary Science Letters, 404, Guangyu Xu, Darrell R. Jackson, Karen G. Bemis, and Peter A. Rona, Time-series measurement of hydrothermal heat flux at the Grotto mound, Endeavour Segment, Juan de Fuca Ridge, pp. 220–231, Copyright 2014, with permission from Elsevier.

Further comparison with historical heat flux and seismic measurements suggests that the evolution of hydrothermal heat output from Grotto since 1988 correlates with the rate of local seismicity. Short episodes of increased heat flux followed pronounced seismic activities, and longer periods of reduced steady heat flux occurred during periods of quiescent seismicity [Xu et al., 2014].

To detect diffuse flow venting, COVIS detects the phase changes caused by sound travel time variations, which, in turn, are caused by variations in the water temperature between COVIS and the diffuse flow–covered seafloor. We expect that the backscatter from the seafloor remains constant, so those phase changes indicate the presence of diffuse flow and can be used to create 2-D maps of this flow (Figure 3d).

The areal extent of diffuse flow shown in those maps, combined with in situ temperature and flow rate measurements, yields estimates of about 33 megawatts of total heat flux from diffuse venting at Grotto [Rona et al., 2015]. In addition, the 2-D maps created in 3-hour intervals reveal the temporal variations of the diffuse flow area at Grotto, which can be further used to study the influences of bottom currents and seafloor tidal pressure on diffuse flow venting.

Acoustic Imaging: A Promising New Tool

COVIS has demonstrated the viability of acoustic imaging for measuring heat flux from hydrothermal vent clusters and their surroundings.COVIS has collected an unprecedented data set capturing the heat flux of high-temperature venting and the local distribution of diffuse discharge over a 4-year period at the Grotto sulfide vent cluster. COVIS has demonstrated the viability of acoustic imaging for the quantitative measurement of heat flux from high-temperature hydrothermal venting across a vent cluster and the qualitative capture of diffuse venting in the surroundings.

New developments in acoustic methods promise to increase quantitative applications to the heat flux of diffuse venting and to broaden the application of acoustic imaging to alternate platforms. Our ongoing work focuses on using the acoustic signal to quantify the intensity of the diffuse venting in the areas where we mapped the distribution of this activity. Three-dimensional imaging capabilities are potentially valuable for interpreting the sources of anomalies in sensor data collected along a vehicle track. Thus, a future challenge is to transfer these techniques from a stationary platform to a moving vehicle.

Honoring Earth and Space Scientists

Fri, 11/10/2017 - 12:56

Four researchers studying aspects of Earth and space science were awarded a 2017 Packard Fellowship for Science and Engineering. The fellowship, awarded to 18 early-career scientists, includes a 5-year, unrestricted research grant of $875,000. The David and Lucile Packard Foundation announced the fellowship recipients on 16 October.

Konstantin Batygin, assistant professor of planetary science at the California Institute of Technology in Pasadena, won the fellowship for his work revealing the mysteries of the Kuiper Belt, a region beyond the orbit of Neptune that’s scattered with small, icy debris. Batygin’s research provided evidence that an undiscovered Neptune-sized planet may lurk in the outskirts of the solar system, and he plans to use the fellowship funds to computationally narrow down the planet’s location and observe it with telescopes.

Marine A. Denolle, assistant professor of Earth and planetary sciences at Harvard University in Cambridge, Mass., received the fellowship for her work exploring the amplification of seismic energy and ground shaking in developing urban areas. With the research grant, Denolle aims to quantify how environmental and groundwater changes in these areas are influencing regional seismic hazard and develop better predictive models of seismic hazard in urban areas.

Magdalena Osburn, assistant professor of Earth and planetary sciences at Northwestern University in Evanston, Ill., won the fellowship for her studies of microbes that live deep underground. She plans to use funds to research microbes whose DNA sequences are known but that have yet to be grown in the lab. Osburn hopes to replicate the microbes’ natural conditions in the lab, cultivate them, and explore their role in shaping Earth.

Laurence Yeung, assistant professor of Earth science at Rice University in Houston, Texas, received the fellowship for his isotope geochemistry research developing new tracers for Earth system cycles, such as atmospheric circulation, biosphere productivity, and nutrient cycling. With the fellowship, Yeung aims to improve empirical constraints on these natural processes and better understand how atmospheres, biospheres, and geospheres interact over short and long timescales.

Other Accolades

Susan Trumbore, a biogeochemist at the University of California, Irvine, and the Max Planck Institute for Biogeochemistry in Jena, Germany, was awarded the 2018 Benjamin Franklin Medal in Earth and Environmental Science for her pioneering climate change work. Trumbore measures carbon levels in plants and soil to better understand the flow of carbon between the biosphere and the atmosphere. She is a past president of the American Geophysical Union’s (AGU) Biogeochemistry section, a Fellow of AGU and the American Association for the Advancement of Science, and the current editor in chief of Global Biogeochemical Cycles. The Franklin Institute announced its Awards Class of 2018, which includes Trumbore and seven other scientists and engineers, on 7 November and will honor them in a ceremony to be held on 19 April 2018 in Philadelphia, Pa.

Anne Egger, assistant professor of geological sciences and science education and director of undergraduate research at Central Washington University in Ellensburg, Wash., was selected by the National Association of Geoscience Teachers (NAGT) as the next editor in chief of the Journal of Geoscience Education. Egger has previously served as the president of NAGT and chair of the NAGT Professional Development Planning Committee. She has spent her career working to improve Earth science literacy and education, develop teaching materials for Earth science educators, and emphasize the process of science in her teaching. NAGT announced the appointment on 2 October, and Egger will begin her responsibilities as editor in chief in January 2018.

Prince Albert II of Monaco received the 2017 Stroud Award for Freshwater Excellence on 17 September. The award recognizes Prince Albert’s commitment to conserving and protecting freshwater resources through the Prince Albert II of Monaco Foundation, which focuses on preserving biodiversity, limiting the effects of climate change, promoting renewable energies, and managing water resources to combat desertification. The Stroud Water Research Center presented the award to Prince Albert at its 50th anniversary gala in Winterthur, Del.

Tracing Water’s Path Through the Santa Clara Valley Aquifer

Fri, 11/10/2017 - 12:54

California governor Jerry Brown declared a drought state of emergency in January 2014, following years of wintertime rainfall levels dipping below historic averages. A lack of rainfall throughout 2015—precipitation was 20% below average—sustained the drought. Surface water levels got so low that residents had to increasingly tap into groundwater resources in order to meet agricultural, urban, and industrial needs. This usage put immense pressure on groundwater resources and made it extremely difficult to manage water resources across the state.

These pressures can have long-term effects too. For example, rapid drawdown of groundwater resources can cause the land above it to sink. A new study examines groundwater levels and sinking land in California’s Santa Clara Valley in the context of the state’s widespread drought.

When an aquifer—a system of porous rocks that allow water to flow through them—reaches an all-time-low water level, large areas above it experience land subsidence. Subsidence is deeply damaging to infrastructure, such as buildings and roads, and can increase flooding in coastal areas.

The aquifer system in California’s Santa Clara Valley, home to the Silicon Valley’s robust tech industry, is made up of many alternating layers of clay and sand, deposited over time by the rising and falling sea level. It was here that in the early 1900s, scientists first observed land subsidence due to groundwater withdrawal within the United States.

For their study, Chaussard et al. compared physical changes in the Santa Clara Valley aquifer system during California’s recent drought to prior studies of the area to pinpoint changes that may be related to the drought. Past studies have used a type of satellite-based radar called interferometric synthetic aperture radar to measure changes in surface elevations. Although this technique provides many measurements over a large area, these data sets contain sizable gaps in time, with many days, and sometimes many weeks, passing between measurements.

Given the rapidly changing conditions of the aquifer system throughout the drought and drought recovery process, the team tried a different tack. Using a constellation of four Earth-observing satellites, Italy’s COSMO-SkyMed, the researchers were able to acquire data in much more frequent intervals (as often as once per day) from 2011 to 2017. Then, by analyzing these data, the researchers were able to trace small changes in ground elevation and relate them to the movement of water through the aquifer system.

The researchers found that water levels and elevations both hit an all-time low in 2014, shortly after Brown declared the drought state of emergency, but then started to rebound in late 2014, while the drought was still going strong, amid new groundwater management efforts. Water levels were back to normal by the start of 2017, whereas elevations took a bit longer to normalize.

Although no permanent water level changes were sustained, the researchers found, some land elevation was lost—more than 4 times the amount that is typically lost during seasonal variations. Also, the researchers found that changes in the aquifer’s water mass caused by tapping into groundwater stores could affect the stress placed on nearby faults and potentially influence the occurrence of earthquakes.

This study allows scientists to better understand rapidly changing drought conditions, as well as the long-lasting strain that these changes have on a region’s aquifer system. With a more intimate knowledge of these processes, researchers can better assess and help improve existing water resource management practices, which is especially important in a global climate that is increasingly prone to drought. (Journal of Geophysical Research: Solid Earth, https://doi.org/10.1002/2017JB014676, 2017)

—Sarah Witman, Freelance Writer

Improved Simulation of Gross Primary Productivity

Fri, 11/10/2017 - 12:52

Diffuse solar radiation can increase light use efficiency in plant canopies compared to direct sunlight. This can impact terrestrial gross primary production (GPP). Yan et al. [2017] have developed a novel diffuse-fraction-based two leaf model of GPP that better explains seasonal variations, particularly at a test site in the Amazon forest. The model also directly accounts for the effects of soil water variations and the difference between C3 and C4 photosynthetic pathways for biomass production that exist in different types of plants. Explained variance in observed monthly GPP increased from about 50 per cent in an existing “big leaf” GPP algorithm to more than 70 per cent in the new scheme, with potential to improve biomass modeling and predictions.

Citation: Yan, H., Wang, S.-Q., Yu, K.-L., Wang, B., Yu, Q., Bohrer, G., … Shugart, H. H. [2017]. A novel diffuse fraction-based two-leaf light use efficiency model: An application quantifying photosynthetic seasonality across 20 AmeriFlux flux tower sites. Journal of Advances in Modeling Earth Systems, 9. https://doi.org/10.1002/2016MS000886

—Paul A. Dirmeyer, Editor, Journal of Advances in Modeling Earth Systems

Steele-MacInnis and Watkins Receive 2017 Hisashi Kuno Award

Fri, 11/10/2017 - 12:51
Citation for Matthew Steele-MacInnis Matthew Steele-MacInnis

It is my pleasure and honor to introduce Matt Steele-MacInnis, recipient of the Hisashi Kuno Award for 2017. It is appropriate for Matt to be recognized by the VGP section of AGU, as he has made significant contributions in volcanology, geochemistry, and petrology, as well as other areas. His research defines and quantifies fundamental chemical and physical processes and provides a sound basis for interpreting field- and laboratory-based observations in a broad range of geologic environments. Matt earned his B.Sc. (Honors) in Earth sciences (with a minor in math) from Memorial University in Newfoundland, where he received numerous honors, including the University Medal for Academic Excellence in Geoscience (Lou Visentin Award). At Virginia Tech, Matt was named an Institute for Critical Technology and Applied Science Doctoral Fellow and, upon completion of his Ph.D., was honored with the Virginia Tech College of Science 2013 Outstanding Doctoral Student Award. During his tenure at Virginia Tech, Matt conducted experimental studies to determine the phase behavior of iron-bearing hydrothermal fluids, developed thermodynamically based methods to interpret volatile contents obtained from melt inclusion analyses, conducted theoretical studies to predict speciation and structural states of ions in solution, and developed numerous numerical methods to interpret fluid chemistry based on laser ablation inductively coupled plasma–mass spectrometry (ICP-MS) and microthermometric analysis of fluid inclusions. Matt’s Ph.D. research, as well as collaborative research with other students and faculty at Virginia Tech and elsewhere, resulted in more than 20 publications in top international journals. Following his Ph.D. studies at Virginia Tech, Matt was awarded the prestigious Marie Curie Postdoctoral Fellowship and conducted postdoctoral research at ETH in Zurich. In 2015, Matt joined the Department of Geosciences at the University of Arizona as a tenure-track assistant professor, and in August 2017 he returned to his native Canada to accept a faculty position at the University of Alberta.

—Robert J. Bodnar, Virginia Polytechnic Institute and State University, Blacksburg

Response

Thank you, Bob, for your kind words and support. I want to thank my nominators, the Volcanology, Geochemistry, and Petrology section, and the Kuno Award Committee for bestowing this honor. I also want to thank many people who have guided me along my career path so far. John Hanchar at Memorial University was instrumental in guiding me toward graduate school and specifically toward joining Bob Bodnar at Virginia Tech. I owe much to Bob. He introduced me to the world of geologic fluids and fluid inclusions, but he also taught me how to be an effective scientist, and he conveyed an infectious enthusiasm and work ethic that drove me and my fellow students to push harder. I am indebted to my cohort of fellow grad students at Bob’s lab, too many to name. During my Ph.D., I spent 1 year as a visitor at GFZ Potsdam, where Christian Schmidt introduced me to experiments on fluids and melts. I was also extremely fortunate to have two exceptional postdoc advisors at ETH Zurich: Both Chris Heinrich and Thomas Driesner broadened my perspectives on geologic fluids immensely, from large-scale processes to microscopic-scale properties. Throughout this time, several early-career colleagues became regular collaborators and helped me branch out into different fields, especially Georg Spiekermann, Rosario Esposito, Joachim Reimer, and Kyle Ashley. And, of course, another regular and long-term collaborator also happens to be my wife, Pilar Lecumberri-Sanchez. I cannot express how grateful I am to Pilar for all of her support. It helps that we can discuss science both at work and at home, without arguing too much.

—Matthew Steele-MacInnis, University of Alberta, Edmonton, Alb., Canada

Citation for James Watkins James Watkins

James Watkins embodies the essence of modern dynamic petrology and geochemistry, continuing and extending the Kuno legacy of quantitative study of igneous rocks. Jim’s work has combined novel high-temperature experiments, careful isotopic analysis, numerical modeling, and nonequilibrium thermodynamics to bring us a new level of understanding of diffusion-related isotopic effects in magmas and other liquids. He also takes his insight to the field and has produced novel measurements and interpretations of chemical effects around vesicles in glassy lavas that yield information about the pressure history of magma as it approaches eruption. The elegant combination of careful experimental work, detailed observations using multiple methods of analysis, and thoughtful modeling is Jim’s hallmark and places him in a rare class of Earth scientists.

Since establishing his research program at the University of Oregon, Jim’s scientific reach has expanded beyond igneous systems. He has derived and tested with experiment a nearly complete description of nonequilibrium isotopic effects in the formation of calcite from aqueous solution, which puts these effects on a firm theoretical basis and promises to make calcite an even richer recorder of past Earth surface conditions. It also begs us to further investigate departures from equilibrium in high-temperature systems.

Jim exhibits no fear in learning new techniques and theory or bravely tackling daunting problems with which he has little previous experience. But he is also more than comfortable working with multiple collaborators to pool expertise on difficult problems. He is the type of person you want as both a colleague and a friend, and these traits will help him continue to spread his scientific influence in the future. In recognition of his original research in fundamental aspects of magma transport processes and aqueous mineral growth, I am indeed pleased to see James Watkins recognized as a 2017 recipient of the AGU Kuno Award.

—Donald J. DePaolo, University of California, Berkeley

Response

I am very grateful to my nominators and the Kuno committee for this honor and for this rare opportunity to publicly acknowledge those who nurtured my curiosity from a young age.

My father is an accountant who loves numbers, and my mother is a teacher who loves nature. It’s neat to think that these traits are expressed in the work that I do for a living. Despite being from a small town in rural Wisconsin, I received a fine public education thanks to some amazing, yet underappreciated, teachers: Laird, Shelton, Majeski, Flynn, Hughes, and Rosenbush—thank you for your unwavering dedication.

University of Wisconsin–Eau Claire was a wonderful place to receive a liberal arts education. Getting into Berkeley was a dream, and I’m forever thankful to Don DePaolo and Michael Manga for taking a chance on me, teaching me how to (among many other things) integrate experiments with mathematical models, and for all the wisdom and resources they continue to provide. I met a lot of brilliant people at Berkeley; Rick Ryerson and Chris Huber were that and more—special thanks to them for basically being my third and fourth Ph.D. advisors.

The University of Oregon invested in me early, and I share this award with my tremendously supportive UO colleagues. I also appreciate my collaborators from other institutions, in particular, Jim Gardner, Fang-Zhen Teng, Casey Saenger, Laurent Devriendt, Kenny Befus, Kate Huntington, Jon Hunt, and Shaun Brown—it is a delight working with all of you alongside our students. There are many others not mentioned here who have had a positive impact on me, wittingly or not, and I will thank them individually in person. Finally, I thank my wife and personal seismologist, Amanda Thomas, for all her support. My favorite collaborative project is rearing our lovely daughter, Ophelia.

—James Watkins, University of Oregon, Eugene

Manning and Marty Receive 2017 Norman L. Bowen Award

Fri, 11/10/2017 - 12:49
Citation for Craig Manning Craig Manning

At a time when many of us focus on models of multidimensional chemical systems, pursue the first measurements of new isotope systems, analyze ever smaller samples, or write short, “silver-bullet” papers, Craig Manning brings exceptional rigor and simplicity to experimental geochemistry. As a result, his experimental results are timeless benchmarks for future work. The same results are timely contributions to understanding complex topics such as the evolution of aqueous fluids in subduction zones, and speciation in fluids at high pressure. This is a unique combination. In his dedication to a simple, physical chemistry approach, Craig stands alone among his generation of experimental petrologists. His insight into design of single-phase solubility experiments, and their application to multiphase, multicomponent systems, is unmatched. Craig’s work calls to mind the giants of experimental geochemistry: Norman Bowen, who merged observational geology with the rigor of chemical thermodynamics; George Kennedy, whose experiments brought similar discipline to hydrothermal systems; Hal Helgeson, who, like Bowen, brought physical chemistry to bear on the study of water–rock reaction; and Bruce Watson, whose innovative experiments showed a generation how mineral solubility data could be applied to real geologic problems. Craig is a sought-after and conscientious advisor, with many first-author papers by his students. He is an experienced field geologist who spent many seasons in Greenland and the Himalaya. He has published more than 95 papers during this century, so one might expect him to be something of a nerd. Yet this is far from the truth. Craig’s wife, Becky, is an accomplished filmmaker, producer, and professor at UCLA, and he spends much more time socializing with Becky’s interesting colleagues than with boring geoscientists. He’s a great reader, a generous friend, and a sophisticated traveler. Craig brings honor, credibility, and style to the Bowen award, AGU, and geoscience in general.

—Peter Kelemen, Columbia University of New York

Response

Thank you, Peter. Your eclectic list of geochemical greatness emphasizes my convoluted path, starting with Bowen’s The Evolution of the Igneous Rocks, assigned by Barry Doolan for my undergrad petrology class at the University of Vermont. I was hooked from the first phase diagram and probably should have foreseen my future as an experimentalist. Instead, I went to Stanford to work on ophiolites with Bob Coleman, then with Dennis Bird, who was rigorously applying thermodynamics to the fossil hydrothermal systems of East Greenland. I got hooked on that too, and we had so much fun discovering how they worked while defending ourselves in the Arctic. A newly minted aqueous geochemist cannot fail to notice the complex high-pressure veining of the Franciscan Formation, but it was frustrating to discover that the beautiful Helgesonian framework for solutes only worked to 5 kilobars. I persuaded Steve Bohlen to take me on for a postdoc at the U.S. Geological Survey. His enthusiasm and willingness to try anything spurred my initial attempts to measure high-pressure quartz solubility in water while I was not working on other things. I was too dumb or obstinate to accept the many failures. Finally, enough capsules held that upon arriving at UCLA I repurposed Art Montana’s piston cylinders for their true calling: determining high-pressure mineral solubility in fluids. Bob Newton eventually joined the fray; he has provided constant inspiration and lasting friendship. Meanwhile, An Yin and Mark Harrison indulged returns to my field roots in the deserts of central Asia. Like so many past recipients of this honor, I can testify that traveling the anastomosing paths of field and experimental study will always reward. Thanks to all of you, to my parents for creating a family of Earth and environmental scientists, and to Becky for companionship, insight, wit—and friends.

—Craig Manning, University of California, Los Angeles

 

Citation for Bernard Marty Bernard Marty

Bernard Marty has made major contributions to our understanding of the origins of volatile elements in the terrestrial planets. One could perhaps highlight four areas, centered on neon, carbon, nitrogen, and xenon. In parallel with Sarda and others, he showed that the neon isotopic composition of oceanic basalts is light relative to the atmosphere and argued that either the atmosphere was residual to a major fraction of lost volatiles or it was added later. He went on to show that some plume basalts have even higher 20Ne/22Ne than previously thought and used this to argue for a component of solar neon in the Earth. Using C/3He ratios of basalts, he estimated the mantle budget for carbon and demonstrated that budgets in arcs are dominated by recycling. With Dauphas he also made the observation that the nitrogen budget of oceanic basalts correlates with 40Ar/36Ar and used this to infer that nitrogen in the mantle was dominated by subduction of clays. He also made groundbreaking discoveries of the zoned nitrogen isotopic composition of the solar system based on Genesis samples. What is most spectacular is his recent work on xenon, where he and his team have made major inroads into long-standing problems. Working on early sediments, he found evidence that the fractionated isotopic composition of the atmosphere has become more so over time and reflects progressive losses, possibly from early UV irradiation. His well gas studies resolved chondritic xenon in the mantle. Finally, with analyses from Comet 67P sampled by Rosetta, he showed that Pepin’s original prediction of U-Xe, the anomalous isotopic composition of Earth’s primordial xenon, is a feature of comets, adding powerful new evidence for a cometary component in heavy noble gases. For these and other contributions, Bernard Marty is an extremely worthy recipient of the 2017 Bowen Award.

Alexander Halliday, University of Oxford, United Kingdom

Response

I am deeply honored to receive the prestigious Bowen Award, and I would like to thank the people who nominated me, the awards committee and all at AGU, for their selfless efforts. I am particularly indebted to Alex Halliday, who has always been keeping his eyes wide open to the magical mystery tour that is the geochemistry of noble gases. I was first introduced to this marvelous field by Minoru Ozima in Tokyo, and I have been inspired by some prominent scientists along my way, including Francis Albarède, Chris Ballentine, Keith O’Nions, Yuji Sano, Igor Tolstikhin, and many others in Paris, Cambridge, and Nancy. I have had the chance to work with fantastic colleagues, students, and postdocs at Centre de Recherches Pétrographiques et Géochimiques (CRPG) Nancy, and especially with Pete Burnard, with whom we developed a state-of-the-art noble gas laboratory at CRPG. Pete was a great noble gas geochemist as well as a true human being. I thank Annie, Louise, and Edwige for personal balance in a life busy with science.

The noble gases are fantastic tracers whose chemical inertness and radiogenic isotopes provide a quantitative approach for investigating mass balance at planetary scales. Their origins in planets have been traced back, thanks to their diverse cosmochemical signatures. However, there remained the need to calibrate “useful” volatile elements, such as water, carbon, and nitrogen, to noble gases to gain insights into their origins and cycles, something I have tried to do throughout my career. Interestingly, none of my research has been directly related to mineralogy and petrology, so I feel particularly humble and blessed to receive an award named after a petrologist as great as Norman Bowen, illustrating to me the fact that in science, our tools do not represent the end of the story but are instead keys for unlocking some of the universe’s mysteries.

—Bernard Marty, University of Lorraine, Nancy, France

White Draws Fire as Nominee to Head Key Environmental Agency

Thu, 11/09/2017 - 17:46

Less than a week after an authoritative White House climate science report reaffirmed that human activities are the dominant cause of global warming, the Trump administration’s nominee to lead a key environment office disputed those findings and disavowed the report itself at her confirmation hearing.

Kathleen Hartnett White, whom the administration wants to chair the White House Council on Environmental Quality (CEQ), dismissed the report “as a product of the past administration and not of this president” at a 7 November hearing conducted by the Senate Committee on Environment and Public Works.

White, a nonscientist who does not believe in the scientific consensus that humans are the main driver of modern climate change, acknowledged at the hearing that climate change is real and that human activity “more than likely” has an effect on climate change. However, she said “the extent to which [it has had an effect] I think is very uncertain.”

If White is confirmed and becomes chair of the CEQ, she would lead an office that coordinates federal environmental efforts. She currently is a senior fellow for energy and environment at the Texas Public Policy Foundation and was chairwoman and commissioner of the Texas Commission on Environmental Quality (TCEQ) from 2001 to 2007.

Climate Science Takes Center Stage

At the hearing, White said that carbon dioxide (CO2) in the atmosphere has none of the characteristics of a pollutant “that can have direct impact on human health,” and she labeled CO2 as “a plant nutrient.”

Asked by committee member Sen. Jeff Merkley (D-Ore.) whether she believes that CO2 levels have risen dramatically, White responded, “No, I would not say they have gone up drastically. I know they have risen from preindustrial times.”

“Obviously, anything that disagrees with [White’s] preestablished view that there is no damage from carbon dioxide she rejects outright.”The Climate Science Special Report, which the administration released on 3 November, concludes that “it is extremely likely that human activities, especially emissions of greenhouse gases, are the dominant cause of the observed warming since the mid-20th century.” The report also states that “today the global CO2 concentration is increasing rapidly” and that the growth rate in atmospheric carbon emissions from human activities has increased from 1.5 to 2 parts per million per year over the past 15–20 years.

The climate science report “needs to be taken seriously,” Merkley told Eos after the hearing. “Obviously, anything that disagrees with [White’s] preestablished view that there is no damage from carbon dioxide she rejects outright. That’s not helpful in taking on the challenges of the world.”

White’s Track Record Debated

White described herself as “an environmental optimist” who believes in and played a part in win-win environmental and economic outcomes.During the hearing, White pointed to her track record at TCEQ and said that during her chairmanship, the state’s population, economy, and jobs grew while point source emissions were dramatically reduced. She said, “I think totally in terms of fundamental protection of human health and welfare, and risks to children particularly motivate me.” In her written testimony, she described herself as “an environmental optimist” who believes in and played a part in win-win environmental and economic outcomes.

White also called herself “a great champion of getting rid of red tape,” noting that she would welcome “the challenge” of reforming the National Environmental Policy Act, including shortening the process for federal agencies to assess environmental impacts of their proposed actions before making decisions.

Republicans, including Sen. James Inhofe (R-Okla.), rose to her defense. “I understand several of the extremists are driving a narrative that you hate the environment and worked to give cover to polluters when you were with [TCEQ],” Inhofe said. However, he said that during White’s tenure there, administrative environmental enforcement orders and the amounts of penalties “increased significantly.”

Yet a 17 October editorial in the Dallas News provides a different perspective of White’s time at TCEQ. The paper wrote that her record “is abominable” and that as TQEC chair she regularly sided with business interests at the expense of public health and that she lobbied for lax ground-level ozone standards.

Committee member Sen. Sheldon Whitehouse (D-R.I.) told Eos that the editorial “is a pretty strong signal that the public interest won’t get a fair shake” from White.

“Far out of the Mainstream” Democrats voiced strong opposition to White’s nomination during her confirmation hearing and at a news briefing directly afterward. Pictured (left to right) at the news briefing are Sen. Tom Carper (D-Del.), Sen. Sheldon Whitehouse (D-R.I.), and Sen. Jeff Merkley (D-Ore.). Credit: Randy Showstack

Other Democrats at the hearing lined up for their opportunity to challenge White about climate science and some earlier statements she had made, including equating those who believe that global warming is occurring with being pagans.

“Your positions are so far out of the mainstream that they’re not just outliers, they’re outrageous,” said Sen. Ed Markey (D-Mass.).

Sen. Tom Carper (D-Del.), the ranking Democrat on the committee, called White a “science denier” and said that she “has shown a disdain for science, a disregard for the laws and regulations already on the books, and a staggering disrespect for people who have views with whom she disagrees.”

At a news briefing after the hearing, Carper called White “a dangerous person” and said that he would work to defeat her nomination.

“It’s really most important to make sure that these individuals go into office clearly marked as the industry hacks who they are, so that there’s no surprise about that.”However, committee chair Sen. John Barrasso (R-Wyo.) said that he expects White’s nomination to clear the committee and be sent to the Senate floor sometime after 20 November, her deadline for responding to written questions from committee members.

At the briefing, Whitehouse said that Democrats need to use their Senate floor time to alert the public about the conflicts of interest and “extremism” that White and other nominees embody. Democrats, he said, need to call out “the danger that [White and others] present to public health down the road, from all of the signals and warnings that they are going to deliberately tune out because they don’t want to hear them.”

Whitehouse added, “It’s really most important to make sure that these individuals go into office clearly marked as the industry hacks who they are, so that there’s no surprise about that.”

—Randy Showstack (@RandyShowstack), Staff Writer

Human Activity Increasing Rate of Record-Breaking Hot Years

Thu, 11/09/2017 - 13:23

WASHINGTON D. C. — A new study finds human-caused global warming is significantly increasing the rate at which hot temperature records are being broken around the world.

Global annual temperature records show there were 17 record hot years from 1861 to 2005. The new study examines whether these temperature records are being broken more often and if so, whether human-caused global warming is to blame.

The results show human influence has greatly increased the likelihood of record-breaking hot years occurring on a global scale. Without human-caused climate change, there should only have been an average of seven record hot years from 1861 to 2005, not 17. Further, human-caused climate change at least doubled the odds of having a record-breaking hot year from 1926 to 1945 and from 1967 onwards, according to the new study.

The study also projects that if greenhouse gas emissions remain high, the chance of seeing new global temperature records will continue to increase. By 2100, every other year will be a record breaker, on average, according to the new study accepted for publication in Earth’s Future, a journal of the American Geophysical Union.

The new findings show how climate change is visibly influencing Earth’s temperature, said Andrew King, a climate extremes research fellow at the University of Melbourne in Australia and lead author of the new study.

“We can now specifically say climate change is increasing the chance of observing a new temperature record each year,” he said. “It’s important to point out we shouldn’t be seeing these records if human activity weren’t contributing to global warming.”

The study strengthens the link between human activity and recent temperature trends, according to Michael Mann, a climatologist and director of the Earth System Science Center at Pennsylvania State University, who was not involved with the new research.

“This work builds on previous research establishing that, without a doubt, the record warmth we are seeing cannot be explained without accounting for the impact of human activity on the warming of the planet,” Mann said.

Record-Breaking Heat

Record hot years have been occurring more frequently in recent decades. 2014 was the hottest year on record since 1880, but that record was quickly broken in 2015 and again in 2016. Research published earlier this year in Geophysical Research Letters found these three consecutive records in global temperatures were very likely due to anthropogenic warming.

Record-breaking temperatures tend to attract attention because they are one of the most visible signs of global warming. As a result, understanding how and why the rate of record-breaking is changing is critical for communicating the effects of climate change to the public, King said.

Previous research examined changes in rates of record-breaking temperatures in specific countries or regions. However, these studies couldn’t analyze global temperature trends because they relied on gathering large numbers of daily temperature records from different sources, according to King. Additionally, they didn’t directly attribute changes in record-breaking to human activity.

In the new study, King developed a method to isolate the human role in changing rates of record-breaking temperatures globally. Unlike previous studies, the method uses a single source of temperature data, in this case global annual temperatures, allowing King to study temperature records on a global scale.

King first looked at global temperature data from 1861 to 2005 and identified which years were hot record breakers. He then used a wide array of climate models to simulate global temperatures in this period. Some of the models included only natural influences on the climate such as volcanic eruptions, while other models featured both natural influences and human influences such as greenhouse gas emissions and the release of aerosols into the atmosphere.

Historically observed and model-simulated numbers of hot and cold global annual temperature records for 1861–2005. The number of real-world record occurrences are shown as black circles. The boxes represent the range of record numbers predicted by models with human and natural influences (red boxes) and natural influences only (orange boxes). The central lines in the boxes represent the median. Credit: Andrew D. King

King found only the climate models that included human influences had the same number of record-breaking hot years as historical temperature records—15 to 21, on average. The models without human influences only had an average of seven record-breaking hot years from 1861 to 2005.

He also determined human-caused climate change at least doubled the odds of having a record-breaking hot year from 1926 to 1945 and from 1967 onwards. The odds didn’t increase from 1945 to 1967 because man-made aerosol emissions generated a cooling effect, which counteracted warming due to anthropogenic greenhouse gases.

King’s research can also be applied to quantify the influence of human activities on a specific record-setting event. He applied his method to record-setting hot global temperatures in 2016 and record-setting hot local temperatures in central England in 2014. He found human influence led to a 29-fold increase in the likelihood of seeing both new records compared to a situation with no human influence on climate.

Analog Modeling Recreates Millions of Years in a Few Hours

Thu, 11/09/2017 - 13:21

Geoscientists face a multitude of difficulties when they study tectonic processes, but scale is probably the biggest. Tectonic processes are not just large on a spatial scale—they also take place over long periods of time. Using models is a very effective way of understanding some of the processes governing tectonic deformation at timescales that humans can comprehend.

Analog models—physical representations of tectonic processes—contribute substantially to the development of new structural and tectonic concepts, improved benchmarking of numerical models, and interactive teaching methods, but the U.S. analog modeling community is dispersed and small. Thus, 50 analog and numerical modelers gathered for a workshop at the Bureau of Economic Geology of the University of Texas at Austin to discuss new techniques, boundary conditions, new materials, and ways to strengthen the analog modeling community.

Making Better Models Together

Analog models are intuitive, and they have been used for more than 200 years. In 1815, Sir James Hall was the first to publish a research paper on a “tectonic” experiment where he shortened different layers of sand and clay to study the evolving structures. However, constructing a useful model is no simple task: It requires suitably scaling experiments so the results can be extrapolated to natural systems, and it requires the modeler to understand the necessary simplifications.

Roger Buck and Xiaochuan Tian (both from Columbia University, New York, N.Y.) use green gelatin and red-colored water to simulate the way magma intrusion breaks up Earth’s crust. Credit: Zachary Kornse, Iowa State University

Workshop participants presented exciting innovations in analog modeling research, highlighting new modeling materials used in scientific research as well as teaching. These materials will eventually become accessible to the wider community via Carleton College’s Science Education Resource Center web page to help encourage the use of authentic, research-worthy analog models in the teaching communities.

A primary goal of this workshop was to strengthen the interaction and collaboration between physical and numerical modelers. Even though the two communities face similar challenges, the exchange between them has traditionally been limited.

During the workshop, participants created analog and numerical models of an extensional rift system. Participants discussed the strengths and limitations of both approaches, as well as ways to increase collaboration between the two groups. This hands-on approach during the workshop encouraged participants to exchange ideas directly, and it facilitated networking between the different communities.

Workshop participants created the numerical model of an extensional rift system that produced this elevation map of surface features. They compared the numerical model with the physical model shown at the top of this page.

In addition to 22 posters and two teaching examples, the workshop included 10 oral presentations on such topics as the application of model results to field and seismic data, different experimental techniques (e.g., centrifuge models versus 1g models), and scaling and reproducibility issues. Workshop attendees constituted a diverse group, ranging from early- to late-career geoscientists.

Future Plans

A discussion on future needs of the community at the end of the workshop led to two insights. Educators need a better system to exchange teaching materials for using analog models in classrooms. For example, they need dedicated ways to share information on how to build models and plan experiments, as well as actual model-building materials. Also, the community has an ongoing need for a system or database to store and exchange model results, as well as a database of various materials that are suitable for building models.

This workshop was sponsored by U.S. National Science Foundation grant 1700033.

—Jacqueline E. Reber (email: jreber@iastate.edu), Department of Geological and Atmospheric Sciences, Iowa State University, Ames; Tim P. Dooley (@timdooley), Bureau of Economic Geology, University of Texas at Austin; and Elizabeth Logan, Institute for Computational Engineering and Sciences, University of Texas at Austin

Future Looks Drier as Drylands Continue to Expand

Thu, 11/09/2017 - 13:18

Drylands currently constitute about 41 per cent of the Earth’s land surface and are home to more than 38 per cent of the world’s population. Drylands are particularly vulnerable to environmental change. In fact, the areas categorized as dryland have been increasing over recent decades, with further expansion set to continue under the influence of climate change. This will have knock-on effects on communities in regions on almost every continent, their crops and livestock, health and livelihoods. A recent review article in Reviews of Geophysics described recent progress in dryland climate change research. The journal’s editors asked two of the authors some questions about how and why dryland areas are changing, and asked them to give an overview of scientific research in this field.

What are “drylands” and where are they found?

Drylands are areas where the annual potential evapotranspiration greatly exceeds annual precipitation. Over drylands, the air is almost always “thirsty” for water but precipitation is not enough to meet this demand. The locations of drylands are determined mostly by atmospheric circulation and topography. They are primarily found in middle and low latitudes such as northern and southern Africa, Central and East Asia, southwestern North America, the west coast of South America, and much of Australia.

How have dryland areas changed over recent decades?

Major drivers of the dryland climate system. Credit: Huang et al., 2017, Figure 22

Both observations and model simulations indicate that global drylands have expanded over recent decades. For example, the area classified as drylands in the period 1990 to 2004 was 4 per cent larger than that for the period 1948 to 1962.

Each of the subtypes of dryland region – hyperarid, arid, semiarid and dry subhumid – has expanded, although the largest expansion has been in semi-arid regions, which now account for more than half of total dryland expansion. Semiarid regions on five continents have all expanded but East Asia accounts for nearly 50 per cent of this global growth.

How will expanding drylands affect people?

The landscapes of drylands are characterized by low vegetation cover, low nutrition content of soil, and low capacity for water conservation. Dryland expansion means vegetated and fertile land permanently degrading into this state, a process known as “desertification.” Climate change model results suggest that under a high emission scenario about 78 per cent of dryland expansion by the end of this century will occur in less developed countries, increasing the dryland coverage rate in these countries to 61 per cent. These areas are already home to disproportionality more poor and vulnerable people; environmental changes including rising temperatures, water shortages and soil loss will exacerbate poverty and may stimulate large scale migrations.

What are some of the recent major developments in drylands research?

Recent findings indicate that long-term trends in aridity are mainly attributable to increased greenhouse gas emissions, while anthropogenic aerosols exert small effects but alter its attributions. Meanwhile, human-induced land use or land cover change has likely contributed to aridity trends on regional scales.

Research has shown that the greatest atmospheric warming over land during the last 100 years was over drylands; this accounted for more than half of all continental warming. However, the global pattern and interdecadal variability of aridity changes are modulated by oceanic oscillations. The different phases of those oceanic oscillations induce significant changes in land-sea and north-south thermal contrasts, which in turn alter global changes in temperature and precipitation.

What are the major unsolved or unresolved questions in this field?

So far, we are still not in the position to distinguish quantitatively between increasing aridity caused by natural variability in the climate system and the changes caused by human activities. On the other hand, studies on dryland climates should pay attention not only to long-term trends but also to decadal, multidecadal, and even interannual variability and their impacts on ecosystems and society. More practically, another pending task is to identify a catastrophic threshold of drylands for developing an early warning system of dry climate related disasters, to take a proactive adaptation, and to mitigate their impacts.

Where are additional data or modeling efforts needed?

The changes occurring in drylands are part of the dynamics of the global climate system so we need “big data” including high-quality ground-based observations and improved satellite retrieval products, as well as climatic proxies from paleoclimate research. Well-developed global and regional climate system models with more reasonable parameterization schemes suitable for dryland areas and so on are particularly required. A special international project comparing dryland climate simulations is planned as part of the current Phase 6 of the World Climate Research Programme’s Coupled Model Intercomparison Project (CMIP6) and this will also yield useful information.

—Jianping Huang, College of Atmospheric Sciences, Lanzhou University, China; email: hjp@lzu.edu.cn; and Congbin Fu, Institute for Climate and Global Change Research & School of Atmospheric Sciences, Nanjing University, China

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer