Syndicate content
Earth & Space Science News
Updated: 1 day 16 hours ago

Scientists Predict Active Hurricane Season

Fri, 05/26/2017 - 18:08

It’s time to get prepared. Scientists at the National Oceanic and Atmospheric Administration (NOAA) predict higher-than-normal hurricane activity for the 2017 Atlantic hurricane season, which lasts from 1 June through 30 November.

Specifically, NOAA scientists predict a 70% chance of 11–17 named tropical storms this season. Five to nine of those storms may become hurricanes (with wind speeds above 119 kilometers per hour), of which two to four may become major hurricanes, meaning category 3 or higher (with wind speeds above 179 kilometers per hour).

NOAA’s hurricane season outlook for 2017. Credit: NOAA

What conditions might bring on this heightened hurricane season? Warm sea surface temperatures and low vertical wind shear across the tropical Atlantic Ocean and Caribbean Sea may contribute, plus a weak or nonexistent El Niño, said Gerry Bell, lead hurricane forecaster at NOAA’s Climate Prediction Center in College Park, Md.

He and other government officials discussed NOAA’s 2017 hurricane outlook and urged storm readiness at a briefing for reporters at the center yesterday.

Hurricane Essentials

Hurricanes can’t form without warm air, and that warm air comes from a warm ocean surface. When warm ocean water evaporates, the water vapor rises, cools, and condenses, forming large clouds. In the Atlantic, wind blows westward, and the continued contact with new sources of warm water creates a thick pool of these storm clouds.

A graphic showing how swirling hot and cold air in a column of clouds can fuel a tropical storm. Credit: NOAA

Condensation releases heat, which warms the surrounding air as the clouds form. This warmed air is pulled back into the column of clouds, where it rises and condenses yet again. This cycle repeats, forming a larger and taller column of clouds. The tops of these clouds expand as more and more water droplets coalesce, forming a classic anvil shape.

As more air is sucked into the clouds, air pressure at the surface of the ocean drops. To fill this void, air from the edges of the expanded cloud tops rushes down, where it’s warmed again by the ocean’s surface and rises. This process repeats over and over, fueling the storm.

The wind then whips these clouds around a central point, and if winds get strong enough—about 119 kilometers per hour—the storm is classified as a hurricane.

The 2017 Season

Why do researchers forecast high hurricane activity in 2017? This season, researchers predict near-average or warmer-than-average sea surface temperatures across the tropical Atlantic Ocean and Caribbean Sea, enough to provide ample fuel for big storms.

What’s more, winds over the Atlantic are not expected to have as many different speeds or directions at different altitudes. This below- or near-average vertical wind shear means that columns of air above the ocean surface, even those within storms, may not dissipate. Instead, the gathering storm is likely to grow. By contrast, high vertical wind shear can “rip apart” a growing storm, Jennifer Collins, a climatologist at the University of South Florida in Tampa, told Eos.

In general, a strong El Niño leads to strong vertical wind shear, helping to suppress storms, Collins said. So this year’s weak or nonexistent El Niño could decrease the wind shear and foster storm growth.

West Coast Prospects

NOAA also weighed in about Pacific Ocean hurricanes, generally known as tropical cyclones. The agency forecasts 80% odds for a near- or above-normal season in both the central and eastern Pacific Ocean regions. For the eastern zone in particular, the agency calculated a 70% probability of 14 to 20 named storms. Of those, 6 to 11 are expected to become hurricanes, including 3 to 7 major ones.

Be Prepared

“Have a family discussion about what you will do, where you will go, and how you will communicate with each other when a storm threatens.”Officials from NOAA and the Federal Emergency Management Agency (FEMA) stressed the importance of preparedness ahead of the hurricane season. They encouraged people who live in hurricane-prone areas to create emergency plans, home safety kits, and evacuation plans.

“Have a family discussion about what you will do, where you will go, and how you will communicate with each other when a storm threatens,” said Robert Fenton Jr., acting administrator of FEMA. “Know your evacuation route, tune into your local news or download the FEMA app to get alerts, and finally, listen to local authorities as a storm approaches.”

Click here for hurricane preparedness tips.

—JoAnna Wendel (@JoAnnaScience), Staff Writer

First Science Results from NASA’s Juno Mission

Fri, 05/26/2017 - 11:14

Early science results from NASA’s Juno mission to Jupiter portray the largest planet in our solar system as a complex, gigantic, turbulent world, with Earth-sized polar cyclones, plunging storm systems that travel deep into the heart of the gas giant, and a mammoth, lumpy magnetic field that may indicate it was generated closer to the planet’s surface than previously thought.

“We are excited to share these early discoveries, which help us better understand what makes Jupiter so fascinating,” said Diane Brown, Juno program executive at NASA Headquarters in Washington, D. C. “It was a long trip to get to Jupiter, but these first results already demonstrate it was well worth the journey.”

Juno launched on Aug. 5, 2011, entering Jupiter’s orbit on July 4, 2016. The findings from the first data-collection pass, which flew within about 2,600 miles (4,200 kilometers) of Jupiter’s swirling cloud tops on Aug. 27, are being published this week in two papers in the journal Science, as well as a 44-paper special collection in Geophysical Research Letters, a journal of the American Geophysical Union.

“We knew, going in, that Jupiter would throw us some curves,” said Scott Bolton, Juno principal investigator from the Southwest Research Institute in San Antonio. “But now that we are here we are finding that Jupiter can throw the heat, as well as knuckleballs and sliders. There is so much going on here that we didn’t expect that we have had to take a step back and begin to rethink of this as a whole new Jupiter.”

Among the findings that challenge assumptions are those provided by Juno’s imager, JunoCam. The images show both of Jupiter’s poles are covered in Earth-sized swirling storms that are densely clustered and rubbing together.

“We’re puzzled as to how they could be formed, how stable the configuration is, and why Jupiter’s north pole doesn’t look like the south pole,” said Bolton. “We’re questioning whether this is a dynamic system, and are we seeing just one stage, and over the next year, we’re going to watch it disappear, or is this a stable configuration and these storms are circulating around one another?”

Another surprise comes from Juno’s Microwave Radiometer (MWR), which samples the thermal microwave radiation from Jupiter’s atmosphere, from the top of the ammonia clouds to deep within its atmosphere. The MWR data indicates that Jupiter’s iconic belts and zones are mysterious, with the belt near the equator penetrating all the way down, while the belts and zones at other latitudes seem to evolve to other structures. The data suggest the ammonia is quite variable and continues to increase as far down as we can see with MWR, which is a few hundred miles or kilometers.

Prior to the Juno mission, it was known that Jupiter had the most intense magnetic field in the solar system. Measurements of the massive planet’s magnetosphere, from Juno’s magnetometer investigation (MAG), indicate that Jupiter’s magnetic field is even stronger than models expected, and more irregular in shape. MAG data indicates the magnetic field greatly exceeded expectations at 7.766 Gauss, about 10 times stronger than the strongest magnetic field found on Earth.

Juno’s MicroWave Radiometer (MWR) passively observes beneath Jupiter’s cloud tops. This artist’s conception shows real data from the 6 MWR channels, arranged by wavelength. Credit: NASA/SwRI/JPL

“Juno is giving us a view of the magnetic field close to Jupiter that we’ve never had before,” said Jack Connerney, Juno deputy principal investigator and the lead for the mission’s magnetic field investigation at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “Already we see that the magnetic field looks lumpy: it is stronger in some places and weaker in others. This uneven distribution suggests that the field might be generated by dynamo action closer to the surface, above the layer of metallic hydrogen. Every flyby we execute gets us closer to determining where and how Jupiter’s dynamo works.”

Juno also is designed to study the polar magnetosphere and the origin of Jupiter’s powerful auroras—its northern and southern lights. These auroral emissions are caused by particles that pick up energy, slamming into atmospheric molecules. Juno’s initial observations indicate that the process seems to work differently at Jupiter than at Earth.

Juno is in a polar orbit around Jupiter, and the majority of each orbit is spent well away from the gas giant. But, once every 53 days, its trajectory approaches Jupiter from above its north pole, where it begins a two-hour transit (from pole to pole) flying north to south with its eight science instruments collecting data and its JunoCam public outreach camera snapping pictures. The download of six megabytes of data collected during the transit can take 1.5 days.

“Every 53 days, we go screaming by Jupiter, get doused by a fire hose of Jovian science, and there is always something new,” said Bolton. “On our next flyby on July 11, we will fly directly over one of the most iconic features in the entire solar system—one that every school kid knows—Jupiter’s Great Red Spot. If anybody is going to get to the bottom of what is going on below those mammoth swirling crimson cloud tops, it’s Juno and her cloud-piercing science instruments.”

Antenna Towers Attract Additional Lightning Strikes

Fri, 05/26/2017 - 11:12

Over the past 30 years, a proliferation of new technologies (especially cell phones) has increased the number of antenna towers in the United States more than threefold. Advancements in broadcasting technologies also assisted in the development of the National Lightning Detection Network (NLDN), a web of 100 sensors that can detect the electromagnetic signals emitted when lightning strikes the ground. Within seconds, these towers transmit data on the location, time, and polarity (positive or negative electrical charge) of the lightning strike to a global database via satellite.

The NLDN database is the crux of numerous climate studies, as it catalogs lightning strokes and flashes across a vast area. Following an upgrade in 1995, this U.S. network has consistently detected cloud-to-ground lightning strikes—the classic bolt of lightning—95% of the time.

However, it’s an imperfect system. Studies dating back to the 1960s show that antenna towers attract lightning strikes to a greater extent than mountain peaks at similar elevations. However, many studies summarize lightning over wide areas (10–20 kilometers), potentially masking smaller-scale lightning anomalies. Thus, lightning driven by other human-made structures might be, in reality, behaving differently than what is reflected in broad use of the data.

To test the accuracy of current lightning measurements, Kingfield et al. (a research group from The University of Oklahoma in Norman, Okla., the heart of storm country) mapped 20 years of NLDN cloud-to-ground lightning data in a grid spaced into 500-meter cells. The researchers found that nearly all (99.8%) of the grid cells with more than 100 cloud-to-ground lightning strikes recorded were within a kilometer of an antenna tower registered with the Federal Communications Commission. They also found that the taller the tower was, the greater the likelihood of a cloud-to-ground lightning strike occurring was.

For instance, 619 cloud-to-ground lightning strikes, the most measured in a single grid cell, were recorded near a 331-meter-tall tower located in the Boston Mountains 30.6 kilometers southeast of Fayetteville, Ark., whereas 163 cloud-to-ground strikes were measured near the Willis Tower (520 meters tall) in Chicago, Ill., over the 20-year period. Furthermore, there was a 631% increase in lightning near a 512-meter tower in northern Wisconsin when compared to an area roughly 2–5 kilometers away.

Most past studies have examined limited geographies and seasons, both of which have a significant effect on lightning frequency. The researchers, however, decided to cover a wide range of locations and dates. For example, they found that from September to February, throughout the northern Great Plains, the frequency of all cloud-to-ground lightning strikes near a tower was about 138% higher than in a region about 2–5 kilometers away. From March to August, the frequency at the same locations was about 117% higher. An exceptionally surprising find was the identification and tracking of so-called hot spots where cloud-to-ground lightning increased immediately after a tower’s construction.

As a whole, the study quantifies the increased likelihood of lightning strikes occurring near human-made towers, especially the tallest of these towers. Its illustration of the variability, yet predictability, of this common atmospheric phenomenon will inform many meteorological and climatological studies to come. (Geophysical Research Letters, https://doi.org/10.1002/2017GL073449, 2017)

—Sarah Witman, Freelance Writer

Close Encounter with Jupiter

Thu, 05/25/2017 - 18:14

Jupiter is the largest planet in the Solar System, more than 300 times larger than the Earth in mass. Surrounded by a system of dust rings and more than 60 moons, we already know something about the characteristics of the gas giant planet itself. Its atmosphere is composed of about 75% hydrogen and 24% helium by mass, with trace amounts of ammonia and other compounds including methane and water vapor. The planet is bulged around the equator because of its rapid rotation, and it is perpetually covered with clouds of ammonia, with its outer atmosphere segregated into several bands at different latitudes and featuring a persistent anti-cyclonic storm called the Great Red Spot south of the equator.

But there is still a lot more to discover about Jupiter’s interior, magnetic field, aurora, radio emissions, and history. NASA’s Juno mission carried to Jupiter a suite of nine scientific instruments designed to gather data on Jupiter’s atmosphere, gravity, magnetic field, energetic particle and radiation environment, aurora, and radio emissions. Entering the planet’s orbit on July 4, 2016, Juno has been collecting a vast amount of new information in a series of close flybys to within 4200 kilometers of the planet (1/17 of the planet’s radius)!

The first results of the Juno mission are the subject of a special issue of Geophysical Research Letters, and they reveal several interesting new findings. For example, Juno has found ammonia upwelling near the equator that exhibits significant variability in its abundance at depths corresponding to 30 bars (30 times Earth’s atmospheric surface pressure). The measured gravity hints at a more gradual “differential rotation” (difference in rotation speed between the pole and the equator), and the measured magnetic field is both stronger and more structured than current models expect.

Contrary to theoretical predictions, this measured magnetic field does not exhibit any perturbations associated with the electrical currents in the high-latitude regions. Likewise, contrary to expectations, the observed intensity variations of the ultraviolet aurora do not quite correlate with the fluctuations of the solar wind dynamic pressure.

Another big surprise is the occurrence of protons originating from the planet energized to hundreds of kilo-electron volts and moving away from the planet. At the same time, there are downward beams of electrons in the polar region that are possibly the source for Jupiter’s intense radio bursts, which have long been detected from Earth.

These new insights offer important clues into how Jupiter may have evolved over history into what it is and where it is today. By illuminating the underlying physics, they contribute to our understanding of the other planets, and provide many more questions than answers.

—Andrew Yau, Andrew Dombard, W. K. Peterson, and Paul D. Williams, Editors, Geophysical Research Letters; email: yau@phys.ucalgary.ca

Ancient Impact May Have Triggered Long-Term Volcanic Eruptions

Thu, 05/25/2017 - 11:49

Approximately 1.85 billion years ago, a comet up to 15 kilometers wide smashed into shallow seawater in what is now the city of Sudbury, Ontario. On impact, the comet may have torn through the continental crust down to the mantle, creating a 30-kilometer-deep transient crater that scientists estimate was between 150 and 260 kilometers in diameter. New research by Ubide et al. suggests that this impact prompted local volcanic activity that persisted long after the collision.

When a comet or an asteroid hits Earth, the extreme pressures and temperatures can melt enough surrounding rock in Earth’s crust to drive short-term volcanic activity in and near the newly formed crater. However, recent satellite observations of crater structures and magma deposits on Mercury, Venus, Mars, and the Moon suggest that some impacts may also be related to longer-term volcanic activity.

This new evidence prompted the researchers to reinvestigate the Sudbury structure’s Onaping Formation: a 1.5-kilometer-thick rock deposit that fills in the impact basin. They visited the formation and used detailed mapping and geological exploration to guide collection of rock samples, which were later analyzed using scanning electron microscopy and laser ablation mass spectrometry.

The analysis revealed that various igneous materials are peppered throughout the Onaping Formation, suggesting that volcanic activity persisted throughout its development. The composition of the lower part of the formation is consistent with short-term volcanic activity driven by the initial impact melt. As seawater flooded the new basin, it interacted with the impact melt to cause explosive eruptions.

However, the upper 1000 meters of the Onaping Formation contain igneous materials that are richer in magnesium, suggesting that they have a deeper origin. The scientists propose that as the impact-melted rock cooled, the initial volcanic activity subsided and was replaced by underwater eruptions fed by magma traveling up from Earth’s mantle. These eruptions lasted for up to 1.5 million years after the impact event.

This hypothesis invokes the controversial idea that large impacts like the Sudbury collision can relieve pressure on the underlying mantle, causing mantle rock to melt and form eruptive magma. In the future, the researchers say, further analysis of rocks from the upper layer of the Onaping Formation could help clarify the later magma source.

The Sudbury crater is the second-largest impact crater on Earth and one of just a few large craters that are still well preserved despite tectonic activity, so these new findings could provide clues to other major impacts in the past. Although Earth’s geology is unique, the findings could also improve understanding of large impacts on other planets and the Moon, some of which are thought to have torn through to the mantle. (Journal of Geophysical Research: Planets, https://doi.org/10.1002/2016JE005085, 2017)

—Sarah Stanley, Freelance Writer

David S. Evans (1936–2016)

Thu, 05/25/2017 - 11:46

Dave Evans was one of the pioneers of rocket measurements over the aurora. He is perhaps best known for his incisive research on plasma electron acceleration that produces vivid auroral displays. On 14 October 2016, he passed away peacefully at his home in Boulder, Colo., at age 80.

Dave was originally from Milwaukee, Wis., where he was born on 17 June 1936. He received a bachelor’s degree in physics from the University of Michigan. His long and productive research career began with graduate research work at the Space Sciences Laboratory of the University of California, Berkeley; he then progressed to NASA’s Goddard Space Flight Center (GSFC) in Maryland.

Dave moved to Boulder, Colo., in 1970, where he was a research scientist with the National Oceanic and Atmospheric Administration’s (NOAA) Space Environment Center (later renamed the Space Weather Prediction Center) until he retired in 2003. Before he retired, Dave did two tours of duty at NASA Headquarters in Washington, D. C., returning each time to Boulder.

During the 1960s, Dave worked with the Bendix Corporation to improve their “channeltron” electron multipliers by curving the channels to reduce ion feedback, which improved their gain and sensitivity dramatically. He used that technical knowledge and experience at GSFC to develop an “open windowed electron multiplier” to measure auroral particles.

Examining the Aurora

This innovation also led to improvements in imaging detectors based on microchannel plates. To this day, nearly all space plasma observations are based on these devices. Sounding rockets launched from Canada carried instruments incorporating these imaging detectors. Dave used data from these instruments to perform some of the earliest measurements of the low-energy electron precipitation into the atmosphere that produces the aurora borealis and australis.

Evans proved for the first time that electric fields aligned parallel to magnetic field lines were responsible for energizing the electrons that form Earth’s magnificent aurora.His work with the rocket data in the mid-1970s led to his most famous paper, in which he convincingly answered one of the most actively debated questions of that period. He proved for the first time that electric fields aligned parallel to magnetic field lines were responsible for energizing the electrons that form Earth’s magnificent aurora. He did this by showing that auroral backscatter and secondary electrons from the atmosphere are trapped and that the observed characteristics of auroral electrons inevitably result. Satellite observations confirmed his prediction that such electrons flow to high altitudes in the absence of discrete aurora.

Dave’s interests, and his views on auroras and the requisite electron acceleration, were perhaps best captured in a quote often attributed to him: “Waterfalls are turbulent. But turbulence doesn’t make the water fall.”  He also disparaged Faraday’s concept of magnetic field lines of force, believing these field lines to be a conceptual crutch that obscured a clear understanding of the electric fields produced by moving charged particles (electrical currents). He was critical of the idea that magnetic field lines act like rubber bands that can “reconnect.” There is no doubt that his opinions on magnetic fields were controversial; he was not afraid to challenge the conventional wisdom. In doing so, he set an example for all of us scientists.

Many of us also know him for his statistical maps of auroral energy influx and characteristic energy. These maps became an operational product at NOAA, and scientists continue to use them today. His auroral maps are used in physics-based models of the upper atmosphere and in assimilative models of auroral electrodynamics.

From 1989 to 1991, during his first stint at the Space Physics Division at NASA in Washington, D. C., he reorganized the peer review process for research proposals and coordinated many of the reviews over the following years. His mantra was, What’s the science question you want to address? Why is it important? And how are you going to answer the question? The basis of his review process is still being used at NASA today.

A Passion for Science

Dave was a penetrating and insightful thinker on many subjects and issues, some political in nature. When one of us first met him, his office featured a “Nixon countdown calendar” that marked the days until the end of Nixon’s term in office. During his tours at NASA Headquarters, his interests in space policy were ignited. He was an avid reader of policy-related works, notably The Heavens and the Earth: A Political History of the Space Age by Walter A. McDougall.

Many of us also know that Dave had a wry sense of humor: Upon receiving a book that had been on loan from his personal library for more than 20 years, he asked, “What happened to the $100 bill I was using for a bookmark?”  One had to have a quick retort to match his quick wit.

Dave rode a BMW motorcycle as a race marshal in the Coors Classic Bicycle Race for several years. He bought his own BMW when the races ended and enjoyed many hours riding through the mountains around Boulder. Credit: Susan B. Evans

Dave was passionate about science and very generous with his knowledge, wisdom, and guidance of early-career scientists. He never hesitated to share what he knew, which meant that many young scientists could build upon his accomplishments. Dave was a truly unselfish leader and motivator, and he loved to chat about physics, particularly over a beer! His scientific integrity and passion for physics have been an inspiration to the life and career of many scientists.

Dave was a family man and is survived by Susan, his wife of 57 years; 8 children; 15 grandchildren; and 5 great-grandchildren. Dave loved the Green Bay Packers, Judy Collins songs, and reading. He enjoyed constructing intricate models of World War II airplanes, and he loved riding his motorcycle along back mountain roads and Sunday drives with his family. He was especially fond of XXX IPA beer and sudoku at Southern Sun Pub & Brewery in Boulder’s Table Mesa neighborhood, where the employees knew him affectionately as Doctor X.

Dave was a talented, unselfish, and courageous man who will be sorely missed by colleagues, family, and friends, both local and worldwide.

—Thomas E. Moore, Heliophysics Science Division, NASA Goddard Space Flight Center, Greenbelt, Md.; and Tim Fuller-Rowell (email: tim.fuller-rowell@noaa.gov), Cooperative Institute for Research in Environmental Sciences, University of Colorado Boulder; and Space Weather Prediction Center, National Oceanic and Atmospheric Administration, Boulder, Colo.

Federal Science Funding Drops Sharply in Trump Budget Plan

Wed, 05/24/2017 - 19:28

The U.S. federal budget proposal for fiscal year (FY) 2018, which the Trump administration released yesterday, slashes funding for many federal science agencies. Attention now turns to Congress, which appropriates funding for federal agencies.

Scientists and environmental advocates say that the proposed budget, which the administration billed as a funding plan to “make America great again,” would weaken the country economically and reduce public health, safety, and environmental measures.

They hope that Congress will muster continued bipartisan support for Earth and space sciences, just as it did earlier this month in passing omnibus legislation to fund the government through FY 2017, which ends on 30 September. President Donald Trump signed that legislation into law even though it sharply contrasted with his FY 2018 “skinny budget” released in March that preceded the fleshed-out FY 2018 budget unveiled yesterday.

Science Agencies Take a Big Hit

“The White House’s 2018 budget plan, if it were to become law, would devastate America’s science and technology enterprise and negatively affect our nation’s economy and public well-being.”As the skinny budget had foreshadowed, the full version of the FY 2018 budget proposal will dramatically cut discretionary funding compared to the FY 2017 budget to offset increases in military funding and other administration priorities (See Table 1).

“The White House’s 2018 budget plan, if it were to become law, would devastate America’s science and technology enterprise and negatively affect our nation’s economy and public well-being,” Rush Holt, CEO of the American Association for the Advancement of Science in Washington, D. C., said in a briefing held by the association on Tuesday.

Table 1. Proposed Federal Budget for Selected Earth and Space Science Agencies and Departments for FY 2018a Federal Agency/ Department FY 2017 Enactedb FY 2018 Budgetb Changeb Percent Change Department of Energy, Office of Science 5392 4472 –919 –17.0 Department of Energy, ARPA-E 306 20 –286 –93.5 Environmental Protection Agency 8058 5700 –2358 –29.3 NASA 19,653 19,052 –600 –3.1 National Science Foundation 7472 6652 –819 –11.0 National Oceanic and Atmospheric Administration 5675 4775 –899 –15.9 U.S. Geological Survey 1085 922 –163 –15.0 aSources: Budget of the U.S. Government: A New Foundation For American Greatness—Fiscal Year 2018; American Geophysical Union Public Affairs Department analysis.

bIn millions of U.S. dollars, rounded to the nearest million.


Proposed NOAA Cuts

The budget for the National Oceanic and Atmospheric Administration (NOAA) would drop to $4.775 billion, down 15.9% from the FY 2017 budget. At NOAA, steep cuts would hit the Oceanic and Atmospheric Research Office, including that office’s Climate Research Program; the National Ocean Service; and the National Environmental Satellite, Data, and Information Service.

The proposed NOAA funding loss “would be a disaster,” former NOAA administrator Conrad Lautenbacher told Eos. “NOAA already functions on a budget well below national requirements. This reduction on top would mean the loss of vital programs that support fisheries, agriculture, transportation, ocean and coastal management, and the scientific research and development essential to national prosperity now and in the future.”

Lautenbacher served as NOAA administrator from 2001 to 2008 under President George W. Bush. He is now the CEO of GeoOptics, a Pasadena, Calif., company developing a constellation of small satellites to collect climate and environmental data.

Major Drop in Funding Requested for EPA Mick Mulvaney, director of the White House Office of Management and Budget, holds up a copy of President Donald Trump’s proposed fiscal year 2018 federal budget as he speaks to the media on Tuesday. Credit: Associated Press/Andrew Harnik

The Environmental Protection Agency (EPA) would receive $5.700 billion, down 29.3%. Within EPA, funding would discontinue for more than 50 programs, including the Obama administration’s Clean Power Plan to limit power plant emissions, international climate change programs, and climate change research and partnership programs.

White House Office of Management and Budget director Mick Mulvaney said that the administration is “absolutely not” antiscience, despite cuts to science agencies and climate change initiatives. “We’re simply trying to get things back in order to where we can look at the folks who pay the taxes and say, ‘Look, yeah, we want to do some climate science, but we’re not going to do some of the crazy stuff the previous administration did,’” he said.

Other Agency Cuts

The Department of Energy’s Office of Science allocation would drop to $4.472 billion (down 17.0%), with the Advanced Research Projects Agency–Energy (ARPA-E) plummeting to $20 million, down 93.5%. The administration said that the private sector is “better positioned to finance disruptive energy research and development and to commercialize innovative technologies.”

However, Kateri Callahan, president of the Alliance to Save Energy in Washington, D. C., said that ARPA-E advances high-potential and high-impact energy technologies “that are too early for private-sector investment.” She added that cuts to federal energy efficiency programs are “not the way to make America great again.”

The National Science Foundation and the U.S. Geological Survey also would lose ground under this budget, as would Earth sciences within NASA. Although the budget for NASA planetary sciences would increase 4.5% under the proposed budget, the plan basically instructs NASA “to stop looking at Earth and look at other planets,” said David Doniger, director of the climate and clean air program of the Natural Resources Defense Council, which is headquartered in New York.

Antonio Busalacchi, president of the Boulder, Colo.–based University Corporation for Atmospheric Research, worried that the administration’s proposed cuts to research into Earth system sciences would undermine scientific progress aimed at better protecting the nation from natural disasters. He said the cuts “would have serious repercussions for the U.S. economy and national security and for the ability to protect life and property.”

Reaction from Congress

“I know that many members of Congress on both sides of the aisle share my concerns about this harsh and misguided budget request.”Democrats in Congress panned the budget proposal. With this proposal, the president has shown that “he has no intention of prioritizing investments in the [research and development] that drives our economy, keeps our nation competitive, and protects the environment and public health,” said Rep. Eddie Bernice Johnson (D-Texas), ranking member of the House Committee on Science, Space, and Technology. “I know that many members of Congress on both sides of the aisle share my concerns about this harsh and misguided budget request, and I hope and expect that by the time the appropriations process is over we will have achieved a saner outcome.”

Some Republicans also have qualms about the budget. Rep. John Culberson (R-Texas), chair of the House Appropriations Subcommittee on Commerce, Justice, Science, and Related Agencies and a strong supporter of NASA exploration, said he will ensure that his committee “strikes a balance between fiscal conservatism and funding our nation’s law enforcement agencies, space program, and scientific agencies.”

Upcoming Eos stories will provide additional coverage of the federal budget.

—Randy Showstack (@RandyShowstack), Staff Writer

Editor’s Note: AGU, which publishes Eos, issued a statement yesterday about the Trump administration’s FY 2018 budget request. The proposal “charts a course of destructive under-funding for scientific agencies,” writes AGU executive director and CEO Christine McEntee.

Unseasonable Weather Entrenches Climate Opinions

Wed, 05/24/2017 - 11:55

In February 2015, Oklahoma Senator Jim Inhofe presented a snowball to the U.S. Senate as proof that climate change isn’t real. His argument? That the current weather outside was just too cold, so global warming must be a hoax.

Jeremiah Bohr, a sociologist at the University of Wisconsin in Oshkosh, told Eos that Inhofe’s stunt was “a powerful argument”—in a political sense.

“It’s interesting [to see] whether part of what shapes our perception of climate change is our experience of everyday weather and our memory of seasons in the recent past.”Bohr recently published a paper in Climatic Change exploring the relationship between short-term climate anomalies, like an unseasonably cold February or an unseasonably warm March, and the political polarization surrounding the climate change conversation.

“It’s interesting [to see] whether part of what shapes our perception of climate change is our experience of everyday weather and our memory of seasons in the recent past,” Bohr said.

Even though climate scientists overwhelmingly agree that Earth is warming and that humans drive this warming by releasing greenhouse gases, as soon as a cooler day blows through or “you get that snowball,” those who don’t believe in climate change—people who are more likely to be Republicans or conservative leaning—will “double down” on those beliefs, Bohr said. Similarly, Bohr found that warmer- or cooler-than-average days will reinforce a belief in climate change among Democrats.

Watch the snowball incident here:

Surveys of Opinion

To find out how short-term climate anomalies affected the polarization of climate change opinions, Bohr looked at CBS/New York Times surveys of American adults for four periods over 2 years: February and March 2013 and February and May 2014.

The surveys asked respondents whether they thought global warming is an environmental problem causing serious impact now, will have an impact in the future, or will have no impact at all. It also asked respondents to identify whether they believe that climate change is real, caused by humans, or caused by natural variation. Bohr then compared these results with national monthly temperature averages compiled by the National Oceanic and Atmospheric Administration over those 2 years and assumed that those who answered the survey questions were in their states of residence at the time.

Bohr found that when temperatures were 3° or more higher or lower than a 5-year average, a higher percentage of Democrats provided answers supporting the existence of climate change, whereas a higher percentage of Republicans provided answers denying the existence or danger of climate change. Both ends of the political spectrum committed more to their respective opinions, Bohr said.

For Democratic or liberal-leaning respondents, this trend meant that slightly cooler or warmer temperatures convinced them even more that climate change was real. For Republican or conservative-leaning respondents, the opposite was true: slightly cooler or warmer temperatures convinced them even more that climate change was either not real or not a problem.

Media Matters

Bohr wonders whether the “entrenched” political polarization of news media could contribute to these responses.

A 2014 Pew Research Center survey found that 47% of “consistent conservatives” mainly get their news from Fox News, and 88% of consistent conservatives trust Fox News over other news outlets like NPR, the New York Times, MSNBC, BBC, and others. Fox News has been criticized for its climate change coverage—or lack thereof. A Media Matters study from 2013 found that when covering a United Nation’s Intergovernmental Panel on Climate Change report, Fox News cast doubt on the science behind climate change 75% of the time over a 2-month period.

Polarization of the media means that “we don’t really have a common space where we can all agree that there [is an] objective set of studies and facts.”Meanwhile, BBC, NPR, and the New York Times were the most trusted by liberal respondents, and those outlets haven’t faced as much criticism for their climate change–related coverage (although recently New York Times has come under fire for hiring a columnist who some people say holds contrarian views on climate change that fly in the face of the scientific evidence for it).

This polarization of the media means that “we don’t really have a common space where we can all agree that there [is an] objective set of studies and facts,” Bohr said. “I just don’t think, as long as our media networks are as polarized as they are, that we’re going to see this polarization going away.”

“Political ideology is the strongest predictor of your opinion on climate change in the [United Sates],” said Peter Howe, a geographer who studies the perception of climate change at Utah State University in Logan and wasn’t involved with the new paper.

Although Howe finds the new study intriguing, he said he’d like to see more research into how particular climate events—and their effects on the public’s opinion about climate change—might affect conversations surrounding mitigation efforts.

—JoAnna Wendel (@JoAnnaScience), Staff Writer

Mining Ancient Texts Reveals Clues to Space Weather of Yore

Wed, 05/24/2017 - 11:53

Scientists use sophisticated instruments such as satellites to monitor space weather today, but watchers of the skies have been observing phenomena such as auroras for millennia. Japanese researchers recently identified what may be the earliest known, datable sketch of an aurora and say it can shed light on solar activity more than 1000 years ago.

The crude marginalia were found in the Zūqnīn Chronicle, a history of events from Creation to the late 8th century that is preserved in the Vatican Apostolic Library. Composed in 775 and 776 CE, the manuscript is written in a dialect of Aramaic and attributed to a monk dubbed Joshua the Stylite, who lived in the monastery of Zūqnīn in what is now eastern Turkey. The manuscript yielded a total of 10 drawings of heavenly phenomena, including a sketch of horizontal bands from 771/772 CE. The chronicle describes it thus:

It was seen at harvest time, occupying the entire northern side from the eastern corner to the western corner. Its form was as follows: a blood-red scepter, a green one, a black one, and a saffron-colored one. It was going from below to above. When one scepter was extinguished, another one went up. And when someone was looking at it, it was changed into seventy shapes.

“One of the most obvious scientific merits for doing this research is that we can confirm past extreme events,” said Hiroaki Isobe, an associate professor in the Graduate School of Advanced Integrated Studies in Human Survivability at Kyoto University. Isobe has collaborated with more than a dozen scientists and historians in searching and analyzing various archives for records of sightings in the heavens.

“For instance, in 775 and 994 there were sharp peaks in carbon-14 seen in tree rings, which is evidence of large amounts of cosmic rays in the atmosphere,” Isobe told Eos on Monday after he spoke at a joint conference of the Japan Geoscience Union (JpGU) and the American Geophysical Union (AGU) in Chiba, Japan.

“This tells you that cosmic rays were there, but not their origin, such as whether they were from extreme space weather or gamma ray bursts or supernovas. If we can find evidence of low-latitude auroras in the same year as these peaks, it strongly supports the hypothesis that this carbon-14 is due to strong solar activity.”

Stormy (Magnetic) Weather

That the circa 771–772 auroras were visible from the relatively low geomagnetic latitude of eastern Turkey suggests they were associated with strong geomagnetic storms, according to Isobe. Although the Zūqnīn sketch was known to some historians, Isobe said his team is the first to investigate it in detail and confirm it was an aurora. Sketches are particularly useful in historical astronomy, he added, because words alone can be harder to interpret.

Written descriptions of auroras have been found in cuneiform clay tablets from Babylonia, whereas ancient Chinese and Japanese observers used terms such as “red vapor” or “white rainbow” to describe auroras. To determine whether a record really describes an aurora, features such as the time, moon phase, color, size, and direction, as well as other contemporary observations, must be taken into account. If the report was made during a full moon, for instance, it could be the result of atmospheric scattering of moonlight. But if it can be established that the phenomenon was an aurora, such records are also a record of solar activity.

“You can actively mine these archives, and there’s a huge treasure trove of information in them.”“You can actively mine these archives, and there’s a huge treasure trove of information in them,” said Martin Connors, Canada Research Chair in Space Science, Instrumentation and Networking at Athabasca University in Alberta. “I think they’re being very careful about interpretation, such as using observations taken on a moonless night and correlating them with changing magnetic latitude.”

Ryuho Kataoka of Japan’s National Institute of Polar Research, one of Isobe’s collaborators and a fellow speaker at a space weather session of the JpGU-AGU conference, shared the results of a study showing that aurora sightings in Japan in 1204 were likely caused by significant magnetic storms resulting from multiple coronal mass ejections. He noted that the observations were made at Kyoto during a time when, because of the orientation of Earth’s magnetic field, the region was especially susceptible to geomagnetic effects. Kataoka showed attendees images of handwriting from the Meigetsuki (The Record of the Clear Moon), a diary written by Fujiwara no Teika, a poet-scholar who died around 1241.

“Their best events were when the northern latitude in Kyoto was most favorable for seeing them. That corroborates these reports being of aurora,” Connors said.

Comet Tale

The Zūqnīn manuscript also contains a sketch of a comet plus a description that says the comet had two tails. The authors noted that the date coincides with the appearance of Halley’s Comet in May 760 and that it is known to have also been observed by Chinese astronomers. A simulation using astronomy software led the researchers to conclude that the Zūqnīn chronicle contains the earliest known description of two tails in a comet.

Isobe and his colleagues have published more than 10 papers about ancient records of celestial phenomena, including a January 2017 study about the aurora sketches in Publications of the Astronomical Society of Japan and a February 2017 study in Space Weather.

—Tim Hornyak (@robotopia), Science and Technology Journalist

Correction, 25 May 2017: This article was updated with the correct day of the week when an interview took place.

A 1.4-Billion-Pixel Map of the Gulf of Mexico Seafloor

Wed, 05/24/2017 - 11:51

The geology of the Gulf of Mexico (GOM) is dynamic, driven not by plate tectonics but by the movement of subsurface bodies of salt. Salt deposits, a remnant of an ocean that existed some 200 million years ago, behave in a certain way when overlain by heavy sediments. They compact, deform, squeeze into cracks, and balloon into overlying material.

Salt tectonics sculpt the geologic strata and seafloor in the Gulf of Mexico like few other places on Earth.Such salt tectonics continue to sculpt the geologic strata and seafloor in the GOM like few other places on Earth. Because of this salt tectonism and a steady supply of sediment delivered to the basin by rivers, the GOM’s seafloor is a terrain continually in flux. Bathymetry is ripe with active faults and escarpments, slump blocks and slides, canyons and channels, sediment waves, pockmarks and mud volcanoes, and other natural oil and gas seeps.

Now a new regional seafloor data set created by the U.S. Department of the Interior’s Bureau of Ocean Energy Management (BOEM) reveals that dynamic environment with stunning new clarity. The data include detailed seismic surveys originally shot by 15 different companies involved in the oil and gas industry. BOEM gained permission to release the relevant proprietary data publicly in a freely downloadable aggregate map of the seafloor.

Fig. 1. Northern Gulf of Mexico deepwater bathymetry grid created from 3-D seismic surveys. The grid defines water depth with 1.4 billion 12 × 12 meter cells and is available in feet and meters. BOEM grid coverage is limited to the area defined by rainbow colors. Shaded relief is vertically exaggerated by a factor of 5. Locations of Figures 2–9 are annotated. Credit: BOEM

With a resolution as fine as 149 square meters per pixel, about equal to the areal footprint of an American single-family house, BOEM’s bathymetry map is at least 16 times higher in resolution than the map historically used for the northern GOM. Most of those house-sized pixels in the new map are 1, 2, and 3 kilometers deep under the water, and the product contains 1.4 billion of them, making this a gigapixel map.

How Did the Salt Get There?

It is hypothesized that the salt precipitated out of hypersaline seawater when Africa and South America pulled away from North America during the Triassic and Jurassic, some 200 million years ago. The GOM was initially an enclosed, restricted basin into which seawater infiltrated and then evaporated in an arid climate, causing the hypersalinity (similar to what happened in the Great Salt Lake in Utah and the Dead Sea between Israel and Jordan).

Salt, buried kilometers deep, deformed like putty over time, oozing upward toward the seafloor.Salt filled the basin to depths of thousands of meters until it was opened to the ancestral Atlantic Ocean and consequently regained open marine circulation and normal salinities. As geologic time progressed, river deltas and marine microfossils deposited thousands more meters of sediments into the basin, atop the thick layer of salt.

The salt, subjected to the immense pressure and heat of being buried kilometers deep, deformed like putty over time, oozing upward toward the seafloor. The moving salt fractured and faulted the overlying brittle sediments, in turn creating natural pathways for deep oil and gas to seep upward through the cracks and form reservoirs within shallower geologic layers [Buffler and Sawyer, 1985; Hudec et al., 2013].

Out with the Old? Not So Fast

The most popular bathymetry map of the northern Gulf of Mexico has been the version generated in the 1990s by the National Oceanic and Atmospheric Administration (NOAA), the National Geophysical Data Center (NGDC), and Texas A&M’s Gulf of Mexico Coastal Ocean Observing System (GCOOS). The organizations compiled it using data from various multibeam sonar surveys and 2-D seismic lines spaced kilometers apart, providing a resolution of up to 2500 square meters per pixel. This is excellent resolution, geophysically speaking, and for the past 2 decades the map has been a respected and popular regional data set within science, academia, and the oil and gas industry.

Figure 2. Horseshoe Basin in the western Gulf of Mexico, as compared using (left) the historic NOAA bathymetry map and (right) BOEM’s new map. The basin contains a salt dome at its center and is flanked by salt sheets. Movement of the salt is evident from the network of faults and rifts expressed on the seafloor around the basin, as well as from the sediment debris flows seen falling down the slopes of the basin and onto its floor. Credit: BOEM

BOEM’s new map, derived exclusively from 3-D seismic data, doesn’t cover as large an area as the NOAA/NGDC/GCOOS map, but its enhanced resolution and consistent pixel size reveal undiscovered and previously poorly resolved geologic features over the continental slope, salt minibasin province, abyssal plain, Mississippi Fan, and the Florida Shelf and Escarpment. However, because of the new map’s smaller coverage, the historic map will continue to be very useful.

Figure 3. BOEM’s new map extends by hundreds of kilometers the visualization of Joshua Channel on the eastern Gulf of Mexico abyssal plain [Posamentier, 2003] compared with older data. It’s visible on the seafloor for 280 kilometers, far beyond the bounds of this image, and an additional 240 kilometers is buried beneath younger sediment systems and muddy drape. BOEM research has established updip linkage with the ancestral Pearl River in Louisiana [Frazier, 1974; Mobley, 2005], and similar-scale channel–levee complexes have been observed in the Amazon Fan [Lopez, 2001]. Credit: BOEM BOEM’s Seismic Database

BOEM researchers constructed the map using BOEM’s confidential database of 3-D seismic surveys, each survey having been originally shot by the oil and gas industry in their search for hydrocarbons. As the bureau responsible for issuing geophysical survey permits in offshore federal waters, the U.S. Code of Federal Regulations reserves the right of BOEM to request a copy of each survey after it has been processed and cleaned up to meet specific quality standards.

Our 3-D seismic database of confidential data covers 350,000 square kilometers of the Gulf of Mexico.After receiving a survey from a geophysical contractor or oil company, BOEM scientists use the data to assist with other important regulatory duties, such as assessing the geology for potential and discovered reservoirs of oil and gas. As of 2017, this 3-D seismic database of confidential data covers 350,000 square kilometers of the Gulf of Mexico, an area larger than the state of New Mexico. The oldest surveys in this database date back to the 1980s.

Deepwater Horizon and the First Integrated Map

In an ongoing effort since 1998, BOEM has used that database to map the seafloor across hundreds of surveys with the goal of identifying potential hard-ground substrates at naturally occurring oil and gas seeps suitable for benthic communities of corals and chemosynthetic organisms (e.g., mussels, clams, and tubeworms). These organisms consume the hydrocarbons and hydrogen sulfide released from those seeps.

Biologists, aware of our database, requested a semiregional map to model the area affected by the oil plume.When the tragic Deepwater Horizon oil spill occurred in 2010, marine biologists of NOAA’s Natural Resource Damage Assessment division needed a detailed map of the seafloor surrounding the incident to model how many of those benthic communities may have been affected. NOAA biologists, aware of BOEM’s expansive seafloor database, requested that its geoscientists create a semiregional map that NOAA could use to model the area affected by the oil plume.

The effort required the researchers to devise a method for combining their multiple overlapping seafloor maps of the spill region, made using different 3-D seismic surveys, into a single gridded surface. Through that, the idea for an even broader gigapixel map was born.

Creating a Gigapixel Grid

Geoscientists realized the potential they had available to them: They could combine the rest of their seafloor maps to cover most of the northern Gulf of Mexico under deep water.Having developed the method and delivered the map to the biologists, the geoscientists realized the potential they had available to them: They could combine the rest of their seafloor maps to cover most of the northern GOM under deep water.

BOEM geoscientists used 3-D time-migrated surveys (in which depth is presented in milliseconds traveled by induced or passive seismics, not in feet or meters) to create the original grid. The researchers then assigned cells in the grid to depths using an algorithm developed by Advocate and Hood [1993]. They then compared the resulting depth grid with more than 300 well penetrations across the GOM to determine the time-depth conversion error, which averaged 1.3% of water depth.

The highest average error, 5%, occurs in water depths shallower than 150 meters because of the nature of conventional seismic acquisition in shallow water and the high variability of temperature and salinity in shallow water, which affect the velocity of sound in water. BOEM scientists decided that the seismic data acquired on the GOM’s shallow shelf often contain too much noise for the seafloor interpreter to accurately determine where the water ends and the sediment begins. This meant that BOEM’s map could not include certain areas of the shelf, making it smaller than the historic NOAA map, which does cover the shelf.

Within the depth range of 500 to 3300 meters (where the largest part of the grid exists), average error was calculated to be less than 0.5% of water depth. This low error meant that data from these depths would reveal the finest-resolution regional deepwater bathymetry ever created.

Figure 4. Megafurrows carved into the Sigsbee Escarpment and abyssal plain around Green Knoll, central Gulf of Mexico. The furrow fields (see right image), not visible in the previous bathymetry grid (left image), extend more than 200 kilometers along and in front of the escarpment. They form when currents, measured up to 2 knots, excavate the seafloor. The megafurrows, first discovered in 1999 by Texas A&M deep-tow data, can be 1–10 meters deep and 5–50 meters wide [Bryant et al., 2000, 2004]. Credit: BOEM Making an Aggregate Map

The geoscientists began with more than 200 individual seafloor maps created from 3-D surveys dating from the late 1980s to the 2010s. In the U.S. portion of the Gulf of Mexico, few areas are covered by only a single survey (some are covered by four or more), and the interpreters needed to compare one with another to determine which was made using the best data. They created a mosaic of more than 100 of their highest-quality bathymetry maps, spanning water depths of 40 to 3379 meters and interpreted on seismic surveys originally shot by 15 different geophysical companies.

Geoscientists created a mosaic of more than 100 of their highest-quality bathymetry maps, interpreted on seismic surveys originally shot by 15 different geophysical companies.Even though BOEM maintains copies of all the seismic data, the original companies retain legal ownership for a period of 25 years. Mergers and acquisitions through the years meant that instead of being required to ask 15 companies for permission to publish, BOEM needed to request it from only 7: CGG Services (U.S.), Inc.; ExxonMobil Corporation; Petroleum Geo-Services (PGS); Seitel, Inc.; Spectrum USA; TGS-NOPEC Geophysical Company; and WesternGeco, LLC.

Obtaining permission from these seven companies took months, much longer than anticipated, but eventually, BOEM received all necessary permissions and began the publication process. The new high-resolution grid is downloadable from the BOEM website. The site also offers GIS layers that classify over 34,000 seafloor features such as pockmarks, channels, hard grounds, mud volcanoes, natural seeps, and others.

Figures 5–9 showcase the detail of BOEM’s GOM gigapixel map, the payoff for 19 years of mapping efforts.

Figure 5. Subaqueous dunes and pockmarks on the upper continental slope in the northwestern Gulf of Mexico. Longitudinal megadunes measure 0.5–1 kilometer crest to crest, 1–10 kilometers long, and 3–10 meters tall. Pockmarks occur atop Nueces Dome (top center) and, Gulf-wide, pockmarks occur within a general water depth range of 300–600 meters. Pockmarks in this region have been attributed to explosive dissociation of natural methane hydrate following basinward migration of the hydrate stability zone during the Wisconsin Glacial sea level lowstand [Hovland and Judd, 1988; Roberts and Carney, 1997]. BOEM has identified more than 4000 pockmarks in federal waters of the northern Gulf of Mexico. Credit: BOEMFigure 6. Gas expulsion mounds with adjacent thrust folding and faulting caused by lateral salt migration in the southern Terrebonne Basin in the central Gulf of Mexico. The image illustrates some of the features formed by the dynamic processes shaping the Gulf, specifically salt tectonics and natural hydrocarbon seepage. Thrust faulting and folding are due to southeast verging lateral movement of salt. Movement of salt is what gives the Gulf of Mexico seafloor its wrinkled nature, also creating faults and fracture networks that provide pathways for oil and gas seeps. These particular expulsion mounds were formed as a result of basin compaction and compression, resulting in upward gas migration [McConnell and Kendall, 2002]. Credit: BOEMFigure 7. Spectacular new detail of Alaminos and Perdido canyons and their associated fans, western Gulf of Mexico. The canyons funnel sediments to create an intermingling basin-floor fan system hundreds of meters thick. Core sampling determined that drainage from the Rio Grande River provides coarse, sandy sediments to the Perdido system [Damuth et al., 2006], whereas cores and well logs in Alaminos Canyon reveal primarily fine-grained deepwater sediment [Bouma et al., 1968; Meyer et al., 2005]. Credit: BOEM

Figure 8. A salt dome has uplifted shallow sediments in the eastern Gulf of Mexico’s abyssal plain. Expulsion and depression features suggest ongoing natural fluid and/or gas seepage. As salt domes move shallower relative to the subsiding basins of sediment around them, sediments atop the domes are uplifted and form seafloor mounds. Over this dome, the movement created a network of extensional faults dividing the mound into three wedges. Faults can also provide pathways for fluid and/or gas migration, as indicated here by the circular depression, or pockmark, on the southeastern face of the mound, and an expulsion feature with a crater on the northwestern side. Credit: BOEM Figure 9. A comparison of a region of the upper continental slope in the northwestern Gulf of Mexico, using (left) the historic NOAA bathymetry map and (right) the new BOEM bathymetry map. The NOAA grid combined areas of widely spaced multibeam sonar bathymetry with other, more coarsely spaced data from 2-D seismic lines, providing a resolution no finer than 50 meters. The BOEM grid uses 3-D seismic throughout, offering a resolution as detailed as 12 meters. Credit: BOEM Acknowledgments

We thank CGG Services (U.S.), Inc.; ExxonMobil Corporation; PGS; Seitel, Inc.; Spectrum USA; TGS-NOPEC Geophysical Company; and WesternGeco, LLC for granting us permission to publish their data.

Ground Surveys Reveal Space Weather Risk to Spain’s Power Grid

Tue, 05/23/2017 - 11:58

When you think of regions that are vulnerable to solar storms, the first places that probably come to mind are at high latitudes, like Canada, the northern United States, and Scandinavia—regions that regularly experience strong geomagnetic activity and spectacular displays of aurora.

A country like Spain, with its balmy Mediterranean climate and geomagnetic latitude equivalent to Florida, probably doesn’t leap to mind. However, history shows that lower-latitude regions need to be on guard as well. During the famous solar storm of 1859, the strongest on record, auroras were reported as far south as Cuba, and telegraphs failed across the United States and Europe.

Now a new study by Torta et al. demonstrates how to improve vulnerability assessments of Spain’s power grid by measuring the conductivity of the bedrock below critical substations. The method could be used by other countries seeking to quickly assess their vulnerability to space weather.

The threat to power grids during such storms stems from the storms’ ability to induce strong geomagnetic activity, which, in turn, can induce strong currents in power lines. Complicating matters is the fact that the strength of these currents also depends on the conductivity of the ground underneath. This varies with the type of rock and can change significantly across geological boundaries or where large bodies of water are present. Although some nations have dedicated resources to comprehensively map these differences over thousands of locations, like the EarthScope project in the United States, other nations may be able to survey only at a few places, near important substations in the power grid.

To evaluate surveys’ usefulness, the authors conducted one such survey near a substation in Spain’s power grid located in Vandellòs, in Catalonia on the Mediterranean Sea, near a nuclear power plant.

These surveys evaluate the natural electromagnetic signal present at these sites, which is a function of the region’s induction response to the natural magnetic field variations. Fluctuations in this signal behave like the firing of an electromagnetic pulse into the ground, sending current racing through the rock below. Electromagnetic disturbances then propagate back to the surface, where instruments can measure them and infer the conductivity of the underlying rock.

The team then used these measurements, called magnetotelluric (MT) soundings, in a model to predict how strong the currents would be at the Vandellòs substation during a solar storm and compared them to actual data from several storms in 2011 and 2012. The team found that MT readings significantly improved predictions, in particular, because they revealed that the nearby Mediterranean Sea strongly affects Earth’s electromagnetic response in the region. By one metric of model performance, MT readings improved the accuracy of the predictions by a factor of 8.

Furthermore, the team found that another key input in the model, the strength of Earth’s magnetic field overhead, could be interpolated from existing data without significantly hurting the model’s performance. That’s partly because at lower latitudes, Earth’s magnetic field is more uniform than at the poles. The team concluded that conducting MT surveys at critical power stations would be an effective way for regions at lower latitudes to assess their risk and be prepared when the next solar storm strikes. (Space Weather, https://doi:10.1002/2017SW001628, 2017)

—Mark Zastrow, Freelance Writer

Cosmic Muons Reveal the Land Hidden Under Ice

Tue, 05/23/2017 - 11:32

The land surface under a glacier is sculpted and shaped by the ice passing over it. Data about the shape of the bedrock yield information crucial to understanding erosional processes underneath a glacier. However, the inaccessibility of sites where glacial erosion currently occurs presents big challenges for advancing this understanding.

A range of techniques has been used to map the bedrock beneath glaciers, including drilling, seismic surveys, multibeam bathymetry, gravity measurements, and radio-echo soundings. The accuracy of results has been limited, so Nishiyama et al. tested a different technique: emulsion film muon radiography.

A muon detector in the Jungfrau railway tunnel awaiting arrival of the cosmic ray muons. Credit: Nishiyama et al.

Muons are formed when cosmic rays collide with atoms in Earth’s upper atmosphere. They descend toward Earth, with about 10,000 muons reaching each square meter of Earth’s surface every minute. One of their significant properties is that they can pass through matter, even dense and solid objects on Earth.

Particle detectors can be used to measure the quantity of muons and their trajectories, which can reveal information about the materials that they have passed through.

Because cosmic muons travel only downward, detectors need to be located below the objects to be surveyed. This technique has been used by geophysicists to scan the interior architecture of volcanoes, seismic faults, and caves and to detect carbon leaks, but it has posed a challenge for surveying the bedrock beneath glaciers.

The team of researchers found a solution in the central Swiss Alps: the Jungfrau railway tunnel, which runs through the bedrock beneath the Aletsch glacier. They set up three particle detectors in the tunnel that are oriented upward with a view of the bedrock beneath the base of the largest glacier of Europe.

Three-dimensional reconstructed bedrock shape (blue) under the uppermost part of the Aletsch glacier. The shape of the interface was determined from the cosmic ray muon measurement performed at three muon detectors (D1, D2, and D3) along the railway tunnel (gray line). Bedrock that pokes through ice is in gray tones. Jungfraufirn is a small glacier that feeds the Aletsch glacier. Blue dots on the gray line represent points where scientists sampled rocks within the tunnel. The image is Figure 5b in Nishiyama et al.; dashed lines outline a cross section of this 3-D map that can be found in Figure 5c. Credit: Nishiyama et al.; base map from SWISSIMAGE, reproduced by permission of swisstopo (BA17061)

Different types of particle detectors are available for muon radiography, but the team selected emulsion films, a special type of photographic film that can be used in remote and harsh environments because it does not require any electric power or computers for operation.

Because of the density contrast between ice and rock, the patterns of muons captured on the film over a 47-day period could be used to accurately map the shape of the bedrock below the glacier.

Using this technique, the researchers were able to map the bedrock-ice interface beneath the glacier over a 4000-square-meter area. They were also able to infer the glacier’s response to global warming. In particular, the team predicts a larger frequency of rock avalanches as the ice shrinks, exacerbated by reconstructed bedrock geometry beneath the glacier. This increase is of particular concern because buildings are situated on top of the bedrock. These include tourist facilities, a research station, and communications infrastructure, as well as the railway tunnel itself, which cuts through the bedrock.

The use of cosmic muon radiography is spreading in various fields, including geophysics and civil engineering. This first application of the technique in glacial geology complements data collected by other methods and has the potential to be applied in other glacial locations underlain by a tunnel. (Geophysical Research Letters, https://doi.org/10.1002/2017GL073599, 2017)

—Jenny Lunn, Contributing Writer

Researchers Propose New Type of Planetary Object

Tue, 05/23/2017 - 11:28

Scientists suggest in a new study the existence of a planetary object called a “synestia,” a huge, spinning, donut-shaped mass of hot, vaporized rock, formed as planet-sized objects smash into each other.

At one point early in its history, Earth was likely a synestia, said Sarah Stewart, a planetary scientist at the University of California Davis and coauthor of the new study in the Journal of Geophysical Research: Planets, a journal of the American Geophysical Union.

Stewart and Simon Lock, a graduate student at Harvard University in Cambridge, Massachusetts and lead author of the new study, explore how planets can form from a series of giant impacts. Current theories of planet formation hold that rocky planets such as Earth, Mars and Venus formed early in the solar system when smaller objects smashed into each other.

These collisions were so violent that the resulting bodies melted and partially vaporized, eventually cooling and solidifying to the nearly spherical planets we know today.

Lock and Stewart are particularly interested in collisions between spinning objects. A rotating object has angular momentum, which must be conserved in a collision. Think of a skater spinning on ice: if she extends her arms, she slows her rate of spin. To spin faster, she holds her arms close by her side, but her angular momentum stays constant.

Now consider two skaters turning on ice: if they catch hold of each other, the angular momentum of each skater adds together so that their total angular momentum stays the same.

In the new study, Lock and Stewart modeled what happens when the “ice skaters” are Earth-sized rocky planets colliding with other large objects with both high energy and high angular momentum.

“We looked at the statistics of giant impacts, and we found that they can form a completely new structure,” Stewart said.

Lock and Stewart found that over a range of high temperatures and high angular momenta, planet-sized bodies could form a new, much larger structure, an indented disk rather like a red blood cell or a donut with the center filled in. The object is mostly vaporized rock, with no solid or liquid surface.

They have dubbed the new object a “synestia,” from “syn-,” “together” and “Estia,” Greek goddess of architecture and structures.

The key to synestia formation is that some of the structure’s material goes into orbit. In a spinning, solid sphere, every point from the core to the surface is rotating at the same rate. But in a giant impact, the material of the planet can become molten or gaseous and expands in volume. If it gets big enough and is moving fast enough, parts of the object pass the velocity needed to keep a satellite in orbit, and that’s when it forms a huge, disc-shaped synestia, according to the new study.

Previous theories had suggested that giant impacts might cause planets to form a disk of solid or molten material surrounding the planet. But for the same mass of planet, a synestia would be much larger than a solid planet with a disk.

Most planets likely experience collisions that could form a synestia at some point during their formation, Stewart said. For an object like Earth, the synestia would not last very long—perhaps a hundred years—before it lost enough heat to condense back into a solid object. But synestia formed from larger or hotter objects such as gas giant planets or stars could potentially last much longer, she said.

The synestia structure also suggests new ways to think about lunar formation. The moon is remarkably like Earth in composition, and most current theories about how the moon formed involve a giant impact that threw material into orbit. But such an impact could have instead formed a synestia from which the Earth and Moon both condensed, Stewart said.

No one has yet observed a synestia directly, but they might be found in other solar systems once astronomers start looking for them alongside rocky planets and gas giants, she said.

Plastic Waste Knows No Bounds

Mon, 05/22/2017 - 11:58

It’s like the island of misfit toys, only less endearing. Way out in the South Pacific Ocean, more than 5000 kilometers away from any large land mass, tiny, uninhabited Henderson Island carries the highest density of plastic debris reported anywhere on the planet.

Researchers occasionally visit the island, which is the largest of the United Kingdom’s Pitcairn Islands, but only every 5 to 10 years. This time around, a group from the University of Tasmania in Hobart, Australia, didn’t just find glassy aquamarine waves lapping against pristine, white sand—they found piles and piles of plastic debris. An estimated 37.7 million pieces of plastic, to be precise, weighing as much as 17 metric tons.

The researchers just published a new paper about the plastic-covered island in the Proceedings of the National Academy of Sciences of the United States of America.

The team, led by marine ecotoxicologist Jennifer Lavers, found 671 pieces of plastic every square meter, and that’s not even with a full comb of the island, she said. Their data likely underestimate the actual amount of trash because they picked out only pieces larger than 2 millimeters down to a depth of 10 centimeters into the sand and couldn’t sample along the cliffs and rocky parts of the island. They estimate that more than 3570 new pieces wash up every day on just one of the island’s beaches.

“What’s happened on Henderson Island shows there’s no escaping plastic pollution even in the most distant parts of our oceans,” Lavers said. The trash likely journeys from South America or elsewhere, washed offshore or dumped into the sea by fishing boats.

This plastic threatens marine animals. Not only is it a hazard to eat, the plastic also blocks some animals from moving around the shore. Plastic debris also contributes to lowering biodiversity along coastlines, Lavers said.

Scientists aren’t sure exactly how much plastic swirls around Earth’s oceans, but a 2015 study narrowed that number down to between 4 and 12 million metric tons—that’s 1.5%–4.5% of the world’s annual plastic production. And that plastic waste reaches unexpectedly remote places, like the Arctic seafloor and even along the Mariana Trench wall, one of the deepest places on Earth.

—JoAnna Wendel (@JoAnnaScience), Staff Writer

Deep Trouble! Common Problems for Ocean Observatories

Mon, 05/22/2017 - 11:54

Science observation of the ocean is difficult. The cost to repair or replace a failed device can run many orders of magnitude higher than the base component cost.

Instruments, cables, and connectors supplied by commercial oceanographic equipment vendors fail at unacceptably high rates. Most ocean observatories have developed testing and burn-in procedures to weed out problem instruments early, but tested instruments still commonly fail after deployment.

In September 2016, representatives from ocean observatories around the world attended a workshop at the Monterey Bay Aquarium Research Institute to share their experiences and exchange ideas for improvement. Attendees were asked to describe specific examples of trouble and approaches for mitigation. Several oceanographic equipment vendors were also invited, and a few were brave enough to attend. Bravery was needed because products from many vendors have been identified as sources of trouble by more than one observatory.

Biofouling during long-term deployments, such as the algae growing on the Regional Cabled Array 200 Meter Mooring and Shallow Profiler located at the Juan de Fuca Ridge’s Axial Seamount, could prevent sensors from reading data. At the time the picture was taken, the mooring had been deployed for 2 years, and the profiler (above the yellow winch drum) had been operating continuously for 1 year. Such biofouling is just one of the issues facing cabled ocean observatories.Credit: UW/NSF-OOI/WHOI; V16

One attendee described accumulated corrosion on connectors after a yearlong deployment (Figure 1), which may indicate that performance has been compromised. Another common failure discussed at the meeting was seawater intrusion into “atmospheric” housings. These housings enclose their contents in dry gas environments at or near normal surface-level pressures. Seawater leaking into an electronics housing creates a dangerous situation beyond the simple destruction of the electronics. Electrolysis can produce hydrogen and oxygen at seafloor pressures, and a compromised housing can spontaneously explode when brought to the surface or weeks later during postrecovery inspection.

Human interference, both purposeful and accidental, is an ongoing problem, attendees noted. Observatories using surface buoys regularly find their equipment vandalized, stolen, or damaged by human activity. Subsea equipment is subject to damage by fishing operations. In Figure 2, a node in the Northeast Pacific Time-Series Undersea Networked Experiments cabled system (NEPTUNE) is askew, likely due to a trawl net catching the lower edge of what was billed to be a trawl-resistant frame. This incident revealed two failures. The first was nontechnical: The trawler was in a marked no-operations area. The other was a design fault where no latches were installed to keep the node in the frame.

Fig. 1. Connectors suffering damage following a yearlong deployment. (left) Pressure-balanced oil-filled electrical connector that leaked oil and opened a conduction path to seawater. Green is probably copper corrosion. Also note heavy corrosion on the connector in the background due to the stainless-steel snap ring used in the assembly of a titanium connector shell. (middle) Instrument connector showing unknown red deposits. (right) Camera connector that delaminated, exposing power pins to seawater. Credit: Eric McRae, Applied Physics Laboratory, University of Washington

After attendee presentations, working groups were formed to tackle issues in three categories: cables and connectors, systems, and testing and operations. The groups brainstormed ways to gain improvements in these arenas. The participants agreed that it was useful to share experiences, and they look forward to continued collaboration to advance the state of the art in ocean observing.

Fig. 2. A Northeast Pacific Time-Series Undersea Networked Experiments cabled system (NEPTUNE) trawl-resistant frame with its node ejected. A trawl net had caught on the lower edge of the trawl-resistant frame. That tilted the entire assembly, and the node fell out of the frame. Credit: Ocean Networks Canada and Global Marine Systems

Their recommendations, and general discussions of the larger group, will be detailed in the full workshop report, which will be posted on the workshop website when completed. More information about the workshop agenda and presentations can be found on the workshop’s website.

The workshop was supported by the National Science Foundation Ocean Technology and Interdisciplinary Coordination program with a grant to the University of Hawai‘i. The Monterey Bay Aquarium Research Institute graciously made their facilities available.

—Bruce M. Howe (email: bhowe@hawaii.edu), University of Hawai‘i at Mānoa, Honolulu, Hawaii; and Eric McRae, Applied Physics Laboratory, University of Washington, Seattle

A Sea Change in Paleoceanography

Mon, 05/22/2017 - 11:51

After 32 years of existence, the journal Paleoceanography is changing its name. On January 1, 2018, it will become Paleoceanography and Paleoclimatology. This reflects the growth, expansion and evolution of a field of research over the years, and is not a major change of course, nor a break with the journal’s history.

Reconstructed Arabian Sea summer wind stress curl anomalies (for the sea surface) and Indian summer monsoon rainfall (over land) for 10, 8, 6, 4 and 2 thousand years ago, presented as % departures from the present day (1981-2010) climatology. Data sources used to construct this figure include lake records, cave records, peat, marine salinity and/or discharge, marine biological productivity, and reconstructed sea surface temperatures. Credit: Gill et al., 2017

In 1986, Jim Kennett, Paleoceanography’s founding editor, asked for contributions dealing with all aspects of understanding and reconstructing Earth’s past climate, biota and environments, while emphasizing global and regional understanding. At the time, such research papers were based dominantly on the marine sedimentary record, with study materials commonly supplied by scientific ocean drilling.

Since then, the technologies of sampling, sample analysis, data analysis and model development have evolved greatly and rapidly. Articles in Paleoceanography today routinely compare and combine proxy records from ice cores, speleothems, terrestrial sediments and/or lake deposits with multiple stacked proxy records from marine sediment cores, while data are integrated into a broad spectrum of geochemical, earth system, ecosystem and climate models.

The process of recognition of this de-facto expansion in scope of Paleoceanography has taken a few years. It was started in 2014 by then Editor-in-Chief, Chris Charles, who announced in Eos that the journal was expanding to ‘embrace all aspects of global paleoclimatology’. The journal’s name was amended (informally) to “Paleoceanography: An AGU Journal exploring Earth’s Paleoclimate.” New Associate Editors with a broad variety of expertise joined the editorial board.

Finally, after discussions at the 2016 AGU Fall Meeting, the leadership of the AGU Focus Group Paleoceanography & Paleoclimatology, together with the journal editors, organized a survey to gauge the community’s opinion. A large majority (~65%) of the 751 respondents was in favor of a change in the name of the journal.

Inserting the word ‘climate’ into the name allows us to celebrate the growth and evolution of our scientific undertaking. Understanding climates of the past has been an integral part of earth sciences since their early days. Lyell (1830–1833) devoted three chapters in ‘Principles of Geology’ to cyclically changing climates (as shown by fossil distributions), influenced by the position of the continents: the present as key to the past. Chamberlin (1906) wondered how Earth’s climate could have remained sufficiently stable to allow life to persist, ‘without break of continuity’, writing that ‘On the further maintenance of this continuity hang future interests of transcendent moment’. With foresight, he argued that for such continuity to persist ‘a narrow range of atmospheric constitution, notably in the critical element carbon dioxide, has been equally indispensable’.

In our present time of environmental change, it is important to use proxy data on Earth’s past in order to evaluate Earth’s futureIn the near future, we may move outside the range of concentrations of atmospheric CO2 as they have been for tens of millions of years, as documented in a number of papers using various proxies, with quite a few of these published in Paleoceanography. We now use, in addition to fossils, a broad and growing range of stable isotope compositions, trace element concentrations and organic biomarkers in fossils and sediments as quantitative proxies for a growing number of environmental properties (e.g., temperature, oxygenation, pH, pCO2).

In our present time of environmental change, it is, more than ever, important to use proxy data on Earth’s past in order to evaluate Earth’s future, thus making our past a key guide to our future.

Paleoceanography has always aimed to publish thorough, innovative studies which add to our understanding of the planet on which we live, and the past variability in its environments over the full range of Earth history. It will continue to do so under its new name. Any paper submitted after July 1, 2017 will be considered under the new title, and all papers accepted after December 1, 2017 will be published under the new title.

—Ellen Thomas, Department of Geology and Geophysics, Yale University; email: ellen.thomas@yale.edu

Rural Areas Becoming Less Water Efficient over Time

Fri, 05/19/2017 - 11:22

A nationwide analysis of water use over the past 30 years finds that there is a disconnect between rural and urban areas, with most urban areas becoming more water efficient and most rural areas becoming less and less efficient over time.

“Understanding water use is becoming increasingly important, given that climate change is likely to have a profound impact on the availability of water supplies,” said Sankar Arumugam, a professor of civil, construction and environmental engineering at North Carolina State University in Raleigh and lead author of a new study on the work. “This research helps us identify those areas that need the most help, and highlights the types of action that may be best suited to helping those areas.”

The new paper in Earth’s Future, a journal of the American Geophysical Union, stems from a National Science Foundation-funded, interuniversity research project which focuses on understanding how water sustainability in the United States has changed over the past 30 years because of climate change and population growth.

For this paper, researchers evaluated water use data at the state and county level for the 48 contiguous states. Specifically, the researchers looked at water-use efficiency, measured as per capita consumption, in 5-year increments, from 1985 to 2010.

Patterns of water use efficiency across the continental United States. Color-coded values indicate the change in per-capita consumption in gallons per day per person between 1985 and 2010. Large numbers indicate the number of years per-capita withdrawals decreased from 1985 to 2010. Credit: AGU/Earth’s Future/Sankar Arumugam

“This is the first systematic evaluation of water use across the continental U.S.,” Arumugam said. “We found that some states—including Washington, Pennsylvania and Wyoming—were becoming more efficient every five years. Meanwhile, other states—such as South Carolina, Oklahoma and Mississippi—have gotten worse every five years.”

But a look at the county-level data reveals what may be the most important finding: most rural counties are getting less efficient, while most urban counties are getting more efficient.

“In other words, as we are facing a more uncertain future regarding water resources, rural counties are being left behind,” Arumugam said.

The researchers found that investment in new water-efficiency technologies, and retrofitting existing water infrastructure, are big reasons for the improvement in urban areas.

“Rural counties appear to lack the resources, the political will, or both, to keep pace,” Arumugam said.

Another important finding is that technologies and strategies focused on efficiency—as opposed to large-scale projects, such as building new reservoirs—have been extremely successful. These efforts have allowed urban areas to avoid sharp increases in water use, even as their populations have grown significantly.

“There may be a role for huge infrastructure projects at some point, but these findings underscore the value of focusing on efficiency measures—and the need to pursue those measures in rural counties,” Arumugam said.

Scientists, Policy Makers Push for Mars Exploration

Fri, 05/19/2017 - 11:21

Going to Mars won’t be easy, “even if we sent Matt Damon,” star of the 2015 film The Martian, U.S. Sen. Ted Cruz (R-Texas) quipped at a Tuesday forum about deep-space exploration held in Washington, D. C.

But the venture is worth doing, helps unify and propel space exploration going forward, and is codified in the NASA Transition Authorization Act of 2017 (S. 442) that President Donald Trump signed into law in March, said Cruz, chair of the Senate Subcommittee on Space, Science, and Competitiveness. He sponsored the legislation, which calls for a human exploration road map that includes “the long-term goal of human missions near or on the surface of Mars in the 2030s.”

“Space exploration is NASA’s central mission, and I certainly am doing everything I can to encourage as many resources as possible [and] as much of NASA’s leadership to be focused on exploration.”Although Cruz said that NASA space exploration should not come at the expense of the agency’s Earth science missions, he said Earth science is not central to NASA. “There are a host of agencies that do science research, that have a science focus. That’s not NASA’s central mission,” he said. “Space exploration is NASA’s central mission, and I certainly am doing everything I can to encourage as many resources as possible [and] as much of NASA’s leadership to be focused on exploration.”

The forum, sponsored by the Atlantic magazine, focused on the issues of sending astronauts to deep space, including Mars; efforts to support commercial space endeavors; the challenge of retaining American leadership in space; and bipartisan support for space exploration.

Bipartisanship on Space Exploration

In an intensely partisan environment, Cruz said that there is bipartisan commitment to American leadership in space. “There are not many issues to which there is bipartisan commitment, but that’s one, and I think that’s very good for those of us who care about continuing to explore space.”

“It’s written in our hearts: We want to explore, we want to push forward. And I think NASA is probably the symbolic piece of that.”That bipartisan congressional and administration support stems in part from people looking at the agency as a symbol of leadership for the country, said Robert Lightfoot, acting administrator of NASA.

“It’s written in our hearts: We want to explore, we want to push forward. And I think NASA is probably the symbolic piece of that,” he said.

Another reason for agency support is that NASA is “basically changing textbooks,” he noted. The pursuit of knowledge and scientific discovery intrigues people and is “different than some of the other things most people talk about in government,” Lightfoot added.

NASA is “basically changing textbooks,” said the agency’s acting administrator Robert Lightfoot in conversation at the forum with Alison Stewart of AtlanticLive. Credit: Kristoffer Triplaar/The Atlantic

The authorization act, he said, also provides a sense of constancy of support for the agency and for long-term projects such as the International Space Station and efforts to journey to Mars. NASA fared well in the fiscal year (FY) 2017 budget, signed into law on 5 May, and the agency hopes for steady support in the FY 2018 budget that the administration plans to release next week.

Getting to Mars calls for a number of intermediate steps that NASA has outlined. These include using the International Space Station in low-Earth orbit as a “local” proving ground to learn about things such as needed technologies and the impacts of a microgravity environment on the human body and then building infrastructure in the vicinity of the Moon to test and improve technologies.

But Lightfoot emphasized that the momentum is there to go to Mars. “Can you imagine the first steps on Mars?” Lightfoot asked. He said it could be a “civilization level change event for us,” just as the first steps on the Moon were. “I tell my guys all the time, ‘You’re making history. You just don’t know it.’”

A Unique Moment

Ellen Stofan, former chief scientist for NASA, said at the forum that now is “a unique moment” for pushing on toward Mars. “We know where we want to go, we understand the path of technologies that we need to get there, we think there’s an affordable plan…and I think you’ve got broad public support.”

“We need leadership at the top of government and at the top of NASA, and we don’t have it right now.”Robert Zubrin agreed at the forum that Mars should be the destination. “Mars is where the science is, Mars is where the challenge is, Mars is where the future is,” said Zubrin, founder and president of the Mars Society, a Lakewood, Colo.–based organization that promotes the exploration and settlement of Mars.

However, he complained that “what we have right now is just drift, it is not a program,” and he questioned the necessity of some intermediate steps, such as a lunar-orbiting space station, to get to Mars. “Right now, NASA is not spending money to do things. It is doing things to spend money,” Zubrin said. “We need leadership at the top of government and at the top of NASA, and we don’t have it right now.”

Support for the Space Industry

At the forum, Cruz announced that his Senate committee will hold a hearing on 23 May to revisit the Outer Space Treaty to see how the treaty can help expand commerce and settlement in space. The treaty, which entered into force 50 years ago, provides the basic framework for international space law.

He predicted that the first trillionaire would be a person in the space exploration world “who invests and makes discoveries in space that we cannot even envision.”

Space Leadership

The senator tied American efforts in space not just to the spirit of exploration and economic opportunities but also to national security and the safety of the nation’s satellites that GPS and other critical technologies rely on.

“The development other countries are making in space weaponry to take out our communication equipment is truly chilling,” he said. “Some of the classified briefings would take your breath away at the potential threats we face.” He called for “serious investments” to address that vulnerability.

—Randy Showstack (@RandyShowstack), Staff Writer

Why Is There So Much Carbon Dioxide in Rivers?

Fri, 05/19/2017 - 11:18

Studies have shown that many of the world’s freshwater rivers and streams are oversaturated with carbon dioxide. Yet the physical and chemical mechanisms that allow carbon dioxide to be released from waterways into the atmosphere are not well understood.

This is especially important given freshwater systems’ role in the global carbon cycle: Freshwater rivers are important components in the linkage of carbon transfer between terrestrial ecosystems, the atmosphere, and the oceans. In 2013, scientists discovered that freshwater rivers and streams release about 5 times more carbon dioxide into the atmosphere than all the world’s lakes and reservoirs combined, a much higher amount than previously believed.

A U.S. Geological Survey scientist collects carbon dioxide data near Fifty-Six, Ark. Credit: Sydney Wilson, U.S. Geological Survey

To learn more about the processes that regulate carbon dioxide levels in freshwater rivers and streams, Stets et al. collected water samples from about 100 field sites throughout the contiguous United States and matched these with thousands of observations by other researchers. They examined concentrations of carbon dioxide, dissolved oxygen, and other water quality parameters in the samples and compared them to data from the U.S. Geological Survey (USGS).

Ecosystem respiration consumes oxygen and produces carbon dioxide, so the researchers expected that as oxygen levels decreased, carbon dioxide levels would increase. They observed this relationship, but there was usually much more carbon dioxide than expected, which was puzzling.

An important breakthrough came when the researchers examined their results from the perspective of carbon dioxide in the oceans. In ocean waters with high buffering, that is, a greater capacity to neutralize acid, carbon dioxide and oxygen behave very differently than they do in waters with low buffering. For one, much more excess carbon dioxide lingers in highly buffered ocean water.

Researchers examined their samples and found that the degree of carbonate buffering controls the amount of excess carbon dioxide in surface freshwater. Alkalinity is a good marker for the degree of buffering and is an important consideration when looking at carbon dioxide in rivers.

Further exploring these processes will tell us more about how carbon, as well as nutrients and other pollutants, flows through aquatic ecosystems. (Global Biogeochemical Cycles, https://doi.org/10.1002/2016GB005578, 2017)

—Sarah Witman, Freelance Writer

Tornado Casualties Depend More on Storm Energy Than Population

Thu, 05/18/2017 - 11:57

When a dark, swirling funnel cloud dips toward the ground, people living in a U.S. region in and near the Great Plains popularly known as Tornado Alley know to move to a safe spot. Tornadoes can destroy concrete buildings and send railcars rolling, and these violent windstorms account for roughly 20% of natural hazard–related deaths in the United States.

Despite tornadoes’ danger, the correlations among the number of storm-related casualties, a twister’s energy, and the size of the population in its path are not well understood. Better understanding of those relationships could help scientists, policy makers, and emergency management personnel predict future tornado deaths and injuries based on trends in population growth and tornado activity. Now researchers have used a principle of economics to show that a tornado’s casualty count depends more strongly on the energy of the storm than on the size of the local population.

This study is “likely to spur conversation and additional research,” said Todd Moore, a physical geographer at Towson University in Towson, Md., not involved in the study. “It provides a framework that can be modified to include additional risk variables.”

Fear Becomes an Obsession

Tyler Fricker grew up hearing his father’s stories of the 1974 Xenia, Ohio, tornado that killed 33 people and injured more than 1000 others. Fricker, now a geographer at Florida State University in Tallahassee and the lead author of the new study, has also lived through a few tornadoes of his own. He explains his fascination with tornadoes as “fear becoming an obsession.”

“By understanding tornado behavior better…we get a deeper understanding of what may be causing the death and destruction we see in these storms.”In the new research, he and his colleagues analyzed 872 casualty-causing tornadoes that swept through parts of the United States between 2007 and 2015. They defined “casualty” as a death or injury related to a storm. “By understanding tornado behavior better…we get a deeper understanding of what may be causing the death and destruction we see in these storms,” said Fricker.

The team borrowed a principle of economics known as elasticity to investigate how a tornado’s casualty toll scaled with its energy and the size of the nearby population. Elasticity is commonly used by economists to investigate how two measurements—for example, supply and demand—are related.

The researchers used National Weather Service data to determine the energy dissipated by a tornado. They calculated this energy as proportional to the area of a tornado’s path multiplied by its average wind speed raised to the third power. Knowing this quantity for each tornado allowed the team to uniformly define the intensity of each storm. The researchers then collected population measurements in roughly 1 × 1 kilometer squares for the path of each tornado using a database of world population maintained by Columbia University.

Predicting Casualties

Armed with these two measurements and the published casualty counts for each of the tornadoes in their sample, Fricker and his colleagues investigated how casualties scaled with storm energy and the size of the nearby population. The scientists found that storm energy was a better predictor of the number of storm-related injuries and deaths: Doubling the energy of a tornado resulted in 33% more casualties, but doubling the population of a tornado-prone area resulted in only 21% more casualties. These results, which the team reported last month in Geophysical Research Letters, can inform emergency planning, the team suggests.

“It’s hard to control the behavior of tornadoes.”The relatively larger impact of tornado energy on casualties might be cause for concern, Fricker and his colleagues note. If climate change is triggering more powerful tornadoes, an idea that’s been suggested and debated, emergency managers might have to contend with larger casualty counts in the future. But scientists are by no means certain that larger tornadoes are imminent. “There is no doubt climate change is influencing hazards, but for tornadoes, we just simply don’t know to what extent yet,” said Stephen Strader, a geographer at Villanova University in Villanova, Pa., not involved in the study.

It is “far more likely” that the population will double in the future rather than the tornado energy, notes Victor Gensini, a meteorologist at the College of DuPage in Glen Ellyn, Ill., who was not involved in the study. Effective communication and good city planning might help reduce storm-related casualties, Fricker and colleagues suggest. “It’s hard to control the behavior of tornadoes, but it’s somewhat within our control to smartly advance how we organize cities and suburbs,” said Fricker.

Many More Factors

Of course, changes in storm energy and population can’t fully explain all variations in storm-related deaths or injuries. “There are also more factors that combine to determine a casualty, one of the most important being what type of structure a person is in when the tornado strikes,” said Gensini.

“You might have only 10 or 15 minutes to get to a safe spot.”Fricker said he and his colleagues are looking forward to examining factors such as how a victim’s age, socioeconomic status, and race might correlate with vulnerability to harm from a tornado. “Maybe we’ll be able to profile communities more susceptible to casualties based on all of these other determinants,” said Fricker.

The team hopes that their findings will be useful to emergency personnel, who could target these most vulnerable populations when they spread information about tornado preparedness, for example. After all, “you might have only 10 or 15 minutes to get to a safe spot,” said Fricker.

—Katherine Kornei (email: hobbies4kk@gmail.com; @katherinekornei), Freelance Science Journalist

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer