GeoSpace: Earth & Space Science

Syndicate content
By AGU staff and collaborators
Updated: 8 hours 23 min ago

Study finds increased moisture facilitated decline in African fires in Africa

Thu, 06/27/2019 - 18:28

By Leigh Cooper

MOSCOW, Idaho — The amount of area burned across Africa declined by 18.5 percent between 2002 and 2016, according to a new study, and this reduction was likely driven by an increase in plant-available moisture and not solely changes in human behavior, as previous studies have found.

The study was published in the AGU journal Geophysical Research Letters.

A prescribed burn travels across the Kruger National Park, South Africa. Photo by Luigi Boschetti/University of Idaho.

Africa is the most fire-prone continent in the world, with more than half of the globe’s burned area and fire-related greenhouse gas emissions. However, over the past 15 years, satellite observations indicate Africa is leading trends that show a decrease in the amount of burned area worldwide.

“In past studies, people have assumed that humans drive trends in fire activity in Africa. But, when you think about the main drivers of fire, you usually think of climate. Up until now, researchers haven’t found strong connections between fire and climate in Africa,” said Maria Zubkova, a University of Idaho doctoral student in the College of Natural Resources and lead author of the study. “We wanted to understand the relationship between fire, climate, vegetation and humans in Africa.”

Using satellite and climate data, the researchers analyzed changes in fire activity in Africa and investigated the impact climate and human factors had on fire trends.

The team found the size of burned area in Africa averaged approximately 1 million square miles a year, roughly four times the size of Texas, from 2002-16. However, the size of the burned area declined by 18.5 percent across the entire study period or roughly 200,000 square miles – about twice the size of Oregon. Most of this reduction occurred in the northern half of the continent where savannas transitions into tropical forest.

Roughly two-thirds of the reduction occurred on natural lands — savannas and forests. Zubkova and her colleagues estimate 71 percent of the decline in fire in these areas could be linked to an increase in plant-available moisture. A rise in moisture availability would likely reduce the flammability of vegetation, Zubkova said, especially in wet savannas.

Only about a third of the reduction in burned areas occurred in croplands. The study did not attempt to attribute this trend to specific human behaviors, such as increases in cropland and grazing land, fire prevention or roads, which can reduce the spread of fire.

“We’ve never had a long record of fire in Africa,” said Luigi Boschetti, a University of Idaho professor and co-author on the study. “With these satellite images, this is really the first time we have had enough data to search for trends.”

Contrary to previous studies, researchers found changes in climate, especially variables that influence moisture availability, can explain a larger portion of the decline in burned area in Africa than human pressures, and that both changing climate patterns and increased human pressure are responsible for the decline of fire activity in Africa.

“At some point we will be able to use this data for fire predictions, but first we simply needed to characterize what trends are happening and where these trends are happening,” Boschetti said.

A prescribed burn travels across the Kruger National Park, South Africa. Photo by Luigi Boschetti/University of Idaho. 

Leigh Cooper is a Science and Content Writer for University of Idaho Communications and Marketing. This post was originally published on the University of Idaho website. 

The post Study finds increased moisture facilitated decline in African fires in Africa appeared first on GeoSpace.

Study shows how to produce natural gas while storing carbon dioxide

Thu, 06/27/2019 - 16:51

By Constantino Panagopulos

New research shows that injecting air and carbon dioxide into methane ice deposits buried beneath the Gulf of Mexico could unlock vast natural gas energy resources while helping fight climate change by trapping the carbon dioxide underground.

The study, published May 26 in the AGU journal Water Resources Research, used computer models to simulate what happens when mixtures of carbon dioxide and air are injected into deposits of methane hydrate, an ice-like, water-rich chemical compound that forms naturally in high-pressure, low-temperature environments, such as deep in the Gulf of Mexico and under Arctic permafrost.

Gas hydrates, shown here on the Gulf of Mexico floor, are an ice-like material that form naturally under extreme pressure in low temperature environments where water is abundant. A new study from The University of Texas at Austin has shown that hydrates under the Gulf floor can be tapped for energy and provide safe storage for greenhouse gas emissions. Credit: NOAA

Lead author Kris Darnell, a recent doctoral graduate from the University of Texas Jackson School of Geosciences, said the research is the next step in solving two significant global challenges: energy security and carbon storage.

“Our study shows that you can store carbon dioxide in hydrates and produce energy at the same time,” said Darnell.

In the process, the nitrogen in the injected air sweeps the methane toward a production well and allows carbon dioxide to take its place, researchers said. The beauty of this approach is that it extracts natural gas from methane hydrate deposits and at the same time stores carbon dioxide, a greenhouse gas, in a deep environment where it is unlikely to be released into the atmosphere where it could contribute to climate change.

The study’s lead author, Kris Darnell, at the University of Texas Pressure Core Center in the Jackson School of Geosciences, the only university-based facility that can study methane hydrate cores under pressure. The lab allows researchers to study methane hydrate under the same environmental conditions in which they are found. Credit: UTIG

This is not the first time that hydrate deposits have been proposed for carbon dioxide storage. Earlier attempts either failed or produced lackluster results. The new study breaks down the physics behind the process to reveal why previous attempts failed and how to get it right.

The next step, said Darnell, is to test their findings in a lab. The method is currently being tested in a specialized facility at the University of Texas, which is one of the few in the world that can store and test methane hydrate. The work is being led by Peter Flemings and David DiCarlo, who are co-authors on the paper.

“Two things are really cool. First, we can produce natural gas to generate energy and sequester CO2,” said Flemings. “Second, by swapping the methane hydrate with CO2 hydrate, we disturb the (geologic) formation less, lowering the environmental impact, and we make the process energetically more efficient.”

If the process can be shown to work in the field on an industrial scale, it has enormous potential.

Methane hydrate is one of a group of chemical compounds known as gas hydrates in which gas molecules become trapped inside cages of water ice molecules rather than chemically bonding with them. Researchers are studying naturally forming methane hydrates with the aim of figuring out their potential as an energy resource.

Estimates suggest that methane harvested from hydrate deposits found beneath the Gulf of Mexico alone could power the country for hundreds of years.

In the paper, the authors showed that a process in which one type of molecule trapped in hydrate is exchanged for another (called guest molecule exchange) is a two-stage process and not a single, simultaneous process, as it was previously thought to be.

First, nitrogen breaks down the methane hydrate. Second, the carbon dioxide crystalizes into a slow-moving wave of carbon dioxide hydrate behind the escaping methane gas.

The computer simulations indicate that the process can be repeated with increasing concentrations of carbon dioxide until the reservoir becomes saturated. The authors said that unlike some methods of carbon storage, this provides a ready incentive for industry to begin storing carbon dioxide, a major driver of climate change.

“We’re now openly inviting the entire scientific community to go out and use what we’re learning to move the ball forward,” Flemings said.

Constantino Panagopulos writes for the University of Texas Institute for Geophysics (UTIG). This post was originally published on the Univerity of Texas website. 

The post Study shows how to produce natural gas while storing carbon dioxide appeared first on GeoSpace.

Climate change is transforming northernmost Arctic landscapes

Wed, 06/26/2019 - 16:44

By Joshua Learn

Isachsen, a permafrost monitoring site that sits at a latitude of 78 degrees north on the Arctic Canadian island of Ellef Ringnes, seemed like the last place that would feel the effects of climate change. But a new study in the AGU journal Geophysical Research Letters shows warmer-than-usual periods are causing frozen ground to thaw earlier than scientists expected.

Isachsen is extremely remote and plagued by unpredictable weather, so researchers struggle to get there with any regularity. It is a two-hour flight from the nearest weather station at Prince Patrick Island, and researchers trying to make the extra jump between the islands have to carry enough fuel for two attempts since the chances of weather being clear enough to land on Ellef Ringnes are slim.

A. Map of the study area showing permafrost monitoring site locations. B. Examples of the even terrain at each site at the start of monitoring (2003, 2004, 2005) and the terrain after a decade of monitoring (2016). Thermokarst development was observed at all sites. C. Examples of thermokarst topography and landforms observed at each site in 2016: i), ii) subsidence and trough formation at Isachsen, iii) trough formation and pond development at Mould Bay, iv) subsidence and trough formation at Green Cabin. All images are taken from within 500 m of the permafrost monitoring station with the exception of iv which was taken aerially but includes the monitoring station within the frame. Credit: AGU

In 2006, researchers from the Geophysical Institute Permafrost Lab at the University of Alaska Fairbanks reached the island. The landscape around the site at Isachsen and two other monitoring stations on Banks and Prince Patrick islands was flat, with little topography. The researchers were not able to return to Isachsen until 2013.
“In 2013 the landscape was completely different, and that’s when we got excited,” said Vladimir Romanovsky, a geophysicist at the Geophysical Institute Permafrost Lab at the University of Alaska Fairbanks.

The landscape at Isachsen had gone from basically flat to a hummock-filled landscape full of meltwater ponds generally referred to as thermokarst – the topography formed from the thawing of ice-rich permafrost and melt of ground ice.

“The system is already undergoing changes that were projected to occur in many decades,” said Louise Farquharson, a postdoctoral researcher at the Geophysical Institute Permafrost Lab at the University of Alaska Fairbanks and the lead author of the new study.
Permafrost is a wide term that refers to ground that remains below freezing for two or more years. Permafrost isn’t always rich in ice, but when ice-rich permafrost thaws, the landscape can take on a varied topographic appearance made up of hummocks and water hollows formed through the melting of this ice.

Thawing of permafrost and the accompanying landscape changes it prompts are common thousands of miles to the south of Isachsen where ground that formerly stayed cold enough year-round is now thawing due to longer, hotter summers. But many researchers commonly believed the northern reaches were still immune to the warming effects of climate change.

“This brings cold permafrost to the story basically,” Farquharson said of the new research.

Equipment installed at Isachsen and other northern monitoring sites in 2003 revealed that since 2006 the area had been subjected to summer temperatures 200 to 300 percent higher than anything researchers had recorded going back to the 1970s, when temperatures from regional stations began recording the weather, Farquharson said.

Researchers were surprised by how quickly the landscape changes occurred.
“Just a few warm summers can change everything,” Romanovksy said. “We were very lucky because we just set up everything and it just happened before our eyes.”

The permafrost beneath the thermokarst terrain at the High-Arctic sites is still thermally stable at about -15 degrees Celsius (5 degrees Fahrenheit), and Romanovsky said the area is still a long way from thawing like permafrost farther south, but the physical changes show that a few warm summers in a row can have a huge transformative effect on ice close to the surface.

He said that while there isn’t much human infrastructure in the area, this type of understudied change to the topography could affect oil pipelines in Alaska or other human infrastructure formerly believed to be outside the reach of the current effects of climate change.
Farquharson said the thawing of permafrost can result in the release of more carbon into the atmosphere, accelerating climate change. The release of sediments within the permafrost can also have transformative effects on the way waterways flow in the area, which could have far-reaching ecological consequences for the wildlife and vegetation in the region.

“It basically completely changes the habitat. You’re going from a very dry level surface to something that is covered in thermokarst with ponds popping up,” Farquharson said, adding that this may provide benefits for some species like water birds but negative impacts to other wildlife.

— Joshua Learn (@JoshuaLearn1) is a freelance science writer. 

The post Climate change is transforming northernmost Arctic landscapes appeared first on GeoSpace.

Ice-squeezed aquifers might create marsquakes

Tue, 06/25/2019 - 14:00

By Larry O’Hanlon

As the Mars InSight lander begins listening to the interior of Mars, some scientists are already proposing that some marsquakes could be signals of groundwater beneath the frozen surface of the Red Planet. 

The idea, proposed by Michael Manga, a planetary scientist at the University of California at Berkeley, and his colleagues, is that Mars could be experiencing quakes a lot like those being felt in Oklahoma and Texas due to wastewater injections from fracking. 

A view from the “Kimberley” formation on Mars taken by NASA’s Curiosity rover. The strata in the foreground dip towards the base of Mount Sharp, indicating flow of water toward a basin that existed before the larger bulk of the mountain formed. NASA/JPL-Caltech/MSSS

On Earth, water from fracking is injected deep into the ground where it increases the pressure in the pores — tiny spaces between the grains that make up the ground. That pressure can loosen up faults and cause them to slip and send vibrations — the shaking of an earthquake — far and wide. 

On Mars, it might also be about the pore pressure, said Manga who is the lead author of a paper describing their hypothesis in the AGU journal Geophysical Research Letters. But instead of fracking, Manga proposes that the wintry temperatures of Mars’s surface might penetrate downward into liquid groundwater, freezing the top layers of groundwater and compressing the still-liquid groundwater below. That pressurized groundwater could be loosening faults on Mars and causing just the sorts of shallow marsquakes that have already been detected by the Insight lander, he said. 

As Mars cools, the boundary between frozen ground and liquid water in aquifers moves downward. The volume expansion upon freezing will compress the remaining liquid water and increase pore pressure in aquifers.

That’s not the entire story, however, because marsquakes need triggers. By modeling their hypothetical ice-squeezed aquifers, Manga and his colleagues found that the two likely triggers of marsquakes are the tidal tugs of Mars’s moon Phobos and the Sun, as well as barometric pressure changes. The last is caused by the warming and cooling of its thin atmosphere by the Sun. 

If Manga and his colleagues are correct, Insight should start to detect a pattern to marsquakes which match changes in the tidals forces and the barometric pressure. If that happens it could be taken by some as evidence of deep, pressurized groundwater. If that water really exists, future Mars explorers might be able to drill down to it and the water would come shooting out of the ground under its own pressure, like an artesian spring, Manga said. 

The hypothesis might also explain some of the features — like icy ridges and ice volcanoes — seen on icy moons in the solar system, he said. 

And if the patterns of marquakes do not fit the pattern of pressurized groundwater? “Either way, the answer is fascinating,” said Manga.

Larry O’Hanlon is a freelance science writer and AGU’s Blog manager.  

The post Ice-squeezed aquifers might create marsquakes appeared first on GeoSpace.

Atmospheric rivers getting warmer along U.S. West Coast

Mon, 06/24/2019 - 13:08

By Mary Caperton Morton

Most of the West Coast of the United States relies on a healthy winter snowpack to provide water through the dry summer months. But when precipitation falls as rain rather than snow, it can diminish summer water supplies, as well as trigger floods and landslides.

A new study in AGU’s Journal of Geophysical Research: Atmospheres finds atmospheric rivers –plumes of moisture that deliver much of the west’s precipitation—have gotten warmer over the past 36 years.

Warmer atmospheric rivers generally produce more rain than snow, potentially causing problems for the region, according to Katerina Gonzales, an atmospheric scientist at Stanford University and lead author of the new study.

The massive Big Sur landslide on the coast of California was triggered by heavy rains. Credit: Bob Van Wagenen/USGS

“The west coast relies on atmospheric rivers as a source of precipitation and for much of this region, it’s really important that this precipitation falls as snow, rather than rain,” she said.

To study whether atmospheric rivers are warming in response to climate change, Gonzales and colleagues at Stanford, University of California-Los Angeles and Colorado State University in Fort Collins turned to 36 years of temperature data collected before, during and after atmospheric river events made landfall on the western U.S.

By combining observational data with models that can track the plumes backwards from where they made landfall to where they originated, the team was able to quantify the temperature of each of the atmospheric rivers that reached the west coast between 1980 and 2016.

“We found warming of atmospheric rivers at both seasonal and monthly scales,” Gonzales said. During the study period, temperatures during landfall rose between 0.69 and 2 degrees Celsius (1.24 to 3.6 degrees Fahrenheit), with the most widespread warming occurring between the months of November and March.

(a) Tracks for all West Coast landfalling atmospheric rivers (ARs). Highlighted tracks are selected to exemplify different track orientations. (b) Total track density for every track occurrence from (a), gridded onto 3°x3°grid. Gray dots denote the weighted mean latitude position for all tracks. (c) Overall mean cool-season AR track temperature. (d) Track centroid locations for each AR track that occurs in January, colored by eventual temperature at landfall. From Gonzales, et al., 2019.

Temperature is a key metric for atmospheric rivers, Gonzales said. Due to thermodynamic properties, warmer air can hold more moisture. But warmer and wetter is not necessarily better, since precipitation falling as snow is important for water storage. Additionally, when rain falls on snow, it can have a destabilizing effect on the snowpack that can lead to enhanced melting, flooding and landslides.

To better understand the underlying causes of this warming trend, Gonzales and colleagues also charted the density and temperature of the atmospheric river events from genesis in the Pacific to landfall in five different regions of the west coast.

“Part of the unique character of atmospheric rivers is that they have been shown to source their moisture as well as their temperature both near landfall and also remotely,” Gonzales says. For example, plumes that originate from typhoons in the tropics may carry large amounts of moisture across the Pacific until the system makes landfall. But in the new analysis the authors also found patches of warm ocean waters near the coast can also influence the temperature of the atmospheric river by the time it makes landfall.

“More research is needed to disentangle the factors that are causing this warming trend,” Gonzales says. Warming temperatures at sea and at landfall locations may not be the sole influences. “Our analysis suggests that a heterogeneous mix of influences are at work that vary region by region and time of year. The result ends up being a mix of background regional warming and warming over the ocean.”

The findings may also have implications for understanding precipitation trends elsewhere in the world.

“Atmospheric rivers are not unique to the western U.S. They occur globally,” Gonzales says. “As our climate continues to warm, it is important to understand how the characteristics of atmospheric rivers are changing.”

The post Atmospheric rivers getting warmer along U.S. West Coast appeared first on GeoSpace.

Northern lights’ social networking reveals true scale of magnetic storms

Fri, 06/21/2019 - 13:44

By Peter Thorley

Magnetic disturbances caused by phenomena like the northern lights can be tracked by a ‘social network’ of ground-based instruments, according to a new study from the University of Warwick.

The researchers, led by Professor Sandra Chapman from the University’s Department of Physics, have for the first time characterised the observations from over 100 ground based magnetometers in terms of a time-varying directed network of connections. They monitored the development of geomagnetic substorms using the same mathematics used to study social networks. The magnetometers ‘befriend’ one another when they see the same signal of a propagating disturbance.

The research, published in the AGU journal Geophysical Research Letters, opens up the opportunity to develop more accurate models of substorms and helps us to understand the impact of space weather on our electrical and communication systems.

 

The northern lights, or Aurora Borealis, occur when charged particles from our Sun bombard the Earth’s magnetic field. This stores up energy like a battery which it then releases, creating large-scale electrical currents in the ionosphere which generate disturbances of magnetic fields on the ground. Small versions of these substorms are common, but occasionally larger storms will occur that can have a larger impact.

Using over 100 magnetometers that form the SuperMAG Initiative led by Dr Jesper Gjerloev, the researchers used the mathematical concepts from network science to monitor the development of substorms in the arctic auroral region. As a substorm develops and the electrical current in the ionosphere grows, individual magnetometers will register a change in the magnetic field. Pairs of magnetometers became linked when their measurements correlated with each other, expanding their network of ‘friends’ and allowing the researchers to monitor how the auroral disturbance from the substorm forms and propagates, and how quickly.

 

(Video: The magnetic field from ground based magnetometers in the auroral region superposed over space-based images of the aurora.  Courtesy SuperMAG)

Substorms from the Aurora Borealis create an electrical current in the atmosphere that is echoed at ground level. Localised changes in the Earth’s magnetic field can disrupt power lines, electronic and communications systems and technologies such as GPS. They are just one form of space weather that affects our planet on a constant basis.

 

(Video: The dynamical network constructed from the magnetometer data, indicating which magnetometers are seeing the same propagated signal. Credit: Orr, et al., 2019, GRL)

Professor Sandra Chapman from the University of Warwick Department of Physics said: “When talking about space weather, it is useful to provide a single number or rating that indicates how severe it is. To do this, we need to capture the full behaviour of how intense the event is, how widespread spatially, and how rapidly it is changing. Our aim is to use network science to develop useful parameters that do this, encapsulating all the information from 100+ observations.

“SuperMAG is a great example of how essential international co-operation is to solve problems like space weather that are on a planetary scale, using data from stations located in all the countries that abut the Arctic Circle.”

The paper is entitled “Directed network of substorms using SuperMAG ground-based magnetometer data,” by L. Orr, S.C. Chapman and J.W. Gjerloev. 

Peter Thorley is the Media Relations Manager at the Warwick Medical School and Department of Physics. This post was originally published on the University of Warwick news website

The post Northern lights’ social networking reveals true scale of magnetic storms appeared first on GeoSpace.

Climate change may shift timing of summer thunderstorms

Fri, 06/07/2019 - 13:30

By Joshua Learn

Climate change could affect the regularity of summer afternoon thunderstorms in some parts of the world, according to new research.

A new study in the AGU journal Geophysical Research Letters modeled weather patterns in western Germany, northern France and parts of Belgium, the Netherlands and Luxembourg, under climate change.

Thunderstorm seen from Belfort, France. Thomas Bresson, Flickr

Under a strong climate scenario, where greenhouse gas emissions continue to increase, extreme summer thunderstorms in these areas might break out more often overnight and in the morning rather than during their customary late afternoon periods, according to the new research.

“In future climates, this afternoon period may no longer be the most likely period to experience an extreme [thunderstorm],” said Edmund Meredith, a postdoctoral meteorologist at the Free University of Berlin and the lead author of the new study.

Thunderstorms are caused by atmospheric convection, which is the result of the instability caused when layers of air with different temperatures interact in the atmosphere. This instability causes the development of cumulus clouds and increased winds, while moisture in some layers can lead to thunderstorms.

Previous research has examined seasonal or yearly changes in storm patterns. This research used kilometer-scale climate models, which remove some of the previous uncertainty around atmospheric convection to predict an average increase or decrease in thunderstorms in some areas of the world.

In the new study, Meredith and his colleagues wanted to see whether daily storm patterns would be affected by predicted changes in atmospheric convection using a climate change scenario in which greenhouse gas emissions continue to increase.

When Meredith and his co-authors looked at future changes in thunderstorms in high-resolution climate models, they found that during warm weather in western Europe, the number of morning or overnight storms could increase at a higher rate than those during the afternoon.

While short, booming storms in the study area are currently more common in the afternoon, climate change may result in heavy rain and lightning more often overnight and in the morning. The new study also confirmed that extreme storms will intensify in the study area, though the authors found that changes in atmospheric instability are not reflective of changes in storm intensity.

“We found that all across the future climate daily cycle, more extreme levels of atmospheric instability are found, but that this does not always translate into more extreme storms,” Meredith said.

While the new study was focused on western Europe, Meredith said that other parts of the world may also see daily shifts in storm patterns – particularly with extreme events. He said future climate change often increases instability in the atmosphere while reducing moisture availability, or vice-versa, making projection of extreme storms more challenging.

Joshua Learn is a freelance science writer. 

The post Climate change may shift timing of summer thunderstorms appeared first on GeoSpace.

One third of the African urban population exposed to extreme heat by 2090

Wed, 06/05/2019 - 16:40

Locations of the 173 cities analyzed in the new study, with 2015 population (millions). Credit: AGU.

By Aurélie Kuntschen 

Climate change, population growth and urbanization are instrumental in increasing exposure to extreme temperatures. A new study in AGU’s journal Earth’s Future assessed a range of possible scenarios regarding the rate of climate change and socio-economic development in 173 African cities for the years 2030, 2060 and 2090. The results show a third of African city-dwellers could be affected by deadly heat waves in 2090. The projections also highlight the influence of socio-economic development on the impact of climate change.

The effects of climate change are felt specifically in countries with tropical climates, which are characterized by high humidity and very high temperatures. Furthermore, countries in these regions – especially in Africa – are experiencing heavy urbanization and socio-economic development, leading to an explosion in the size of urban populations. A combination of these two factors is having a major impact on the living conditions of city-dwellers in Africa, especially in terms of exposure to extreme – or even lethal – temperatures.

“We consider the critical threshold to be 40.6 degrees Celsius in apparent temperature, taking humidity into account,” said Guillaume Rohat, a researcher at UNIGE’s Institute for Environmental Sciences (ISE). High outdoor humidity levels disrupt our ability to thermoregulate, with potentially fatal consequences. The scientists based their research on scientific climate projections and future urban demographics, rather than on current demographic data, to calculate the risk in the years ahead – which was in itself a first.

“The idea was to factor in all possible scenarios regarding climate change and urban population growth, the best and the worst, so we could find out what the future holds,” Guillaume explained. The scientists then combined five scenarios based on socio-economic projections and three climate change projection scenarios carried out by the Intergovernmental Panel on Climate Change (IPCC) for the years 2030, 2060 and 2090.

“This gave us twelve different plausible combinations for each of the years. It also meant we could calculate the number of people per day exposed to apparent temperatures above 40.6 degrees Celsius (105 degrees Fahrenheit) in cities in Africa on an annual basis. The same individual can be counted several times, because he or she may be exposed to these heat waves several days a year,” Guillaume said. Based on these twelve models, the scientists analyzed the demography, urbanization and climate in 173 cities with at least 300,000 inhabitants in 43 countries across Africa.

Sharp rise in the number of people at risk

The initial results of the new study show regardless of the scenario selected, a drastic increase in the number of people affected by extreme temperatures on an annual basis is inevitable.

“In the best case, 20 billion person-days will be affected in 2030, compared to 4.2 billion in 2010 – a jump, in other words, of 376 percent,” Rohat said. “This figure climbs to 45 billion in 2060 (up 971 percent) and reaches 86 billion in 2090 (up 1947 percent).”

When the researchers modeled the worst-case scenario for each of these three years –  a very steep population increase, an explosion in urbanization and a climate badly disturbed by a continuous increase in CO2 – the figures rose even more sharply: 26 billion in 2030 (up 519 percent compared to 2010), 95 billion in 2060 (up 2,160 percent) and 217 billion in 2,090 (up 4,967 percent. If every inhabitant in the 173 cities studied was exposed every day of the year in 2090, the figure would rise to 647 billion.

“We see that the worst scenario for 2090 affects 217 billion people – that’s a third of Africa’s urban population potentially exposed on a daily basis!” Rohat said. This means that one third of the population would be exposed every day to a minimum temperature of 40.6 degrees Celsius (105 degrees Fahrenheit) or that every African city would experience this heat for four months of the year. The figure falls to 10 percent in the best possible scenario for 2030.

The Paris Agreement and Sustainable Development Goals

The team of scientists investigated whether it was possible to reduce the exposure to extreme heat. They performed the calculations a second time using the best possible climate scenario combined with the different socio-economic models, and found exposure was reduced by 48 percent for the year 2090.

“This proves that if we follow the Paris Agreement, we’ll halve the number of people at risk in 2090, which is encouraging!” Rohat said. Under the best socio-economic scenario for each of the climate models, the number of people exposed to extreme temperatures drops by 51 percent, according to the researchers. “We can see the importance of the UN Sustainable Development Goals: access to education, a drop in the number of children per woman, developments in the standard of living, and so forth.”

The study makes it clear exposure to extreme temperatures is going to rise sharply. But it also shows, if we act quickly, the increase can be at least partially curbed.

“That’s why we’re currently in contact with several cities that we studied,” Rohat said. “The local actors are interested in the results for 2030 and 2060 so they can adapt to the inevitable and take measures to restrict urbanization, especially by improving the quality of life in rural areas or promoting the development of other cities of more modest size.”

—Aurélie Kuntschen is an attachée de presse at the Université de Genève in Switzerland. This story was originally published as a press release in French and English on the Université de Genève website.

The post One third of the African urban population exposed to extreme heat by 2090 appeared first on GeoSpace.

Feeling Heat on the Roof of the World

Wed, 06/05/2019 - 14:43

The Tibetan Plateau, also known as the “roof of the world,” is getting hotter. This process is especially fast in places marked by retreating snow, according to new research by scientists from the University of Portsmouth and the Institute of Tibetan Plateau Research of the Chinese Academy of Sciences (ITPCAS).

“It is critically important to understand what is happening as a result of global warming at high elevations on the plateau where nearly all of the current snow and ice in the region exists. Changes in these mountain snow reserves are critical for the supply of water to billions of people in both China and India, and they are threatened by climate change,” said Nick Pepin, lead author of the study in the AGU’s Journal of Geophysical Research: Atmospheres. 

Map of the Tibetan Plateau showing the 87 Chinese Meteorological Administration stations used to develop the model. The three mountain ranges selected for further analysis are represented by colored boxes. Image: Pepin, et al., 2019 / AGU

Earlier research indicated that the rate of warming can be amplified with elevation, such that high-altitude environments often experience more rapid changes in temperature than lower ones. This phenomenon, known as Elevation-Dependent Warming, drove the scientists to explore temperature trends at high elevations across the Tibetan Plateau, where temperature readings are scarce yet crucial for understanding global warming.

Direct measurements of air temperature are unavailable in remote higher elevation regions, since harsh conditions often prohibit setting up manned weather stations. Scientists have to rely on satellites for information in higher elevation regions.

The raw satellite data, though potentially useful, is not representative enough for temperature trend analysis since clouds potentially confuse the data. Also, local factors such as vegetation and concrete/grasses can obscure the wider picture.

This is where the team’s research came in. They made a customized model so that precise air temperatures in Tibetan mountains could be deduced from satellite data.

With this model, the researchers found a marked peak in warming rates around 5,000-5,500 meters (16,000-18,000 feet) in the Nyenchen Tanglha Mountains, one of the major ranges in the central part of the plateau. This warming is particularly strong during the day. The disappearance of snow cover seems to be the most obvious reason for this increased warming.

“Snow reflects sunlight during the day. So when it is reduced it causes even more warming, especially at the height where it is disappearing fastest,” said Pepin. During the night there is also enhanced warming more broadly at higher altitudes (up to 6,500 meters / 21,000 feet), which is thought to be related to changes in both cloud patterns and moisture.

The post Feeling Heat on the Roof of the World appeared first on GeoSpace.

Patagonia ice sheets thicker than previously thought, study finds

Tue, 06/04/2019 - 14:24

Glaciologists characterize protected region with new methods

By Brian Bell

After conducting a comprehensive, seven-year survey of Patagonia, glaciologists have concluded that the ice sheets in this vast region of South America are considerably more massive than expected.

Glaciers in South America’s Patagonia region, including Argentina’s Viedma Glacier (pictured), are much thicker than expected, according to a seven-year survey conducted by scientists from California, Chile and Argentina that will enable researchers and planners to more accurately model the effects of global warming and plan for potential disruptions in freshwater resources. Jeremie Mouginot / UCI

Through a combination of ground observations and airborne gravity and radar sounding methods, the scientists created the most complete ice density map of the area to date and found that some glaciers are as much as a mile (1,600 meters) thick. Their findings were published today in the American Geophysical Union journal Geophysical Research Letters.

“We did not think the ice fields on the Patagonian plateau could be quite that substantial,” said co-author Eric Rignot, Donald Bren Professor and chair of Earth system science at the University of California, Irvine (UCI). “As a result of this multinational research project, we found that – added together – the northern and southern portions of Patagonia clearly hold more ice than anticipated, roughly 40 times the ice volume of the European Alps.”

Patagonia is home to the largest ice fields in the Southern Hemisphere outside Antarctica, and its glaciers are among the fastest-moving in the world. Surface elevation observations from satellite radar altimetry and optical imagery have shown that most of the ice slabs in the region have been thinning rapidly over the past four decades. The contribution to global sea level rise from their melting has increased at an accelerating pace during that time.

Study co-author M. Gabriela Lenzano, a researcher with Argentina’s National Scientific and Technical Research Council, said the results will “help the scientific community better explain the interactions and consequences of ice sheet dynamics and climate on this cold environment – and the impact on communities and ecosystems downstream.”

With more precise knowledge of the size and shape of the glaciers in this highly protected region – much of which is contained in one of the world’s largest national park systems – researchers and planners will be able to more accurately model the effects of global warming and plan for potential disruptions in freshwater resources that serve its inhabitants.

“This is why having accurate maps of the ice thickness is a priority,” said lead author Romain Millan, who was a UCI graduate student in Earth system science for the bulk of this research project and is now a postdoctoral scholar at the Institute of Environmental Geosciences in Grenoble, France. “It is fundamental to get the right contours and depth of the glacial valleys; otherwise, simulations of glacier retreat will always be wrong.”

The difficulty in quantifying bed elevation and thickness has limited scientists’ ability to predict the region’s potential contribution to sea level rise; model glacier dynamics in response to climate change; study the impacts on freshwater resources; or prepare against such hazards as lake outburst flooding, which occurs when a dam containing a glacial lake fails.

Past attempts to gauge the total heft of the ice have fallen short, because traditional sounding techniques were limited to the shallowest sections of the ice field. Another obstacle has been the temperate nature of Patagonian ice. The frozen water in the glaciers is near its melting point from the top to the bottom; the higher water content makes this kind of ice more difficult to measure with radar.

To overcome these challenges, the scientists took to the skies, flying over broad stretches of terrain in helicopters and fixed-wing aircraft equipped with gravimeters, devices that can determine the ice volume by reading changes in Earth’s gravitational field. The addition of data collected by glaciologists from Chile’s Center for Scientific Studies, who had mapped ice thickness with low-frequency airborne radar sounding since 2002, was instrumental in creating a more comprehensive description of the area’s conditions.

“This research has been enhanced and successfully completed thanks to our collaboration with the Rignot group at UCI and our Argentinean colleagues, with whom we have worked at both sides of the southern Patagonia ice field – disregarding the political border that divides the region,” said co-author Andrés Rivera of the Chilean center.

Brian Bell is a communications officer at UCI. This post was first published on the UCI website. 

 

The post Patagonia ice sheets thicker than previously thought, study finds appeared first on GeoSpace.

Loss of Arctic sea ice stokes summer heat waves in southern U.S.

Mon, 06/03/2019 - 14:00

Continued ice loss may mean more heat waves

By Mary Caperton Morton

Over the last 40 years, Arctic sea ice thickness, extent and volume have declined dramatically. Now, a new study finds a link between declining sea ice coverage in parts of the Canadian Arctic and an increasing incidence of summer heat waves across the southern United States.

The new study in AGU’s Journal of Geophysical Research: Atmospheres explores how seasonal fluctuations of sea ice coverage trigger changes in atmospheric circulation patterns during the boreal summer.

The study draws upon four decades of satellite data of Arctic sea ice coverage collected between 1979 and 2016, overlapped with heat wave frequency data across the United States during the same time period.

The team found evidence for a strong statistical relationship between the extent of summer sea ice in the Hudson Bay and heat waves across the southern Plains and southeastern U.S.

Composites of summer extreme (left panels) and oppressive heat wave (right panels) frequency during summers of low (top), neutral (middle) and high (bottom) Hudson Bay sea ice extent. Credit: AGU

“The latest research on this topic suggests that declining Arctic sea ice may be linked to increased incidence of extreme weather patterns across the northern hemisphere,” said Dagmar Budikova, a climatologist at Illinois State University in Normal and lead author of the new study. “Our results confirm this hypothesis by offering further evidence that Arctic sea ice variability has the potential to influence extreme summer temperatures and the frequency of heat waves across the southern U.S.”

A better understanding of the physical relationships may allow scientists to forecast heat wave-prone summers, Budikova said.

“If Arctic sea ice continues to decline as predicted, then we could expect more summer heat waves across the southern U.S. in the future,” she said.

Warm Arctic spring, hot southern summer

The new study finds the loss of sea ice across the Arctic begins with warmer-than-usual spring temperatures in the Hudson Bay and Labrador regions in the southeastern Canadian Arctic.

“This process starts when temperatures across the southeastern Canadian Arctic and northwestern Atlantic are 2 degrees [Celsius] warmer than expected in March, April and May,” Budikova said.

This springtime warming lessens the north-to-south change in temperature between the high and middle latitudes of eastern North America, leading to a reduction in the strength of regional wind patterns. These conditions are symptomatic of weakened large-scale movements of air that appear to persist into the summer months, Budikova said.

The weakened circulation typically leads to increased undulation in the jet stream and the formation of persistent high-pressure systems over the southern U.S. The presence of high-pressure systems, also known as an atmospheric block, ultimately promotes unseasonable surface and atmospheric warming, and increased heat wave incidence.

Heat waves can last for days or weeks as high-pressure zones inhibit wind, clouds and other weather systems from entering the area.

“Local humidity, soil moisture, and precipitation conditions are shown to influence the ‘flavor’ of the heat waves, which are more likely to be oppressive in the southeastern U.S. and extreme across the southern Plains during summers experiencing low Hudson [sea ice extent],” Budikova and colleagues wrote in the new study.

The next step will be to use dynamic modeling to confirm the statistical relationships between Arctic sea ice coverage and summer heat waves, and explore in detail the physical and dynamic atmospheric processes that make such linkages possible.

“General circulation models would further elucidate the processes that are taking place in the atmosphere to drive these connections,” Budikova said.

Mary Caperton Morton is a freelance science writer. Follow her on Twitter @theblondecoyote.

The post Loss of Arctic sea ice stokes summer heat waves in southern U.S. appeared first on GeoSpace.

Using the past to unravel the future of Arctic wetlands

Wed, 05/29/2019 - 12:50

By Anna Harrison

Canadian High Arctic coastal fen.
Credit: Jennifer Galloway

A new study has used partially fossilized plants and single-celled organisms to investigate the effects of climate change on the Canadian High Arctic wetlands and help predict their future.

The Arctic is warming faster than any other region on Earth, which is causing the region’s ecology to undergo a rapid transformation. Until now there has been limited information on the response of Arctic wetlands to climate change and rising global temperatures.

An international team of scientists led by the University of Leeds and the Geological Survey of Canada have reconstructed past moisture conditions and vegetation histories to determine how three main types of Canadian High Arctic wetlands have responded to warming temperatures over the last century.

Understanding past ecological changes in this region allows for more accurate predictions of how future changes, such as longer growing seasons and increased water from ground-ice thaw, could affect the wetlands.

The study, published in the AGU journal Geophysical Research Letters, found that under 21st century warming conditions and with adequate moisture, certain Arctic wetlands may transition into peatlands, creating new natural carbon storage systems and to some extent mitigating carbon losses from degrading peatlands in southern regions.

Canadian High Arctic polygon mire.
Credit: Jennifer Galloway

Study lead author Thomas Sim, PhD researcher in the School of Geography at Leeds, said: “High Arctic wetlands are important ecosystems and globally-important carbon stores. However, there are no long-term monitoring data for many of the remote regions of the Arctic – making it hard to determine their responses to recent climate warming. Reconstructing the ecological history of these wetlands using proxy evidence can help us understand past ecological shifts on a timescale of decades and centuries.” 

Study co-author Paul Morris, from the University’s research centre water@leeds, said: “Our findings show that these harsh and relatively unexplored ecosystems are responding to recent climate warming and undergoing ecosystem shifts. While some of these wetlands could transition into productive peatlands with future warming, the long term effects of climate change is likely to vary depending on the type of wetland.

“Although new productive peatlands may form in places such as the High Arctic, degrading peatlands in other areas are a major global concern. Every effort should be made to preserve peatlands across the globe – they are incredibly important component of the global carbon cycle.”

The team examined ecological responses to twentieth century warming in the three types of High Arctic wetland: polygon mire, coastal fen and valley fen. Plant macrofossils and testate amoeba – tiny, single-celled organisms that live in wetlands – in combination with radiocarbon dating were used as proxies for historic changes in vegetation and moisture levels.

The study found that all three wetland types – with the exception of certain sections of the polygon mire – have experienced ecosystem shifts that coincided with an increase in growing degree days: a unit scientists use to quantify growing season length and warmth. The coastal fen site experienced an increase in shrub cover related to warming, while sections of the polygon mire increased in moss diversity.

Canadian High Arctic valley fen.
Credit: Jennifer Galloway

The study also found that environmental factors other than warming temperatures may be contributing to vegetation changes. The research suggests that grazing Arctic geese may have contributed to the recent shift of from shrub to mosses in the coastal fen site. Arctic geese population have risen significantly and food competition at their summer nesting sites may be causing them to seek new grazing sites further north as they warm.

Study co-author Jennifer Galloway, Associate Professor at Aarhus Institute of Advanced Studies, Denmark and Geological Survey of Canada, said: “Our study highlights the complex ways in which climate change is affecting ecosystems and suggest that effects of climate warming will vary depending on wetland type. While we can clearly see that climate change is altering ecology across the Arctic wetlands, whether that will result in a transition to productive peatlands will be strongly influenced by the complex dynamics that govern the wetlands.”

— Anna Harrison is a press officer at the University of Leeds. This post originally appeared as a press release on the University of Leeds website. 

The post Using the past to unravel the future of Arctic wetlands appeared first on GeoSpace.

Aftershocks of 1959 earthquake rocked Yellowstone in 2017-18

Thu, 05/23/2019 - 19:37

By Paul Gabrielsen

State Highway 287 slumped into Hebgen Lake; damage from the August 1959 Hebgen Lake (Montana-Yellowstone) earthquake. Photo credit: I.J. Witkind/USGS

On Aug. 17, 1959, back when Dwight D. Eisenhower was president, the U.S. had yet to send a human to space and the nation’s flag sported 49 stars, Yellowstone National Park shook violently for about 30 seconds. The shock was strong enough to drop the ground a full 20 feet in some places. It toppled the dining room fireplace in the Old Faithful Inn. Groundwater swelled up and down in wells as far away as Hawaii. Twenty-eight people died. It went down in Yellowstone history as the Hebgen Lake earthquake, with a magnitude of 7.2.

Location of the Maple Creek swarm, made using data from the U Seismograph Stations.
Photo credit: USGS

And in 2017, nearly 60 years and 11 presidents later, the Hebgen Lake quake shook Yellowstone again. A swarm of more than 3,000 small earthquakes in the Maple Creek area (in Yellowstone National Park but outside of the Yellowstone volcano caldera) between June 2017 and March 2018 are, at least in part, aftershocks of the 1959 quake. That’s according to a study published in the AGU journal Geophysical Research Letters by University of Utah geoscientists led by Guanning Pang and Keith Koper.

“These kinds of earthquakes in Yellowstone are very common,” says Koper, director of the University of Utah Seismograph Stations. “These swarms happen very frequently. This one was a little bit longer and had more events than normal.”

“We don’t think it will increase the risk of an eruption,” Pang adds.

A long seismic tail

Taken together, the more than 3,000 small quakes of the Maple Creek swarm can be divided into two clusters. The northern cluster consists of Hebgen Lake aftershocks. The quakes fell along the same fault line, and were oriented the same way, as the Hebgen Lake event. Also, the team didn’t see signs that the northern cluster was caused by movement of magma and other fluids beneath the ground.

Koper and Pang says it’s not unheard of for aftershocks of a large earthquake to continue decades after the initial event. Pang, for example, has also studied aftershocks as recent as 2017 from the 1983 Borah Peak earthquake in central Idaho.

“There are formulas to predict how many aftershocks you should see,” Koper says. “For Hebgen Lake, there looked like a deficit in the number of aftershocks. Now that we’ve had these, it has evened things out back up to the original expectations.”

Plot of magnitude versus time in color-matched subsets of earthquakes. The warm colors mark earthquakes in the northern cluster and the cool colors mark the earthquakes in the southern cluster. Photo credit: University of Utah

A second culprit

The southern cluster of the Maple Creek swarm seems to have a different origin. Although the northern cluster was lined up with the Hebgen Lake fault, the southern cluster’s lineup was rotated about 30 degrees and the quakes were about 0.6 miles (1 kilometer) shallower than the northern cluster.

So, the researchers concluded, although the shaking in the northern cluster influenced the southern cluster, the primarily cause of the southern shaking was likely caused by subsurface movement of magma. “We do consider it to be one swarm all together,” Koper says. “Because they were so close, there was some feedback and influence between the two sections.”

Koper says that the results highlight how earthquakes are different than other natural hazards. Floods, hurricanes or wildfires are over when they’re over. “Earthquakes don’t happen as a single discrete event in time,” he says. The specter of aftershocks can continue for months, years or even, as Maple Creek shows, decades.

Paul Gabrielsen is a science writer for University of Utah Communications. This post was originally published at the University of Utah UNews site.

The post Aftershocks of 1959 earthquake rocked Yellowstone in 2017-18 appeared first on GeoSpace.

New Studies Increase Confidence in NASA’s Measure of Earth’s Temperature

Thu, 05/23/2019 - 15:34

By Jessica Merzdorf

A new assessment of NASA’s record of global temperatures revealed that the agency’s estimate of Earth’s long-term temperature rise in recent decades is accurate to within less than a tenth of a degree Fahrenheit, providing evidence that past and future research is correctly capturing rising surface temperatures.

 

The GISTEMP index shows how much warmer or cooler Earth’s surface is than the 1951-1980 baseline each year, an important tool to help scientists track climate change. The number of areas experiencing warmer than normal temperatures, shown in red, has steadily increased since 1880. Credits: NASA’s Scientific Visualization Studio/Kathryn Mersmann

The most complete assessment ever of statistical uncertainty within the GISS Surface Temperature Analysis (GISTEMP) data product shows that the annual values are likely accurate to within 0.09 degrees Fahrenheit (0.05 degrees Celsius) in recent decades, and 0.27 degrees Fahrenheit (0.15 degrees C) at the beginning of the nearly 140-year record.

This data record, maintained by NASA’s Goddard Institute for Space Studies (GISS) in New York City, is one of a handful kept by major science institutions around the world that track Earth’s temperature and how it has risen in recent decades. This global temperature record has provided one of the most direct benchmarks of how our home planet’s climate has changed as greenhouse gas concentrations rise.

The study, published in the AGU journal JGR: Atmospheres, also confirms what researchers have been saying for some time now: that Earth’s global temperature increase since 1880 – about 2 degrees Fahrenheit, or a little more than 1 degree Celsius – cannot be explained by any uncertainty or error in the data. Going forward, this assessment will give scientists the tools to explain their results with greater confidence.

GISTEMP is a widely used index of global mean surface temperature anomaly — it shows how much warmer or cooler than normal Earth’s surface is in a given year. “Normal” is defined as the average during a baseline period of 1951-80.

NASA uses GISTEMP in its annual global temperature update, in partnership with the National Oceanic and Atmospheric Administration. (In 2019, NASA and NOAA found that 2018 was the fourth-warmest year on record, with 2016 holding the top spot.) The index includes land and sea surface temperature data back to 1880, and today incorporates measurements from 6,300 weather stations, research stations, ships and buoys around the world.

Previously, GISTEMP provided an estimate of uncertainty accounting for the spatial gaps between weather stations. Like other surface temperature records, GISTEMP estimates the temperatures between weather stations using data from the closest stations, a process called interpolation. Quantifying the statistical uncertainty present in those estimates helped researchers to be confident that the interpolation was accurate.

“Uncertainty is important to understand because we know that in the real world we don’t know everything perfectly,” said Gavin Schmidt, director of GISS and a co-author on the study. “All science is based on knowing the limitations of the numbers that you come up with, and those uncertainties can determine whether what you’re seeing as a shift or a change is actually important.”

The study found that individual and systematic changes in measuring temperature over time were the most significant source of uncertainty. Also contributing was the degree of weather station coverage. Data interpolation between stations contributed some uncertainty, as did the process of standardizing data that was collected with different methods at different points in history.

After adding these components together, GISTEMP’s uncertainty value in recent years was still less than a tenth of a degree Fahrenheit, which is “very small,” Schmidt said.

The team used the updated model to reaffirm that 2016 was very probably the warmest year in the record, with an 86.2 percent likelihood. The next most likely candidate for warmest year on record was 2017, with a 12.5 percent probability.

“We’ve made the uncertainty quantification more rigorous, and the conclusion to come out of the study was that we can have confidence in the accuracy of our global temperature series,” said lead author Nathan Lenssen, a doctoral student at Columbia University. “We don’t have to restate any conclusions based on this analysis.”

Another recent study evaluated GISTEMP in a different way that also added confidence to its estimate of long-term warming. A paper published in March 2019, led by Joel Susskind of NASA’s Goddard Space Flight Center, compared GISTEMP data with that of the Atmospheric Infrared Sounder (AIRS), onboard NASA’s Aqua satellite.

GISTEMP uses air temperature recorded with thermometers slightly above the ground or sea, while AIRS uses infrared sensing to measure the temperature right at the Earth’s surface (or “skin temperature”) from space. The AIRS record of temperature change since 2003 (which begins when Aqua launched) closely matched the GISTEMP record.

Comparing two measurements that were similar but recorded in very different ways ensured that they were independent of each other, Schmidt said. One difference was that AIRS showed more warming in the northernmost latitudes.

“The Arctic is one of the places we already detected was warming the most. The AIRS data suggests that it’s warming even faster than we thought,” said Schmidt, who was also a co-author on the Susskind paper.

Taken together, Schmidt said, the two studies help establish GISTEMP as a reliable index for current and future climate research.

“Each of those is a way in which you can try and provide evidence that what you’re doing is real,” Schmidt said. “We’re testing the robustness of the method itself, the robustness of the assumptions, and of the final result against a totally independent data set.”

In all cases, he said, the resulting trends are more robust than what can be accounted for by any uncertainty in the data or methods.

Jessica Merzdorf is a NASA Missions Science Writing and Support Specialist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. This post was originally published on the NASA website

The post New Studies Increase Confidence in NASA’s Measure of Earth’s Temperature appeared first on GeoSpace.

Domino Droughts

Wed, 05/22/2019 - 16:40

New research finds one drought can amplify or cause another. Decreased moisture recycling and transport impacts how droughts form and move across continents.

By Megan Glatzel

Could a drought in California be linked to a drought in the Midwest? A recent Stanford-led study published in the AGU journal Geophysical Research Letters finds that regions may fall victim to water scarcity like dominos toppling down a line.

“We know droughts can travel thousands of miles across continents, but it has not been clear exactly how,” said lead author Julio E. Herrera Estrada, a postdoctoral scholar with the Stanford Water in the West program and the Stanford Department of Earth System Science.

Droughts occur when a lack of precipitation causes a water shortage. Continents receive most of their precipitation from water vapor transported by wind from other land and ocean areas as well as from moisture that evaporates from a region and falls in the same area – a process known as recycling.

In this study, researchers looked at how decreased moisture from recycling and transport amplified the 2012 drought in the Midwest, which resulted in losses of over $33 billion. Using a complex, mathematical, moisture tracking model combined with state-of-the-art data on precipitation, evaporation and moisture fluxes in the atmosphere, they found that reduced precipitation from recycling and transport from upwind land areas made up 62 percent of the total precipitation deficit experienced by the Midwest. Diminishing moisture transported directly from oceans made up only 38 percent of this deficit.

An example from the study of upwind and downwind regions in North America between which droughts have propagated through reduced moisture transport.

Like most of the U.S., the Midwest relies on moisture imported from other regions. When a drought occurred in the western U.S. that same year, it resulted in less evaporation and drier air. Transported by wind, this drier air likely resulted in less rainfall over the Midwest, according to the researchers. As less moisture arrived in the Midwest, precipitation recycling shut down, further intensifying the drought. This sequence can reinforce itself and lead to new or more severe droughts. The study found that the Midwest eventually recovered from drought when more moisture was imported directly from the ocean, restarting the precipitation recycling process in the region.

“We show that multiple droughts over a continent may not necessarily be a coincidence,” said Herrera Estrada. “There may be important feedbacks between and within land areas that can propagate and intensify droughts, helping them travel across continents.”

As the U.S. faces more intense climactic events, understanding how droughts form and move will be increasingly important. While there is still a great deal to be learned, it is imperative for water managers and policymakers to prepare for future droughts. Being able to better predict where and when droughts occur and how long they last will be key. To slow the potential domino effect of droughts, the researchers urge the adoption of sustainable land management practices to prevent soil erosion and degradation and recommend preventing deforestation and desertification. These practices ensure more vegetation and better soil which will help keep up the supply of moisture for recycling and to be exported to regions downwind.

“It will also be crucial to take a regional approach to drought risk management and facilitate coordination between upwind and downwind communities to reduce the severity and impacts of future droughts,” concluded Herrera Estrada. “In many instances, this will require international cooperation.”

Photo credit: U.S. Fish and Wildlife Service.

Megan Glatzel is the Communications and Program Coordinator for the Stanford Water in the West program. This post was first published on the Water in the West program website. 

The post Domino Droughts appeared first on GeoSpace.

Earthquake in 2009 intensified American Samoa’s rising sea levels

Thu, 05/16/2019 - 14:01

Island most severely afflicted by earthquake is sinking, potentially increasing coastal flooding and causing sea levels to rise faster than global average.

By Brendan Bane

The 2009, magnitude-8.1 Samoa earthquake dealt a great deal of damage to the Samoan Islands: Tsunami waves as high as 14 meters (46 feet) wiped out multiple villages, claiming nearly 200 lives and severely damaging water and electrical systems. 

New research reveals the damage is likely to continue in the island Tutuila, also known as American Samoa. A new study shows the island is now sinking, a product of post-earthquake tectonic shifting that will likely continue for decades.

According to the new study, published in AGU’s Journal of Geophysical Research: Solid Earth, American Samoa’s sinking has intensified the island’s already rising sea levels. The authors predict that, since the 2009 Samoa earthquake, American Samoa’s surrounding sea levels will climb an additional 30-40 centimeters (12-16 inches) throughout this century.

The island’s sea levels are now rising at an accelerated rate roughly five times higher than the global average, threatening regular coastal flooding in an area that has seen cyclones and other extreme weather in recent years, according to the new study.

Before the earthquake, American Samoa’s sea levels were already climbing two to three millimeters (0.07 to 0.11 inches) each year— a rate caused by the melting of polar ice and glaciers, as well as the expanding, warming ocean. Today, said the study’s authors, those two rates climb in tandem.

“Before the earthquake, American Samoa was experiencing sea level rise that was roughly equal to the global average. But after the earthquake, that rate drastically increased,” said Shin-Chan Han, Professor of Geodesy at the University of Newcastle in Callaghan, Australia and lead author of the new study. “That’s alarming to me because of its many implications.”

Tremors with lasting impact
The Samoan Islands are an archipelago in the central South Pacific, comprising a handful of islands that are home to roughly 250,000 people. Tropical forests cover portions of the larger islands, which are among the largest of the Polynesian islands.

The Samoa earthquake was the largest of 2009 and gained international attention, as then-U.S. President Barack Obama declared it a major disaster, directing federal disaster aid to relief efforts. The Government of Samoa estimated the total cost of the earthquake’s damage to be just shy of $150 million.

The nature of the tremors were unique, according to the authors, in that they arose from two, near-simultaneous earthquakes emanating from the northern tip of the Kermadec-Tonga subduction zone. The Samoan Islands are situated within the Pacific Ring of Fire, a 40,000-kilometer (25,000-mile), volcanically-active area where several tectonic plates smash, grind and slide past one another to produce 90 percent of Earth’s earthquakes.

To better characterize the 2009 event, the authors assessed changes in Earth’s gravity field caused by tectonic activity from GRACE satellites, used GPS to track the land’s movement and analyzed past sea level changes by examining tide gauge records and satellite altimeter data. They then modeled the area’s tectonic activity to estimate how the land will continue shifting in response to the Samoa earthquake.

Crews working near the damage from the 2009 tsunami in American Samoa. Lorn Cramer/Flickr, Wikimedia Commons

The authors found that, because of the Samoan Islands’ placement around the fault zone, each island is responding differently. In Samoa, for example, tectonic shifting now pushes the island both horizontally and vertically at equal rates, according to the study. The American Samoa island, however, now moves mostly in a vertical direction, sinking into the Earth in a geological phenomenon known as subsidence, at a rate twice as fast as Samoa.

Because of this movement, the authors now consider American Samoa an “extreme case,” as tides may reach increasingly farther inland over the coming decades, potentially flooding the main road running along the island’s perimeter and near its coast.

“The ocean is eating up their land,” said Han. “The major road in American Samoa is around the coastal area, and the coastal area is where they will see the impact of nuisance flooding.”
Han said the study highlights the need for government agencies to re-evaluate sea level rise in afflicted areas after large earthquakes, as tectonic movement can greatly influence the rate that sea levels rise, and should be considered in addition to climate-induced changes.

“When the land subsidence effect is not considered we may misinterpret sea level rise,” Han said. “Land motion is not ignorable. Sometimes, the land motion effect is greater than the climate change effect.”

Brendan Bane is a freelance science writer. 

The post Earthquake in 2009 intensified American Samoa’s rising sea levels appeared first on GeoSpace.

Study: U.S. methane emissions flat since 2006 despite increased oil and gas activity

Wed, 05/15/2019 - 14:00

By Theo Stein

Natural gas production in the United States has increased 46 percent since 2006, but there has been no significant increase of total US methane emissions and only a modest increase from oil and gas activity, according to a new NOAA study.

The finding is important because it’s based on highly accurate measurements of methane collected over 10 years at 20 long-term sampling sites around the country in NOAA’s Global Greenhouse Gas Reference Network, said lead author Xin Lan, a CIRES scientist working at NOAA.

“We analyzed a decade’s worth of data and while we do find some increase in methane downwind of oil and gas activity, we do not find a statistically significant trend in the US for total methane emissions,” said Lan. The study was published in the AGU journal Geophysical Research Letters.

The study did not attempt to quantify oil and gas methane emissions or methane emissions overall, but sought only to identify whether emissions were increasing by looking at enhancements in methane atmospheric concentration.

The new analysis showed increases in methane emissions from oil and gas activity of 3.4 percent ± 1.4 percent per year – or up to 10 times lower than some recent studies which derived their methane trend by measuring levels of another petroleum hydrocarbon, ethane. Overall though, methane concentrations in US air samples were shown to be increasing at the same rate as the global background, meaning there was no statistically significant increase in total methane from the US.

Many sources of methane
Methane is a component of natural gas, but it can also be generated by biological sources, such as decaying wetland vegetation, as a byproduct of ruminant digestion, or even by termites. Ethane is a hydrocarbon emitted during oil and natural gas production and is sometimes used as a tracer for oil and gas activity. By measuring ethane, which is not generated by biologic processes, scientists had hoped to produce an accurate estimate of petroleum-derived methane emissions.

However, those studies assumed that the ratio of ethane to methane in natural gas produced by different oil and gas regions is constant. Instead, Lan said, the new NOAA analysis shows that ethane-to-methane ratios are increasing, and that has led to major overestimations of oil and gas emission trends in some previous studies.

“What this means is if you want to track methane, you have to measure methane,” said Lan.

The quest to understand methane releases and leaks associated with oil and natural gas production has taken on a high profile in recent years as production has surged to historic levels in the US. Methane is 28 times more potent than carbon dioxide in trapping heat in the atmosphere over 100 years. It exerts the second largest influence on global warming behind carbon dioxide.

Global methane levels were nearly stable from 1999 through 2006, but since then have increased significantly. Some studies have suggested that the U.S. oil and natural gas emissions have large contributions to the post-2007 increases. Previous NOAA research suggests the global methane increase has been dominated by biogenic emissions.

Ten years of NOAA data analyzed
Lan led an analysis of data collected by a research team from NOAA’s Earth System Research Laboratory in Boulder, Colorado, and Lawrence Berkeley National Laboratory in Berkeley, California, that studied air samples collected from aircraft flights at 11 sites and 9 tall towers that are part of NOAA’s Global Greenhouse Gas Reference Network. Sampling with aircraft and tall towers allows scientists to analyze the different concentrations of gases close to the ground, where emissions occur, as well as higher up in the atmosphere where the influence of recent surface emissions is minimal, to help scientists understand their fate. The sampling sites were established in locations where sampling would capture well-mixed air masses and avoid samples dominated by local sources.

Three of the five sampling sites located downwind of oil and natural gas production areas did show varying increases in methane, ethane and propane. This could be caused by a different makeup of the underlying oil and gas resource, or different activity levels driven by the price of oil, natural gas and other hydrocarbons, Lan said.

Lan’s study is one of the first to explore trends in methane data from sites established by the 2004 North American Carbon Program, a multi-agency research program focused on carbon sources and sinks in North America and its adjacent oceans, said Arlyn Andrews, chief of the NOAA Global Monitoring Division Carbon Cycle Group.

“With 20 sites across the country, we can make enough measurements to evaluate aggregate emissions at large regional scales,” she said. “If we had more sampling sites, we would be able to provide more specificity about methane sources in regions dominated by agriculture and oil and gas. These study results show the value of GMD’s high quality air sampling network over more than a decade of measurements.”

Theo Stein is a Public Affairs Officer for NOAA Communications. 

The post Study: U.S. methane emissions flat since 2006 despite increased oil and gas activity appeared first on GeoSpace.

La Niña’s effect on droughts can be traced back to U.S. Civil War

Mon, 05/13/2019 - 20:20

By Joshua Rapp Learn

Cyclical variations in wind and sea surface temperatures in the Pacific Ocean may have contributed to a drought that played an important role in the outcome of the U.S. Civil War, according to a new study.

The new research used tree ring data to reconstruct the influence of El Niño and La Niña conditions on droughts across North America for the past 350 years, including during the American Civil War.

The Civil War drought – one of the worst to afflict the U.S. in centuries – occurred in the mid-1850s to the mid-1860s. That drought is infamous for its effects in the U.S. Southwest and parts of the Great Plains, where it led to the near extinction of the American bison and played an important role in changing the course of the Civil War by causing food and water shortages, slowing the advance of part of the Confederate army in 1862.

Max Torbenson coring a bristlecone pine in central Colorado. Photo by Daniel Griffin.

The drought effects extended far north of the core southwestern area usually impacted by La Niña, spreading into the Great Plains.

“It may very well be that [La Niña] played a significant role in the evolution of the sustained drought during the early 1860s,” said Max Torbenson, a geosciences PhD candidate at the University of Arkansas and the lead author of the new study in the AGU journal Paleoceanography and Paleoclimatology.

The El Niño/Southern Oscillation (ENSO) is a term for the cyclical variation in winds and sea surface temperatures that occurs in the tropical eastern Pacific Ocean. This includes the warm phase, called El Niño, and the cool phase, called La Niña, each lasting a few months and recurring every few years.

“These two phases affect the direction of storm tracks from the Pacific, and in turn influence how much rain falls, especially over the Southwest,” Torbenson said.

The magnitude of El Niño and La Niña conditions vary as well. A body of previous research has shown the stronger La Niña periods can cause severe droughts in the U.S. Southwest and Mexico, such as the one that afflicted Texas, New Mexico and parts of northern Mexico in 2011.

A tree ring core from a Ponderosa pine, a species used for the reconstructions. Photo by Daniel Griffin.

Researchers previously had only about 70 years of records that show how ENSO affected climate in parts of the U.S. Torbenson and his co-authors wanted to see whether they could push the record of how ENSO affects the extent of droughts to back before 1950.

To do that they tapped into the International Tree-Ring Data Bank, a public database of information gleaned from tree ring samples all around the world. Tree rings reveal past climate conditions by the thickness of a year’s growth: a thick tree ring means a year of abundant rain while a series of thin ones in a row point to a multi-year drought. Because of the strong relationship between ENSO and winter rainfall, the rings can also tell the story of past La Niña and El Niño conditions.

The researchers focused in on tree-ring chronologies from parts of northern Mexico, Texas and New Mexico, where the ENSO effects are felt the strongest, and produced estimates of ENSO variability back to 1675. These estimates were then compared to drought reconstructions based on the local tree rings of other parts of the U.S. stored in the North American Drought Atlas and broken into a grid system across the country.

Max Torbenson coring a bristlecone pine in central Colorado. Photo by Daniel Griffin.

Their results indicate that ENSO influence on drought has waxed and waned in areas far beyond the core southwestern U.S. and northern Mexico region. One notable signal they detected was the Civil War drought. During the mid-1800s, significant correlations between the ENSO estimates and drought reconstructions reached further east than at any other time, and included impacts over the Great Plains and even the confluence of the Mississippi and Ohio Rivers. The Civil War drought coincided with one of the most persistent La Niña periods in the estimates.

Torbenson said this long-term examination of the relationship between ENSO and droughts in the region could be a tool for predicting future drought conditions and for water management, especially in areas outside of the core ENSO region.

“There appears to be some pattern that could be helpful moving forward knowing when ENSO influences rainfall in certain areas,” such as eastern Texas and the Great Plains, he said. But the research itself is also compelling, he said, as it reveals the way the climate affected a critical period in U.S. history.

“I definitely think it’s something that makes us imagine the hardships of the past,” he said.

Joshua Rapp Learn is a freelance writer. Follow him on Twitter: @JoshuaLearn1

The post La Niña’s effect on droughts can be traced back to U.S. Civil War appeared first on GeoSpace.

A new view of wintertime air pollution

Wed, 05/08/2019 - 15:38

Study could help improve air quality in cities across the U.S. West

By Karin Vergoth

The processes that create ozone pollution in the summer can also trigger the formation of wintertime air pollution, according to a new study led by Cooperative Institute for Research in Environmental Sciences (CIRES) and NOAA researchers. The team’s unexpected finding suggests that in the U.S. West and elsewhere, certain efforts to reduce harmful wintertime air pollution could backfire.

Specifically, targeting nitrogen oxides emitted by cars and power plants could initially actually increase harmful air pollution, the researchers reported in their new paper, out today in the AGU journal Geophysical Research Letters.

“This is contrary to what is typically assumed and suggests a new way to mitigate this type of pollution in Salt Lake City, Denver, and beyond,” said Caroline Womack, a CIRES scientist working in the NOAA Earth System Research Laboratory and lead author of the study.

Regulations and cleaner technologies have steadily improved air quality in the United States. Yet valleys in western states still experience high levels of particulate matter (PM2.5), or microscopic droplets suspended in air, during the winter. In Utah’s urban Salt Lake Valley, wintertime levels of PM2.5 exceed national air quality standards an average of 18 days per year. Denver often has the same problem in winter, when brown clouds hang over the city.

Wintertime air pollution in the Salt Lake Valley. Credit: Alessandro Franchin, CIRES/NOAA

A major component of the Salt Lake Valley and Denver PM2.5 pollution is ammonium nitrate aerosol, which forms from emissions of nitrogen oxides, volatile organic compounds (VOCs), and ammonia. Those reactions happen during winter temperature inversions, when warm air aloft traps cold air below, concentrating pollutants.

To combat wintertime PM2.5 pollution, scientists first needed a detailed understanding of the chemical processes that produce it. So in 2017, CIRES and NOAA researchers partnered with the University of Utah, the Utah Department of Environmental Quality, and others to measure PM2.5 and its precursor emissions at several ground sites in and around the Salt Lake Valley. Using the NOAA Twin Otter—a small, instrumented research airplane—the team also collected air samples throughout the pollution layer in the critical altitude region where particulate matter forms.

Based on the observations from the field campaign, Womack and her colleagues found that ozone and ammonium nitrate aerosol pollution are closely related, connected by the unusually named parameter “total odd oxygen.” Since the same chemical processes that form ozone pollution in the summer produce ammonium nitrate pollution in winter, strategies that have effectively controlled ozone could also limit production of ammonium nitrate.

In western valleys with high levels of ammonium nitrate aerosol, mitigation efforts have tended to focus first on controlling one component of the pollution: nitrogen oxides from burning fossil fuels. The researchers found this approach may actually increase ammonium nitrate pollution, at least initially. A potentially more effective way to reduce PM2.5 pollution would be to limit VOCs, according to the new assessment.

“Atmospheric scientists typically don’t look at wintertime air pollution in this way,” Womack said. “Our findings could hold true in other areas with severe winter aerosol pollution, including mountain valleys across the U.S. West and urban areas in East Asia, and Europe.”
PM2.5 pollution is a major cause of premature death worldwide—and besides negatively affecting human health, PM2.5 also affects agricultural yields, visibility, and possibly Earth’s climate.

Up next for the research team is a follow-on study that will look at wintertime air pollution across the entire U.S. West.

Karin Vergoth is a science writer for CIRES-NOAA. This post was also published on the CIRES website

The post A new view of wintertime air pollution appeared first on GeoSpace.

Roman mining activities polluted European air more heavily than previously thought

Tue, 05/07/2019 - 15:34

By Lauren Lipuma

Roman-era mining activities increased atmospheric lead concentrations by at least a factor of 10, polluting air over Europe more heavily and for longer than previously thought, according to a new analysis of ice cores taken from glaciers on France’s Mont Blanc.

Humans have mined metals since the 6th millennium BCE, but the Romans were the first European civilization to mass produce lead for water pipes, household items and silver for coins. Mining and smelting releases many types of pollutants into the air, including several heavy metals, which are toxic.

Scientists have known the Romans mined lead but were not sure how much their mining activities may have polluted European air or for how long, and how large the impact of Roman activities was compared to more recent lead pollution.

The remains of Las Médulas, the most important gold mine in the Roman Empire, located in northwestern Spain. The spectacular landscape resulted from the Ruina Montium mining technique.Credit: Rafael Ibáñez Fernández, CC BY-SA 3.0

Now, concentrations of trace metals in some of Mont Blanc’s deepest ice show two spikes in atmospheric lead pollution over Europe during the Roman era, one in the second century BCE and one in the second century CE. Overall, Roman mining and smelting activities polluted the atmosphere for nearly 500 years and also contaminated Europe’s air with antimony, a toxic metalloid that can produce effects similar to arsenic poisoning, according to the new study.

The new study in AGU’s journal Geophysical Research Letters is one of the first to quantify atmospheric lead concentrations over Europe during antiquity, the time period spanning the height of ancient Greek and Roman cultures. Lead is one of the most dangerous environmental pollutants and is toxic to humans at extremely low levels.

The findings add to the evidence that humans have generated lead pollution at large scales for longer than previously thought, according to the study’s authors.

“Our very first study of pollution during the antiquity inferred from an alpine ice core allows us to better evaluate the impact of Roman emissions at the scale of Europe and to compare this old pollution to the recent pollution linked with the use of leaded gasoline in Europe between 1950 and 1985,” said Michel Legrand, an atmospheric scientist at the Université Grenoble Alpes in Grenoble, France, and co-author of the new study.

“This alpine ice shows that the lead emissions during the antiquity enhanced the natural level of lead by a factor of 10. For comparison, recent human activities related to the use of leaded gasoline in Europe enhanced the natural lead level by a factor of 50 to 100,” Legrand said. “Thus, the pollution by the Romans is five to 10 times less than that due to the recent use of gasoline but it took place for a long period of time – several centuries instead of 30 years of leaded gasoline use.”

The new results support previous research challenging the idea that environmental pollution began before the Industrial Revolution in the 1800s, according to Alex More, a climate historian at Harvard University who was not connected to the new study.

Current policies that set standards for acceptable levels of lead pollution use pre-industrial levels as their baseline. But the new findings suggest pre-industrial levels are not an accurate baseline and only levels from before the start of metallurgy can be considered natural, More said.

“Man-made air pollution has existed for a long time, and the baseline that we thought was natural is in fact not so,” More said. “All standards of pollution that rely on this assumption of a pre-modern, pre-industrial baseline, are wrong.”

The original plumbers

Historians credit ancient Rome with being the first civilization to mass produce lead and the Romans were the first to build large-scale plumbing systems with lead pipes. At the height of the Roman Empire, the Romans mined lead from many areas of Europe, including the Iberian Peninsula and Great Britain. Lead production declined after the fall of Rome in the 5th century and did not reach comparable levels until the Industrial Revolution.

Roman ingots of lead from the mines of Cartagena, Spain, housed in the Archaeological Municipal Museum of Cartagena.Credit: Nanosanchez; public domain.

Researchers had previously found lead in an ice core from Greenland that they connected to the detailed story of Roman mining activities, but because Greenland is so far from the pollution’s source, scientists have been unsure exactly what the lead concentrations were in European air at the time.

Several previous studies have looked at past lead contamination in ice cores from the Alps, but none had yet focused on the Roman Era. A 2017 study in AGU’s journal GeoHealth found lead mining activities in Europe during the Middle Ages plummeted to nearly zero during the Black Death pandemic of 1349 to 1353.

Metals in ice

In the new study, researchers measured concentrations of trace metals in an ice core taken from Mont Blanc, the highest peak in the Alps, to understand how Roman activities may have affected Europe’s environment. Studies of lake sediments and peat bogs have shown local lead pollution in some parts of Europe during this time, but ice cores provide better evidence for the European continent as a whole.

The new study provides a record of lead pollution over Europe for roughly the past 5,000 years, spanning the Bronze Age (3000 to 800 BCE), antiquity (800 BCE through the 5th century CE), and into the early Middle Ages.

The researchers found the Romans polluted European air for roughly 500 years, from around 350 BCE to 175 CE. Within that period, they found two times where lead pollution spiked to more than 10 times higher than background levels. The study can’t pinpoint the exact years, but the spikes occur around 250 BCE and 120 CE and may correspond to times of expansion and prosperity of Roman culture. The Roman Republic expanded to the entire Italian peninsula in the 3rd century BCE, and the Roman Empire expanded to most of mainland Europe in the 2nd century CE. By comparison, the Greenland ice core showed lead levels peaking at roughly four times the background level.

The arches of an elevated section of the Roman provincial Aqueduct of Segovia, in modern Spain. Roman aqueducts supplied water to public baths, latrines, fountains, and private households. They also supported mining operations, milling, farms, and gardens.Credit: Bernard Gagnon, CC BY-SA 3.0

Between the two spikes, the study found lead pollution dropped, although not to pre-Roman levels. This could correspond to the Crisis of the Roman Republic, a period of political instability that marked the transition from the Roman Republic to the Roman Empire from around 134 to 44 BCE, although the exact dates are uncertain.

The researchers also quantified antimony pollution during antiquity for the first time and found antimony concentrations at least six times higher than background levels during the Roman era. Lead ores commonly contain elements like arsenic, antimony, copper, silver and gold.

The findings show the Romans impacted air quality beyond simple lead pollution and their effect on the European atmosphere was longer-lived than previously thought, according to the study’s authors.

The ice core data gives scientists a better context for understanding how toxic modern air pollution is, according to More.

“Our ultimate goal is to show the man-made impact on the atmosphere for millennia now,” he said. “The baseline that we can now show is much more detailed, compared to modern times.”

Lauren Lipuma is a Senior Public Information Specialist/Writer at AGU.

The post Roman mining activities polluted European air more heavily than previously thought appeared first on GeoSpace.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer