GeoSpace: Earth & Space Science

Syndicate content
By AGU staff and collaborators
Updated: 2 days 20 hours ago

Earthquake in 2009 intensified American Samoa’s rising sea levels

Thu, 05/16/2019 - 14:01

Island most severely afflicted by earthquake is sinking, potentially increasing coastal flooding and causing sea levels to rise faster than global average.

By Brendan Bane

The 2009, magnitude-8.1 Samoa earthquake dealt a great deal of damage to the Samoan Islands: Tsunami waves as high as 14 meters (46 feet) wiped out multiple villages, claiming nearly 200 lives and severely damaging water and electrical systems. 

New research reveals the damage is likely to continue in the island Tutuila, also known as American Samoa. A new study shows the island is now sinking, a product of post-earthquake tectonic shifting that will likely continue for decades.

According to the new study, published in AGU’s Journal of Geophysical Research: Solid Earth, American Samoa’s sinking has intensified the island’s already rising sea levels. The authors predict that, since the 2009 Samoa earthquake, American Samoa’s surrounding sea levels will climb an additional 30-40 centimeters (12-16 inches) throughout this century.

The island’s sea levels are now rising at an accelerated rate roughly five times higher than the global average, threatening regular coastal flooding in an area that has seen cyclones and other extreme weather in recent years, according to the new study.

Before the earthquake, American Samoa’s sea levels were already climbing two to three millimeters (0.07 to 0.11 inches) each year— a rate caused by the melting of polar ice and glaciers, as well as the expanding, warming ocean. Today, said the study’s authors, those two rates climb in tandem.

“Before the earthquake, American Samoa was experiencing sea level rise that was roughly equal to the global average. But after the earthquake, that rate drastically increased,” said Shin-Chan Han, Professor of Geodesy at the University of Newcastle in Callaghan, Australia and lead author of the new study. “That’s alarming to me because of its many implications.”

Tremors with lasting impact
The Samoan Islands are an archipelago in the central South Pacific, comprising a handful of islands that are home to roughly 250,000 people. Tropical forests cover portions of the larger islands, which are among the largest of the Polynesian islands.

The Samoa earthquake was the largest of 2009 and gained international attention, as then-U.S. President Barack Obama declared it a major disaster, directing federal disaster aid to relief efforts. The Government of Samoa estimated the total cost of the earthquake’s damage to be just shy of $150 million.

The nature of the tremors were unique, according to the authors, in that they arose from two, near-simultaneous earthquakes emanating from the northern tip of the Kermadec-Tonga subduction zone. The Samoan Islands are situated within the Pacific Ring of Fire, a 40,000-kilometer (25,000-mile), volcanically-active area where several tectonic plates smash, grind and slide past one another to produce 90 percent of Earth’s earthquakes.

To better characterize the 2009 event, the authors assessed changes in Earth’s gravity field caused by tectonic activity from GRACE satellites, used GPS to track the land’s movement and analyzed past sea level changes by examining tide gauge records and satellite altimeter data. They then modeled the area’s tectonic activity to estimate how the land will continue shifting in response to the Samoa earthquake.

Crews working near the damage from the 2009 tsunami in American Samoa. Lorn Cramer/Flickr, Wikimedia Commons

The authors found that, because of the Samoan Islands’ placement around the fault zone, each island is responding differently. In Samoa, for example, tectonic shifting now pushes the island both horizontally and vertically at equal rates, according to the study. The American Samoa island, however, now moves mostly in a vertical direction, sinking into the Earth in a geological phenomenon known as subsidence, at a rate twice as fast as Samoa.

Because of this movement, the authors now consider American Samoa an “extreme case,” as tides may reach increasingly farther inland over the coming decades, potentially flooding the main road running along the island’s perimeter and near its coast.

“The ocean is eating up their land,” said Han. “The major road in American Samoa is around the coastal area, and the coastal area is where they will see the impact of nuisance flooding.”
Han said the study highlights the need for government agencies to re-evaluate sea level rise in afflicted areas after large earthquakes, as tectonic movement can greatly influence the rate that sea levels rise, and should be considered in addition to climate-induced changes.

“When the land subsidence effect is not considered we may misinterpret sea level rise,” Han said. “Land motion is not ignorable. Sometimes, the land motion effect is greater than the climate change effect.”

Brendan Bane is a freelance science writer. 

The post Earthquake in 2009 intensified American Samoa’s rising sea levels appeared first on GeoSpace.

Study: U.S. methane emissions flat since 2006 despite increased oil and gas activity

Wed, 05/15/2019 - 14:00

By Theo Stein

Natural gas production in the United States has increased 46 percent since 2006, but there has been no significant increase of total US methane emissions and only a modest increase from oil and gas activity, according to a new NOAA study.

The finding is important because it’s based on highly accurate measurements of methane collected over 10 years at 20 long-term sampling sites around the country in NOAA’s Global Greenhouse Gas Reference Network, said lead author Xin Lan, a CIRES scientist working at NOAA.

“We analyzed a decade’s worth of data and while we do find some increase in methane downwind of oil and gas activity, we do not find a statistically significant trend in the US for total methane emissions,” said Lan. The study was published in the AGU journal Geophysical Research Letters.

The study did not attempt to quantify oil and gas methane emissions or methane emissions overall, but sought only to identify whether emissions were increasing by looking at enhancements in methane atmospheric concentration.

The new analysis showed increases in methane emissions from oil and gas activity of 3.4 percent ± 1.4 percent per year – or up to 10 times lower than some recent studies which derived their methane trend by measuring levels of another petroleum hydrocarbon, ethane. Overall though, methane concentrations in US air samples were shown to be increasing at the same rate as the global background, meaning there was no statistically significant increase in total methane from the US.

Many sources of methane
Methane is a component of natural gas, but it can also be generated by biological sources, such as decaying wetland vegetation, as a byproduct of ruminant digestion, or even by termites. Ethane is a hydrocarbon emitted during oil and natural gas production and is sometimes used as a tracer for oil and gas activity. By measuring ethane, which is not generated by biologic processes, scientists had hoped to produce an accurate estimate of petroleum-derived methane emissions.

However, those studies assumed that the ratio of ethane to methane in natural gas produced by different oil and gas regions is constant. Instead, Lan said, the new NOAA analysis shows that ethane-to-methane ratios are increasing, and that has led to major overestimations of oil and gas emission trends in some previous studies.

“What this means is if you want to track methane, you have to measure methane,” said Lan.

The quest to understand methane releases and leaks associated with oil and natural gas production has taken on a high profile in recent years as production has surged to historic levels in the US. Methane is 28 times more potent than carbon dioxide in trapping heat in the atmosphere over 100 years. It exerts the second largest influence on global warming behind carbon dioxide.

Global methane levels were nearly stable from 1999 through 2006, but since then have increased significantly. Some studies have suggested that the U.S. oil and natural gas emissions have large contributions to the post-2007 increases. Previous NOAA research suggests the global methane increase has been dominated by biogenic emissions.

Ten years of NOAA data analyzed
Lan led an analysis of data collected by a research team from NOAA’s Earth System Research Laboratory in Boulder, Colorado, and Lawrence Berkeley National Laboratory in Berkeley, California, that studied air samples collected from aircraft flights at 11 sites and 9 tall towers that are part of NOAA’s Global Greenhouse Gas Reference Network. Sampling with aircraft and tall towers allows scientists to analyze the different concentrations of gases close to the ground, where emissions occur, as well as higher up in the atmosphere where the influence of recent surface emissions is minimal, to help scientists understand their fate. The sampling sites were established in locations where sampling would capture well-mixed air masses and avoid samples dominated by local sources.

Three of the five sampling sites located downwind of oil and natural gas production areas did show varying increases in methane, ethane and propane. This could be caused by a different makeup of the underlying oil and gas resource, or different activity levels driven by the price of oil, natural gas and other hydrocarbons, Lan said.

Lan’s study is one of the first to explore trends in methane data from sites established by the 2004 North American Carbon Program, a multi-agency research program focused on carbon sources and sinks in North America and its adjacent oceans, said Arlyn Andrews, chief of the NOAA Global Monitoring Division Carbon Cycle Group.

“With 20 sites across the country, we can make enough measurements to evaluate aggregate emissions at large regional scales,” she said. “If we had more sampling sites, we would be able to provide more specificity about methane sources in regions dominated by agriculture and oil and gas. These study results show the value of GMD’s high quality air sampling network over more than a decade of measurements.”

Theo Stein is a Public Affairs Officer for NOAA Communications. 

The post Study: U.S. methane emissions flat since 2006 despite increased oil and gas activity appeared first on GeoSpace.

La Niña’s effect on droughts can be traced back to U.S. Civil War

Mon, 05/13/2019 - 20:20

By Joshua Rapp Learn

Cyclical variations in wind and sea surface temperatures in the Pacific Ocean may have contributed to a drought that played an important role in the outcome of the U.S. Civil War, according to a new study.

The new research used tree ring data to reconstruct the influence of El Niño and La Niña conditions on droughts across North America for the past 350 years, including during the American Civil War.

The Civil War drought – one of the worst to afflict the U.S. in centuries – occurred in the mid-1850s to the mid-1860s. That drought is infamous for its effects in the U.S. Southwest and parts of the Great Plains, where it led to the near extinction of the American bison and played an important role in changing the course of the Civil War by causing food and water shortages, slowing the advance of part of the Confederate army in 1862.

Max Torbenson coring a bristlecone pine in central Colorado. Photo by Daniel Griffin.

The drought effects extended far north of the core southwestern area usually impacted by La Niña, spreading into the Great Plains.

“It may very well be that [La Niña] played a significant role in the evolution of the sustained drought during the early 1860s,” said Max Torbenson, a geosciences PhD candidate at the University of Arkansas and the lead author of the new study in the AGU journal Paleoceanography and Paleoclimatology.

The El Niño/Southern Oscillation (ENSO) is a term for the cyclical variation in winds and sea surface temperatures that occurs in the tropical eastern Pacific Ocean. This includes the warm phase, called El Niño, and the cool phase, called La Niña, each lasting a few months and recurring every few years.

“These two phases affect the direction of storm tracks from the Pacific, and in turn influence how much rain falls, especially over the Southwest,” Torbenson said.

The magnitude of El Niño and La Niña conditions vary as well. A body of previous research has shown the stronger La Niña periods can cause severe droughts in the U.S. Southwest and Mexico, such as the one that afflicted Texas, New Mexico and parts of northern Mexico in 2011.

A tree ring core from a Ponderosa pine, a species used for the reconstructions. Photo by Daniel Griffin.

Researchers previously had only about 70 years of records that show how ENSO affected climate in parts of the U.S. Torbenson and his co-authors wanted to see whether they could push the record of how ENSO affects the extent of droughts to back before 1950.

To do that they tapped into the International Tree-Ring Data Bank, a public database of information gleaned from tree ring samples all around the world. Tree rings reveal past climate conditions by the thickness of a year’s growth: a thick tree ring means a year of abundant rain while a series of thin ones in a row point to a multi-year drought. Because of the strong relationship between ENSO and winter rainfall, the rings can also tell the story of past La Niña and El Niño conditions.

The researchers focused in on tree-ring chronologies from parts of northern Mexico, Texas and New Mexico, where the ENSO effects are felt the strongest, and produced estimates of ENSO variability back to 1675. These estimates were then compared to drought reconstructions based on the local tree rings of other parts of the U.S. stored in the North American Drought Atlas and broken into a grid system across the country.

Max Torbenson coring a bristlecone pine in central Colorado. Photo by Daniel Griffin.

Their results indicate that ENSO influence on drought has waxed and waned in areas far beyond the core southwestern U.S. and northern Mexico region. One notable signal they detected was the Civil War drought. During the mid-1800s, significant correlations between the ENSO estimates and drought reconstructions reached further east than at any other time, and included impacts over the Great Plains and even the confluence of the Mississippi and Ohio Rivers. The Civil War drought coincided with one of the most persistent La Niña periods in the estimates.

Torbenson said this long-term examination of the relationship between ENSO and droughts in the region could be a tool for predicting future drought conditions and for water management, especially in areas outside of the core ENSO region.

“There appears to be some pattern that could be helpful moving forward knowing when ENSO influences rainfall in certain areas,” such as eastern Texas and the Great Plains, he said. But the research itself is also compelling, he said, as it reveals the way the climate affected a critical period in U.S. history.

“I definitely think it’s something that makes us imagine the hardships of the past,” he said.

Joshua Rapp Learn is a freelance writer. Follow him on Twitter: @JoshuaLearn1

The post La Niña’s effect on droughts can be traced back to U.S. Civil War appeared first on GeoSpace.

A new view of wintertime air pollution

Wed, 05/08/2019 - 15:38

Study could help improve air quality in cities across the U.S. West

By Karin Vergoth

The processes that create ozone pollution in the summer can also trigger the formation of wintertime air pollution, according to a new study led by Cooperative Institute for Research in Environmental Sciences (CIRES) and NOAA researchers. The team’s unexpected finding suggests that in the U.S. West and elsewhere, certain efforts to reduce harmful wintertime air pollution could backfire.

Specifically, targeting nitrogen oxides emitted by cars and power plants could initially actually increase harmful air pollution, the researchers reported in their new paper, out today in the AGU journal Geophysical Research Letters.

“This is contrary to what is typically assumed and suggests a new way to mitigate this type of pollution in Salt Lake City, Denver, and beyond,” said Caroline Womack, a CIRES scientist working in the NOAA Earth System Research Laboratory and lead author of the study.

Regulations and cleaner technologies have steadily improved air quality in the United States. Yet valleys in western states still experience high levels of particulate matter (PM2.5), or microscopic droplets suspended in air, during the winter. In Utah’s urban Salt Lake Valley, wintertime levels of PM2.5 exceed national air quality standards an average of 18 days per year. Denver often has the same problem in winter, when brown clouds hang over the city.

Wintertime air pollution in the Salt Lake Valley. Credit: Alessandro Franchin, CIRES/NOAA

A major component of the Salt Lake Valley and Denver PM2.5 pollution is ammonium nitrate aerosol, which forms from emissions of nitrogen oxides, volatile organic compounds (VOCs), and ammonia. Those reactions happen during winter temperature inversions, when warm air aloft traps cold air below, concentrating pollutants.

To combat wintertime PM2.5 pollution, scientists first needed a detailed understanding of the chemical processes that produce it. So in 2017, CIRES and NOAA researchers partnered with the University of Utah, the Utah Department of Environmental Quality, and others to measure PM2.5 and its precursor emissions at several ground sites in and around the Salt Lake Valley. Using the NOAA Twin Otter—a small, instrumented research airplane—the team also collected air samples throughout the pollution layer in the critical altitude region where particulate matter forms.

Based on the observations from the field campaign, Womack and her colleagues found that ozone and ammonium nitrate aerosol pollution are closely related, connected by the unusually named parameter “total odd oxygen.” Since the same chemical processes that form ozone pollution in the summer produce ammonium nitrate pollution in winter, strategies that have effectively controlled ozone could also limit production of ammonium nitrate.

In western valleys with high levels of ammonium nitrate aerosol, mitigation efforts have tended to focus first on controlling one component of the pollution: nitrogen oxides from burning fossil fuels. The researchers found this approach may actually increase ammonium nitrate pollution, at least initially. A potentially more effective way to reduce PM2.5 pollution would be to limit VOCs, according to the new assessment.

“Atmospheric scientists typically don’t look at wintertime air pollution in this way,” Womack said. “Our findings could hold true in other areas with severe winter aerosol pollution, including mountain valleys across the U.S. West and urban areas in East Asia, and Europe.”
PM2.5 pollution is a major cause of premature death worldwide—and besides negatively affecting human health, PM2.5 also affects agricultural yields, visibility, and possibly Earth’s climate.

Up next for the research team is a follow-on study that will look at wintertime air pollution across the entire U.S. West.

Karin Vergoth is a science writer for CIRES-NOAA. This post was also published on the CIRES website

The post A new view of wintertime air pollution appeared first on GeoSpace.

Roman mining activities polluted European air more heavily than previously thought

Tue, 05/07/2019 - 15:34

By Lauren Lipuma

Roman-era mining activities increased atmospheric lead concentrations by at least a factor of 10, polluting air over Europe more heavily and for longer than previously thought, according to a new analysis of ice cores taken from glaciers on France’s Mont Blanc.

Humans have mined metals since the 6th millennium BCE, but the Romans were the first European civilization to mass produce lead for water pipes, household items and silver for coins. Mining and smelting releases many types of pollutants into the air, including several heavy metals, which are toxic.

Scientists have known the Romans mined lead but were not sure how much their mining activities may have polluted European air or for how long, and how large the impact of Roman activities was compared to more recent lead pollution.

The remains of Las Médulas, the most important gold mine in the Roman Empire, located in northwestern Spain. The spectacular landscape resulted from the Ruina Montium mining technique.Credit: Rafael Ibáñez Fernández, CC BY-SA 3.0

Now, concentrations of trace metals in some of Mont Blanc’s deepest ice show two spikes in atmospheric lead pollution over Europe during the Roman era, one in the second century BCE and one in the second century CE. Overall, Roman mining and smelting activities polluted the atmosphere for nearly 500 years and also contaminated Europe’s air with antimony, a toxic metalloid that can produce effects similar to arsenic poisoning, according to the new study.

The new study in AGU’s journal Geophysical Research Letters is one of the first to quantify atmospheric lead concentrations over Europe during antiquity, the time period spanning the height of ancient Greek and Roman cultures. Lead is one of the most dangerous environmental pollutants and is toxic to humans at extremely low levels.

The findings add to the evidence that humans have generated lead pollution at large scales for longer than previously thought, according to the study’s authors.

“Our very first study of pollution during the antiquity inferred from an alpine ice core allows us to better evaluate the impact of Roman emissions at the scale of Europe and to compare this old pollution to the recent pollution linked with the use of leaded gasoline in Europe between 1950 and 1985,” said Michel Legrand, an atmospheric scientist at the Université Grenoble Alpes in Grenoble, France, and co-author of the new study.

“This alpine ice shows that the lead emissions during the antiquity enhanced the natural level of lead by a factor of 10. For comparison, recent human activities related to the use of leaded gasoline in Europe enhanced the natural lead level by a factor of 50 to 100,” Legrand said. “Thus, the pollution by the Romans is five to 10 times less than that due to the recent use of gasoline but it took place for a long period of time – several centuries instead of 30 years of leaded gasoline use.”

The new results support previous research challenging the idea that environmental pollution began before the Industrial Revolution in the 1800s, according to Alex More, a climate historian at Harvard University who was not connected to the new study.

Current policies that set standards for acceptable levels of lead pollution use pre-industrial levels as their baseline. But the new findings suggest pre-industrial levels are not an accurate baseline and only levels from before the start of metallurgy can be considered natural, More said.

“Man-made air pollution has existed for a long time, and the baseline that we thought was natural is in fact not so,” More said. “All standards of pollution that rely on this assumption of a pre-modern, pre-industrial baseline, are wrong.”

The original plumbers

Historians credit ancient Rome with being the first civilization to mass produce lead and the Romans were the first to build large-scale plumbing systems with lead pipes. At the height of the Roman Empire, the Romans mined lead from many areas of Europe, including the Iberian Peninsula and Great Britain. Lead production declined after the fall of Rome in the 5th century and did not reach comparable levels until the Industrial Revolution.

Roman ingots of lead from the mines of Cartagena, Spain, housed in the Archaeological Municipal Museum of Cartagena.Credit: Nanosanchez; public domain.

Researchers had previously found lead in an ice core from Greenland that they connected to the detailed story of Roman mining activities, but because Greenland is so far from the pollution’s source, scientists have been unsure exactly what the lead concentrations were in European air at the time.

Several previous studies have looked at past lead contamination in ice cores from the Alps, but none had yet focused on the Roman Era. A 2017 study in AGU’s journal GeoHealth found lead mining activities in Europe during the Middle Ages plummeted to nearly zero during the Black Death pandemic of 1349 to 1353.

Metals in ice

In the new study, researchers measured concentrations of trace metals in an ice core taken from Mont Blanc, the highest peak in the Alps, to understand how Roman activities may have affected Europe’s environment. Studies of lake sediments and peat bogs have shown local lead pollution in some parts of Europe during this time, but ice cores provide better evidence for the European continent as a whole.

The new study provides a record of lead pollution over Europe for roughly the past 5,000 years, spanning the Bronze Age (3000 to 800 BCE), antiquity (800 BCE through the 5th century CE), and into the early Middle Ages.

The researchers found the Romans polluted European air for roughly 500 years, from around 350 BCE to 175 CE. Within that period, they found two times where lead pollution spiked to more than 10 times higher than background levels. The study can’t pinpoint the exact years, but the spikes occur around 250 BCE and 120 CE and may correspond to times of expansion and prosperity of Roman culture. The Roman Republic expanded to the entire Italian peninsula in the 3rd century BCE, and the Roman Empire expanded to most of mainland Europe in the 2nd century CE. By comparison, the Greenland ice core showed lead levels peaking at roughly four times the background level.

The arches of an elevated section of the Roman provincial Aqueduct of Segovia, in modern Spain. Roman aqueducts supplied water to public baths, latrines, fountains, and private households. They also supported mining operations, milling, farms, and gardens.Credit: Bernard Gagnon, CC BY-SA 3.0

Between the two spikes, the study found lead pollution dropped, although not to pre-Roman levels. This could correspond to the Crisis of the Roman Republic, a period of political instability that marked the transition from the Roman Republic to the Roman Empire from around 134 to 44 BCE, although the exact dates are uncertain.

The researchers also quantified antimony pollution during antiquity for the first time and found antimony concentrations at least six times higher than background levels during the Roman era. Lead ores commonly contain elements like arsenic, antimony, copper, silver and gold.

The findings show the Romans impacted air quality beyond simple lead pollution and their effect on the European atmosphere was longer-lived than previously thought, according to the study’s authors.

The ice core data gives scientists a better context for understanding how toxic modern air pollution is, according to More.

“Our ultimate goal is to show the man-made impact on the atmosphere for millennia now,” he said. “The baseline that we can now show is much more detailed, compared to modern times.”

Lauren Lipuma is a Senior Public Information Specialist/Writer at AGU.

The post Roman mining activities polluted European air more heavily than previously thought appeared first on GeoSpace.

Un nuevo estudio profundiza en las Nubes de Venus

Mon, 04/29/2019 - 17:52

Click here for this post in English

Por Nanci Bompey

Animación mostrando la variada morfología de las nubes intermedias de Venus con proyecciones en perspectiva vertical de imágenes a 900 nm de la cámara IR1 a bordo de Akatsuki (JAXA). La perspectiva vertical muestra el lado diurno de Venus con el punto central siendo a la intersección entre el ecuador y el mediodía (12:00). Las líneas discontinuas se corresponden (de arriba a abajo) con latitudes 60ºN, 30ºN, ecuador, 30ºS y 60ºS, y horas locales 15:00, 12:00 y 9:00 (de izquierda a derecha). Créditos: JAXA.

Venus es conocido por sus nubes de ácido sulfúrico que cubren todo el planeta y por sus fortísimos vientos con velocidades de cientos de kilómetros por hora. Sin embargo, los científicos lo han tenido difícil para estudiar a fondo nuestro planeta vecino, principalmente debido a su espesa capa de nubes.

Ahora, los investigadores han logrado visualizar lo que sucede en las nubes intermedias de esta gruesa capa gracias a imágenes en infrarrojo, y se han topado con sorpresas inesperadas.

Este nuevo estudio, publicado en la revista Geophysical Research Letters de la Unión Americana de Geofísica (AGU), muestra que esta capa intermedia de nubes exhibe una amplia variedad patrones nubosos que cambian con el tiempo y resultan ser muy distintos a los que se ven en las nubes más altas de Venus, que suelen ser observadas con imágenes ultravioletas. El estudio también revela que el albedo de las nubes intermedias (el grado en que las nubes reflejan la luz solar) también es muy variable, lo que podría indicar la presencia de agua, metano u otros compuestos capaces de absorber la radiación infrarroja del Sol.

Como resultado de combinar las velocidades de las nubes intermedias de diferentes misiones espaciales, los investigadores han logrado también reconstruir el comportamiento de los vientos de Venus a lo largo de 10 años, mostrando que en las nubes intermedias estos fuertes vientos suelen ser más veloces en el ecuador y que, al igual que sucede en las nubes más altas, los vientos varían con el tiempo.

Las nubes intermedias de Venus observadas en la región del atardecer con imágenes de la cámara IR1 a bordo de la nave Akatsuki de JAXA. Esta imagen fue tomada el 1 de Julio de 2016 y muestra nubes con bruscas discontinuidades así como claras diferencias entre ambos hemisferios del planeta. Créditos: JAXA.

Estas nuevas observaciones podrían ayudar a entender mejor a nuestro planeta vecino, además de aportar luz en la investigación de exoplanetas con características similares a Venus.

“Hemos observado fenómenos completamente inesperados,” comenta Javier Peralta, investigador ITYF de la Agencia de Exploración Aeroespacial de Japón (JAXA) y primer autor de este nuevo estudio. “Hemos descubierto que las nubes intermedias de Venus no están tan en calma ni son tan aburridas como parecían en misiones espaciales anteriores.”

Las nubes intermedias de Venus observadas en la región del amanecer con imágenes de la cámara IR1 a bordo de la nave Akatsuki de JAXA. Esta imagen fue tomada el 17 de Mayo de 2016 y muestra un ejemplo del oscurecimiento periódico (cada 4-5 días) que las nubes del hemisferio norte experimentaron durante esta fase de la misión. Créditos: JAXA.

Observando las nubes de Venus
Este nuevo trabajo usa imágenes de la nave Akatsuki de JAXA, que llegó a Venus en Diciembre de 2015 y cuya misión principal es entender la superrotación de la atmósfera de Venus. Decimos que la atmósfera de un planeta está en superrotación cuando ésta gira mucho más deprisa que el propio planeta. Este fenómeno, todavía sin explicación, no sólo ocurre en Venus sino también en la luna Titán y en muchos exoplanetas. El caso de Venus es, sin duda, uno de los más extremos ya que mientras el periodo de rotación de Venus es de 243 días terrestres, su atmósfera a la altura de las nubes es 60 veces más rápida y tarda apenas 4 días en dar una vuelta completa en torno al planeta.

Para llevar a cabo este nuevo estudio se analizaron casi 1,000 imágenes de las nubes de Venus tomadas por la cámara IR1 de Akatsuki a lo largo de un año. Dicha cámara fue diseñada para observar la capa de nubes a nivel intermedio, situada entre 50 y 55 kilómetros por encima de la superficie del planeta. Se usaron imágenes infrarrojas por que los fotones a estas longitudes de onda pueden penetrar mucho mejor la espesa capa de nubes de Venus antes de ser reflejados por estas, permitiendo ver nubes a más profundidad.

“Si bien las nubes intermedias de Venus ya habían sido observadas por algunas misiones espaciales anteriores, nunca habíamos podido estudiarlas de manera continuada y con tanto detalle como con Akatsuki. Es imposible saber cómo evolucionan a menos que los instrumentos puedan estudiarlas durante un tiempo prolongado,” comenta Javier Peralta.

Imágenes a 900 nm, tomadas por la cámara IR1 de la nave Akatsuki de JAXA, mostrando la fuerte variabilidad de las nubes intermedias de Venus durante el año 2016. El albedo de estas nubes muestran fuertes asimetrías entre ambos hemisferios, bandas con orientación paralela al ecuador y bruscas discontinuidades. Las imágenes fueron tomadas (de izquierda a derecha): 2, 3 y 17 de Mayo, 23 de Junio y 1 de Julio. Créditos: JAXA.

Estas imágenes nuevas de Akatsuki muestran que las nubes intermedias de Venus no sólo sufren cambios importantes a lo largo del tiempo, sino que éstas resultan ser muy diferentes a las nubes de la capa superior (a unos 70 km de altura). A veces, las imágenes muestran cómo bandas de nubes oscuras son invadidas por nubes más brillantes con forma de espiral o de aspecto moteado, sugiriendo la presencia de convección atmosférica en forma de transferencia vertical de calor. Esto es muy interesante ya que en el caso de la Tierra la convección puede causar tormentas. Otras veces, las imágenes de Akatsuki muestran nubes brillantes, homogéneas y con una apariencia menos turbulenta, con poco contraste local y múltiples bandas.

Entre Abril y Mayo de 2016, el hemisferio norte de Venus empezó a oscurecerse cada cuatro o cinco días. Esta diferencia de comportamiento entre las nubes de los hemisferios norte y sur es algo realmente novedoso y sin explicación. Las imágenes también exhiben estructuras nubosas insólitas, como un filamento oscuro con forma de garfio de más de 7,300 kilómetros que apareció en el hemisferio norte en Mayo de 2016, y que fue observado nuevamente en Octubre por astrónomos aficionados.

Las observaciones también han permitido demostrar que el albedo de las nubes intermedias puede exhibir mucho más contraste que lo reportado en misiones anteriores. Los autores de este estudio sugieren que esto podría deberse a la presencia de compuestos capaces de absorber la radiación infrarroja del Sol. Alternativamente, dichos contrastes podrían ser indicativos de cambios importantes en el espesor de las nubes.

Asimismo, los investigadores han reconstruido el comportamiento de los vientos de Venus a lo largo de 10 años combinando las observaciones de Akatsuki con las de astrónomos aficionados y misiones pasadas como Venus Express (ESA) y MESSENGER (NASA). Esto ha permitido descubrir que la superrotación de las nubes intermedias tiene, a veces, mayores velocidades en el ecuador, o que la superrotación sufre variaciones de hasta 50 kilómetros por hora al cabo de varios meses.

Imágenes a 900 nm, tomadas por la cámara IR1 de la nave Akatsuki de JAXA, mostrando la fuerte variabilidad de las nubes intermedias de Venus durante el año 2016. El albedo de estas nubes muestran fuertes asimetrías entre ambos hemisferios, bandas con orientación paralela al ecuador y bruscas discontinuidades. Las imágenes fueron tomadas (de izquierda a derecha): 3 y 17 de Mayo, 23 de Junio y 1 de Julio. Créditos: JAXA.

Intentando comprender la superrotación de Venus
“Los resultados de este trabajo pueden ayudarnos a entender mejor la superrotación de Venus. Factores como el rozamiento entre la superficie y la atmósfera, la generación de ondas estacionarias debido la interacción entre el viento y las elevaciones de la superficie, o el calentamiento de la atmósfera debido al Sol podrían tener un papel determinante a la hora mantener la superrotación o definir su evolución a largo plazo”, comenta Javier Peralta.

“Estudiar las nubes y los vientos de Venus a diferentes alturas es crucial, ya que la mayor parte de la energía que Venus recibe del Sol se absorbe en la capa de las nubes, y es también en las nubes donde la superrotación alcanza su mayor velocidad,” destaca Javier. De hecho, se sospecha que el albedo de las nubes de Venus y su variabilidad podrían estar íntimamente ligados a la superrotación y a como se redistribuye la “inercia” de la atmósfera y su energía.

“Además, no se puede descartar que el mecanismo de la superrotación esté vinculado al fuerte efecto invernadero de Venus, y esto podría ayudarnos a comprender mejor las implicaciones del cambio climático en la Tierra,” argumenta Javier Peralta. “De igual manera, estos resultados podrían dar una nueva luz a investigaciones de la superrotación en otros cuerpos del sistema solar como la luna Titán de Saturno, o en exoplanetas que orbitan muy cerca de su estrella,” comenta.

Nanci Bompey es Subdirectora de Información Pública de AGU.

The post Un nuevo estudio profundiza en las Nubes de Venus appeared first on GeoSpace.

New research takes deeper look at Venus’s clouds

Mon, 04/29/2019 - 17:51

Haz clic aquí para este blog en español

By Nanci Bompey

Venus is known for its clouds of sulfuric acid covering the entire planet and its super-fast winds moving at hundreds of kilometers per hour, but our neighboring planet’s thick clouds make it difficult for scientists to peer deep inside its atmosphere.

Animation of the middle clouds of Venus as observed with the 900-nm images from Akatsuki/IR1. Credit: Javier Peralta

Now, researchers have used infrared images to spy into the middle layer of Venus’s clouds and they have found some unexpected surprises.

The new research, published in the AGU journal Geophysical Research Letters, finds this middle layer of clouds shows a wide variety of cloud patterns that change over time and are very different from the upper layer of Venus’s clouds, which are usually studied with ultraviolet images. The study also found changes in the albedo of the middle clouds, or how much sunlight they are reflecting back to space, which could indicate the presence of water, methane or other compounds absorbing solar radiation.

The middle clouds of Venus as observed on the evening side at 900-nm by the camera IR1 onboard JAXA’s orbiter Akatsuki. This image was acquired the 1 of July of 2016, and it exhibits an example of the hemispherical asymmetry and sharp contrasts apparent on the albedo. Credits: JAXA.

The motions of the middle clouds, combined with previous observations, allowed researchers to reconstruct a picture of the winds on Venus over 10 years, showing the super-fast winds in the planet’s middle clouds are fastest at the equator and, like the upper clouds, change speed over time.

These new observations could help scientists better understand our neighboring planet and shed light on other planets and exoplanets with similar features, according to the study’s authors.

“We observed completely unexpected events,” said Javier Peralta, ITYF researcher at the Japan Aerospace Exploration Agency (JAXA) and lead author of the new study. “We have discovered that the middle clouds are not as quiet or as boring as they seemed during previous missions.”

The middle clouds of Venus as observed on the morning side at 900-nm by the camera IR1 onboard JAXA’s orbiter Akatsuki. This image was acquired the 17 of May of 2016, and it exhibits an example of the hemispherical asymmetry on the albedo which re-appeared every 4-5 days during this stage of the mission. Credits: JAXA.

Observing Venus’s clouds
The new study used images taken by JAXA’s Akatsuki spacecraft, which arrived at Venus in December 2015 and whose main goal is to understand Venus’s super-rotation. Super-rotation is a puzzling phenomenon also seen on Titan and many exoplanets that makes the atmosphere move much faster than the solid planet. It takes Venus 243 Earth days to complete a rotation. However, it takes only four Earth days for the planet’s atmosphere to go all the way around Venus – about 60 times faster than the planet’s rotation.

In the new study, researchers analyzed nearly 1,000 infrared images of Venusian clouds captured by one of Akatsuki’s cameras over one year. The camera was designed to observe the middle cloud layer, which sits 50 to 55 kilometers above the planet’s surface. Photons at infrared wavelengths can penetrate deeper into the clouds before being reflected, allowing scientists to peer deeper into this cloud layer.

Previous missions studying Venus’s top-most clouds have seen glimpses of the middle cloud layer but had been unable to get a good, long look at it with infrared images. In order to see how the middle clouds evolve, instruments have to look at them for a longer time than was done during previous missions, according to Peralta.

The strong variability of the middle clouds of Venus as shown in 900-nm mages acquired by the camera IR1 onboard JAXA’s orbiter Akatsuki during the year 2016. Clear hemispherical asymmetries, zonally-oriented stripes and sharp discontinuities are visible on the middle clouds’ albedo. Image dates (from left to right): 2, 3 and 17 of May, 23 of June and 1 of July. Credits: JAXA.

The new images taken by Akatsuki show the middle layer of clouds change over time and are also very different than Venus’s upper cloud layer, which sit at a height of about 70 kilometers. Sometimes, the images show a slightly darker band of clouds invaded by bright clouds that at times exhibit swirl shapes or look mottled. These observations are suggestive of convection, the vertical movement of heat and moisture in the atmosphere. On Earth, convection can cause thunderstorms. At other times, the images showed clouds that are less turbulent and appear homogeneously bright or featureless, with multiple stripes.

From April to May of 2016, Venus’s northern hemisphere became periodically darkened every four to five days. Scientists had not previously observed this difference between the hemispheres and the cause is yet to be determined, according to the new study. The images also showed other rare cloud features, including a hook-like dark filament extending more than 7,300 kilometers in the northern hemisphere in May and October of 2016.

Akatsuki also saw unexpected high contrasts in the cloud albedo. The new study suggests there could be compounds in the cloud layer able to absorb at the infrared wavelength or, alternatively, there could changes in the thickness of the clouds.

The scientists have also reconstructed Venus’s winds over 10 years by combining the Akatsuki images with observations by amateur observers and past missions like ESA’s Venus Express and NASA’s MESSENGER mission. They found the super-rotating winds in Venus’s middle clouds are sometimes fastest at the equator and their speed could change by up to 50 kilometers per hour over several months.

The strong variability of the middle clouds of Venus as shown in 900-nm mages acquired by the camera IR1 onboard JAXA’s orbiter Akatsuki during the year 2016. Clear hemispherical asymmetries, zonally-oriented stripes and sharp discontinuities are visible on the middle clouds’ albedo. Image dates (from left to right): 3 and 17 of May, 23 of June and 1 of July. Credits: JAXA.

Understanding Venus’s super-rotation
The findings could help scientists better understand Venus’s super-rotation. The frictional drag and mountain waves caused by Venus’s surface or the periodic heating from the Sun are factors that could be playing a key role in the maintenance of the super-rotation by slowing down or accelerating the winds and defining its long-term evolution, according to Peralta.

Since most of the solar energy is absorbed in the cloud layers and the fastest super-rotating winds also occur there, studying several layers of the clouds is critical to understanding the winds, according to Peralta. Scientists suspect changes in Venus’s clouds and their albedo could be linked to the planet’s super-rotation, and how the wind’s momentum and energy is transported.

Uncovering the cause of the super-rotation on Venus and its potential connection to the planet’s runaway greenhouse effect might help scientists understand changes on Earth related to climate change, Peralta said. It could also shed light on the atmospheric super-rotation of other bodies in our solar system like Saturn’s moon Titan, and exoplanets orbiting very close to their stars, he said.

Nanci Bompey is AGU’s Assistant Director of Public Information. 

The post New research takes deeper look at Venus’s clouds appeared first on GeoSpace.

Uncovering polynya: new research unravels 43-year-old Antarctic mystery

Wed, 04/24/2019 - 19:00

Researchers at NYU Abu Dhabi have discovered how the Maud-Rise Polynya that was initially spotted in Antarctica in 1974, reappeared in September 2017 at the same location.

Abu Dhabi — A study led by NYU Abu Dhabi (NYUAD) Research Scientist Diana Francis has unraveled the four decade long mystery surrounding the occurrence of a mid-sea Polynya – a body of unfrozen ocean that appeared within a thick body of ice during Antarctica’s winter, almost two years ago. The new study has been published in the AGU’s Journal of Geophysical Research: Atmospheres

Figure 1: MODIS satellite visible imagery showing the location of the Maud Rise polynya in the Lazarev Sea to the east of the Weddell Sea and the Antarctic Peninsula. Satellite images are from NASA Worldview ( and adapted by the authors.

The Maud-Rise Polynya was spotted in mid September 2017 in the center of an ice pack in Antarctica’s Lazarev Sea, causing researchers to question how this phenomenon occurred during Antarctica’s coldest, winter months when ice is at its thickest. Due to its difficult to access location, NYUAD scientists used a combination of satellite observations and reanalysis data to discover that cyclones (as intense as category 11 in the Beaufort Scale) and the strong winds that they carry over the ice pack cause ice to shift in opposite directions, which leads to the opening of the Polynya.

At the time of the discovery, the Maud-Rise Polynya was approximately 9,500 square kilometers large (equivalent to the landmass of the state of Connecticut), and grew by over 740 percent to 800,000 square kilometers within a month. Eventually, the Polynya merged with the open ocean once the ice started to retreat at the beginning of the austral summer months. Prior to 2017, this phenomenon has only been known to have occurred in the 1970s when satellite observations started to become more commonly used, and has baffled scientists ever since.

Sketch summarizing the mechanisms by which the cyclone opens the polynya. Credit: Francis, et al., 2019

“Once opened, the Polynya works like a window through the sea-ice, transferring huge amounts of energy during winter between the ocean and the atmosphere.” said Francis. “Because of their large size, mid-sea Polynyas are capable of impacting the climate regionally and globally as they modify the oceanic circulation. It is important for us to identify the triggers for their occurrence to improve their representation in the models and their effects on climate.

“Given the link between Polynya and cyclones we demonstrated in this study, it is speculated that Polynya events may become more frequent under warmer climate because these areas will be more exposed to more intense cyclones. Previous studies have shown that under warmer climate, polar cyclone activity will intensify and extratropical cyclones track will move toward Antarctica which could decrease the sea-ice extent and make Polynya areas, closer to the cyclones formation zone,” she added.

This post was shared from NYU Abu Dhabi

The post Uncovering polynya: new research unravels 43-year-old Antarctic mystery appeared first on GeoSpace.

Aurora create speed bumps in space

Tue, 04/23/2019 - 14:00

By Liza Lester

A new study fills in gaps in a long-running mystery of what causes “speed bumps” in space that slow satellites in orbit.

Using new data from the Rocket Experiment for Neutral Upwelling 2 (RENU2) mission, the new study in AGU’s journal Geophysical Research Letters finds a type of high-altitude aurora are responsible, at least in part, for moving pockets of air high into the atmosphere where they can cause drag on passing satellites.

The aurora seen dancing in this video through the glass bubble of Kjell Henrikson Observatory in Svalbard are not the typical bright ribbons of light seen at night in Earth’s high latitudes. Known as Poleward Moving Auroral Forms (PMAF), they are less energetic, and though they can be seen from the ground, the observer needs a very dark location to catch a glimpse.

PMAF are dim and distant. But because they occur at such high altitudes, these lower-energy auroras transfer more of their energy to the thin atmosphere at 250-400 kilometers (150-250 miles) above the ground, and produce more interesting effects than more familiar aurora, which sparkle at closer to 100 kilometers (60 miles) up.

When early space programs first put satellites into orbit, they noticed degradation of the satellites’ orbits when the sun was active. Something was creating drag on the satellites, like driving a car into a strong headwind. This is a problem, because when the extra drag slows the satellites down, they move closer to Earth. Without extra fuel to boost them back up to speed, they will eventually fall back to Earth.

Scientists developed a hypothesis that air from lower in Earth’s atmosphere was welling up into the normally wispy upper reaches of the atmosphere and dragging on satellites in low Earth orbit. In the early 2000s, new data showed some of the upwelling that occurs within these “space speed bumps” consists of sharp, intense spikes.

Marc Lessard, a physicist at the University of New Hampshire and lead author of the new study, and his colleagues began to suspect that aurora may be instigating the upwelling events, because poleward moving auroral forms seemed to be coincidentally present at the right time. To find out, they needed a closer look at the development of the speed bumps.

RENU2 lifted off from Svalbard on the day this video was captured, carrying instruments up into an upwelling event to measure it in action. The new study found that repeated passes of poleward moving auroral arcs injected enough energy to produce the upwelling of air from up to several hundred kilometers lower in Earth’s atmosphere. RENU2 observed a more complex structure to the upwelling events than expected, more like the rising bubbles of a lava lamp than a smooth wave.

The video was taken by Fred Signernes looking straight up through a glass dome at Kjell Henrikson Observatory in Svalbard. A shadowy figure clears the glass of snow and ice (with the help of a little Rasputin vodka) to reveal the dance of the poleward moving auroral forms.

Liza Lester is a senior public information specialist and writer at AGU. Follow her on twitter at @lizalester.

Video Credit: Fred Sigernes/Kjell Henrikson Observatory.

The post Aurora create speed bumps in space appeared first on GeoSpace.

Microbes hitch a ride on high-flying dust

Mon, 04/22/2019 - 15:00
High-altitude dust may disperse bacterial and fungal pathogens for thousands of miles, seeding far-flung ecosystems and potentially impacting human health

By Mary Caperton Morton

Dust doesn’t just accumulate under your bed. It can also travel for thousands of kilometers, across continents and oceans.

A new study analyzed the microbial content of dust particles being transported from the deserts of central Asia to South Korea and Japan. The new research shows dust’s potential for carrying potential pathogens to far-flung places, potentially impacting natural ecosystems and human health.

Tucked in the rain shadow of the Tibetan plateau, central Asia is home to the sprawling Gobi and Taklimakan deserts. During the spring, summer, and fall, steady winds blow the top layers of sandy soil east, along with any microbes, such as bacteria and fungi, living on the surface. These winds disperse the particles over eastern Asia, Japan and across the Pacific.

Long-range transport of dust particles has been tracked for decades using satellites, but scientists had not looked at what types of microbes they could be carrying.

The new study, published in AGU’s Journal of Geophysical Research: Atmospheres, collected dust samples from South Korea and Japan, and used DNA sequencing to detect the presence of hitchhiking pathogenic bacterial species.

Red and blue lines show the trajectories of dust particles from the deserts of central Asia to the island of Japan. Red trajectories are dust particles that arrived to Yonago, Japan, at altitudes of 1,000 meters.. Blue trajectories arrived at 3,000 meters. The paths are calculated using the National Oceanic and Atmospheric Administration Hybrid Single Particle Lagrangian Integrated Trajectory model (

The new study finds some dust events were transporting as many as 400 different species of bacteria, including some potentially harmful to human health, such as Staphylococcus and Bacillus species.

“Our data shows that dust mineral particles include the nucleotide fragments of pathogenic species,” said lead author Teruya Maki of Kanazawa University in Japan. “Additionally, we have demonstrated that the fungi associated with dust events can enhance allergen levels by as much as 10 times [above baseline allergen levels].” The transport of bacteria such as Staphylococcus and Bacillus species could also lead to disease outbreaks in new human populations, he said.

The team is the first to demonstrate the long-range transport of dust-associated microbes at high altitudes using a unique system of helicopter and balloon sampling methods, Maki said.

Along with sampling the dust using helicopters and balloons, the team also relied on two continuous monitoring stations installed on 10-meter high platforms for four months of the 2015 dust season. The stations were situated downwind of Asian-dust source region in Yongin, South Korea and Yonago, Japan that fall along the main Asian dust transport trajectory.

After collecting the samples, the team used fluorescent microscopic observation techniques to identify dust, biological particles and pollutants such as black carbon. They then extracted and sequenced short sections of genomic DNA from the biological particles to classify the microorganisms.

The team of scientists from Japan, Korea, Singapore and New Zealand found bacterial content of the dust was highest in early spring, while bacterial communities flourished from early spring into late summer. They also found the dust picked up additional bacteria as it traveled across new terrain: Samples collected on the island of Japan had higher contents of marine bacteria after being transported over water.

“These results indicate that bacteria in Asian dust transported over long distances, including oceans, may increase in community variation, and that variation is also associated with the seasonal changes of airborne bacteria from spring to summer,” they wrote in the new study.

Dust events are expected to intensify worldwide in the coming decades due to climate change. Deserts are likely to become more arid and more frequent extreme weather events could stir up prevailing wind patterns.

This could enhance the long-range transport of disease-causing microbes and atmospheric pollutants such as black carbon.

Developing a database of atmospheric microbes could help provide air quality information for public health purposes, Maki said.

“As new molecular biological techniques are rapidly being developed, these cutting-edge techniques can be used for analyzing dust samples collected at high altitudes,” he said.

Mary Caperton Morton is a freelance science writer. Follow her on Twitter at @theblondecoyote

The post Microbes hitch a ride on high-flying dust appeared first on GeoSpace.

New research explains why Hurricane Harvey intensified immediately before landfall

Mon, 04/22/2019 - 14:10

This image shows Hurricane Harvey’s track and intensity as it passed through the Gulf of Mexico and made landfall along the Texas coast. Harvey intensified from a Category 1 storm to a Category 4 as it crossed the Gulf of Mexico and entered the Texas Bight on August 24 and 25.
Credit: Potter et al. 2019/Journal of Geophysical Research: Oceans/AGU.

By Lauren Lipuma

A new study explains the mechanism behind Hurricane Harvey’s unusual intensification off the Texas coast and how the finding could improve future hurricane forecasting.

Hurricanes are fueled by heat they extract from the upper ocean. But hurricane growth often stalls as the storms approach land, partly because as the ocean gets shallower, there is less water and therefore less heat available to the storm. As a result, most hurricanes weaken or stay the same strength as they get close to making landfall.

But Hurricane Harvey intensified from a Category 3 storm to a Category 4 as it neared the Texas shore in late August 2017, and scientists have been puzzled as to why it was different. In a new study, researchers at Texas A&M University compared ocean temperatures in the Texas Bight, the shallow waters that line the Gulf Coast, before and after Harvey passed through it.

They found the Bight was warm all the way to the seabed before Harvey arrived. Strong hurricane winds mix the ocean waters below the storm, so if there is any cold water below the warm water at the surface, the storm’s growth will slow. But there wasn’t any cold water for Harvey to churn up as it neared the coast, so the storm continued to strengthen right before it made landfall, according to the study’s authors.

“When you have hurricanes that come ashore at the right time of year, when the temperature is particularly warm and the ocean is particularly well-mixed, they can absolutely continue to intensify over the shallow water,” said Henry Potter, an oceanographer at Texas A&M and lead author of the new study in AGU’s Journal of Geophysical Research: Oceans.

The researchers don’t yet have enough temperature data to say if the Texas Bight was unusually warm in 2017. But the findings suggest hurricane forecasters may need to adjust the criteria they use to predict storm intensity, according to Potter. Forecasters typically use satellite measurements and historical data to make intensity predictions, but Harvey’s case shows they need data collected from the ocean itself to know exactly how much heat is there, where that heat is located in the water column and if it’s easily accessible to the storm, Potter said.

 Lauren Lipuma is a senior public information specialist at AGU. Follow her on twitter at @Tenacious_She.  

The post New research explains why Hurricane Harvey intensified immediately before landfall appeared first on GeoSpace.

The Moon’s crust is really cracked

Thu, 04/18/2019 - 14:00

By Larry O’Hanlon

The bombardment of asteroids and meteoroids that pockmarked the Moon’s surface over the eons also created fractures reaching deep into the lunar crust, report researchers in a new study in AGU’s Journal of Geophysical Research: Planets.

The new study finds asteroids as small as 1 kilometer (0.6 miles) in diameter can fracture lunar crust into meter-sized blocks down to depths of 20 kilometers (12 miles) below the Moon’s surface. Larger asteroids fracture rocks to about the same depth, but over a much wider area. A 10-kilometer (6-mile) impactor, for example, fractures the crust to depths of 20 kilometers and as far as 300 kilometers (186 miles) away from the impact zone.


The crust of the Moon may be more deeply fractured from impacts than previously thought. Image: View from Apollo 15 / NASA

The fractured lunar crust is probably very similar to what the crusts of Earth and Mars would look like, except that on the Moon the damage hasn’t been erased by billions of years of weather and plate tectonics, according to the study’s authors.

“The fragmentation goes much deeper than we thought,” said planetary scientist Sean Wiggins, lead author of the paper and doctoral student at Brown University in Providence, Rhode Island.

The new findings suggest that much of the modern-day lunar surface can be created by many smaller impacts rather than by multiple large impact events, according to the new study.

The widespread lunar fragmentation could also help explain a recent mystery about the Moon’s crust, Wiggins said. Gravity measurements of the Moon have revealed the crust to be less dense than expected. One possible explanation is that the lunar crust is more tilled up by impacts and there are lots of voids that, when averaged out, lower the crust’s overall density.


Damage and fragment sizes 20 second after a 1-kilometer-diameter impactor strikes a moon-like target at 15 kilometer per second. From Wiggins, et al., 2019 


Simulating impacts
In the new study, Wiggins and his colleagues simulated physical tests on Earth’s volcanic rocks similar to those found on the Moon. They also used computer simulations to see how deeply and widely the lunar crust might have been cracked and fragmented by impacts of asteroids and other debris that crowded the early solar system.

Their simulations covered a wider area of fractured crust with more detail and involved a great deal more computational power than previous modeling of meteor impacts, which tend to focus more narrowly on impact sites and resulting craters.


Fragment sizes 12 seconds after a 1-kilometer-diameter impactor strikes a moon-like target at 15 km/s under lunar (a), Martian (b), and Earth gravity (c). Material is colored according to fragment size, corresponding to the scale bar. From Wiggins, et al., 2019


The researchers also simulated impacts under the different gravities of the Moon, Earth and Mars. They found the greater gravity of Earth, for instance, makes it harder for the crust to fracture as widely and as deeply as the Moon’s crust. This is because the greater weight of rocks on Earth puts the crust under greater pressure and makes it harder to fracture those pressurized rocks.

Researchers might be able to test that result by looking at fracturing caused by more recent impact events on Earth, Wiggins said. The simulations could lead to insights about hydrothermal systems that can develop as water moves through the cracks of impact zones of such craters on Earth and perhaps on Mars.

Larry O’Hanlon is a freelance science writer, editor and online producer. He manages the AGU Blogosphere. 

The post The Moon’s crust is really cracked appeared first on GeoSpace.

Dust toll in Africa exceeds deaths from HIV

Tue, 04/16/2019 - 12:00
Mineral dust from the Sahara is the biggest contributor to air pollution-related premature deaths on the African continent

By Liza Lester

In Africa, air pollution causes the premature deaths of about 780,000 people each year, potentially more than HIV infection, a new study estimates.

Mineral dust from the Sahara desert is the largest contributor to air quality-related mortality on the continent overall according to the new study in AGU’s Journal of Geophysical Research: Atmospheres.

“It’s just the sheer amount of material and also how it co-locates with the densely populated parts of Western Africa. These two things together make mineral dust a bigger health threat than anything that’s anthropogenic or coming from the industrial development,” said Susanne Bauer, a researcher specializing in aerosols and climate modeling at NASA’s Goddard Institute for Space Studies in New York, and the lead author of the new study.

The new study used global climate models to simulate particulate and ozone pollution throughout the continent, combined with health models to estimate outcomes for exposed populations. Air pollution monitoring is sparse in Africa, but the authors found their modeling results fit the limited data available. The new study is an effort to help bridge that information gap through modeling.

The new study differs from previous studies by quantifying contributions from natural and anthropogenic sources and accounting for climate feedbacks from human industry. Emissions from human activities can change the location and frequency of dust storms, for example.

The relative risk from natural, industrial, and agricultural sources of pollution varies regionally. Agricultural practices, industrial development, population, distance from the desert and prevailing winds all contribute to risk.

Measurements of annual mean ambient PM2.5 concentrations (micrograms/cubic meter), updated in 2018. Air pollution data for the African continent is limited by scarcity of monitoring stations.
Credit: World Health Organization

In West Africa, wind-carried dust from the desert is responsible for about 40 percent of premature deaths from air pollution, whereas in Southern Africa, dust has a negligible impact and industrial and domestic sources of air pollution cause nearly 90 percent of premature deaths, according to the new study. Smoke from agricultural fires is responsible for more than 50 percent of premature deaths from bad air in Central Africa, the study found.

Nigeria experiences the deadliest air pollution on the continent thanks to a triple threat from desert dust, dense industrialization and smoke from agricultural fires in the West Africa region. Nigeria is also the most populous nation in Africa, with more than 190 million residents.

“Nigeria is maybe a country that should worry about this, and a country that can afford to worry about it,” Bauer said.

Deadly air

Both particulates and ground-level ozone contribute to air pollution. At high concentrations, ozone, produced on the ground by vehicles, fires, and industrial processes, triggers asthma, constricts lung function, and can cause lung disease.

Particulate air pollution is a problem worldwide. More than 90 percent of people breath air that exceeds safe guidelines, according to the World Health Organization. Breathing fine particles smaller than 2.5 micrometers is known to damage human health, causing or worsening cardiovascular and respiratory disease, asthma and stroke.

These microscopic particles, also known as PM2.5, are small enough to burrow through lung tissue and get into the bloodstream. Inhaled fine particles can also increase exposure to other dangerous chemicals. Though too small see, these particles have plenty of surface area to collect and transport toxins into the body.

Hundreds of fires smoldered across Central Africa on 27 December, 2017, as farmers burned the residue of the previous season’s crops.
Credit: Jeff Schmaltz/NASA

Smoke and soot from fire is major source of particulate pollution. Burning brush and agricultural fields is so prevalent after harvest in Central Africa, and in winter in Western Africa, that the smoke plumes are visible from space. The dramatic plumes motivated the initiation of the new study.

Though responsible for an estimated 43,000 premature deaths per year, biomass burning does not affect as many people as wind-blown desert dust because the population in Central Africa is relatively low.

“So my big motivation to do the study, the agricultural burning effect, turned out to be the smallest killer African-continent wise. But that doesn’t mean for the people who live in there, in Central Africa, it’s not important,” Bauer said.

Bauer says providing people with information about their risk from air pollution is essential to empowering them to protect themselves. Agricultural fires and dust storms can be predicted and prepared for. Anthropogenic pollution can be regulated. Even intractable natural sources like massive dust storms from the Sahara desert can mitigated by face masks or limiting time outside.

Wind-driven sandstroms, or haboobs, from the Sahara shut down airports and schools in Egypt and Sudan in March 2018.
Credit: Jeff Schmaltz/NASA

— Liza Lester is a public information specialist and writer at AGU. Follow her on twitter @lizalester.

The post Dust toll in Africa exceeds deaths from HIV appeared first on GeoSpace.

Earliest life may have arisen in ponds, not oceans

Mon, 04/15/2019 - 15:53

Study finds shallow bodies of water were probably more suitable for Earth’s first life forms.

By Jennifer Chu

Don Juan Pond in Antarctica. Credit: Pierre Roudier, Flickr

Primitive ponds may have provided a suitable environment for brewing up Earth’s first life forms, more so than oceans, a new MIT study finds.

Researchers report that shallow bodies of water, on the order of 10 centimeters deep, could have held high concentrations of what many scientists believe to be a key ingredient for jump-starting life on Earth: nitrogen.

In shallow ponds, nitrogen, in the form of nitrogenous oxides, would have had a good chance of accumulating enough to react with other compounds and give rise to the first living organisms. In much deeper oceans, nitrogen would have had a harder time establishing a significant, life-catalyzing presence, the researchers say.

“Our overall message is, if you think the origin of life required fixed nitrogen, as many people do, then it’s tough to have the origin of life happen in the ocean,” says lead author Sukrit Ranjan, a postdoc in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “It’s much easier to have that happen in a pond.”

Ranjan and his colleagues have published their results today in the AGU journal Geochemistry, Geophysics, Geosystems. The paper’s co-authors are Andrew Babbin, the Doherty Assistant Professor in Ocean Utilization in EAPS, along with Zoe Todd and Dimitar Sasselov of Harvard University, and Paul Rimmer at Cambridge University.

Breaking a bond
If primitive life indeed sprang from a key reaction involving nitrogen, there are two ways in which scientists believe this could have happened. The first hypothesis involves the deep ocean, where nitrogen, in the form of nitrogenous oxides, could have reacted with carbon dioxide bubbling forth from hydrothermal vents, to form life’s first molecular building blocks.

The second nitrogen-based hypothesis for the origin of life involves RNA — ribonucleic acid, a molecule that today helps encode our genetic information. In its primitive form, RNA was likely a free-floating molecule. When in contact with nitrogenous oxides, some scientists believe, RNA could have been chemically induced to form the first molecular chains of life. This process of RNA formation could have occurred in either the oceans or in shallow lakes and ponds.

Nitrogenous oxides were likely deposited in bodies of water, including oceans and ponds, as remnants of the breakdown of nitrogen in Earth’s atmosphere. Atmospheric nitrogen consists of two nitrogen molecules, linked via a strong triple bond, that can only be broken by an extremely energetic event — namely, lightning.

“Lightning is like a really intense bomb going off,” Ranjan says. “It produces enough energy that it breaks that triple bond in our atmospheric nitrogen gas, to produce nitrogenous oxides that can then rain down into water bodies.”

Scientists believe that there could have been enough lightning crackling through the early atmosphere to produce an abundance of nitrogenous oxides to fuel the origin of life in the ocean. Ranjan says scientists have assumed that this supply of lightning-generated nitrogenous oxides was relatively stable once the compounds entered the oceans.

However, in this new study, he identifies two significant “sinks,” or effects that could have destroyed a significant portion of nitrogenous oxides, particularly in the oceans. He and his colleagues looked through the scientific literature and found that nitrogenous oxides in water can be broken down via interactions with the sun’s ultraviolet light, and also with dissolved iron sloughed off from primitive oceanic rocks.

Ranjan says both ultraviolet light and dissolved iron could have destroyed a significant portion of nitrogenous oxides in the ocean, sending the compounds back into the atmosphere as gaseous nitrogen.

“We showed that if you include these two new sinks that people hadn’t thought about before, that suppresses the concentrations of nitrogenous oxides in the ocean by a factor of 1,000, relative to what people calculated before,” Ranjan says.

“Building a cathedral”
In the ocean, ultraviolet light and dissolved iron would have made nitrogenous oxides far less available for synthesizing living organisms. In shallow ponds, however, life would have had a better chance to take hold. That’s mainly because ponds have much less volume over which compounds can be diluted. As a result, nitrogenous oxides would have built up to much higher concentrations in ponds. Any “sinks,” such as UV light and dissolved iron, would have had less of an effect on the compound’s overall concentrations.

Ranjan says the more shallow the pond, the greater the chance nitrogenous oxides would have had to interact with other molecules, and particularly RNA, to catalyze the first living organisms.

“These ponds could have been from 10 to 100 centimeters deep, with a surface area of tens of square meters or larger,” Ranjan says. “They would have been similar to Don Juan Pond in Antarctica today, which has a summer seasonal depth of about 10 centimeters.”

That may not seem like a significant body of water, but he says that’s precisely the point: In environments any deeper or larger, nitrogenous oxides would simply have been too diluted, precluding any participation in origin-of-life chemistry. Other groups have estimated that, around 3.9 billion years ago, just before the first signs of life appeared on Earth, there may have been about 500 square kilometers of shallow ponds and lakes worldwide.

“That’s utterly tiny, compared to the amount of lake area we have today,” Ranjan says. “However, relative to the amount of surface area prebiotic chemists postulate is required to get life started, it’s quite adequate.”

The debate over whether life originated in ponds versus oceans is not quite resolved, but Ranjan says the new study provides one convincing piece of evidence for the former.

“This discipline is less like knocking over a row of dominos, and more like building a cathedral,” Ranjan says. “There’s no real ‘aha’ moment. It’s more like building up patiently one observation after another, and the picture that’s emerging is that overall, many prebiotic synthesis pathways seem to be chemically easier in ponds than oceans.”

Jennifer Chu writes for the MIT News Office. This post was originally published on the MIT News website. This research was supported, in part, by the Simons Foundation and MIT.

The post Earliest life may have arisen in ponds, not oceans appeared first on GeoSpace.

Extended winter polar vortices chill Saturn’s strangely familiar moon, Titan

Thu, 04/11/2019 - 18:42

By Liza Lester

Titan’s south polar vortex in 2012.
Credit: NASA/JPL-Caltech/Space Science InstituteNASA/JPL-Caltech/Space Science Institute

Saturn’s hazy moon Titan has a long-lived Earth-like winter polar vortex supercharged by the moon’s peculiar chemistry, according to new research published in AGU’s journal Geophysical Research Letters.

Titan is the second largest moon in the solar system and the only moon with a thick atmosphere comparable to Earth’s. The Saturnian moon may be the most Earth-like place in the solar system, with seasons, rain and surface lakes, although it is about 10 times as far from the Sun as Earth and very cold.

Titan’s stratosphere, like Earth’s, is characterized by cooler layers closer to the surface and warmer layers higher up, and is the realm of the polar vortex, a cap of cold air that sits over the poles in winter. This is the same phenomenon that can cause frigid temperatures in North America during the winter.

On Earth, the polar vortex usually dissipates in spring. The new study found that Titan’s northern hemisphere polar vortex sticks around past the moon’s summer solstice, into what would be late June on Earth, lasting three-quarters of a Titan year, or about 22 Earth years.

The new study used measurements from NASA’s Cassini spacecraft and atmospheric science developed on Earth to understand seasonal changes observed on Titan.

The new study expanded previous work by the researchers indicating the presence of the polar vortex on Titan explained the enrichment of trace gases in the moon’s stratosphere and the enrichment of trace gases explained the unexpectedly intense cold observed in the southern hemisphere vortex in early winter.

The combination of cooling caused by trace gases and warming caused by sinking air breaks Titan’s winter into two phases, according to the new study.

“Earth cools in winter due to lack of sunlight over the poles, but you don’t get this added effect from extra gases, whereas on Titan you’ve got these weird gases in there that’s making the process even more extreme than it would be otherwise,” said Nick Teanby, a planetary scientist at the University of Bristol in the United Kingdom, and the lead author of the new study.

Previous work from Teanby and his colleagues described the relationship between the trace gases and the polar vortex, but the new study is the first comprehensive analysis of seasonal variation in the temperature and composition of Titan’s stratosphere, based on infrared mapping data from Cassini’s entire 13-Earth-year tour of Saturn’s system.

A vortex and heightened concentration of trace organic gases sits over Titan’s north pole during northern winter, when the pole is tilted away from the Sun, in this artist’s impression of the moon, inspired by data from NASA’s Cassini mission. As Titan moves past equinox and the north pole tilts toward the Sun, a vortex develops over the south pole. Winter is long on the Saturnian moon, where a year lasts 29.5 Earth years. Credit: ESA

“This is the first time one paper has gone into the whole of the Cassini dataset, covering almost half of the Titan year, and looked at how northern and southern polar vortex evolution might differ,” said Claire Newman, an expert in planetary atmospheres at Aeolis Research and a researcher unaffiliated with the new study. “I work on atmospheric models and we rely on these kinds of observations to understand how correctly our models are capturing what is going on on Titan itself.”

In the future, the authors of the new study hope to have enough data to apply Earth’s atmospheric models to Titan and attempt to predict climate trends on the moon. Testing models on a whole new world could help scientists make the models more robust. One day, Saturn’s unusual moon may help scientists better understand the atmosphere of our home planet, Teanby said.

“Why it’s so interesting is that Titan is like a mini Earth with a really exotic and cold atmosphere that we can use to test climate models and things like that,” Teanby said. “That’s the big picture to why we bothered, but I guess the real motivation is just that it’s really cool to try and figure this stuff out.”

Winter whirl

Titan spins on an axis tilted to about the same degree as Earth’s, which gives the moon seasons like Earth’s, but drawn-out over the 29 Earth years Titan and Saturn take to circle the Sun. NASA’s Cassini spacecraft observed the turning of Titan’s seasons, from mid-winter through summer solstice in the moon’s northern hemisphere.

When Cassini arrived at Saturn in 2004, Titan’s northern pole was enveloped in a polar vortex from the pole to about 45 degrees north latitude, about where the southern border of Montana is on Earth.

A polar vortex is a large cap of cold air and low pressure that sits over the poles in winter, twisting in the direction of the planet’s, or moon’s, spin. Strong westerly jetstreams encircle the pole and contain the cold, creating a distinct separation from warm air from the equator. Jetstream barriers avert mixing of air masses and keep chemicals as well as cold inside the vortex.

On Earth, the edge of this big atmospheric system sits at about 60 degrees latitude, the southern border of Canada’s Yukon and Northwest Territories in the Northern Hemisphere. Lower latitudes encounter the vortex, as North America did last January, when the circling jetstream weakens or meanders.

Cassini found that Titan’s northern polar vortex persisted through the equinox and broke up in summer, much like on Earth, but lasting later in the year. Meanwhile, a new vortex began forming over the southern pole shortly after the moon’s equinox. The embryonic southern vortex was, surprisingly, colder than the northern vortex, which had only been observed in full winter glory.

The new research suggests the difference could be an early winter extra-cold phase produced by Titan’s chemistry rather than intrinsic differences between the poles.

Strange chemistry

The new study suggests Titan’s atmospheric chemistry may accentuate its polar vortex. Like Earth’s atmosphere, Titan’s atmosphere is mostly nitrogen, and the moon’s surface pressure is about 1.5 times Earth’s at sea level. But unlike Earth, the remaining 2 percent of the atmosphere is mostly methane, the main component of natural gas. When it rains on Titan, it rains hydrocarbons.

High in the moon’s relatively hot, upper atmosphere, methane reacts with energy from the Sun and from Saturn’s magnetic field to produce trace gases like cyanide, ethylene, ethane and larger organic molecules. Some of these gases are building blocks of Titan’s characteristic haze.

Cassini observed enrichment of these trace gases over the winter poles and the new research finds this enrichment is most pronounced in early winter, when the pole is also colder.

Daylight scatters through Titan’s atmosphere, seen from the moon’s night side. A hood of haze sits over the north pole at top, and a hint of the south polar votex appears at the bottom in this image captured by Cassini in June 2018, about three Earth years past the moon’s equinox into winter in the southern hemisphere.
Credit: NASA/JPL-Caltech/Space Science Institute NASA/JPL-Caltech/Space Science Institute

On Titan, as on Earth, the difference in temperature between the equator and dark winter pole ultimately drives the formation of the polar vortex. On both worlds, cold air sinks, dragging the upper atmosphere downward at the pole in winter. As the trace gases mix downward into the colder mid-layers of Titan’s atmosphere, they condense to liquid or solid clouds. Condensed trace gases act like a sink, accelerating the movement of more trace gases down from the top of the atmosphere where they are created.

Trace gases make the cold layers of Titan’s stratosphere even colder by emitting infrared light. Infrared light is just beyond the visible light spectrum and is perceptible to humans as heat. When trace gases glow, they lose energy, which has the effect of cooling the atmosphere by radiating energy away into space. The new study proposes the now even colder air sinks faster, in a frigid feedback cycle.

“That’s all happening at the start of winter, so the start of winter is really, really cold,” Teanby said. Eventually, the pressure increase caused by all that sinking air creates its own heat, which counters the feedback cycle. The authors suggest this creates two distinct phases in Titan’s winter.

“As you go deeper into winter and the circulation’s more developed, you get an opposite effect, where you start to warm the stratosphere due to this compression of the air as it’s sinking. So there’s these two phases to winter that are quite strange. We’re not totally sure that’s what’s happening, but that’s our theory at the minute,” Teanby said.

— Liza Lester is a public information specialist and writer at AGU. Follow her on twitter @lizalester.

The post Extended winter polar vortices chill Saturn’s strangely familiar moon, Titan appeared first on GeoSpace.

California ‘browning’ more in the south during droughts

Thu, 03/28/2019 - 19:01

Severe droughts hitting Southern California vegetation harder due to climate change, new study finds

By David Colgan 

Like a climate chameleon, California turned brown during the 2012–16 drought, as vegetation dried or died off. But the change wasn’t uniform. Large areas of the northern part of the state were not severely affected, while Southern California became much browner than usual, according to new research published in the AGU journal Geophysical Research Letters

“Southern California is more prone than the northern part of the state to getting severe droughts,” said Glen MacDonald, one of the paper’s authors and a UCLA climate scientist. “But that difference seems to be increasing.”

That means additional stress will be placed on wildlife ecosystems and resources that the approximately 24 million people living in Southern California need to survive, including energy, food and water supply.

The problem isn’t just a lack of precipitation. Hotter temperatures due to global warming — which accelerate evaporation and make drought effects worse — are playing play a major role in many locations, including Southern California and some parts of the Sierra Nevada.

One band of low-to-middle elevation forest in the western Sierra was hit particularly hard and showed drastic browning, MacDonald said. That area of the Sierra Nevada experienced a high concentration of tree deaths, which contributed to California’s overall loss of more than 129 million trees since 2010.

In contrast, some parts of California became greener — mostly at high elevations and in the far northwestern part of the state, where it’s cooler and moister.

The researchers examined satellite images dating back to 2000 and historical records dating to 1895. They combined that data with information about drought severity and vegetation indexes — which analyze imagery to determine how densely green a patch of land is.

The research was partially funded by UCLA’s Sustainable LA Grand Challenge, which seeks to develop informed strategies to transition L.A. County to 100 percent renewable energy, 100 percent local water and enhanced ecosystem health by 2050.

Lead author Chunyu Dong, who worked on the project as a UCLA postdoctoral researcher, said the findings reveal a century-long trend in Southern California toward a drier climate that won’t affect only plants, but also the lives of millions of people.

“The Southern California water shortage will be more severe in the coming decades, especially when we consider the population here is increasing quickly,” Dong said. The changes also have implications for wildfires, he added. Additional dry vegetation and hotter, windy weather could lead to more large fires that are difficult to control.

That lines up with 2017 research by MacDonald, who used the natural climate record contained in ancient tree rings to understand how climate variability and droughts have changed over hundreds of years. That paper found that California is in an unprecedented scenario in which the climate has warmed at the same time that variations in temperature and precipitation have been magnified, supporting rapid plant growth in wet years and then drying in hot summers, which provides more fuel for wildfires.

The 2019 rainy season made California drought-free for the first time since 2011, greening the state and causing wildflower superblooms, even in deserts. But MacDonald said the relief could be short-lived.

“The one thing that seems to keep coming up is that we’ll have more swings in precipitation,” he said. “We’re going to have our seasonally dry summer and that fine fuel is going to dry out. If it’s a hot summer, conditions are ripe for wildfire. The worst thing we can possibly do is say we don’t have to worry about this anymore.”

How climate change and drought will reshape the state’s vegetation in the long term remains to be seen. Some coastal sage scrub and chaparral could be replaced by grasslands, and low-elevation shrubland and woodland might even replace some coniferous forest, MacDonald said, but more study is needed.

— David Colgan is the UCLA press officer. This post was first published by UCLA. 

The post California ‘browning’ more in the south during droughts appeared first on GeoSpace.

Laser Blasts Show Asteroid Bombardment, Hydrogen Make Great Recipe for Life on Mars

Mon, 03/25/2019 - 17:00

By Timothy Childers 

A new study reveals asteroid impacts on ancient Mars could have produced key ingredients for life if the Martian atmosphere was rich in hydrogen. An early hydrogen-rich atmosphere on Mars could also explain how the planet remained habitable after its atmosphere thinned. The study used data from NASA’s Curiosity rover on Mars and was conducted by researchers on Curiosity’s Sample Analysis at Mars (SAM) instrument team and international colleagues.

These key ingredients are nitrites (NO2-) and nitrates (NO3-), fixed forms of nitrogen that are important for the establishment and sustainability of life as we know it. Curiosity discovered them in soil and rock samples it took as it traversed within Gale Crater, the site of ancient lakes and groundwater systems on Mars.

To understand how fixed nitrogen may have been deposited in the crater, researchers needed to recreate the early Martian atmosphere here on Earth. The study, led by Dr. Rafael Navarro-González and his team of scientists at the Institute of Nuclear Sciences of the National Autonomous University of Mexico in Mexico City, used a combination of theoretical models and experimental data to investigate the role hydrogen plays in altering nitrogen into nitrites and nitrates using energy from asteroid impacts. The paper was published January in AGU’s Journal of Geophysical Research: Planets.

In the lab, the group used infrared laser beam pulses to simulate the high-energy shockwaves created by asteroids slamming into the atmosphere. The pulses were focused into a flask containing mixtures of hydrogen, nitrogen and carbon dioxide gases, representing the early Martian atmosphere. After the laser blasts, the resulting concoction was analyzed to determine the amount of nitrates formed. The results were surprising, to say the least.

A portion of the experimental setup Dr. Rafael Navarro-González, an astrobiologist at the Institute of Nuclear Sciences of the National Autonomous University of Mexico in Mexico City and a co-investigator with the SAM instrument, and his team of researchers used to simulate asteroid impacts in the early Martian atmosphere. The flask (center) contains a composition of carbon dioxide, nitrogen and hydrogen gasses. A high-intensity infrared laser is focused into the flask from a lens (left), to simulate the high energy shockwaves produced by asteroids entering the Martian atmosphere. The gas is then evacuated from the flask and analyzed to determine the composition and levels of nitrogen fixation. Credits: courtesy of Dr. Rafael Navarro-González.

“The big surprise was that the yield of nitrate increased when hydrogen was included in the laser-shocked experiments that simulated asteroid impacts,” said Navarro-González. “This was counter-intuitive as hydrogen leads to an oxygen-deficient environment while the formation of nitrate requires oxygen. However, the presence of hydrogen led to a faster cooling of the shock-heated gas, trapping nitric oxide, the precursor of nitrate, at elevated temperatures where its yield was higher.”  
Although these experiments were conducted in a controlled lab environment millions of miles from the Red Planet, the researchers wanted to simulate the results obtained from Curiosity using the SAM instrument on the rover. SAM takes samples drilled from rock or scooped up from the surface by the rover’s mechanical arm and bakes them to look at the chemical fingerprints of the released gases.

“SAM on Curiosity was the first instrument to detect nitrate on Mars,” said Christopher McKay, a co-author of the paper at NASA’s Ames Research Center in California’s Silicon Valley. “Because of the low levels of nitrogen gas in the atmosphere, nitrate is the only biologically useful form of nitrogen on Mars. Thus, its presence in the soil is of major astrobiological significance. This paper helps us understand the possible sources of that nitrate.”

Why were the effects of hydrogen so fascinating? Although the surface of Mars is cold and inhospitable today, scientists think that a thicker atmosphere enriched in greenhouse gases such as carbon dioxide and water vapor may have warmed the planet in the past. Some climate models show that the addition of hydrogen in the atmosphere may have been necessary to raise temperatures enough to have liquid water at the surface.

“Having more hydrogen as a greenhouse gas in the atmosphere is interesting both for the sake of the climate history of Mars and for habitability,” said Jennifer Stern, a planetary geochemist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, and one of the coinvestigators of the study. “If you have a link between two things that are good for habitability – a potentially warmer climate with liquid water on the surface and an increase in the production of nitrates, which are necessary for life – it’s very exciting. The results of this study suggest that these two things, which are important for life, fit together and one enhances the presence of the other.”

Even though the composition of the early Martian atmosphere remains a mystery, these results may provide more pieces for solving this climate puzzle.

NASA is exploring our Solar System and beyond, uncovering worlds, stars, and cosmic mysteries near and far with our powerful fleet of space and ground-based missions. Experimental and theoretical work by Navarro-González was funded by the National Autonomous University of Mexico in Mexico City and the National Council of Science and Technology of Mexico. American co-authors received funding from NASA’s Mars Science Laboratory project and French co-authors received funding from the National Center for Space Studies (CNES), Paris, France. NASA’s Mars Exploration Program for the agency’s Science Mission Directorate (SMD) in Washington funded all work related to the operation of the Curiosity rover, the SAM instrument, and the use of NASA facilities and resources to retrieve and analyze the data. Goddard provided the SAM instrument. NASA’s Jet Propulsion Laboratory in Pasadena, California, built the rover and manages the project for SMD.

This post was first published by NASA. Timothy Childers works at NASA’s Goddard Space Flight Center.


The post Laser Blasts Show Asteroid Bombardment, Hydrogen Make Great Recipe for Life on Mars appeared first on GeoSpace.

Chemical tracers untangle natural gas from agricultural methane emissions

Thu, 03/21/2019 - 17:19

COCCON (Collaborative Carbon Column Observing Network) network of column sensors to measure excess columns of methane (CH4) during tests atop the NCAR Foothills Laboratory. Photo: Mahesh Kumar Sha/KIT/BIRA-IASB

By Katie Weeman

With natural gas booming across the Front Range, drilling rigs may operate within feet from cattle farms. That shared land use can confound attempts to understand trends in methane, a greenhouse gas and air pollutant—the gases emitted from these different sources blend together.

To untangle them, a CIRES-led team has innovated a new, cost-effective technique to efficiently measure methane and a cocktail of associated chemicals in the atmosphere, and to create a kind of chemical identification tag for methane sources.

“Methane is an important greenhouse gas. But it has a high global concentration so it can be challenging to see its specific sources,” said Natalie Kille, CIRES PhD student and lead author on the study published today in the AGU journal Geophysical Research Letters. “This technique allows us to remove the background methane concentrations in our analysis to clearly see unique chemical tracers.”

“Tracers” are chemicals unique to a single source: ethane is a great tracer for oil and gas operations, for example; and ammonia is a tracer for cattle farms, responsible for that unmistakable cow smell. Measuring levels of those two tracers helped the team disentangle sources of methane produced locally by both agriculture and oil and gas operations.

In Colorado, oil and gas operations sit within feet from cattle farms. Photo: Frank Flocke/NCAR

Using instruments that sit on the ground and measure the air above, they can instantly capture a snapshot of chemical concentrations for methane and its tracers in the column of air reaching from the surface all the way up to the top of the atmosphere. The team then uses this information to remove the methane background—a concept known as “excess column”—so that the tracers can take center stage.

“This was the first study to measure excess columns of all these molecules simultaneously,” said Rainer Volkamer, CIRES Fellow, CU associate professor of chemistry, and corresponding author on the study. “This gives us a better handle to separate and quantify methane sources on a regional scale.”

The team set up a network of these small instruments across Colorado’s Front Range. Frank Hase and Thomas Blumenstock with the German Karlsruhe Institute of Technology developed a novel, portable spectrometer capable of highly precise methane measurements. And CIRES/CU Boulder provided Volkamer’s University of Colorado “CU mobile Solar Occultation Flux” instrument that measured the chemical tracers ethane and ammonia. Both devices harness sunlight to identify each molecule by its light absorption fingerprint.

“These two instruments were set up side-by-side in Eaton, Colorado, within what we call the ‘methane dome’ of the Denver-Julesburg Basin,” said Volkamer. “In the areas where natural gas and cattle farming sites are present, methane is emitted, and mixes together from both sources, forming a bubble inside the atmospheric boundary layer that expands and contracts as if its breathing.”

To measure the background concentrations of methane, the team set up two additional KIT instruments (one operated by the National Center for Atmospheric Research) outside the methane dome, in Boulder and Westminster, each about 60 miles away from Eaton. These data helped Kille’ calculate—and then remove—the background concentration of methane to isolate locally produced methane and those two key chemical tracers.

In previous work to untangle sources of methane, scientists have often collected flask samples of air, either from the ground or by aircraft, for detailed analysis back in a laboratory. But some chemicals, including ammonia, can stick to the insides of some canisters, creating challenges.

In this work, the small and portable instruments could be deployed almost anywhere for real-time measurements of the open atmosphere. In Eaton, the team set up in the parking lot behind a bed and breakfast.

Based on data from five days’ worth of measurements in 2015, the team found oil and natural gas operations were responsible for most of the methane produced in the Denver-Julesburg Basin, with agricultural sources providing an important but minor source.

The study also uncovered some baffling observations that will require further exploration: for example, when methane concentrations are very low, the agricultural sources are relatively more significant.

These results could help natural gas operators, cattle farmers, and their regulators make more informed decisions about methane mitigation.

In the future, the researchers hope to generate a long-term time series over multiple seasons to see how methane sources in the region change over time—a feat that becomes possible with low-cost, autonomous sensor networks like this. Scientists could also work towards comparing these data with those gathered from satellites, to develop best practices to inform satellite observations, said Volkamer.

This story originally appeared on the CIRES website. Katie Weeman works for the CIRES Communications Office. 

The post Chemical tracers untangle natural gas from agricultural methane emissions appeared first on GeoSpace.

Where do microplastics go in the oceans?

Wed, 03/20/2019 - 16:12

By Liza Lester

Where do tiny bits of plastic go when they are flushed out to sea?

Previous research finds most plastic ends up in the subtropical ocean gyres circling the mid-latitudes of the Atlantic and Pacific oceans. These rotating currents encircle large areas sometimes called “garbage patches” because they are the destination for so much persistent floating junk.

A new modeling study in AGU’s Journal of Geographical Research: Oceans finds more microplastic may be reaching Arctic waters than previously thought.

The new study looked at what oceanographers know about ocean currents to ask which types of current are most influential on how microplastics drift.

Generally defined as plastic bits smaller than 5 millimeters, this durable, non-biodegradable flotsam ranges from the size of polystyrene beads to microscopic nanoparticles small enough to squeeze through cell membranes. They can persist in surface waters for years.

Microplastics are unhealthy for animals to ingest, causing physical and metabolic damage to sea life, from tiny plankton to whales. Microplastics can also spread chemical pollutants and living organisms carried on their surfaces.

The new simulations of plastics from the millimeter to meter scale show wind-driven surface currents called Ekman currents mostly determine the fate of microplastics in the subtropical gyres.

But the new research also finds ocean waves push microplastics toward the poles. The new research shows Stokes drift, an element of fluid dynamics theory that describes the influence of waves, may have led to underestimation of microplastic pollution in the Arctic in previous studies. Stokes drift is not always included in ocean models and is currently not observed from satellites.  

The post Where do microplastics go in the oceans? appeared first on GeoSpace.

Western droughts caused permanent loss to major California groundwater source

Tue, 03/19/2019 - 13:59

By Joshua Rapp Learn

California’s Central Valley aquifer, the major source of groundwater in the region, suffered permanent loss of capacity during the drought experienced in the area from 2012 to 2015.

California has been afflicted by a number of droughts in recent decades, including one between 2007 and 2009, and the millennium drought that plagued the state from 2012 to 2015. Due to lack of water resources, the state drew heavily on its underground aquifer reserves during these periods.

According to new research, the San Joaquin Valley aquifer in the Central Valley shrank permanently by up to 3 percent due to excess pumping during the sustained dry spell. Combined with the loss from the 2007 to 2009 drought, the aquifer may have lost up to 5 percent of its storage capacity during the first two decades of the 21st Century, according to Manoochehr Shirzaei, an assistant professor of earth sciences at Arizona State University in Tempe and one of the co-authors of a new study published in AGU’s Journal of Geophysical Research: Solid Earth.

Measures of land Subsidence in San Joaquin Valley. Credit: USGS

Groundwater exists in the pore spaces between grains of soil and rocks. When fluids are extracted from aquifers, the pore spaces close. There is a range for which these spaces can shrink and expand elastically. But if the pore spaces close too much, they start to collapse, causing the land to shrink irreversibly.

Figuring out how much the aquifer shrank permanently could help water managers prepare for future droughts, according to the study’s authors. The San Joaquin Valley aquifer supplies freshwater to the Central Valley – a major hub that produces more than 250 different crops valued at $17 billion per year, according to the U.S. Geological Survey.

“If we have even one drought per decade, our aquifers could shrink a bit more each time and permanently lose more than a quarter of their storage capacity this century,” said Susanna Werth, a research assistant professor of earth sciences at Arizona State University, and a co-author of the new study.

The new study could also help scientists understand how other areas might be affected by drought.

“That was a curiosity for us to understand how much groundwater has been lost in those particular regions and will give us a picture of what we can expect for arid areas around the globe if groundwater practices are not sustainable,” said Chandrakanta Ojha, a post-doctoral researcher at Arizona State and the lead author of the new study.

Underground water from space
The researchers measured water volume changes due to groundwater variation in the aquifer using data from the Gravity Recovery and Climate Experiment (GRACE), a twin satellite mission that has been measuring the Earth’s gravity field every month from April 2002 until June 2017. The study’s authors compared the groundwater losses based on GRACE measurements with those calculated from vertical land motion measurements obtained by GPS. Land depressions were also measured by a radar technique called InSAR and multiple extensometers, devices which are installed in a borehole of a groundwater observation well. They also examined groundwater level records.

The study’s authors found that from 2012 to 2015, the aquifer of the San Joaquin Valley lost a total volume of about 30 cubic kilometers (7.2 cubic miles) of groundwater. The aquifer also shrank permanently by 0.4 percent to 3.25 percent, according to the new study.

Previous research found the 2007 to 2009 drought caused the San Joaquin aquifer to permanently lose between 0.5 percent to 2 percent of its capacity. Cumulatively, the authors said both drought periods – 2007 to 2009 and 2012 to 2015 — caused the San Joaquin aquifer to shrink permanently by as much as 5.25 percent.

Surface deformation map over San Joaquin Valley during 2015-2017 using Satellite radar interferometry. Credit: Chandrakanta Ojha.

Forecasting future drought effects
Shirzaei said the information they have gathered is important for future planning—particularly since the loss of permanent storage capacity is unsustainable in the long-run.

By using this type of calculation, Shirzaei said land and water resource managers can predict the effect of droughts on the aquifer system. This can help to make better regulations for groundwater conservation during those periods and prevent permanent loss of aquifer storage capacity.

Shirzaei said the compaction of the aquifer may also cause fissures and cracks on the surface as the land subsides. This could affect roads, power lines, railroads or other infrastructure, but more research is needed to understand the details of these effects.

Joshua Learn is a freelance science writer based in Washington, DC.

The post Western droughts caused permanent loss to major California groundwater source appeared first on GeoSpace.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer