EOS

Syndicate content Eos
Science News by AGU
Updated: 1 day 1 hour ago

This Week: Glacier Mice and Melancholy Blossoms

Fri, 05/29/2020 - 11:17

Herd of Fuzzy Green ‘Glacier Mice’ Baffles Scientists. Okay, this made me smile: Moss balls live on glaciers, and scientists have found that they roll around in synch. Nicknamed “glacier mice,” the green balls of puffy moss aren’t attached to anything, but live perched on glaciers. Scientists figured they must move around to keep their rounded shape, so they tagged 30 moss balls and monitored them for a few months. They found that the colony of moss balls moved at the same speed and in the same directions, almost as though they were a herd. Moss balls travel, on average, an inch a day, and scientists still don’t know why. Winds, downhill slopes, and solar radiation couldn’t explain their movements. Picturing moss herds marching along a glacier is just too delightful!

—Jenessa Duncombe, Staff Writer

 

Green glacier mice? It seems a bit of a mossy, fluffy story, but it turns out to be a fascinating puzzle: How do these globs of moss move across glaciers, sometimes an inch a day? I’m reminded of the Racetrack at Death Valley.

—Naomi Lubick, International Editor

 

Demo-2, Here We Go!

The sun has risen on the dawn of a new era in human spaceflight.

At 4:33 p.m. ET @AstroBehnken & @Astro_Doug will liftoff atop a @SpaceX rocket on their way to the @Space_Station. This will be the first time humans have launched from U.S. soil since 2011: https://t.co/p6zJ3XlwdR pic.twitter.com/UJhfftslal

— NASA’s Kennedy Space Center (@NASAKennedy) May 27, 2020

 If you’re like me, you’ve been watching with anticipation as the United States prepares to launch astronauts from our home turf for the first time since 2011. I have my issues with the Artemis program, but I can’t deny that it’s exciting to see my country launch astronauts again—whenever it may happen. As I write this, on Wednesday morning, 27 May, there’s a “will they/won’t they” going on as Tropical Storm Bertha heads toward the East Coast. Everyone is crossing their fingers for a smooth and safe launch.

—Kimberly Cartier, Staff Writer

 

The First Footprints on Mars Could Belong to This Geologist. If you’re looking for an inspiring and uplifting quick read, check out this fun interview with a member of NASA’s latest class of new astronauts, who also happens to be a planetary geologist and who could one day set foot on Mars.

—Timothy Oleson, Science Editor

 

Splendid Isolation: A Surreal Sakura Season.

Japan’s usually bustling sakura season (flourishing here in 1849) was cut short this year by COVID-19. Credit: Utagawa Hiroshige/LACMA

I loved this sparkling riff on Japan’s millennia-long history of celebrating sakura (cherry blossoms). It’s geoscience as social history, art, commerce, identity, and the nature of melancholy.

—Caryl-Sue, Managing Editor

Deepwater Horizon: La Plataforma Petrolera y el Surgimiento de las Técnicas Ómicas

Fri, 05/29/2020 - 11:13

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

A Decade of Science Since Deepwater Horizon Modeling Under Pressure   Deepwater Horizon and the Rise of the Omics   Why Sunlight Matters for Marine Oil Spills   Thirty Years, $500 Million, and a Scientific Mission in the Gulf   Leveraging Satellite Sensors for Oil Spill Detection   Deepwater Horizon’s Legacy of Science

En casi todas partes, los científicos han mirado la superficie de la Tierra o cerca de ella—desde lagos antárticos sepultados sobre hielo, hasta desiertos áridos ultravioleta y ecosistemas que van desde prístinos hasta muy contaminados— y han encontrado abundantes y muy diversas poblaciones de microorganismos. Los microorganismos, o microbios, están en todas partes. Son adaptables y desempeñan papeles clave en el ciclo de los elementos y el funcionamiento del ecosistema en casi todos los entornos de la Tierra.

Los microbios son los grandes descomponedores en los ecosistemas. Descomponen la materia orgánica muerta y moribunda, y reciclan los principales nutrientes para que puedan ser usados por las plantas. Y al reaccionar rápidamente y adaptarse a las condiciones cambiantes, actúan como socorristas para ayudar a restablecer el equilibrio y la estabilidad de los ecosistemas después de perturbaciones como la contaminación o las tormentas catastróficas. Los microbios están, por ejemplo, íntimamente involucrados en las respuestas de los ecosistemas a los derrames de petróleo.

Igual que la materia orgánica derivada de la producción primaria moderna, el petróleo formado a lo largo del tiempo geológico puede actuar como una fuente de carbono que alimenta el crecimiento y el metabolismo microbiano. Los microbios que degradan los hidrocarburos se han estudiado durante décadas y se cree que son ubicuos y diversos, y se han adaptado al consumo de petróleo durante millones de años [Head et al. , 2006]. La biodegradación mediada por comunidades microbianas locales se considera el destino primario de la mayoría del petróleo (petróleo y gas) que ingresa al medio marino a través de mecanismos naturales como las filtraciones [Leahy y Colwell , 1990].

Han surgido una variedad de técnicas llamadas ómicas, centradas en analizar la composición genética de las células, y han ofrecido a los investigadores nuevas y poderosas formas de estudiar las comunidades microbianas.Sin embargo, a medida que los investigadores comenzaron a revelar la complejidad de las comunidades microbianas y los fundamentos ilustrados de cómo operan en las últimas décadas, quedaba poca claridad sobre su estructura y funcionamiento en la naturaleza. La razón de esto fue por la escasez de técnicas para estudiarlos. Debido a su pequeño tamaño, los microbios evaden la observación fácil y la mayoría no puede cultivarse en el laboratorio. En el momento del derrame de petróleo de Exxon Valdez en 1989, por ejemplo, la microbiología ambiental era un campo relativamente nuevo. Pero en la última década, surgieron una variedad de técnicas llamadas ómicas, centradas en analizar la composición genética de las células, y ofrecieron a los investigadores nuevas y poderosas formas de estudiar las comunidades microbianas y los roles desempeñados por grupos específicos de microbios.

Emergen las Técnicas Ómicas

El derrame de petróleo de 2010, Deepwater Horizon (DWH) en el Golfo de México, es la mayor descarga accidental de petróleo en un entorno marino para la cual se realizó un esfuerzo proporcional de respuesta de emergencia. A diferencia del derrame de Valdez, que fue el último derrame importante que afectó a los Estados Unidos antes de 2010, la descarga de DWH ocurrió en aguas profundas, y se usaron volúmenes extraordinariamente grandes de dispersante químico durante los esfuerzos de respuesta de emergencia.

En la imagen se muestra un brillo de petróleo que cubre la superficie del Golfo de México en junio de 2010, mientras los barcos trabajan para ayudar a controlar el derrame de Deepwater Horizon. Crédito: kris krüg, CC BY-NC-SA 2.0

El derrame de DWH también fue el primer desastre ambiental importante donde las tecnologías genómicas habían madurado hasta tal punto que podían desplegarse para cuantificar las respuestas microbianas en grandes escalas espaciales y temporales. Como resultado, el campo de la genómica ambiental maduró durante la última década en paralelo con la respuesta al DWH, esto se puede ver detallado en un informe reciente publicado por la Academia Estadounidense de Microbiología. Los avances técnicos en genómica permitieron análisis directos e integrales de los microbios en su hábitat natural tanto de agua de mar como de sedimentos contaminados con petróleo o no contaminados. Los investigadores que estudiaron los efectos del derrame de DWH presidieron una explosión de datos de genómica microbiana que permitió grandes avances en la ciencia del derrame de petróleo y permitió a los científicos responder a la pregunta: ¿qué microbios hay?, en comunidades complejas con detalles sin precedentes.

La metagenómica, la secuenciación de todos los genes para todos los organismos en una muestra, permitió determinar el rango completo de especies microbianas presentes. También proporcionó evaluaciones del potencial metabólico de estos organismos para llevar a cabo procesos importantes del ecosistema como la fotosíntesis y la degradación de ciertos compuestos de carbono. La aplicación de la metatranscriptómica, la secuenciación de genes activos o expresados, brindó oportunidades para descifrar las funciones o actividades de esos mismos microbios en la naturaleza, esencialmente respondiendo a la pregunta: ¿Qué están haciendo?

Las secuencias de genes se recogen del medio ambiente en fragmentos. Las recientes mejoras en las herramientas de bioinformática, que utilizan la informática de alto rendimiento para unir estos fragmentos en los genomas de especies microbianas individuales, han permitido a los científicos reconstruir genomas microbianos a gran escala, revelando la increíble diversidad y complejidad de las comunidades microbianas.

A través de un enfoque de sistemas que incorpora, la genómica junto con el conocimiento y las herramientas de una variedad de otras disciplinas (como la biogeoquímica y la oceanografía), los investigadores ahora pueden monitorear y evaluar la salud del ecosistema, e identificar alteraciones que de otro modo pasarían desapercibidas mediante el análisis de poblaciones microbianas que pueden actuar tanto como administradores, como bioindicadores de los ecosistemas. Con estos esfuerzos, los ecosistemas globales pueden protegerse mejor y, cuando sea necesario, restaurarse frente a diversos factores de estrés ambiental.

Descubrimientos Transformadores

A raíz del derrame del Deepwater Horizon, las asociaciones científicas multidisciplinarias permitieron descubrimientos transformadores que detallan cómo los microbios responden a las descargas de petróleo y facilitan la recuperación del ecosistema.Antes de 2010, la mayoría de los estudios de microbios asociados con derrames de petróleo se realizaban cultivándolos en el laboratorio utilizando cultivos puros o enriquecimientos . En consecuencia, teníamos una comprensión muy limitada de los tipos y la distribución de los microorganismos que degradan el petróleo, y de lo que realmente hacen en el medio ambiente, porque la gran mayoría de los microorganismos en el medio natural aún no se han cultivado. Pero a raíz del derrame de DWH, las asociaciones científicas multidisciplinarias permitieron descubrimientos transformadores que detallan cómo los microbios responden a las descargas de petróleo y facilitan la recuperación del ecosistema.

Muchas de estas asociaciones fueron respaldadas por la Iniciativa de Investigación del Golfo de México (GoMRI), creada con un compromiso de $ 500 millones por 10 años de BP para financiar un programa de investigación científica independiente dedicado a estudiar los impactos y la mitigación de derrames de petróleo, particularmente en el Golfo de México. GoMRI ha financiado 17 consorcios internacionales y miles de investigadores.

Armados con herramientas de genómica, los investigadores de GoMRI demostraron que los microbios que degradan el petróleo son, de hecho, casi ubicuos, se encuentran en casi todo el mundo en baja abundancia, incluso cuando el petróleo crudo está ausente. Estos microbios, que forman parte del grupo de especies de baja abundancia conocido como “la biosfera rara”, albergan una capacidad metabólica especializada para utilizar el petróleo como fuente de alimento, una capacidad que puede activarse rápidamente tras la exposición al petróleo [ Kleindienst et al. , 2015].

Jonathan Delgardio y Will Overholt del Instituto de Tecnología de Georgia, tomaron muestras de capas de arena el 20 de octubre de 2010 en Pensacola Beach, Florida, la cual estaba muy contaminada por el petróleo degradado después de la descarga de Deepwater Horizon. Los investigadores utilizaron la genómica para rastrear cómo las comunidades microbianas cambiaron en respuesta al petróleo al comparar las capas de arena aceitada con las arenas vírgenes. Crédito: Markus Huettel

Desde aguas oceánicas profundas hasta sedimentos costeros poco profundos, las bacterias que degradan los hidrocarburos respondieron profundamente a la contaminación por hidrocarburos después del derrame de DWH, aumentando en abundancia y expresando genes involucrados en el metabolismo de los hidrocarburos durante días o meses. Se demostró en algunos casos que las comunidades microbianas estaban compuestas de hasta un 90% de especies degradantes del petróleo después de la exposición a los hidrocarburos [ Kleindienst et al. , 2015; Huettel y col. , 2018].

Con el tiempo, las sucesiones de poblaciones microbianas florecieron a medida que consumieron los diferentes compuestos de hidrocarburos del petróleo y respondieron a factores ambientales [Kostka et al. , 2011; Yang y col. , 2016; Kleindienst y col. , 2015]. La investigación de genómica reveló que diferentes especies microbianas están adaptadas para degradar tipos específicos de compuestos de hidrocarburos (por ejemplo, gases naturales, alifáticos de cadena lineal o aromáticos ) dependiendo de las condiciones ambientales como la temperatura y la disponibilidad de nutrientes. Estos descubrimientos subrayan la capacidad natural de los microbios en el Golfo de México y en otros lugares para bio-remediar hidrocarburos de petróleo.

Cómo el Petróleo Afecta los Ecosistemas

La investigación habilitada por la genómica reveló que las funciones del ecosistema relacionadas con el ciclo microbiano del nitrógeno se vieron drásticamente afectadas por el petróleo.Los científicos han planteado la hipótesis de que el destino y los impactos del petróleo en los ecosistemas están determinados por las interacciones entre las características físicas y químicas del medio ambiente y por la química de los hidrocarburos y los procesos biogeoquímicos que en gran parte están mediados por microbios. Sin embargo, la complejidad de estas interacciones ha afectado nuestra capacidad de descifrar exactamente cómo el petróleo afecta el funcionamiento del ecosistema.

El petróleo puede ser una fuente de alimento para algunos microbios, pero puede ser tóxico para otros y provocar efectos adversos en los servicios de los ecosistemas mediados por microbios, como la descomposición de la materia orgánica y la regeneración de nutrientes. Después de la descarga de DWH, los investigadores de GoMRI observaron a través de múltiples líneas de evidencia que los hidrocarburos líquidos y gaseosos del derrame entraron rápidamente en la red alimentaria microbiana y persistieron durante años [Fernández-Carrera et al. 2016; Rogers y col. , 2019; Chanton y col. , 2020], con importantes implicaciones para el ciclo del carbono y los nutrientes a través del medio ambiente. La investigación habilitada por la genómica reveló, por ejemplo, que las funciones del ecosistema relacionadas con el ciclo microbiano del nitrógeno se vieron drásticamente afectadas por el petróleo.

Este núcleo de arena (izquierda) recogido el 30 de junio de 2010 en Pensacola Beach, Florida, contiene una pronunciada capa aceitada (marrón oscuro). Más del 50% de los microbios en esa capa pertenecían al género Marinobacter (en el orden Alteromonadales), un grupo microbiano conocido que degrada los hidrocarburos, mucho más que en las arenas debajo y por encima de la capa aceitada. Crédito: Markus Huettel

Por ejemplo, una serie temporal metagenómica reveló un aumento en la abundancia de genes que codifican para la fijación de nitrógeno (a través de la enzima nitrogenasa) que coincidió con un aumento en los genes relacionados con las vías de degradación de hidrocarburos [Rodriguez-R. et al. , 2015]. Este aumento se disipó cuando el aceite y los compuestos de hidrocarburos asociados desaparecieron. Además, la abundancia de genes relacionados con la degradación de clases específicas de hidrocarburos, como alcanos y aromáticos policíclicos, podría correlacionarse directamente con las concentraciones de las clases correspondientes.

Los datos genómicos fueron corroborados por la investigación con trazadores isotópicos, que mostraron la incorporación de nitrógeno inorgánico en la red alimentaria microbiana [Fernández-Carrera et al. , 2016]. Los microbios fijadores de nitrógeno, también llamados diazotrofos, son bien conocidos por apoyar el crecimiento de los cultivos en los ecosistemas agrícolas y la producción fotosintética en el océano abierto [Zehr et al. ,2016], pero la fijación de nitrógeno por degradadores de petróleo en respuesta a la exposición a hidrocarburos es un nuevo descubrimiento. El reconocimiento de que las bacterias que degradan el petróleo pueden abastecerse de nitrógeno indica que la red alimentaria microbiana puede compensar, al menos en cierta medida, los flujos de petróleo pobre en nutrientes. Los estudios de los investigadores de GoMRI revelaron que, a medida que la diversidad microbiana general disminuía en ambientes contaminados con petróleo, el aceite seleccionado para unas pocas especies microbianas muy abundantes con la capacidad dual de fijar nitrógeno y degradar el petróleo.

“Super Bicho” Descubierto

Los practicantes encargados de limpiar después de los derrames de petróleo sueñan con una “superbacteria”. Una que sea nativa del ambiente contaminado y capaz de eliminar todos los componentes del petróleo y generar sus propios nutrientes.Fertilizar el agua con nitrógeno y fósforo para estimular el crecimiento microbiano es una estrategia común de biorremediación para la limpieza de derrames de petróleo. Se usó, por ejemplo, durante el derrame de Valdez en 1989 [Bragg et al. , 1994]. Pero los fertilizantes son costosos y difíciles de aplicar a grandes escalas y pueden tener consecuencias no deseadas en el ecosistema. Por lo tanto, los profesionales encargados de limpiar después de los derrames de petróleo sueñan con una “superbacteria”, una que sea nativa del ambiente contaminado y capaz de eliminar todos los componentes del petróleo y generar sus propios nutrientes.

La naturaleza puede haber proporcionado tal organismo. Guiados por datos de campo metagenómicos, los investigadores de GoMRI unieron los genomas de microbios que se cree que son diazotróficos que también degradaron el petróleo en los sedimentos marinos. Después de observar los metabolismos potenciales de estos microbios, aislaron un microorganismo particular de las muestras de campo. Utilizaron hexadecano, un hidrocarburo, como única fuente de carbono y energía y no proporcionaron nitrógeno [Karthikeyan et al. , 2019]. La secuenciación confirmó que el genoma del microbio recién aislado, KTK-01, contiene genes que codifican para la  fijación de nitrógeno y las vías de degradación de hidrocarburos, así como para la producción de biosurfactantes, los cuales juntos facilitan el crecimiento en un ambiente aceitado limitado en nitrógeno.

Las comparaciones con genomas disponibles de estudios previos, revelaron que el microbio recién aislado, llamado provisionalmente Candidatus Macondimonas diazotrophica por el aceite de Macondo que se descargó durante el desastre de DWH, representa un género nuevo de Gammaproteobacteria, una clase que incluye Escherichia coli y Salmonella , entre muchos otros. El examen también reveló una distribución notable de secuencias idénticas o casi idénticas a las de KTK-01 en sedimentos contaminados con hidrocarburos de ecosistemas costeros de todo el mundo: los microbios con genomas que coinciden con esta secuencia a menudo constituían aproximadamente el 30% de sus comunidades totales, pero eran casi ausente en sedimentos vírgenes o agua de mar. Por lo tanto, Macondimonas parece desempeñar un papel ecológico clave en las respuestas naturales a los derrames de petróleo en los entornos costeros de todo el mundo y podría ser un organismo modelo útil para seguir estudiando dichas respuestas.

Biomarcadores de Contaminación de Petróleo

Los indicadores genómicos sirven como biomarcadores para guiar las estrategias de mitigación, al igual que los análisis de sangre pueden indicar a los médicos el diagnóstico de enfermedades y las opciones de tratamiento.El objetivo final de la investigación genómica respaldada por GoMRI es traducir los hallazgos genómicos en información procesable para ayudar a los científicos a monitorear y restaurar la salud del ecosistema ante desastres naturales o provocados por el hombre. Mediante el examen de los organismos, genes y vías metabólicas presentes en las comunidades microbianas, los investigadores pueden tomar el pulso de un ecosistema e identificar déficits o ganancias funcionales en las comunidades que afectan la salud general del ecosistema. Tales indicadores genómicos sirven como biomarcadores para guiar las estrategias de mitigación, de la misma forma que los análisis de sangre pueden indicar a los médicos el diagnóstico de enfermedades y las opciones de tratamiento.

Durante la respuesta al DWH, las técnicas de genómica microbiana han demostrado el potencial de desarrollar proxys genéticos o biomarcadores efectivos para registrar los aportes de petróleo, los regímenes de exposición y la degradación de hidrocarburos. Las alteraciones del ecosistema inducidas por el petróleo fueron identificadas por una reducción en la diversidad comunitaria, el crecimiento excesivo de ciertas especies o la aparición de nuevos genes, vías metabólicas y funciones del ecosistema. Por ejemplo, se demostró que Macondimonas domina las comunidades microbianas en las arenas de las playas aceitadas, y un gran aumento en la abundancia de genes de fijación de nitrógeno significó la limitación de nutrientes e interrupciones en el ciclo del nitrógeno iniciado por la lubricación [Karthikeyan et al. , 2019]. Además, una disminución en la abundancia de microorganismos quimiolitoautotrófico nitrificantes en sedimentos aceitados seguidos del rebote de estos microbios en las arenas recuperadas, proporcionaron evidencia de la recuperación del esocsistema Huettel et al. , 2018].

Preparación para la Respuesta y Restauración

Los esfuerzos apoyados por GoMRI para caracterizar las respuestas de las comunidades microbianas en los ecosistemas del Golfo de México después del derrame de petróleo de DWH, generaron conocimiento con impactos de largo alcance e impulsaron una gran cantidad de descubrimientos. Y las herramientas y enfoques recientemente desarrollados han demostrado la prueba de principio para la implementación como parte del kit de herramientas de respuesta a emergencias.

La necesidad de seguir investigando en estas áreas es grande porque el riesgo de futuros derrames de petróleo como DWH permanece, mientras la industria petrolera continúa aprovechando pozos marinos ultraprofundos para la producción de petróleo y gas, y porque los dispersantes químicos, que pueden ser tóxicos para los organismos, siguen siendo la principal estrategia de respuesta. Pero las lecciones aprendidas de la investigación de DWH hasta ahora se pueden aplicar al desarrollo de nuevas estrategias de mitigación y mejoras en las capacidades predictivas para responder a futuras perturbaciones ambientales, como las causadas por eventos climáticos extremos o cambio climático.

Por primera vez, es posible un enfoque basado en datos para la respuesta y mitigación de derrames de petróleo. Con herramientas genómicas avanzadas y experiencia científica, los microbiólogos pueden analizar rápida y económicamente muestras de campo para proporcionar información esencial sobre los ecosistemas microbianos antes, durante y después de los derrames.

Visualizamos un futuro en el que las mediciones ómicas permitan evaluar los riesgos ambientales, identificar los déficits del ecosistema, seleccionar los planes de mitigación apropiados y monitorear la recuperación del ecosistema, y ​​en el que los científicos desempeñen funciones clave para informar a los profesionales a fin de mejorar la respuesta y la preparación para la restauración ante futuros desastres ambientales.

Monitoring African Elephants with Raspberry Shake & Boom

Thu, 05/28/2020 - 12:18

African elephants are social animals that use sound and vibrations to communicate and keep tabs on each other. But their conversations aren’t anything like the high-pitched trumpeting sound that we commonly associate with these animals—those calls are reserved for signaling imminent danger or alarm. More often than not, elephants communicate using powerful, low-frequency vocalizations known as rumbles. Rumbles fall outside the threshold of human hearing, but other elephants can detect them over very long distances.

Seismic communication opens the possibility of monitoring elephants using seismic sensors, which is very appealing for biologists and conservationists.Rumbles are so powerful that they resonate with the ground, producing seismic waves that travel farther through land than through the air. Other elephants can feel these vibrations at distances up to 6 kilometers from the source, maybe more. They pick up these vibrations with their sensitive feet, where they have special organs to help them do just that, earlier research has shown. Elephants use rumbles to locate each other, detect friendly or rival calls, and find mates. They can also feel other elephants running or stomping their feet, telltale signs of danger or threats.

Researchers call this seismic communication, and it opens the possibility of monitoring elephants using seismic sensors, which is very appealing for biologists and conservationists alike. It offers a way of keeping tabs on the animals without the hassle and risk of other methods such as radio collars or tags, which require chasing and tranquilizing the animals.

The main drawback of this approach, though, is that conventional seismic or geophysical sensors cost tens of thousands of dollars.

Raspberry Shake & Boom

The cost is why last October a group of researchers including Oliver Lamb, a geophysicist at the University of North Carolina at Chapel Hill, traveled to South Africa to test whether scientists could use a low-cost seismic sensor called Raspberry Shake & Boom (RS&Boom) to monitor elephants in the field.

RS&Boom is based on Raspberry Shake, an affordable seismograph powered by the popular Raspberry Pi technology. RS&Boom includes an acoustic microphone and a geophone—a device that converts ground vibrations into measurable electronic signals—and researchers can buy one for less than $1,000.

Lamb and his colleagues installed five RS&Boom devices at Adventures with Elephants, a 300-hectare natural reserve home to seven African elephants. Their goal was to record noisy bonding episodes known as “reunions,” where elephants rumble and stomp the ground after being separated for a while.

Their test succeeded only in part. Although the acoustic microphones on RS&Boom clearly recorded intricate details of low-frequency vocalizations, the geophones detected only rumbles produced within a 100-meter range and footsteps within a 50-meter range from the elephants. The new work is currently under review by the journal Bioacoustics.

Although the range of the RS&Boom is very limited, Lamb thinks that the devices could be used in areas where the elephants are known to congregate, such as watering holes. In theory, the vocalizations were recorded with enough clarity that specialized software should be able to differentiate between animals and tell how many different animals are present, said Lamb in a presentation at EGU2020: Sharing Geoscience Online, the 2020 General Assembly of the European Geosciences Union.

Learning Curve

Lamb is confident that a more refined field setup could yield better results.

RS&Booms are not designed for outdoor use. To protect the tools from the elements, Lamb and his team tucked each device inside a plastic container and buried it 30 centimeters in the ground. Since the acoustic sensors rely on direct measurements of the atmosphere, they poked a hole in each container and introduced a hollow tube running from the container to the surface. Large car batteries connected to solar panels powered these contraptions.

To protect the tools from the elements, scientists tucked each device inside a plastic container and buried it 30 centimeters in the ground. Credit: Oliver Lamb

“It wasn’t the best way to deploy the sensor, but it was the best we could do,” Lamb said. “The best way to deploy the sensor would be to build a large concrete vault, at least 1 meter into the ground. That just wasn’t viable for the temporary field experiment we were doing.”

Tarje Nissen-Meyer, a geophysicist at the University of Oxford in the United Kingdom, has also used seismic sensors to study elephants in the field but was not involved in the current study. According to Nissen-Meyer, the capability to detect seismic signals from wildlife at long distances strongly depends on many variables, including local soil and geology, vegetation, topography, ambient noise, and, crucially, the coupling of the recording device to the ground.

“Only under very favorable conditions can one expect signals to be detectable at far distances,” Nissen-Meyer said. “In that sense, I entirely agree with their plans to run a new project with better instrument coupling.”

Automated Monitoring and Early-Alarm Systems

Seismic systems could be used to detect poachers or as an early-warning system when elephants leave protected areas or get too close to human settlements, crops, or cattle.In their paper, Lamb and colleagues hint at the possibility of developing an automated system based on seismic detectors that could autonomously track and send alarms about elephant activities. Such systems could be used to detect poachers or as an early-warning system when elephants leave protected areas or get too close to human settlements, crops, or cattle.

Although it’s theoretically possible, “we are not in a position to implement an automated system with reliable accuracy at this point, at least as far as I am aware,” Nissen-Meyer said. “However, we are not starting from scratch, by building on hardware and software advances from other fields.”

For instance, Nissen-Meyer argues, biologists have used low-cost acoustic sensors with success before. On the software and methodological side, such methods shouldn’t be much different from those used for earthquake seismology or nuclear monitoring. It would be a matter of adapting existing technologies in seismic modeling, source inversion, and machine learning. “We are nowhere close to having a system or even proof-of-concept at hand, but the very fact that seismic signals are detectable is enough to inspire us to dig deeper,” he said.

—Javier Barbuzano (@javibarbuzano), Science Writer

Adolphe Nicolas (1936–2020)

Thu, 05/28/2020 - 12:13
Adolphe Nicolas in Oman in the 1990s. Credit: Françoise Boudier, CC BY-SA 4.0

Prof. Adolphe Nicolas, a great teacher, eminent research scientist, and beloved friend to many, passed away 31 March 2020 at age 84. He was an emeritus professor in the Laboratoire Géosciences Montpellier in France at the time.

Nicolas was born in 1936 in Rennes, France. After spending the postwar years in Morocco, where his father served as a doctor in the Organisation Mondiale de la Santé, he attended high school in the United States. Following his studies in physics and Earth science at the University of Paris, including a Ph.D. dissertation on the Piemontese ophiolites in the western Alps, he taught at the School of Mines in Nancy from 1958 to 1965. During this period, in 1959, Nicolas married Odile Rohrer, with whom he had four children: Ronan, Valentine, Alexis, and Clarisse.

After his time in Nancy, Nicolas became a professor at the University of Nantes in 1968. There he departed from tradition and created the relatively small yet innovative Laboratoire de Tectonophysique. He and his team produced a large volume of first-order research on the physical properties of the mantle and on plastic deformation of the solid Earth, as recorded by crystallographic orientations of minerals. Through this research, Nicolas initiated collaborations that lasted for more than 70 years, including work with Dale Jackson, Steve Kirby, and Harry Green during an influential sabbatical in California; studies with Emile Den Tex in the Netherlands; and research with Jean-Paul Poirier that was exemplified by their classic 1976 book Crystalline Plasticity and Solid State Flow in Metamorphic Rocks.

Nicolas’s interest in mantle processes continued throughout his career, producing major advances reported in numerous papers and seven books.Nicolas’s interest in mantle processes continued throughout his career, producing major advances reported in numerous papers and seven books. His most notable contributions include studies quantifying and interpreting microstructures in rocks in terms of deformation processes and seismic anisotropy, and studies of the formation of oceanic lithosphere at spreading centers, which were based on observations in ophiolites.

Starting in the 1980s, Nicolas focused on the exceptional outcrops of the Samail ophiolite in Oman and the United Arab Emirates, spending weeks in the field there every year through 2016 and leading a team that made systematic, structural measurements in every canyon over the entire 350-kilometer length of that tectonically accreted block of upper mantle and oceanic crust. In this work, Nicolas and his group largely eschewed contemporary debates about exactly which kind of spreading ridge formed the Samail lithosphere—whether a “normal” mid-ocean ridge or one related to a microplate, nascent arc, fore arc, or back arc. Instead, they concentrated on elucidating processes that are shared by mantle melting, melt transport, and crustal formation at all spreading centers. This approach opened and sustained a fertile, dialectical interaction between marine and ophiolite research communities, embodied in international “ridge initiatives” (e.g., InterRidge).

In the 1980s, while still at Nantes, Nicolas served as acting director of the National Institute of Sciences of the Universe, part of the Centre National de la Recherche Scientifique (CNRS). In 1986, at the request of Prof. Maurice Mattauer, Nicolas moved with much of the Laboratoire de Tectonophysique team from Nantes to the University of Montpellier, where he continued as the lab’s director from 1986 to 1994. He then served as director of the Institut des Sciences de la Terre, de l’Eau et de l’Espace de Montpellier from 1994 to 1997 and, in Paris, as Counselor for Earth Sciences and Environment in the French Ministry of Research from 1997 to 2000. On his return to Montpellier, he resumed full-time teaching at the university as well as at the Montpellier branch of the École Polytechnique Féminine until his retirement in 2003.

As an emeritus professor from 2003 onward, Nicolas continued to work with the intensity that characterized his entire career, dividing his efforts between basic science research and outreach to the general public.As an emeritus professor from 2003 onward, he continued to work with the intensity that characterized his entire career, dividing his efforts between basic science research and outreach to the general public about planetary evolution and climate change. Reflections from his time at the Ministry of Research, interacting with climate scientists, led him to write three books—2050: Rendez-Vous à Risques; Futur Empoisonné: Quels Défis? Quels Remèdes?; and Énergies: Une Pénurie au Secours du Climat?and to give numerous presentations on the subject. Most recently, in 2018, he coauthored a new edition of his classic textbook Principes de Tectonique with his colleague and friend Jean-Luc Bouchez.

In recognition of his research accomplishments, extraordinary teaching career, and public service, Nicolas was awarded the AGU Harry H. Hess Medal, the Prix Dolomieu of the Academie des Sciences in France, and the Silver Medal of CNRS. He was a Knight of the Legion of Honor and of the Ordre des Palmes Académiques, an AGU Fellow, and a senior member of the Institut Universitaire de France.

Prof. Nicolas is survived by his wife, his children Alexis and Clarisse, and four grandchildren. He will be greatly missed by his country, colleagues, friends, and family.

—Françoise Boudier (francoise.boudier@gm.univ-montp2.fr), Géosciences Montpellier, University of Montpellier, and CNRS, Montpellier, France; Bob Coleman, Stanford University, Stanford, Calif.; Benoit Ildefonse, Géosciences Montpellier, University of Montpellier, and CNRS, Montpellier, France; Peter Kelemen, Lamont-Doherty Earth Observatory, Columbia University, Palisades, N.Y.; and David Mainprice, Géosciences Montpellier, University of Montpellier, and CNRS, Montpellier, France

Venus Exploration Starts in the Lab

Thu, 05/28/2020 - 12:12

In March of 1982, the Soviet spacecraft Venera 13 landed a probe on the surface of Venus. The probe sent back the first color photographs from the surface of another planet, revealing that Venus has a desolate landscape to match its hellish atmosphere. It collected and analyzed a sample of the rocky surface, and its acoustic detector measured vibrations from the wind.

Venera 13 sent back some of the best data we have to date of Venus’s surface. The probe holds the record for the longest-lived Venus surface mission.

“GEER is a highly adaptable facility that’s constantly evolving its capabilities.”It survived for just 127 minutes.

Scientists have been trying to return to Venus’s surface since the late 1980s, but this time with instruments that will last for days or even months. That’s where GEER comes in.

GEER, the Glenn Extreme Environments Rig at NASA Glenn Research Center (GRC) in Cleveland, Ohio, is a test chamber that can create Venus-like conditions to study how materials placed inside the chamber react.

“GEER is a highly adaptable facility that’s constantly evolving its capabilities,” said Kyle Phillips, an aerospace and mechanical engineer at GRC. Phillips is GEER’s primary operator and test engineer. “In past tests, we’ve simulated conditions all the way from Venus surface conditions—both lowlands and highlands—up through the lower atmosphere through where we expect the cloud layers to be, and just slightly above the cloud layers and the upper atmosphere.”

Building Spacecraft to Last Some types of metal wiring react at Venus surface conditions, causing electronics to break down. Shown here is a metal wire before (top) and after (bottom) a test in the GEER chamber. Credit: GEER/NASA Glenn Research Center

Venera 13, its twin probe Venera 14, and the eight other successful attempts to land a probe on Venus all fell prey to the same thing: temperatures hotter than 450°C, pressures about 90 times that of Earth’s surface (90 bars), and a corrosive carbon dioxide–dominated atmosphere. Under those conditions, a spacecraft that might survive for years on Mars or the Moon would break down in minutes on Venus as the outer casing melts or dissolves, wires corrode, and delicate hardware warps.

The GEER team has “tested things like basic materials that one might use in a spacecraft or around the spacecraft,” said Tibor Kremic, chief of space science projects at GRC. “How do those interact with the environment? How do they fare? How did their properties and their functions change over time in a Venus surface–like environment?”

Test material is placed inside the 1-cubic-meter, corrosion-resistant stainless-steel cylinder. The test engineers then ramp up the pressure, temperature, and gas composition inside the chamber and hold it steady for days, weeks, or even months. “Currently, GEER can replicate temperatures from near ambient up to 1,000° Fahrenheit—that’s 537°C,” Phillips said, “and it can replicate pressures from ambient to rough vacuum to…94 bars.”

“Copper, you might think, is just fine to use for electrical conductors. Turns out, don’t use copper. In fact, gold would be a better material.”“We have done work over time in understanding what materials would be viable for long-term missions and which are not,” said Gary Hunter, a senior electronics engineer with GEER. For example, “copper, you might think, is just fine to use for electrical conductors. Turns out, don’t use copper. In fact, gold would be a better material to use because the reactivity on the Venus surface and at those temperatures is different, and the materials that are viable are different, than you might see in standard high-temperature operations on Earth.”

GEER has been operational since 2014, and the team has already made huge leaps forward in terms of designing Venus-durable spacecraft. During a test a few years ago, “we demonstrated electronics operational in Venus surface condition for 21 days,” Hunter said. Computer chips turned out to be fairly durable. “The longest time anything else had ever lasted before that point in terms of electronics on the surface of Venus…was approximately 2 hours. To go 21 days was showing a significant step up in what might be possible [in] Venus surface exploration.”

To Venus and Back in 80 Days

In its longest test to date, the GEER team subjected common geologic samples to Venus’s harsh surface conditions for 80 continuous days.

Understanding how common geologic materials like basalts and glasses behave on Venus’s surface will help planetary scientists understand data that come back from missions. Using GEER, these geologic samples (top) were exposed to conditions on Venus’s surface for 80 days (bottom). Credit: GEER/NASA Glenn Research Center

“We tested geologic material, so glasses, basalts, minerals, things that we expect might be on the Venus surface,” Kremic said, “to understand how they might change or what they might look like if we’re trying to identify them remotely.” A basalt or a glass or a silicate might have a different spectrum or appearance on Venus than on Earth, the Moon, or Mars.

Tests that reveal the properties of planetary materials at extreme conditions serve a dual purpose, Kremic explained. Mission scientists can tailor their instruments to measure Venus-relevant signatures, and they can use test results as benchmarks to interpret those measurements.

The 80-day test also underscored the need for a second, smaller test vessel that could be run at the same time as the larger one. “It’s a very small, mini GEER,” Kremic said. The aptly named MiniGEER went into operation in 2019. It’s just 4 liters in volume (250 times smaller than GEER) and can be brought up to temperature, pressure, and gas composition, and back down again, much faster than its larger counterpart.

“Maybe we have two things going on or we have tests that don’t require the volume [of GEER],” Kremic said, “and this way [they] can be done quicker and at lower cost.”

The Future of Venus Exploration

NASA might be headed back to Venus in the near future—two of its four finalists for a Discovery-class mission are bound for Venus. If one of those missions is selected, the GEER facility will be involved with getting the technology mission ready.

But the team has already been hard at work designing its own Venus mission, a small probe called the Long-Lived In-Situ Solar System Explorer (LLISSE). LLISSE would weigh about 10 kilograms and last for at least 60 days on Venus.

“At Venus you get a day-to-night or night-to-day transition at least once in a 60-day period,” said Kremic, who is LLISSE’s principal investigator, “and so we want to make sure that we capture one of those….We’re going to measure temperatures, we’re going to measure pressures, we’ll measure winds, maybe 3D winds on the surface of Venus,” as well as atmospheric composition and how all of those properties change over time. The team plans to build a full-scale ground model of LLISSE and test it inside GEER for the full 60 days by 2023.

The inside of GEER is 1 cubic meter in volume, or about 3 feet wide × 4 feet long. Credit: GEER/NASA Glenn Research Center

The scientists are also exploring how GEER can adapt to simulate other places in the solar system and beyond. “The beauty and one of the unique things about GEER is that we can mix up pretty much whatever chemistry we want,” Kremic said, and new hardware might let GEER reach colder-than-ambient temperatures too.

“The results of what we’re doing will change and enhance our ability to do science, our understanding of our solar system, and of other [planetary] bodies, Venus in particular,” Kremic said, and we can “be more confident in what we send there.”

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

Earthquake-coda Tomography Boosts Illumination of the Deep Earth

Thu, 05/28/2020 - 11:30

Seismic tomography is among our most powerful tools to infer the deep structure of the Earth, thereby yielding information about its composition, dynamics and evolution. It is traditionally based on observations of few well-studied body and surface wave types, which limits tomographic resolution.

Wang and Tkalčić [2020] provide a theoretical framework that enables the use of a kind of waves that has so far been largely ignored in seismic tomography: waves that scatter multiple times after being excited by an earthquake, also known as coda waves. Their method provides a tool that synthesizes otherwise unobservable body waves from seemingly chaotic coda wave recordings. The synthesized waves correspond to complicated propagation paths through the Earth, thereby increasing illumination in deep parts of the planet that are not well sampled by more traditional observations.

In addition to the theory, the authors provide a data-based validation of their approach, as well as an application to the imaging of inner core structure.

Citation: Wang, S., & Tkalčić, H. [2020]. Seismic event coda‐correlation: Toward global coda‐correlation tomography. Journal of Geophysical Research: Solid Earth, 125, e2019JB018848. https://doi.org/10.1029/2019JB018848

—Andreas Fichtner, Associate Editor, JGR: Solid Earth

How Is the Pandemic Affecting AGU Journal Article Submissions?

Wed, 05/27/2020 - 12:00

The COVID-19 pandemic has changed all our lives in many ways. One specific aspect the media has been reporting on—and a topic that AGU has received a number of inquiries about—is the effect the pandemic is having on female academics. Within the AGU Publications Department, we specifically wanted to see how the pandemic is affecting our authors.

So far, we aren’t seeing statistically significant changes in submissions from women or researchers in countries most affected by the COVID-19 outbreak.We looked at the latest data on people submitting new manuscripts to our journals in the context of historical trends. We focused on the gender, age, and geographical location of the “corresponding” author, who is typically the lead/first author and submits the article and acts as the main point of contact.

The bottom line is that, so far, we aren’t seeing statistically significant changes in submissions from women or researchers in countries most affected by the COVID-19 outbreak, but it’s worth taking a look at the data more closely.

A bit about the age and gender data: We looked at submissions to all our journals between January 2018 and April 2020. We matched corresponding authors with our AGU database of members and non-members who self-identified their birth year and gender. Of distinct corresponding authors in this analysis, 70% have identified their gender and 53% their age. Corresponding authors provide their country of physical address when submitting their manuscript. The data supporting this analysis is anonymized and provided in aggregate here.

Month-by-month

First, we compared the current year’s monthly submissions with previous years to see if there are any seasonal trends in manuscript submissions by women. The data suggest no significant trends: So far this year, we’ve only seen minor differences each month in the proportion of female corresponding authors. There is a noticeable difference between June 2018 and June 2019, so we’ll be eager to examine data of the rest of the months of 2020 once we get them.

Percentage of corresponding authors who were women (unknown genders excluded, which is about 30% of corresponding authors). Note the y-axis starts at 10% to better show the differences between months.                                   Age group

Though the dataset of corresponding authors with known age and gender is slightly smaller, it’s still worth breaking out the data by age cohort. In the first four months of 2020, the proportion of female corresponding authors in their 20s increased from February to April; women in their 30s submitting papers noticeably increased in February but dropped in March and stayed about the same in April; and women in their 40s and 50s submitted nearly the same percentage of manuscripts in April and March.

Age group of female corresponding authors calculated as a percentage of total female corresponding authors with known age. 70s and 80s age groups not shown.

A variety of individual factors across and within age groups make it difficult to draw any immediate conclusions from the data.  We’ll continue to monitor the situation with a particular focus on early career researchers as they juggle teaching (or job hunting), research, and childcare, demands only exacerbated by the pandemic.

Geographic region

The timing of the spread of the virus and the restrictions imposed by countries have differed across the world, so how has that affected our authors from different regions? Most regions saw an increase in submissions between February and April, except China, and Central and South America. The decline in the Americas was due to decreases in Brazil and Chile specifically. We have also seen a dip in submissions from India in April, a month during which the cases of reported cases of the Coronavirus has increased in that region.

Submission rates of corresponding authors by region.

Not too much should be read into slight dips in February because the general trend across all submissions, as shown in the figure below, is a noticeable drop in February each year followed by a recovery in March. There are likely several reasons. For example, January is usually busy for submissions because people have time during the holidays to work on research papers before the university terms start. Late January and early February see the annual Chinese New Year celebrations with a long holiday taken in that region. And, of course, February has fewer days than January and March.

With data for only the first four months of the year, it’s difficult to tell whether the impacts of the virus and lockdown in different regions has had an effect on our submissions.

Number of submitted manuscripts by month from January 2018 to April 2020.

AGU is committed to monitoring the data as different countries ease restrictions on movement and we are in a place to look back at the whole of 2020.By looking at these data on journal article submissions, can you tell we are in the midst of a pandemic? Not really. AGU is committed to monitoring the data as different countries ease restrictions on movement and we are in a place to look back at the whole of 2020. We will also examine how the crisis might have affected others involved in our publishing operations, such as reviewers and editors, for example changes in volume or rate of reviewing papers and making decisions on manuscripts.

For now, AGU is continuing to support our community by providing pandemic-related resources, including granting deadline extensions for journal authors and reviewers. How has COVID-19 impacted your work and life, and how can AGU help? Share your story with us.

—Paige Wooden (pwooden@agu.org; 0000-0001-5104-8440), Senior Program Manager, Publications Statistics, American Geophysical Union

Manteniendo el Conocimiento de la Ciencia Indígena Fuera de un Molde Colonial

Wed, 05/27/2020 - 12:00

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

Durante su investigación de doctorado, Dominique David-Chávez estudiaba el conocimiento indígena acerca del clima de su comunidad. A medida en que ella revisaba la literatura científica sobre del tema, notó un patrón perturbador.

“Independientemente del tipo de estudio que se haya realizado, sin importar si era acerca de indicadores ecológicos de los cambios estacionales o prácticas agrícolas, lo que más leía eran estudios similares en donde (científicos no indígenas) iban y documentaban ese conocimiento en una revista científica,” dijo David-Chávez, quien es una investigadora posdoctoral que trabaja junto con el Instituto de las Naciones Nativas de la Universidad de Arizona en Tucson y con la Universidad Estatal de Colorado en Fort Collins. Además, es miembro de la Comunidad Arawak Taíno.

“Fue muy difícil encontrar quién de la comunidad estuvo contribuyendo con ese conocimiento, cómo estos descubrimientos eran devueltos a esa comunidad, o qué preguntas y preocupaciones tuvo que enfrentar esa comunidad indígena en términos de la investigación,” dijo.

“Me sentí preocupada por esa forma de hacer investigación. No lo encuentro respectable”.Este tipo de conocimiento extraído de las comunidades indígenas es uno de los muchos aspectos del colonialismo que afectan a la práctica moderna de investigación cuando se trata del conocimiento científico indígena.

“Me sentí preocupada por esa forma de hacer investigación. No lo encuentro respetable,” dijo David-Chávez. “De verdad tuve que buscar en otra parte para tratar de encontrar un modelo que estuviera alineado con mis valores culturales y los estándares científicos que necesitaba mantener en mi trabajo.”

David-Chávez y sus colaboradores desarrollaron y probaron en campo un modelo para guiar a los científicos en el cumplimiento de esos estándares. Con ese modelo como marco de trabajo, los investigadores, junto con los miembros de las comunidades rurales de Cidra y Comerío en el centro de Puerto Rico (Borikén), diseñaron y facilitaron un proyecto de investigación climática dirigido por jóvenes en 2016-2017.

“Realmente el modelo trata de ser intencional en todos los aspectos de la investigación durante cada etapa de esta, (comenzando con) la etapa de diseño e incluso antes de eso,” dijo. David-Chavez presentó este modelo en la Reunión de Otoño de la AGU, el 12 de Diciembre del 2019.

Colonialismo en la Investigación Científica y la Educación

“Estamos realmente en un momento en el que hay un impulso para involucrar […] diversas perspectivas en las ciencias,” dijo David-Chávez. “Sin embargo, al hacerlo, no siempre entendemos o reconocemos el contexto histórico que ha inhibido ese tipo de participación, por ejemplo, en los Estados Unidos, durante los últimos cinco siglos”.

Ese contexto, continuó diciendo, incluye una “historia de colonialismo, genocidio y opresión y asimilación en donde, por ejemplo, los sistemas de conocimiento que mantenían las comunidades, y las lenguas en donde esos sistemas de conocimiento existían, fueron algunas veces ilegales y a menudo oprimidos.”

David-Chavez recuerda muchos casos en los que los pueblos indígenas estaban preocupados acerca de cómo los científicos estaban usando el conocimiento de la comunidad, si los resultados de la investigación iban a ser devueltos a la comunidad o si no se les consultaba en absoluto durante esta.

“Una de las mayores amenazas para mantener el conocimiento indígena que se ha documentado, es esta brecha generacional y la influencia del sistema escolar colonial”.“También escuché de líderes tribales, por ejemplo, que decían, ‘Sí fuimos consultados’, pero su versión de consulta era que se les enviaron una carta acerca de la investigación que estaban haciendo. Y eso fue todo,” dijo.

Más tarde, cuando la investigación es terminada y publicada, una mentalidad colonial a menudo determina cómo se enseña esa ciencia en las escuelas. Los estudiantes indígenas podrían aprender de los ancianos y los sabios sobre cómo sus comunidades resistieron fuertes huracanes y años de sequía en generaciones pasadas. Sin embargo, “si vas a las grandes ciudades como San Juan, Ponce, o Mayagüez, ellos no saben nada sobre eso porque no tienen la experiencia y esta información en la escuela,” dijo la coautora Norma Ortiz, miembro de la comunidad indígena Cidra, quien trabajó en el sistema educativo por más de 20 años.

“La escuela no está tan interesada en enseñarnos esto. En este momento, en la escuela tenemos una clase que está enseñando sobre el cambio climático, pero nada acerca de ser sustentable. Por ser una isla, esto es necesario.”

“Una de las mayores amenazas para mantener el conocimiento indígena que se ha documentado, es esta brecha generacional y la influencia del sistema escolar colonial”, dijo David-Chávez. “Así que ese es un aspecto realmente importante que está incluido en el modelo y fue algo en lo que nos centramos en nuestro estudio de investigación: asegurarnos de que los jóvenes tengan acceso a ese conocimiento”

Centrando la Investigación en Valores

Para diseñar su estudio climático dirigido por jóvenes, David-Chavez y Ortiz, primero acudieron a los ancianos y curanderos de la comunidad de Cidra y Comerío, quienes funcionaron como un grupo asesor comunitario.

Un grupo asesor comunitario en Cidra que co-diseñó el estudio del clima dirigido por jóvenes. Sus miembros identificaron qué resultados serían más valiosos para los jóvenes de su comunidad y se aseguraron de que el conocimiento compartido por los ancianos se aplicara de una manera que respetara su historia. Créditos: Dominique David-Chavez

“Al principio, identificamos personas en la comunidad que ya tenían un interés en involucrarse en un estudio como este y solamente hablamos con ellos informalmente,” dijo David-Chavez. “Les preguntamos específicamente qué conocimiento ambiental indígena creían que era el más importante que aprendieran los jóvenes y las futuras generaciones.”

“Mencioraon que querían que (los estudiantes) aprendieran sobre nuestra comprensión indígena de los ciclos estacionales para plantar y cosechar plantas indígenas, y especialmente plantas alimenticias indígenas. Terminamos teniendo ese como el objetivo y el tema de nuestro estudio”, dijo.

“Al cambiar la investigación para no solo enfocarse en metas y objetivos e impactos más amplios, sino para volver ese lenguaje uno de los valores centrales … los protocolos científicos y culturales se alinean entre sí durante todo el proceso”, dijo David-Chávez.

Conocimiento Indígena Sobre la Resiliencia Climática

Luego, “fuimos a las escuelas, una en Cidra y otra en Comerío,” dijo Ortiz. “Tuvimos muchos estudiantes que querían participar, pero hicimos una selección aleatoria.” Después de iniciar a los alumnos en el tema del proyecto, continuó diciendo, “aprendieron a usar mucha de la tecnología que no saben utilizar como un (receptor) GPS y una grabadora de voz con la que entrevistaron a los ancianos,” documentando el conocimiento ambiental y observando conexiones con los conceptos científicos del clima.

Los ancianos y los sabios en Cidra y Comerío les dijeron a los investigadores que los jóvenes de sus comunidades necesitaban saber qué plantas de alimentos sostenían a las comunidades durante los huracanes pasados. Los estudiantes aprendieron sobre raíces nativas comestibles (izquierda) y documentaron sus hallazgos durante un campamento (derecha). Créditos: Dominique David-Chávez

Los ancianos “nos hablaron mucho sobre el conocimiento indígena [de] cómo sobrevivieron en huracanes, en estaciones secas, en temporadas de lluvias”, dijo Ortiz.

Por ejemplo, “mi familia planta muchas plantas como la yautía,” dijo. (Yautía es un tipo de tubérculo almidonado). Los vientos huracanados podrían derribar los árboles frutales, “pero tenemos las raíces, y no importa qué tan fuerte sea el huracán. Las raíces siempre permanecen debajo del suelo, por lo que tenemos comida”.

Después del Huracán María en 2017, “el puerto de aquí de Puerto Rico no se usó dos semanas” dijo. “Entonces mucha gente no tenía nada para comer. Pero nosotros (en Cidra) estamos en el centro de la isla. Siempre tenemos plantas. Siempre tenemos agricultores, siempre tenemos alimentos, por lo que no sufrimos mucho.”

Al final del campamento, los estudiantes presentaron su investigación a los científicos el Instituto Internacional de Silvicultura Tropical en San Juan. Ortiz presentó los resultados de su programa de investigación juvenil en la Reunión de Otoño de la AGU del 2018.

Una Responsabilidad para las Generaciones Futuras

Algunas veces el conocimiento de la ciencia indígena es sumamente estigmatizado en las escuelas, afirmó David-Chavez, y los estudiantes de las comunidades indígenas aprenderán sobre esto solo si lo encuentran en una revista científica en la universidad o más tarde. Participando activamente en el proyecto de investigación, los estudiantes aprendieron el conocimiento ambiental indígena de la fuente, en lugar de que fuera a través de una perspectiva colonial.

Norma Ortiz entrevista a un anciano en Cidra. Créditos: Dominique David-Chávez

“Tuvimos una prueba previa y posterior como parte del estudio, en donde observamos el impacto que enseñar ciencia de esta manera tiene en en sus actitudes hacia la ciencia, llegando a verse potencialmente como científicos que participan en la ciencia”, dijo David-Chávez. “También observamos sus actitudes hacia el conocimiento indígena y el conocimiento de la ciencia en su comunidad y cómo valoraron eso, cómo lo vieron.”

Las encuestas revelaron que el interés de los estudiantes en las ciencias del clima y el medio ambiente aumentó cuando se considera dentro de un contexto culturalmente relevante. “Uno de los resultados más impactantes que identificamos al principio de este estudio fue el renovado sentido de orgullo y valor hacia el conocimiento indígena por los investigadores jóvenes, sus familias, escuelas y miembros de la comunidad” escribieron David-Chávez y Ortiz en un blog acerca del estudio.

Los investigadores esperan que estudios de investigación intergeneracionales dirigidos por jóvenes (como este) se usen para cerrar la brecha generacional de conocimiento en otras comunidades indígenas. El equipo está elaborando un informe para el Departamento de Educación de Puerto Rico sobre el impacto de este tipo de aprendizaje en las escuelas y también está trabajando con un artista local en un calendario agrícola indígena para llevarlo a las comunidades.

“Tenemos una responsabilidad con la próxima generación [porque] tendrán que enfrentar los impactos climáticos. Necesitan todos los recursos que puedan tener”, dijo David-Chávez. “Y eso incluye el conocimiento indígena que la gente ha tenido sobre cómo adaptarse, cómo observar los indicadores de cambio estacional, qué alimentos crecerán bien y serán resistentes”.

“Asegurarnos de que ellos tengan esos recursos también es parte de esa resiliencia”.

—Kimberly M. S. Cartier (@AstroKimCartier), escritora de Eos

Dominique David-Chávez y Norma Ortiz desean agradecer a los miembros de las comunidades indígenas Cidra y Comerío por sus contribuciones a esta investigación. La reunión de otoño de AGU 2019 se llevó a cabo en el territorio tradicional del pueblo Ohlone, y la tribu Muwekma Ohlone continúa viviendo en sus tierras tradicionales, que incluyen la actual ciudad de San Francisco.

This translation was made possible by a partnership with Planeteando. Esta traducción fue posible gracias a una asociación con Planeteando. Traducción de Rebeca Lomelí y Edición de Alejandra Ramírez de los Santos.

Geology and Chemistry Drive Animal Migration in the Serengeti

Wed, 05/27/2020 - 11:58

The most famous migration in the animal kingdom is undoubtedly that of wildebeests. Every year, roughly 1.2 million of the ungulates wind their way through Africa’s Serengeti ecosystem. Researchers now have preliminary evidence that this record-setting migration is dictated by more than just precipitation patterns: Soil chemistry is also a likely driver.

A Perilous Journey

“The Serengeti is one of the last great migratory systems we’ve got left.”Wildebeests resemble shaggy cows with long, skinny legs. “They look funny,” said Simon Kübler, a geoscientist at the Ludwig Maximilian University of Munich. “They look like a mixture of several animals.” Most people know the animals from nature documentaries showing them traversing the Mara River, a perilous crossing marked by drownings and hungry crocodiles.

Every year, the animals journey roughly 500 kilometers through wide plains covered with short grasses, as well as through wooded areas and landscapes with mixed grasses and shrubs. They’re following the route that their ancestors did, and that movement merits study, said Josephine Mahony, an environmental scientist at the University of Oxford not involved in the research. “The Earth has lost a lot of its migratory ecosystems over time. The Serengeti is one of the last great migratory systems we’ve got left.”

Scientists have often studied wildebeest migration from a climatic perspective but rarely from the angle of rock chemistry and weathering, said Kübler. And what’s in the ground might have a significant influence on animal grazing patterns because soil nutrient levels modulate vegetation growth.

Scientists obtained a “chemical fingerprint” of the Serengeti landscape, allowing them to determine how factors such as geology, volcanism, and tectonic activity might be affecting soil chemistry and nutrient availability, which in turn influence vegetation growth and therefore migration patterns. Credit: iStock.com/mantaphoto The Chemistry of the Serengeti

Last October, Kübler and three colleagues from German and African institutions met in Serengeti National Park in Tanzania. Starting in the southeastern part of the park, the researchers spent 2 weeks in a beige Toyota Land Cruiser retracing the wildebeests’ clockwise migration route.

Along the way, Kübler and his collaborators collected samples of rock, soil, and vegetation. The aim, said Kübler, was to obtain a “chemical fingerprint” of the landscape. That fingerprint would allow the team to determine how factors such as geology, volcanism, and tectonic activity might be affecting soil chemistry and nutrient availability, which in turn influence vegetation growth and therefore migration patterns.

Most of the samples are still awaiting analysis in a laboratory in Arusha, Tanzania, said Kübler. But the team, represented by Eileen Eckmeier from the Ludwig Maximilian University of Munich, presented several preliminary results at this month’s EGU2020: Sharing Geoscience Online, a virtual series hosted by the European Geosciences Union.

The site farthest south that the team sampled—within the animals’ springtime grazing grounds—is characterized by soils enriched by a nearby volcano, the researchers found. Ol Doinyo Lengai, roughly 50 kilometers east of Serengeti National Park, holds a unique honor among volcanoes: It produces magma rich in sodium and calcium. (That’s unlike most other volcanoes, which spew out silica-rich magma.)

Ash from Ol Doinyo Lengai rains down on the southeastern part of the park and sprinkles calcium into the soil, Kübler said. “You can see calcium carbonate concretions in the soils.”

“The geological system that’s underlying the entire ecosystem might be stable for longer periods of time.”This nutrient contributes to soil fertility, which in turn promotes vegetation growth. Calcium also helps animals develop strong bones. It’s probably not a coincidence that wildebeests graze here with their young, said Kübler. “We believe that the activity of Ol Doinyo Lengai, as the calcium source for the southeastern part of the ecosystem, is critical for keeping the migration alive.”

The next site that the team visited was a transitional grazing spot where wildebeests spend late fall and early winter. Chemical analyses are still in progress, but “we believe that the nutrient levels in the soils [here] are probably the lowest,” said Kübler. “Wildebeest can only stay for a limited amount of time until they migrate farther north.” Water-induced erosion likely contributes to the poor soil quality in this region, the team concluded.

The third and final site the scientists analyzed was near the northernmost border of the park, where wildebeests spend the late summer and early fall. Because of high precipitation levels in this area, rocks experience more chemical weathering, the team hypothesized, which releases nutrients into the soil and promotes vegetation growth. Furthermore, there’s a source of fresh rock because tectonic activity and uplift are occurring near this part of the park, said Kübler. “Tectonic processes can expose fresh and unweathered rocks.”

In the future, Kübler and his colleagues plan to study the timescales over which geologically important processes like volcanism and tectonic activity occur. “Climatic signals may be active on shorter timescales,” said Kübler. “The geological system that’s underlying the entire ecosystem might be stable for longer periods of time.”

—Katherine Kornei (@KatherineKornei), Science Writer

We Need to Direct More Science Research Dollars to Rural America

Wed, 05/27/2020 - 11:55

In rural regions of the U.S., distrust in science is relatively high. The consequences of this are huge, because rural votes affect who is elected to govern and whether the decisions they make are consistent with the best available science. If rural residents elect influential politicians who distrust science, U.S. leadership on global issues suffers. Our nation and others are exposed to greater risk around issues like climate change and pandemic response.

As the U.S. House and Senate plan for further pandemic relief funding for America, they should make sure some of that money goes toward supporting science — and in particular, science at rural colleges and universities.

Trust in science will grow if scientists are part of rural communities.Such funding will bring more scientists to rural communities. And trust in science will grow if scientists are part of rural communities. As a scientist at a rural college, I’m often asked if climate change is real. People trust my answer, not because I’ve studied climate change impacts and am a lead author on a 2019 Intergovernmental Panel on Climate Change report, but because they know me as a neighbor and a friend.

Trust is just one reason to steer science funding toward rural colleges and universities. These institutions offer an opportunity to involve local communities in science. I learn from members of my community what they think should be part of my research. Without community involvement, science may not be aligned with community priorities and consequently go unused.

Research at rural institutions also improves our understanding of nature’s benefits in every corner of the country, from providing refuge for diverse species, to nurturing crops, to supporting an abundance of marine life. In Colorado, I study how people affect mountain snowpack and how mountain snowpack affects plants, water supply and people. This natural wealth that surrounds rural schools is our national savings, and without rural science we do not know how much money is in the bank or how fast we are spending it.

We purchase new equipment from the housewares aisle at Walmart and use old equipment donated by retiring scientists.COVID-19 has brought to light yet another advantage of funding science at rural colleges and universities. The pandemic is putting some global change science on hold, leaving critical environmental data sets uncollected. But, as a rural scientist, I can hire students and conduct research in our area, even with travel restricted. With the help of pandemic relief funding, we could study how the mountains in which we live are responding to this great shift in human activity.

Though great science can happen in rural America, rural scientists currently face huge financial limitations. To overcome this, we purchase new equipment from the housewares aisle at Walmart and use old equipment donated by retiring scientists. It’s not an environment conducive to attracting more scientists and students. We need to change that now, by providing funding to grow science in rural regions.

Federal funding for research could be a lifeline for rural schools. Modern equipment and real lab spaces would be game changers. Field stations could ensure that we can study tundra landscapes and deserts and their connection to our well-being. Recovery dollars could help rural institutions hire and retain new faculty by offering the resources they need.

Pandemic-related relief funding that goes to science should not simply support existing research and new positions where science is already thriving. Choices for funding need to be strategic. They need to be equitable. They should support rural science to build a better future for us all.

—Heidi Steltzer (@heidimountains), Mountain Scientist

This opinion originally appeared in Ensia. Heidi Steltzer is a lead author of Intergovernmental Panel on Climate Change’s (IPCC) September 2019 Special Report on the Ocean and Cryosphere in a Changing Climate and a professor of environment and sustainability at Fort Lewis College in Durango, Colo.

Volcano Monitoring Goes Offshore

Tue, 05/26/2020 - 16:13

A significant proportion of active volcanoes of the world sit offshore. Monitoring them is vital because of the impacts of eruptions on human activities, such as the disruption of marine traffic. However, offshore volcanoes are not studied as much as their onshore counterparts because technical challenges hinder continuous monitoring.

Recent technological developments have allowed monitoring of some submarine volcanoes. Hefner et al. [2020] present new work on Axial Seamount, an active seafloor volcano off the Oregon coast. They demonstrate that vertical displacements from the 2015 eruption recorded by ocean-bottom pressure gauges are well modeled by a combination of a deflation of a shallow magma reservoir and a fault slip on the caldera wall. A series of these studies demonstrate the utility of offshore measurements to monitor activity.

It is not only vertical displacements that are useful in monitoring offshore volcanoes but also horizontal displacements. For example, a new system has been developed using a rigid buoy with a Global Navigation Satellite System antenna, attached to the seafloor. Future technological developments will allow us to monitor submarine and island volcanoes offshore more extensively with less cost.

Citation: Hefner, W. L., Nooner, S. L., Chadwick, W. W., & Bohnenstiehl, D. W. R. [2020]. Revised magmatic source models for the 2015 eruption at Axial Seamount including estimates of fault‐induced deformation. Journal of Geophysical Research: Solid Earth, 125, e2020JB019356. https://doi.org/10.1029/2020JB019356

—Yosuke Aoki, Associate Editor, JGR: Solid Earth

The Role of Earth and Space Scientists During Pandemics

Tue, 05/26/2020 - 12:09

We are living in exceptional and difficult times due to the spread of SARS-CoV-2, the virus responsible for the coronavirus pandemic (COVID-19), which has significantly disrupted the usual rhythm of our days. As Earth and space scientists, we believe we can offer insights into the pandemic and work collectively towards solutions.

The coronavirus pandemic is truly a global health crisis that has affected every country around the world. Governments have imposed severe restrictions on their populations in a bid to stop the spread of coronavirus, including strict controls on internal travel, bans on foreign visitors, and orders for people to stay in their homes. At the time of writing in mid-May, the virus has infected over 4.6 million people and killed more than 311,500 across the world in under six months (data from the Center for Systems Science and Engineering at Johns Hopkins University).

Earth and space scientists have an important role to play. They can offer valuable insights into the coronavirus pandemic and other global health challenges.While the immediate concern during this public health emergency is focused on the provision of medical care and interventions to reduce transmission, this is not only an issue for biomedical scientists, virologists and public policy experts to tackle. Earth and space scientists have an important role to play too. Soil scientists, climate modelers, hydrologists and people from many other disciplines can offer valuable insights into the coronavirus pandemic and other global health challenges.

Planetary changes

During recent decades, humans have been exploiting the natural resources of our planet – fossil fuels, water, land, timber, minerals, wildlife and much more. The need and desire for more natural resources has led humans to encroach on various natural habitats. This results in expansion of “ecotones”, where species assemblages from different habitats mix, providing new opportunities for “spillover” of pathogens from wild animals and insects into human beings.

What we are experiencing is not the first case of human diseases originating from indiscriminate contacts with infected animals. For example, the emergence of the human immunodeficiency virus (HIV) is believed to have arisen from the hunting of nonhuman primates (chimpanzees) for food in central African forests. The outbreaks of Ebola hemorrhagic fever from 1999 and other coronaviruses such as SARS (severe acute respiratory syndrome coronavirus, SARS-CoV) in 2003 and MERS (Middle East respiratory syndrome coronavirus, MERS-CoV) in 2012, were also triggered by a jump from animal to human in disturbed natural habitats. Phylogenetic analysis of the novel SARS-CoV-2 virus suggests an initial single-locus zoonotic spillover event in December 2019 at a wet market in the large, modern city of Wuhan (Hubei province, central China), where wildlife was being sold, often in unhygienic conditions.

While these examples may have happened in places far from where we live, we must share a collective responsibility for this as 21st century humans in a globalized world. Across all continents, human populations are expanding their footprint as ever-more land is used for settlements, agriculture and natural resource extraction. This penetration into wildlife habitats increases the risk of spillover of zoonotic viruses and the rate of future zoonotic disease emergence will be linked to the evolution of the agriculture–environment nexus.

The changing climate is another factor to consider in the spread of the virus. Inter-annual and inter-decadal climate variability interacts with environmental and land-use changes affecting the survival, reproduction and distribution of disease pathogens and their hosts to contribute to disease emergence. The corresponding rise in annual average temperatures has altered the habitat of pathogen-carrying insects causing outbreaks, such as West Nile, Chikungunya, Dengue, and Zika diseases, to appear in new geographic regions.

A need for convergent research

There is an urgent need for convergent research focusing on biological, ecological and social drivers of pathogen emergence and distribution. Through holistic, integrated and interdisciplinary studies and codesign with, and participation of, relevant stakeholders we must focus our attention on pathogen dynamics at the wildlife–livestock–human interface along with the influence of warmer temperatures on disease vectors.

Earth and space science is the science of the 21st century. All disciplines of our science can produce the knowledge and expertise needed to address the grand societal challenges that we collectively face.Earth and space science is the science of the 21st century. All disciplines of our science can produce the knowledge and expertise needed to address the grand societal challenges that we collectively face. These challenges transcend borders and politics. We must take this opportunity and broaden our reach by inviting those in other domains of science and relevant stakeholders to work with us in co-designing and co-producing knowledge and solutions. Through this, we will advance knowledge and create solutions that benefit communities, places and environments, and contribute towards the sustainability of our planet.

— Fabio Florindo (fabio.florindo@ingv.it), Editor in Chief, Reviews of Geophysics; and Christine McEntee, former Executive Director and CEO of AGU

Seismic Noise Reveals Landslides in the Gulf of Mexico

Tue, 05/26/2020 - 12:09

Some 5% of signals recorded in the seismic record are caused by earthquakes. The rest is noise: Sensitive instruments pick up signals from ocean waves, storms, ice movement, hard-rock landslides, traffic, and even the occasional football game.

In examining this noise, scientists recently detected some strange signals emanating from the Gulf of Mexico. After a good bit of detective work, they determined the signals were coming from submarine landslides, about 10 per year, triggered by earthquakes hundreds to thousands of kilometers away. These findings have significant implications for the study of submarine slope failure processes and potential implications for paleoseismology, the oil and gas industry, and even coastal hazards assessments.

Seismic Noise

Seismologist Wenyuan Fan of Florida State University in Tallahassee was trying to understand the seismic noise across the United States, using USArray data, when he ran across “a lot of sources in the Gulf of Mexico and got very confused” because the Gulf region is not known for producing many earthquakes.

Fan and his colleagues went back through the data, combining USArray data with seismic data from many regional networks (including the Southern California Seismic Network and the Pacific Northwest Seismic Network) and then making sure the signals showed up on multiple arrays. They also compared the signals to the U.S. Geological Survey (USGS) database of known earthquakes.

All but 10 of the signals were preceded by earthquakes from magnitude 4.9 to 7.3, 1,000 kilometers or more away, suggesting that the landslides were triggered by remote earthquakes.Fan and his colleagues, Jeff McGuire of the USGS and Peter Shearer of the Scripps Institution of Oceanography, found 85 seismic events between 2008 and 2015 emanating in the Gulf of Mexico’s Western Planning Area, a region managed for development of oil, gas, and mineral resources by the Bureau of Ocean Energy Management. None of the 85 seismic sources showed up in the USGS earthquake database, suggesting the seismic events, some of which were as strong as a magnitude 3.5 earthquake, were not earthquakes.

The lack of faults and earthquakes in the region plus the abundance of these events “and the signatures of the waveforms led us to think they are likely to be submarine landslides,” Fan said. Furthermore, all but 10 of the sources were preceded by earthquakes from magnitude 4.9 to 7.3 emanating 1,000 kilometers or more away, and the occurrences of these events coinciding with the passing seismic waves from the remote earthquakes suggested that the landslides were triggered by remote earthquakes, the team reported in Geophysical Research Letters.

Remote Triggering

Not all geoscientists are convinced. For one thing, the Western Planning Area is riddled with faults throughout the salt diapirs that make up much of the subsurface, said Chris Goldfinger, a geologist at Oregon State University in Corvallis..

This detailed survey map of the Terrebonne Basin, off the central coast of Louisiana, shows the geologic complexity of the region. Credit: BOEM

Though these don’t create traditional crustal earthquakes, they can shake. It is far more likely that the seismic signals Fan’s team observed came from quakes in the salt diapirs than from submarine landslides, Goldfinger said.

There has been no evidence of triggering of submarine landslides by remote earthquakes in the geologic record, he said, and submarine landslides are not known to produce seismic signals. “Submarine landslides, by nature—failing poorly consolidated materials in a low-gravitation environment—don’t make much noise, like marshmallows rolling down a gentle incline,” Goldfinger said. The submarine landslides would have to be enormous to create seismic signals, and that’s just not a likely scenario, he said.

However, Goldfinger added, the temporal links between seismic signals that Fan’s team found are intriguing. It’s quite possible, he noted, that the authors may have solved a resolution problem in the offshore Gulf: Quakes in the diapirs might have been occurring all along but hadn’t been seen by land-based seismometers until USArray.

Triggering of submarine landslides by local earthquakes is common and frequently used to identify past large earthquakes, such as those along the Cascadia Subduction Zone where Goldfinger works. If you shake the local layers of sediments on steep slopes hard enough, they will fail, producing turbidites, which can be dated to determine when the submarine landslide, and thus earthquake, occurred.

Triggering of one earthquake by another is also quite common. That’s because faults are often “ready to go,” Goldfinger said.

But landslides don’t fail by the same mechanism, Goldfinger explained. They are “different beasts.” If the ground was shaking strongly enough to trigger a landslide, in theory, it should trigger several or even a lot of landslides, not just one, as Fan and his team recorded, Goldfinger said.

Confirming Preconditioning

Michael Strasser, a marine geologist and sedimentologist at the University of Innsbruck in Austria, noted that recent research has indicated that steep slopes with high sediment rates (like the Gulf of Mexico) can be “preconditioned” to fail and may just need a little push.

A 2006 study in Proceedings of the Integrated Ocean Drilling Program, for example, showed that Gulf of Mexico sediments are overpressured, Strasser said. If a seismic wave passes through overpressured sediments, it can cause them to fail. If you accept the preconditioning hypothesis, Strasser said, Fan’s “fascinating” results could make sense and “could indicate that such slopes are close to failure.” Of course, he said, he would really like to see some ground truthing.

Fan’s team did not confirm that any of the 85 submarine landslides they inferred through seismic data actually occurred because the team’s landslide location resolution is “too poor” to make a direct comparison to the high-resolution bathymetric data of the Gulf of Mexico, Fan said. Having an ocean bottom seismometer array in the Gulf would greatly improve the landslide location accuracy, he said.

Scientists could also survey the seafloor with “very, very good multibeam sonar or a submersible” before and after landslides, Strasser said, but it would be hard, given the size of the proposed landslides and the area to be surveyed plus current technology. One could also try drilling cores, said marine geologist Paul Johnson of the University of Washington. But engineers and scientists would need to know just where to drill.

Wide-Ranging Implications

The implications of Fan and his colleagues’ findings are wide-ranging. It’s pretty clear, Fan said, whether the seismic sources are submarine landslides or something else, that “it’s active out there” in the Gulf.

Geologic activity in the northern Gulf of Mexico would be of interest to oil and gas production there. Click image for larger version. Credit: U.S. Energy Information Administration

Although these landslides are small enough that they probably won’t change the tsunami hazard potential for Gulf Coast communities, they could change the picture for oil and gas operators in the Western Planning Area. Indeed, Goldfinger said, “if I were an oil guy reading this paper, I’d be concerned.”

The study also may have implications for paleoseismology because Fan and his colleagues “made a great case for remote triggering,” Johnson said. But correlating old submarine landslides with past earthquakes is done by correlating numerous sites “to assess the origins of event deposits and to filter out extraneous signals,” Goldfinger said. “You can’t compare just one earthquake and landslide.”

“It’s a beautifully testable hypothesis—great science.”Strasser said it’s probably just going to take time before scientists have any idea how important this paper is: “I’m really looking forward to seeing this method applied elsewhere. If it’s fundamentally relevant, it should be detectable around the world.…It’s a beautifully testable hypothesis—great science.”

—Megan Sever (@MeganSever4), Science Writer

An Iconic Eruption Shaped Careers, as well as Landscapes

Tue, 05/26/2020 - 12:05

In March 1980, Mount St. Helens began to wake up. A series of small earthquakes shook the mountain, and over the next 10 days, hundreds more followed. On 27 March, violent steam explosions blasted through the summit’s ice cap to create a giant 75-meter-wide crater.  On 18 May, the volcano’s entire north face collapsed, creating the largest landslide ever recorded.

Just a few years earlier, in 1978, two U.S. Geological Survey (USGS) researchers—Rocky Crandell and Don Mullineaux—had cautioned that the volcano could erupt relatively soon, “perhaps before the end of the century.”

Consequently, volcanologists like Mullineaux, Crandell, and USGS research geologist Don Swanson were quick to arrive in March. Swanson had grown up in southwestern Washington, and the Cascades were familiar territory. Before 1980, however, his research had focused mainly on the geological study of the range’s older volcanic rocks. “Then Mount St. Helens came along,” Swanson said, “and I got back into the active side of volcanism.”

Growing Threat

Scientists, with helicopter support, climbed over the mountainside to install surveying targets they had fashioned from plastic highway reflectors.By early April, the summit’s crater had stretched wide across the summit, and a broad area of the volcano’s northern flank appeared to be growing outward. Concerned, Crandell sought the expertise of Barry Voight, a landslide researcher at Pennsylvania State University. On the basis of what he saw, Voight believed that a giant landslide was possible—and that it could “uncork” the volcano and cause a major eruption.

“I remember having dinner at a Portland fish restaurant with a former grad student, making a back-of-envelope calculation of a likely volume of collapse—well over a cubic kilometer—and thinking it could happen very soon and shock a lot of people,” said Voight.

Meanwhile, to quantify movement on the expanding bulge, Swanson and other scientists, with helicopter support, climbed over the mountainside to install surveying targets they had fashioned from plastic highway reflectors.

Only weeks later, at 8:32 a.m. on 18 May, the north slope of Mount St. Helens collapsed. With that release of pressure, the magma pocket inside the volcano exploded and ejected a blast of hot gas and rock. The eruption continued for 9 hours, and further eruptions continued for months..

An ash plume from the eruption towers over Mount St. Helens on 18 May 1980. The plume moved eastward at an average speed of 95 kilometers per hour, and the volcano erupted for 9 hours, eventually reaching 20–25 kilometers above sea level. By early 19 May, the devastating eruption was over. Credit: USGS Milestone Event for Volcanology in the United States

The eruption and landslide ultimately claimed the lives of 57 people, including USGS geologist David Johnston. It also sparked major changes in the field of volcanology and influenced the careers of many scientists.

For Mullineaux and Crandell, the eruption was the culminating event of their distinguished careers. Both retired officially from the USGS shortly after 1980 but continued to publish articles about volcano hazards, and Mullineaux coedited the definitive 1981 USGS monograph about the catastrophe.

“The death of Dave Johnston and all the others in 1980 was a, was kind of a spur to me to ask, ‘Can we do better in the future?’”But for many scientists, the disaster marked a new beginning. “The death of Dave Johnston and all the others in 1980 was a, was kind of a spur to me to ask, ‘Can we do better in the future?’” Swanson said.

Swanson spent the next 6 years in the field at Mount St. Helens, painstakingly tracking the growth of new lava domes inside the crater. During that time, the volcano continued to erupt, and scientists were able to start forecasting its behavior.

Voight also worked on the posteruption disaster response and guided his graduate student Harry Glicken’s investigation into the 1980 slope failure at the volcano. Their study triggered a dramatic surge of interest in volcanic landslides and related explosions caused by volcano collapses and the global realization that such events were not rare, as previously thought.

Don Swanson carries a huge pumice rock in the weeks after the 1980 eruption. Credit: USGS

Glicken went on to become an expert on volcanic avalanches but in 1991 was killed by an explosive eruption of Mount Unzen, Japan. “Harry was a brilliant, insightful field geologist,” Voight said, “and his life was rewarding for science.”

Voight’s analysis of various Mount St. Helens data helped him formulate the failure forecast method, which can help to forecast landslides and volcanic eruptions. His field studies and engineering approach to volcano processes earned him a strong reputation in the community. “Wherever there was a problem that involved a potential slope failure at a volcano, my name would come up,” Voight said.

Voight went on to assess the potential for slope collapses and related eruptions at volcanoes around the world, including those in South America, Japan, Russia, Southeast Asia, and the Caribbean. “Without the Mount St. Helens eruption, I wouldn’t have done any of these things,” Voight said. “It completely changed the trajectory of my career.”

Improved Monitoring and Mitigation  

Following the 1980 eruption, the U.S. government boosted funding for research in volcanology more than tenfold, which led to the establishment of the USGS Cascades Volcano Observatory (CVO) in 1981. “The Mount St. Helens eruption really galvanized a lot of people from the federal government side to the university side to start thinking about volcanism in the United States,” said Seth Moran, the scientist-in-charge at CVO.  “On a basic level, it’s the reason why I’m doing what I’m doing today.”.

A scientist surveys the bulging flank of Mount St. Helens after the 1980 eruption. Credit: Mike Doukas, USGS

The Mount St. Helens eruption also inspired many volcanologists to increase their level of work toward mitigating the impacts of volcanic activity. Swanson went to the USGS Hawaiian Volcano Observatory in 1998, with the goal of better understanding the explosive history of Kīlauea. His research there highlighted the complex relationship between the caldera collapse and eruptive activity at Kīlauea and formed the basis for much of the assessment of hazards associated with Kīlauea’s 2018 summit collapse.

“I’m not sure I would have had the same interest in the explosive deposits from Kīlauea if I hadn’t had the experience of seeing what the explosion at Mount St. Helens did,” Swanson said. “But there was a greater awareness from the 1980 eruption that volcanoes are hazardous, and that has sharpened our studies of Kīlauea.”

Paying It Forward Barry Voight (facing camera) and Oscar Ospina install an instrument under the summit ice of Nevado del Ruiz, shortly after the volcano’s fatal eruption in 1985. Credit: Barry Voight

In 1985, Colombian authorities requested assistance from the United States shortly after the eruption of the Nevado del Ruiz volcano, which resulted in 23,000 deaths. Voight was sent to Nevado del Ruiz, and his analysis of the disaster indicated that cumulative human error led to the extent of the tragedy and improved hazard assessment and mitigation measures could have saved many lives.

“I was able to examine why the whole disaster took place and became more acutely aware of the hazard management aspect,” Voight said. “A major general lesson was that scientists, who had the best appreciation of the true hazard, needed to shoulder more social responsibility—to ensure that an appropriate and effectively delivered message reached the people at risk.”

“If you look at it in a global context, I think what was learned at St. Helens has probably saved, and will continue to save, tens of thousands of lives.”Voight was able to use his experience and skills in physical volcanology and hazard management when he was asked to be part of a team of scientists monitoring the Soufrière Hills volcano at Montserrat, an island in the Lesser Antilles. His research and hazard mitigation efforts there extended over 17 years and multiple eruptions.

In a December 1997 eruption, an anticipated flank collapse at Soufrière Hills led to a severe and devastating lateral blast, closely mimicking that of Mount St. Helens in 1980. Fortunately, islanders had evacuated, and there were no fatalities associated with this dramatic event. “If you look at it in a global context, I think what was learned at St. Helens has probably saved, and will continue to save, tens of thousands of lives,” Swanson said.

—Jane Palmer (@JanePalmerComms), Science Writer

Sensor Network Warns of Stealth Tsunamis

Tue, 05/26/2020 - 12:03

Tsunamis are among the deadliest of natural hazards, having caused about 250,000 fatalities in the past 20 years alone. For locally generated tsunamis, the most effective way that loss of human life is reduced is also one of the simplest: People feel long-lasting or strong ground motion or see a wave approaching, and they evacuate the area. However, direct tsunami observation may not be available until hours after the tsunami was generated—some locally generated tsunamis arrive at coastlines within minutes. Quickly and accurately measuring tsunami waves before they arrive at a coast is the most certain way to forecast tsunami impacts and reduce the impact on people, property, infrastructure, and the economy.

This New Zealand initiative marks the biggest single growth in the tsunami monitoring network in this region since its inception.Recently, scientists, policy makers, and emergency managers have begun to further their knowledge that in some regions, the warning provided by ground shaking, even from large regional subduction earthquakes (M > 8), might not be sufficient to convince people to move away from coastal areas and/or inland out of tsunami evacuation zones. One example of this was the 2009 Samoan earthquake and tsunami. Despite the earthquake’s large magnitude (M8.1) and regional distance of a few hundred kilometers, the earthquake wasn’t felt strongly in Tonga, but it generated land inundation that caused fatalities. Another such region prone to this stealth tsunami effect is New Zealand’s densely populated north coast, where tens of thousands of people, including residents of the country’s largest city, Auckland, are exposed to tsunami hazards [Fry et al., 2018].

To reduce tsunami risks to coastal populations, the New Zealand government recently announced an initiative to support the Pacific Tsunami Warning System to expand the global network of tsunameters in the southwest Pacific. Tsunameters, devices or systems that detect tsunamis, often use Deep-ocean Assessment and Reporting of Tsunamis (DART) real-time monitoring systems. This New Zealand initiative, with its planned deployment of 12 new 4G DART buoys, marks the biggest single growth in the tsunami monitoring network in this region since its inception (Figure 1). This array will soon underpin improved tsunami monitoring and detection for all communities bordering the Pacific Ocean.

Fig. 1. Approximate locations of new (lettered) and existing (numbered) DART sites. Colored lines indicate faults associated with seismic activity; a tsunami generated along one of these faults will be detected by the DART buoy array in the time indicated by color. Detection times for the existing array (left) can be compared to detection times with the full array (right). Establishing a Monitoring Network

A distant tsunami might take more than 3 hours to travel from its origin to where it reaches a coast. Regional tsunamis arrive at nearby coasts between 1 and 3 hours after they’re initiated, and local tsunamis may arrive in less than an hour. Reducing uncertainty in forecasting any type of tsunami is best done by measuring open-ocean sea heights using tsunameters [Angove et al., 2019].

After the magnitude 9 Sumatra earthquake and tsunami in 2004, the United Nations Education, Science and Cultural Organization advocated for the installation of a Pacific-wide tsunami monitoring system comprising a network of DART buoys. In the 15 years since its establishment the system has provided accurate and useful tsunami warning advice for numerous regional and trans-Pacific tsunamis.

This network now includes 39 observational buoys operated by the U.S. National Oceanic and Atmospheric Administration in the Pacific and Atlantic Oceans and 21 buoys operated by countries that partner in the Pacific and Indian Ocean Tsunami Warning System. Improving the density of this array will improve forecasting, especially for regional and local events.

New Sensors Record Close to the Source

Recent advances in DART technology now facilitate effective warnings for even relatively short tsunami travel times.Technical limitations in early tsunami buoys required them to be placed far from tsunami sources. Thus, they were best suited to providing early warnings for tsunamis generated by distant earthquakes, with long travel times to land. When these early-generation DARTs were located too close to the earthquakes that generated tsunamis, the signals they measured were ambiguous because the signals contained overlapping readings of both tsunami waves and compressional earthquake waves traveling through Earth’s crust. This was the case for measurements made right after the magnitude 7.9 Gulf of Alaska earthquake of 2018, for example. Thus, early-generation DARTs were best suited to providing early warnings for tsunamis generated by distant earthquakes, with long travel times to land.

Recent advances in DART technology now facilitate separation of earthquake and tsunami contributions to the recorded signals, so the buoys can be placed close to potential earthquake sources (e.g., a subduction trench) to provide effective warning for even relatively short tsunami travel times. Because of this new ability to provide early warnings for local and regional tsunamis, DART deployments are now a viable option for tsunami monitoring in many of the smaller countries that are exposed to tsunamis generated nearby (Figure 2). Consequently, the capability of the Pacific tsunami monitoring network is rapidly expanding to provide faster warning times following future big earthquakes.

Fig. 2. Diagram showing travel times to the nearest DART buoy for a tsunami generated within the indicated regions. Recent advances in DART technology enable better separation of the signals from earthquakes and the tsunamis they generate, so the buoys can be placed closer to potential earthquake sources. Thus, many small countries at risk from tsunami hazards can use DART deployments for tsunami monitoring.

The New Zealand initiative capitalizes on these technological advances and includes the planned deployment of 12 new 4G DART buoys proximal to the Hikurangi, Kermadec, Tonga, and New Hebrides trenches (Figure 1). The array geometry of the network has been designed to detect most regional tsunamis generated by subduction earthquakes within 20 minutes of earthquake onset. The network will also detect tsunamis generated by volcanic eruptions in the Tonga-Kermadec back arc within approximately 30 minutes of the event.

Data from the new array will be open access and made available to monitoring agencies and the scientific community. Data will also be streamed in real time to the Pacific Tsunami Warning System to enhance trans-Pacific tsunami forecasts, providing benefits to all countries surrounding the Pacific Ocean.

From Sensor Data to Tsunami Forecasts

In operational monitoring, DART data are used to calibrate tsunami forecasts on the basis of earthquake source models. Currently, operational tsunami early warning (TEW) is predicated on using seismological methods to detect and describe the tsunami-generating earthquake, which is then taken as a proxy for the tsunami source. On the basis of the assumed tsunami source, tsunami wave arrival times and amplitudes at DARTs are then predicted. Differences between predictions and actual DART observations are used to refine the forecast.

Increasingly dense DART networks, with their greater numbers of detecting and measuring instruments, reduce uncertainty in forecasts of the maximum coastal amplitude of tsunami waves and the duration of significant waves. This increase in precision enables more efficient and effective emergency response, earlier warning cancellations, fewer false alarms, and, ultimately, fewer lost lives and reduced disruption from tsunami warnings.

Dense tsunameter monitoring networks will also open future possibilities for using advanced data science techniques to rapidly forecast tsunamis. Currently, using the generating earthquake as a proxy for tsunami source, rather than determining the tsunami source directly, is still a significant limitation. However, a proliferation of DART buoys or other open-ocean water height observations will eventually support next-generation TEW based on machine learning approaches. Such approaches include data assimilation in which direct observations of tsunamis will drive physics-based forecasts. These approaches require measurement of the tsunami waves at multiple locations prior to their arrival in at-risk coastal communities. The new southwestern Pacific tsunameter network provides a unique opportunity to prototype these tools in an operational setting.

Big Earthquakes, Small Warnings

The lack of sufficient natural warning signs and short tsunami travel times from initiation to coastal impact (between about 45 and 90 minutes) in the southwestern Pacific pose significant risks to coastal communities. Improving tsunami early warning for New Zealand and Pacific Island communities motivates the new network expansion, which has been led by the New Zealand National Emergency Management Agency.

Fig. 3. In one widespread approach to tsunami early warning (top), coastal residents evacuate an area mostly on the basis of natural warning signs when an approaching tsunami takes less than 1 hour to reach the coast. Seismic instruments record regional ground shaking, and for tsunami travel times greater than 3 hours, open-ocean DART observations underpin warnings based on ground shaking and water waves. When seismic attenuation and earthquake geometry dampen ground shaking, this approach produces a gap in tsunami response strategy (middle). The new network will close this gap for much of the southwest Pacific (bottom).

Earthquakes in this region may not produce strong shaking on land because of the distance the seismic waves have to travel and the fact that seismic energy from earthquakes does not radiate and dissipate equally in all directions. Further, seismic waves lose energy as they travel through the hot, weak material of the back arc and mantle wedge of the Tonga-Kermadec Subduction Zone. These effects combine to reduce the shaking at the coastline and thus decrease the effectiveness of ground shaking as a natural warning to keep communities safe.

New Zealand is not alone in this: A lack of adequate natural seismic warning likely applies for major earthquake scenarios in other regions of the Pacific. Thus, for effective tsunami early warning, self-evacuation messaging must be supplemented with scientific monitoring and alerting mechanisms to protect vulnerable coastal populations living close to tsunami sources (Figure 3). The new array of tsunameters provides both a definitive path to improving societal resilience to tsunamis and a test bed for next-generation tsunami forecasting techniques.

Acknowledgments

Underpinning science for this work was funded by the New Zealand Royal Society’s Marsden Fund.

This Week: Our Summer Reading List!

Fri, 05/22/2020 - 06:00

I have a huge stack of books to get through, but I can guarantee you, I will be reading The Beach Club, by Elin Hilderbrand. —Melissa Tribur, Production Specialist

 

Summer is a time for daydreaming, so that’s why I’m turning to fantasy this #quarantinesummer. I’m working my way through The Fellowship of the Ring, by J. R. R. Tolkien, on audiobook—a jovial companion for meandering walks and lazy weekend cooking. I’m also rereading Harry Potter by J. K. Rowling out loud with a friend. We are thoroughly enjoying how villains in children’s books overexplain their motives, as well as the chance to hone our beguiling British accents. —Jenessa Duncombe, Staff Writer

 

I’m guessing I’m not the only one with an e-book holds list that’s maxed out at my local library, since wait times range from 2 to 20 weeks, perfect to distribute through the summer. There are a few I’m most excited to pop up as available, including Fleishman Is in Trouble, the debut novel by Taffy Brodesser-Akner, whose must-read profiles (Val Kilmer, Gwyneth Paltrow) are funny and weird and insightful. This Is How You Lose the Time War by Amal El-Mohtar and Max Gladstone is being called an outstanding entry into the “epic time-traveling love story” genre. Record of a Spaceborn Few is the third book in Becky Chambers’s Wayfarers trilogy, a supremely enjoyable series. —Heather Goss, Editor in Chief

 

My summer reading looks to be heavy on photography and sci-fi, with two books topping my list. First up is The Human Planet: Earth at the Dawn of the Anthropocene, released in April, an aerial photographic tour of Earth highlighting humanity’s impacts on and intersections with the natural world that features George Steinmetz’s stunning camera work and Andrew Revkin’s always-compelling writing. Then I’m hoping to dig into Pierce Brown’s Red Rising series, which—I hear tell—imagines issues of class warfare on a settled Mars hundreds of years in the future in gripping fashion. The first book in the now five-installment series came out in 2014, so no spoilers please if you’ve read it! —Timothy Oleson, Science Editor

 

I’m hoping to get to a couple of books that have been on my wish list for a few years and fully commit to finishing them. I’m going to start with Medical Apartheid, by Harriet Washington, which timelines the history of malpractice in the medical field, and eventually land on Toni Morrison’s collection of essays and speeches, The Source of Self-Regard. — Anaise Aristide, Production and Analytics Specialist

 

My summer reading list is long and only getting longer. On the nonfiction side, once I finish listening to Michelle Obama’s Becoming (she has such a soothing speaking voice), I’ll be starting Amanda Knox’s Love Lives Here; The Craft of Science Writing from Siri Carpenter at The Open Notebook; and The Bible v.2: Genesis, a humorous retelling written by my good friend Aaron Senser. And then to round out my list with some sci-fi and fantasy are Thrawn: Alliances, a Star Wars novel by the legendary Timothy Zahn (aah, Thrawn, gotta love him!); the long, long awaited Dresden Files installments Peace Talks and Battle Ground by Jim Butcher; and my wild card, Starless, by Jacqueline Carey. I like having a wild card book on my list: I know nothing about it, I have no idea if I will like it, but I like the author, so I’m giving it a shot. Seveneves was my last wild card book, and, oh boy, was that a good decision! —Kimberly Cartier, Staff Writer

 

I’m putting my trust in the good folks at the New York Public Library this summer—following recommendations via their Book of the Day email. (Sign up here!) Whenever I finish a book, I’ll check out (literally!) whatever they recommend that day. So far, I’ve enjoyed a Dust Bowl murder mystery and a wonderful, wonderful novel set during the New Zealand Gold Rush. Now I’m in the middle of O Pioneers!, which has been on my to-do list for about 30 years. —Caryl-Sue, Managing Editor

NOAA Predicts Busy Hurricane Season

Thu, 05/21/2020 - 20:57

Coastal states should brace for three to six major Atlantic hurricanes this season, according to a forecast released by the National Oceanic and Atmospheric Administration (NOAA) today. There is a 60% chance of an above-normal storm season, 30% chance of an average season, and 10% chance of a below-normal season.

The agency predicts 13 to 19 named storms, of which 6 to 10 are hurricanes, including 3 to 6 major hurricanes (category 3 or higher). On average, a typical year has 12 named storms, 6 of which become hurricanes, including 3 that grow into major hurricanes, according to NOAA.

“The 2020 Atlantic hurricane season is expected to be a busy one.”“The 2020 Atlantic hurricane season is expected to be a busy one,” Gerry Bell, NOAA’s lead seasonal hurricane forecaster at the Climate Prediction Center, said in a news conference today.

NOAA issues a forecast each spring for the Atlantic hurricane season that runs from 1 June to 30 November. Among all the factors considered, tropical Atlantic and Caribbean sea surface temperatures, Atlantic trade winds, and the phase of the El Niño–Southern Oscillation will affect the upcoming season.

Climate change has made storms more intense, according to research released this week in the Proceedings of the National Academy of Sciences of the United States of America. With each passing decade since 1979, we have had about a 6% higher chance of a category 3 hurricane or larger. Category 3 storms have winds faster than 178 kilometers per hour, strong enough to snap trees, remove roof decking, and cause devastating damage, said the National Hurricane Center.

“If 2020 becomes an above-normal season, it will make a record of five consecutive above-normal Atlantic hurricane seasons,” Bell said.

A Recipe for a Hurricane

Two climate conditions contributed to the projection of a more active season this year: the warm phase of the Atlantic Multidecadal Oscillation and the neutral to La Niña phase of the El Niño–Southern Oscillation.

The Atlantic Multidecadal Oscillation is a naturally occurring climate phase that lasts about 20–40 years. The warm phase of the oscillation, which began around 1995, boosts sea surface temperatures, decreases the vertical wind shear, and enhances the African monsoon this year, leading to more favorable storm conditions.

La Niña events typically boost hurricane seasons by weakening the vertical wind shear over the Atlantic.The season could become highly active, on the upper end of predictions, if a La Niña develops. La Niña is one phase of the El Niño–Southern Oscillation, and La Niña events typically amp up hurricane seasons by weakening the vertical wind shear over the Atlantic. There is a 50% chance of neutral conditions this year (no El Niño or La Niña) and a 40% chance of a La Niña, according to NOAA’s 14 May forecast. The agency will issue an updated forecast around 10 June.

Bell said that climate change was not a major driver in Atlantic hurricane conditions this year.

Other forecasts by universities, private companies, and governments also predict higher-than-average storm activity this year, with the exception of one model based in Reading, United Kingdom. An average of 17 named storms and 8 hurricanes have been forecasted by 13 models, according to CNN. The Reading model, that of the European Centre for Medium-Range Weather Forecasts, has predicted lower storm counts than those observed in the past several years, according to CNN.

Emergency Response Put to the Test

This year may be reminiscent of 2005, the most active hurricane season on satellite record, Michael Mann, an atmospheric scientist and contributor to Pennsylvania State University’s hurricane forecast, told Wired. In 2005, 27 named storms developed, 14 of which became hurricanes. Three storms in 2005 grew to category 5 hurricanes, including Hurricane Katrina, which killed more than a thousand people.

FEMA urges residents to evacuate to friends and family instead of emergency shelters and stay home unless in an evacuation zone.On 20 May, the Federal Emergency Management Agency (FEMA) issued new guidance for storm preparedness during the pandemic caused by the novel coronavirus (COVID-19). Among them, FEMA urges residents to evacuate to friends and family instead of emergency shelters and stay home unless in an evacuation area.

Florida has already accounted for the novel coronavirus in their hurricane emergency preparedness tasks. One emergency manager spoke about stockpiling masks, and another laid out procedures for safely housing evacuees: temperature checks and isolation for people with COVID-19 symptoms. Typically, municipalities rely on wide-open spaces like gymnasiums for shelter, but the American Red Cross and other organizations are looking into using hotels, dormitories, and other alternatives.

Only time will tell how many hurricanes will make landfall because NOAA’s forecasts can’t predict storm tracks months in advance. In 2010, a particularly active hurricane season in the Atlantic, not 1 of the 12 hurricanes made landfall in the United States. There was a 1-in-70 chance of so few hurricanes coming ashore, according to CNN.

“We always say that it only takes one big hurricane landfall to be a bad season,” said CNN meteorologist Taylor Ward. “So all coastal residents should certainly be paying close attention and have their hurricane plan ready for the upcoming season.”

—Jenessa Duncombe (@jrdscience), Staff Writer

As the Planet Warms, Intense Storms Become More Common

Thu, 05/21/2020 - 15:40

Storms born over water—hurricanes, cyclones, and typhoons—can be downright destructive when they make landfall. What’s more, theory has suggested that intense storms are more likely in a warmer climate. Now researchers have used a nearly 40-year record of storm observations to show that to be true. This finding is cause for concern, the team suggests, because the strongest storms cause disproportionate damage and mortality.

Upping the Speed Limit

Models have consistently demonstrated a link between a warmer climate and stronger storms. That makes sense, said James Kossin, an atmospheric scientist at the National Oceanic and Atmospheric Administration in Madison, Wis. A tropical storm’s potential intensity—its “speed limit”—is directly dictated, in part, by the difference between the temperature of the ocean’s surface and the temperature of the upper atmosphere.

To test that hypothesis using real data, Kossin and his colleagues collected infrared satellite imagery of over 4,000 storms observed between 1979 and 2017. Satellite imagery affords the most unbiased observations of storms, said Kerry Emanuel, an atmospheric scientist at the Massachusetts Institute of Technology (MIT) not involved in the research. “Satellites can see every single storm on the planet.”

So Many Data

The data set included more than just a single look at a storm: Some of the tempests were imaged at up to 133 different points in time. In total, Kossin and his collaborators had a lot of data on their hands—about 225,000 observations.

The first step was to ensure that the data were all of roughly equal quality—the images varied significantly in resolution because they were obtained by different instruments. To obtain a more homogeneous data set, Kossin and his colleagues resampled all of the observations to a spatial resolution of 8 kilometers.

The researchers then applied an automatic algorithm known as the Advanced Dvorak Technique to estimate wind speed from the images. This technique basically inputs parameters such as a storm’s structure (including its eye) and infrared intensity into a flowchart to estimate wind speed. “It’s not much more sophisticated than a trained forecaster looking at a photograph,” said MIT’s Emanuel.

Despite its indirect nature, the Advanced Dvorak Technique has been widely used to estimate wind speed from storm images, said Kossin.“It sounds very coarse and rudimentary, but it’s stood the test of time.”

The team recovered wind speeds ranging from 25 to 170 knots (roughly 46–315 kilometers per hour). The scientists then grouped the observations according to the Saffir-Simpson scale, a commonly used metric for categorizing storms based on wind speed. The Saffir-Simpson scale ranges from category 1 (wind speeds between 119 and 153 kilometers per hour) to category 5 (wind speeds above 252 kilometers per hour).

More Common “Monsters”

Kossin and his colleagues limited their analysis to images of categories 1–5 storms. They calculated the proportion of images classified as category 3, 4, or 5 and found that these “major storms” increased in prevalence as time went on: Over the 39-year span of the data set, the proportion of major storms increased by 25% (roughly 6% per decade).

The researchers also divided their data in two based on when the images were obtained. When they compared these “early half” and “late half” subsamples, Kossin and his collaborators found that more recent storms were about 15% more likely to be major storms.

“There is a likely human fingerprint on this increase.”These consistent findings are troubling, said Kossin, because intense storms are the ones that pack the biggest punch. “The major hurricanes by far dominate damages and mortality,” he said. “They’re the monsters.”

It’s difficult to pin down precisely what’s driving this evolution. Natural variability in storm strength might well play a role, the authors noted, but the data’s consistency with simulations of greenhouse warming is telling. “There is a likely human fingerprint on this increase,” the researchers concluded in their study, which was published this week in the Proceedings of the National Academy of Sciences of the United States of America.

In the future, Kossin and his colleagues plan to explore other ways to estimate wind speed from satellite imagery. “We have enough data to start applying machine learning algorithms,” he said. “It’s ultimately a pattern-recognition problem.”

—Katherine Kornei (@KatherineKornei), Science Writer

New Space Telescope Named for Nancy Roman, Astronomy Pioneer

Thu, 05/21/2020 - 15:39

A future NASA powerhouse space telescope has been named after the agency’s first chief of astronomy, Dr. Nancy Grace Roman. Roman (1925–2018) was also the first woman to be a NASA executive and is widely considered the “Mother of the Hubble Space Telescope” for her leadership in making the observatory a reality. NASA officials announced the name on 20 May.

“I was told from the beginning that women could not be scientists.”Roman was a pioneer in astronomy, earning her doctorate and conducting early research to map the Milky Way galaxy in the 1950s. Her research into the chemical compositions and motions of stars was considered one of the 100 most important astrophysical research papers in 100 years.

“I certainly did not receive any encouragement,” Roman said in a 2018 interview. “I was told from the beginning that women could not be scientists.”

She joined NASA in the agency’s infancy, created its space astronomy program, and then led it for 2 decades. Roman was the driving force in getting Hubble and many other missions approved by the U.S. Congress.

“I think that it’s really important to recognize that large NASA missions take more than being a good scientist or take more than being a good engineer,” Julie McEnery said during the announcement. “You have to have people like Nancy Grace Roman, who really understood the details of all of those things and could bring them together.” McEnery is a deputy project scientist for the Roman Telescope at NASA Goddard Space Flight Center.

Nancy Grace Roman and her legacy as the “Mother of Hubble” were featured in the LEGO Women of NASA set. Credit: Kimberly M. S. Cartier

Chanda Prescod-Weinstein, a theoretical astrophysicist at the University of New Hampshire, commented on Twitter, “I am so thrilled about this news. Nancy Roman, the mother of the Hubble Space Telescope, deserved to hear it while she was alive, but I am glad that her name will forever be imprinted on a space telescope.”

“Not gonna lie, teared up a little over here,” Jessie Christiansen, a research scientist at NASA Exoplanet Science Institute, tweeted. “So glad to see her get this recognition.”

The Nancy Grace Roman Space Telescope, formerly the Wide Field Infrared Survey Telescope (WFIRST), has faced funding challenges for years but was approved for development and launch in 2016. The Roman Telescope will be the same size as the Hubble Space Telescope and have the same sensitivity but will have a field of view 100 times larger. It’s scheduled to launch in the mid-2020s and will observe the structure and expansion of the universe, reveal the first billion years of cosmic history, and hunt for and image extrasolar planets.

Roman “really advocated putting astronomy instruments in space,” said Roman Telescope astrophysicist Elisa Quintana during the announcement. “It’s because of her and people like her that we have missions like WFIRST.”

“NASA doesn’t just build missions for science that we know about today,” Quintana said. “You really have to be a visionary like Nancy Grace Roman to develop instruments that are going to let us explore and really try to understand the unknown mysteries in our universe.”

You can learn more about Nancy Roman’s legacy from the woman herself in one of her last recorded interviews (here) and in the video below. .

—Kimberly M. S. Cartier (@AstroKimCartier), Staff Writer

A Whole World View

Thu, 05/21/2020 - 12:27

The Carbon Cycle: A Balancing Act The Future of the Carbon Cycle in a Changing Climate   Microbial Influences on Subduction Zone Carbon Cycling   Basalts Turn Carbon into Stone for Permanent Storage   Mountain Streams Exhale More Than Their Share of CO2   A Whole World View

The livability of our world depends on a healthy circulatory system for carbon. Knowing how carbon transitions among air, land, and sea is critical to understanding the balance that keeps Earth habitable—not to mention, understanding how our actions can throw off that balance.

In our June issue, several science teams report on work in this area. In “The Future of the Carbon Cycle in a Changing Climate,” Aleya Kaushik and colleagues discuss the complexities of gaining a whole-world view of that cycle. Achieving this holistic view challenges our understanding of how ecosystems respond to climate change and how those responses will alter Earth’s carbon budget in the future. In particular, these predictions are complicated by the numerous feedbacks involved. Kaushik et al. look to the observation networks—both on the ground and orbiting overhead—that have evolved over the past several decades to help scientists address these critical questions.

In “Microbial Influences on Subduction Zone Carbon Cycling,” Donato Giovannelli and colleagues consider the carbon cycle at subduction zones, specifically at the Costa Rica convergent margin. This multidisciplinary team was part of a Deep Carbon Observatory project called Biology Meets Subduction, which was aimed at answering several questions about the influence of biological activity at the convergent margin and, ultimately, determining whether the team could use that information to improve deep-carbon budget estimates for the area. Like all the best research, their work is yielding promising data and raising many more intriguing questions.

In this month’s news, we report on a fascinating experiment at a geothermal facility in Iceland. Carbon storage trials are showing that 90% of carbon dioxide injected into subsurface basalt rock is transformed into minerals in just 2 years, whereas mineralization in standard carbon storage methods can take thousands of years. Head to “Basalt Turns Carbon into Stone for Permanent Storage” to read more about the potential of mineral carbonation as a means of large-scale carbon sequestration in Iceland and elsewhere and whether it’s up to the task of making a measurable impact on climate change.

On the other side of the carbon cycle, a recent study shows that mountain streams, though comprising only about 5% of the surface area of rivers and streams globally, might account for 10% to 30% of the total flux of carbon dioxide from those waterways. Scientists have also recently discovered that the Arctic Ocean may not be as important a carbon sink as previously thought.

Finally, we’re pleased to share an article from the editor in chief of one of AGU’s newest journals, GeoHealth. Gabriel Filippelli writes in “Geohealth: Science’s First Responders” about how geoscientists, health professionals, and regional leaders are growing into a new community that is combining and harnessing their skills to address disasters. Looking back at the Deepwater Horizon disaster and the Tohoku earthquake and tsunami, Filippelli examines lessons learned from those events and how we should be applying them to the current global pandemic.

Our June issue examines the balance of this planet’s systems and, in so many ways, how anthropogenic forcing can step on that balance. How can we assess the natural movements of carbon while we keep pumping carbon into the air? How can we prepare to live safely in a world where disasters, even relatively predictable ones, can still be sudden and overwhelming? The scientists featured in this issue are committed to answering those questions and, ultimately, to making our world more knowable so we can better prepare ourselves for change.

—Heather Goss (@heathermg), Editor in Chief

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer