EOS

Syndicate content Eos
Science News by AGU
Updated: 36 min 27 sec ago

Determining Dissolved Organic Carbon Flows into the Gulf of Alaska

Fri, 01/15/2021 - 13:45

Amid ongoing climate change, understanding how and where carbon is moving across ecosystems has become a top research priority. This type of “carbon accounting” helps scientists determine where the planet is sequestering and releasing atmosphere-warming carbon compounds and is especially important at the boundaries between different ecosystems.

In a new study, Edwards et al. investigate how rivers in western Canada and southeast Alaska transport dissolved organic carbon and fresh water into the Gulf of Alaska. The study region, which spans from northern British Columbia to the southwestern corner of the Yukon Territory, represents an incredibly complex confluence of glaciers, forests, mountains, and plateaus with river systems that drain into bays, fjords, and channels before reaching the Pacific Ocean.

To build the model of carbon flux, the researchers combined a digital elevation model with estimated shapes of watershed boundaries and glacier extents as well as gridded data representing mean monthly runoff. To calculate the total amount of freshwater runoff, they used a distributed climate water balance model calibrated with measurements taken from watersheds in the study area.

The team calculated that overall, the region exports 430 cubic kilometers of fresh water and 1.17 teragrams of dissolved organic carbon annually to the Pacific Ocean. Their model shows that watershed type, location, and flow rate are important variables that control the spatial and temporal patterns of carbon flux. The scientists say that despite the region’s immense size and importance for both commercial fishing and climate, the Gulf of Alaska has been chronically understudied compared with other sections of the North American coastline. The new results highlight the significance of the region and provide a starting point for unraveling the complexity of the dynamic ecosystems and their effect on climate and humanity. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2020JG005725, 2020)

—David Shultz, Science Writer

Cómo convertir nuestras ciudades en Treetopias

Fri, 01/15/2021 - 13:44

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

El siglo XXI es el siglo urbano. Se ha pronosticado que las zonas urbanas de todo el mundo se expandirán hasta alcanzar los 2.500 millones de personas en 2050.

La velocidad y la escala de la urbanización han creado importantes problemas ambientales y sanitarios para los habitantes de las ciudades. Con frecuencia, estos problemas se han visto agravados por la falta de contacto con el mundo natural.

Con el grupo de investigación Tree Urbanistas, he estado considerando y debatiendo cómo resolver estos problemas. Para el año 2119, la única manera de que las ciudades puedan funcionar, ser viables y mantener a sus poblaciones es restablecer el contacto con el mundo natural, en particular con los árboles.

Las ciudades del futuro

La creación de bosques urbanos hará que valga la pena vivir en ciudades, que sean capaces de funcionar y de sustentar a sus poblaciones: Las Treetopias.

Este rediseño incluirá la plantación de muchos más árboles urbanos y otros tipos de vegetación, así como el uso de nuevos métodos más creativos. Aunque no nos dimos cuenta del todo en ese momento, el Hundertwasserhaus de Viena de 1986, un edificio que incorporó 200 árboles en su diseño, fue el comienzo de un pensamiento de silvicultura urbana innovador.

Los bosques urbanos deben ser diseñados como una prioridad, parte de la infraestructura crítica de toda la ciudad, no sólo como una idea cosmética de último momento.Esto se ha llevado a cabo en los apartamentos Bosco Verticale de Stefano Boeri en el centro de Milán, que incorpora más de 800 árboles como parte del edificio. En todo el mundo se están desarrollando estructuras similares, como en Nanjing en China y Utrecht en Países Bajos.

Los bosques urbanos deben ser diseñados como una prioridad, parte de la infraestructura crítica de toda la ciudad, no sólo como una idea cosmética de último momento. Por ejemplo, en 2015, los bosques urbanos del Reino Unido le ahorraron al Servicio Nacional de Salud más de 1000 millones de libras esterlinas al ayudar a reducir el impacto de los contaminantes atmosféricos. En 2119, bien podríamos recordar a este momento de la historia como el equivalente a la barriada victoriana.

Los árboles pueden crear lugares que tienen la capacidad de mejorar enormemente nuestra salud y bienestar. Nuestros bosques urbanos pueden ofrecer espacios y lugares para ayudar a manejar nuestra salud mental y mejorar nuestra salud física. Investigaciones han indicado, por ejemplo, que aumentar la cubierta del dosel de un vecindario en un 10% y crear lugares seguros y caminables puede reducir la obesidad hasta en un 18%.

Ciudades construidas entre árboles

A medida que las zonas rurales se vuelven menos productivas como consecuencia del cambio climático, las ciudades – que previamente consumían bienes y servicios procedentes del campo – tendrán que pasar a ser internamente productivas. Los árboles jugarán un papel esencial en esto, contribuyendo al equilibrio energético de la ciudad mediante el enfriamiento, la regulación y la limpieza de los flujos de aire y agua, y asegurando que nuestros previamente descuidados suelos urbanos funcionen saludablemente.

Los bosques urbanos también podrían proporcionar madera para la construcción. Tenemos una historia de bosques productivos en el Reino Unido, sin embargo, los materiales de construcción alternativos y el crecimiento de una población urbana con menos conocimientos sobre la gestión de los bosques significa que el bosque urbano rara vez se considera productivo. Estamos reconociendo la productividad potencial de los bosques urbanos, ya que las campañas para estimular los mercados de madera de producción propia y lograr una mayor eficiencia en la ordenación están teniendo éxito.

Además, el crecimiento económico sigue siendo considerado como el símbolo principal de la eficacia de una ciudad, pero tenemos que ser igualmente conscientes de otros valores no visibles. Así se abrirán nuevos enfoques de gobernanza. La gobernanza debe abarcar de forma equilibrada todas las formas de valor y facilitar una nueva visión, considerando cómo los árboles pueden ayudar a crear ciudades habitables.

Nuevas oportunidades

A medida que la población urbana aumenta, es necesario mejorar la comprensión de la amplitud y diversidad de los valores que se tienen acerca de nuestros bosques urbanos. Cada persona puede tener varios valores distintos a la vez, ya que los bosques urbanos pueden contribuir a su bienestar de diferentes maneras.

Necesitamos desarrollar asociaciones viables entre los responsables de la conservación de los árboles, los miembros de la comunidad y las empresas para apoyar a los árboles en nuestras ciudades.Los actuales guardianes de nuestro bosque urbano, principalmente los responsables de los árboles ante las autoridades locales, dedican gran parte de su tiempo a gestionar riesgos en lugar de maximizar las oportunidades de los árboles. A menudo reciben quejas sobre los árboles y la gestión de los mismos, y a veces puede ser difícil recordar que la gente también se preocupa por los árboles. Necesitamos desarrollar asociaciones viables entre los responsables de la conservación de los árboles, los miembros de la comunidad y las empresas para apoyar a los árboles en nuestras ciudades.

Aunque la cubierta del arbolado de las ciudades de todo el mundo está disminuyendo actualmente, no ocurre lo mismo en Europa, donde va en aumento. Muchos países europeos están reconociendo el hecho de que hemos sobrediseñado nuestras ciudades para acomodar los automóviles, y ahora es el momento de reclamar el espacio público para nuestra gente, ya sean peatones o ciclistas.

Proyectos creativos como el Hundertwasserhaus no son la única respuesta a la creación de Treetopias. Estamos y seguiremos plantando más árboles callejeros, arboledas urbanas y cúmulos informales de árboles en nuestros parques y espacios verdes. La Treetopia ha comenzado.

—Alan Simson, The Conversation (Reino Unido) 

Esta historia apareció originalmente en The Conversation (Reino Unido). Se vuelve a publicar aquí como parte de la asociación de Eos con Covering Climate Now, una colaboración periodística global comprometida con el fortalecimiento de la cobertura de la historia climática.

This translation was made possible by a partnership with Planeteando. Esta traducción fue posible gracias a una asociación con Planeteando.

Taíno Stilt Houses May Have Been an Adaptation to Climate Change

Fri, 01/15/2021 - 13:43

The first time Christopher Columbus voyaged across the Atlantic, the first people he met were the Taíno, who lived on various islands of the Caribbean. They were a sophisticated agricultural society with large settlements. But their encounter with the Europeans proved fatal, with the civilization vanishing almost completely a few decades later.

In the 1990s, archaeological excavations led to the discovery of a Taíno settlement in Los Buchillones, a shallow lagoon in north central Cuba. Submerged in about a meter of water, Los Buchillones is one of the largest and best-preserved prehistoric settlements discovered so far in the Caribbean. Archaeologists found remains of about 40 dwellings, as well as a variety of wooden and ceramic artifacts.

What surprised scientists, however, was the discovery that Taíno houses were built on stilts. “This was a really important finding,” said Matthew Peros of Bishop’s University, Quebec, Canada, “because up until then, Taíno settlements had not been formally associated with that kind of settlement strategy.”

Solving an Old Puzzle with Sediment

Since the early 2000s, Peros said, archaeologists suspected that the stilt houses may have been an adaptation to climate and environmental change.

With more data available from the region, Peros and his team have been able to test this hypothesis and presented their results in a poster session at AGU’s Fall Meeting 2020. The team reconstructed the past climate of the region using sediment cores from a limestone sinkhole (cenote) called Cenote Jennifer on the island of Cayo Coco, about 16 kilometers north of Los Buchillones. Sediments and organic matter build up in such sinkholes, and geochemical testing can reveal clues that can be used to reconstruct past climate.

The researchers used the sediment core’s oxygen isotope ratios, which reflect evaporation or precipitation, to identify wet and dry periods in the region. They also correlated the oxygen isotope ratios with calcium to titanium ratios, obtained by scanning the sediment core using X-ray fluorescence. This ratio corresponds to deposition of calcium carbonate in the cenote, with drier periods having more calcium deposition. Researchers found the cenote’s oxygen isotope and calcium ratios matched up quite well.

Radiocarbon dating of the sediment core gave the ages of the different sediment layers. Researchers correlated the dates with the isotope analyses and thus obtained climate data from Los Buchillones over the past 2,000 years. In addition, the team used radiocarbon dating of the site’s artifacts and structural remains to give them an idea of the times the village was occupied.

Building Climate Resilience

“There seems to have been a lot of thought put into building a settlement that is well suited to a dynamic coastal environment.”The data from Cenote Jennifer showed the Los Buchillones region endured two major dry periods, the first between 900 and 1200 CE and the second around 1650 CE, the peak of the Little Ice Age. When the authors matched the climate data with the period when the village was occupied, they found that the stilt village flourished in the wet interval, possibly because conditions were favorable for agriculture and fishing. “So there seems to be a link between climate and actual occupation of the site itself,” said Peros.

Paleoclimate reconstruction of hurricanes from the Bahamas indicated this was also a time when hurricane activity increased in the region. Building houses on stilts may have been a deliberate approach to cope with an active hurricane period, which causes increased storm surges and coastal flooding, Peros said. Although the site was well protected by being built behind a coral reef, stilt houses could have been a backup strategy. “There seems to have been a lot of thought put into building a settlement that is well suited to a dynamic coastal environment,” said Peros.

The Taíno stilt houses were undoubtedly more resilient to storm surges caused by hurricanes, said Isabel Rivera-Collazo, an environmental archaeologist at the University of California, San Diego, who was not part of the research. But she is not completely convinced that these types of houses were a purposeful adaptation to climate change. It is certainly plausible, Rivera-Collazo said, but archaeologists will need to study more examples of historic dwellings to definitively say that the Taíno intentionally constructed buildings as a response to climate change.

—Lakshmi Supriya (rlsupriya@gmail.com), Science Writer

Going Down: How Do Cities Carry That Weight?

Thu, 01/14/2021 - 14:30

Most everyone knows that humans impact their environment, but our thinking usually focuses on elements like the atmosphere or oceans. Parsons [2020] provides a fun and intriguing look at an earth-system linkage that gets little attention: the direct impacts of humans on the solid earth. Focusing on the Bay Area in California, he shows that the mass added by building cities can cause modest but appreciable subsidence, something that should be folded into assessment and planning for sea level rise. It will be intriguing to see this sort of analysis done for other tectonic settings, particularly low-lying conurbations where even small changes in relative sea level matter a great deal.

Citation: Parsons, T. [2020]. The Weight of Cities: Urbanization Effects on Earth’s Subsurface. AGU Advances, 1, e2020AV000277. https://doi.org/10.1029/2020AV000277

—Peter Zeitler, Editor, AGU Advances

Deep Decarbonization? Yes We Can!

Thu, 01/14/2021 - 14:30

Staying within a 1.5°C global warming limit will require transformation of our economy to net-zero emissions by 2050, which seems like an enormously ambitious goal. And yet, with new in-depth modeling analysis, Williams et al. [2020] illuminate several technologically and economically feasible pathways to this required deep decarbonization. All pathways require enhanced energy efficiency, decarbonized electricity, electrification, and carbon capture. Interestingly, a modest role for natural gas in 2050 to ensure continuous reliability of electricity supplies is part of the least-cost pathway that still meets the emissions goals. Demonstrating the feasibility of these urgently needed transitions could not come at a more important time, as discussions on appropriate policy instruments to speed the journey to climate stabilization will be front and center as the U.S. rejoins the Paris Climate Accords.

Citation: Williams, J., Jones, R., Haley, B., Kwok, G., Hargreaves, J., Farbes, J. & Torn, M. [2020]. Carbon-Neutral Pathways for the United States. AGU Advances, 1, e2020AV000284. https://doi.org/10/1029/2020AV000284

—Eric A. Davidson, Editor, AGU Advances

Juno Maps Water Ice Across Northern Ganymede

Thu, 01/14/2021 - 13:40

Jupiter’s moon Ganymede is the largest planetary satellite in the solar system. It’s also one of the most intriguing: Ganymede is the only moon with its own magnetic field, it is the most differentiated of all moons, and it likely possesses a subsurface ocean of liquid water. It was studied by the early Jupiter flybys made by the Pioneer and Voyager spacecraft, but our understanding today rests largely on observations made by NASA’s Galileo orbiter from 1995 to 2003.

Mura et al. now report some of the first in situ observations of Ganymede since the end of the Galileo mission. They used the Jovian Infrared Auroral Mapper (JIRAM) on board NASA’s Juno spacecraft to take images and spectra of the moon’s north polar region. On 26 December 2019, Juno passed Ganymede at a distance of about 100,000 kilometers, enabling JIRAM to map this region at a spatial resolution of up to 23 kilometers per pixel.

As Juno flies past Ganymede, the spacecraft can observe physical locations on the moon’s surface from a variety of angles. By comparing the brightness of these regions across a range of observation and illumination geometries, the authors developed a photometric model for Ganymede’s surface reflectance. They observed that wavelength-dependent reflectance relationships sometimes break down in the vicinity of relatively fresh craters, perhaps because of a larger average size of ice grains in these regions.

Combining their model with spectral observations of the 2-micrometer water ice absorption band allowed the authors to map the distribution of water ice in the north polar region. Where these estimates overlapped with maps derived from Earth-based telescopic observations, the researchers found largely good agreement. This congruence enabled them to extend the global water ice map for Ganymede to much more northerly latitudes.

Observations in other spectral bands also revealed the presence of nonwater chemical species on the surface of Ganymede, including possible detections of hydrated magnesium salts, ammonia, carbon dioxide, and a range of organic molecules. The authors note that 2020 offered additional opportunities for Juno to make polar observations of Ganymede, as does 2021, and suggest that continuing observations from JIRAM will help set observation strategies in future observing campaigns like the Europa Clipper and Jupiter Icy Moons Explorer (JUICE) missions. (Journal of Geophysical Research: Planets, https://doi.org/10.1029/2020JE006508, 2020)

—Morgan Rehnberg, Science Writer

Modeling Earth’s Ever-Shifting Magnetism

Thu, 01/14/2021 - 13:39

Tracking Magnetic Fields A Field Guide to the Magnetic Solar System   The Herky-Jerky Weirdness of Earth’s Magnetic Field   Habitability and the Evolution of Life Under Our Magnetic Shield   Modeling Earth’s Ever-Shifting Magnetism   Do Uranus’s Moons Have Subsurface Oceans?   Measuring Massive Magnetic Meteorites   A Robust Proxy for Geomagnetic Reversal Rates in Deep Time  

On a day-to-day basis, most of us probably take for granted how much Earth’s deep inner workings affect some of modern life’s conveniences, like the relative ease with which we find our way from place to place by plane, boat, or automobile or on foot. Roughly 2,900 kilometers below the planet’s surface, convection of molten iron and nickel in the outer core generates Earth’s magnetic field, which guides navigation technology from handheld compasses to complex automated systems.

To help these systems make sense of the magnetic field—which constantly shifts about, sometimes gradually and sometimes not—and navigate accurately, they make use of models that provide assessments of the current state of the magnetic field and predictions of how it will change in the future. One such model is the World Magnetic Model (WMM), a geomagnetic reference model representing the main component of the magnetic field—that is, the field produced by Earth’s outer core geodynamo.

The WMM is widely used by government, industry, and the public for orientation and navigation. For example, the U.S. Federal Aviation Administration relies on the WMM to provide accurate magnetic field referencing in the National Airspace System, including for runway numbering. NOAA uses the WMM in nautical charts and for orienting ocean reference station buoys. It is also used by government and industry in antenna tracking, attitude control of aircraft and spacecraft, surveying, and mapping. And as the WMM is embedded in billions of handheld electronic devices, including in navigation apps on smartphones, it is a truly ubiquitous scientific product.

The model, first named the World Magnetic Model in 1990, is a modern successor to magnetic field mapping efforts dating back to 1701, when Edmond Halley first published a magnetic chart. Today’s WMM is developed in a partnership between NOAA’s National Centers for Environmental Information (NCEI) and the British Geological Survey (BGS) and is a joint product of the U.S. National Geospatial-Intelligence Agency (NGA) and the U.K.’s Defence Geographic Centre. Monitoring the magnetic field and maintaining the WMM to support all these applications is a continuous effort for these agencies and one that occasionally poses unexpected and timely challenges.

Under the Hood of the WMM

Mathematically, the WMM is a spherical harmonic model (of degree and order 12) that provides a snapshot of the core-generated magnetic field as well as its time-varying change, known as secular variation, at a given time. The snapshot and time-varying parts of the model each comprise 168 coefficients—yielding a total of 336 coefficients for the complete model—that describe the direction and intensity of the field. The model captures spatial features in the field to a resolution of about 3,000 kilometers at Earth’s surface, and it is updated every 5 years, providing a linear extrapolation of the magnetic field 5 years into the future based on the field’s rate of change at the time the model is updated.

Fig. 1. Magnetic declination at the surface of the World Geodetic System (WGS 84) ellipsoid on 1 January 2020 as predicted by the most recent World Magnetic Model (WMM2020) is shown in this Miller projection. The contour interval is 2°. Red contours are positive (east), blue are negative (west), and green represent the zero (agonic) line. White stars indicate the 2020.0 positions of the dip poles. Blackout zones are shown as dark shaded areas.

The latest update, WMM2020, was released in December 2019 and is valid until 31 December 2024 [Chulliat et al., 2020]. WMM2020 provides the vector magnetic field everywhere from 1 kilometer below the World Geodetic System (WGS 84) ellipsoid—a standardized approximation of Earth’s surface—to approximately 850 kilometers above it. Figure 1, for example, shows the magnetic declination (the angle between the direction of geographic north and the local direction of the horizontal magnetic field) predicted by WMM2020 on 1 January 2020 at the surface of the WGS 84 ellipsoid.

Prior to the satellite era, the global network of magnetic observatories was the primary source of high-quality data for the WMM, combined with data collected from boats and planes.WMM coefficients are inferred from the best magnetic field measurements available at the time of model development. Since 1999, high-quality measurements have been collected almost without interruption by various low-Earth orbit (LEO) scientific satellites such as Ørsted, CHAMP (Challenging Minisatellite Payload), SAC-C (Satélite de Aplicaciones Científicas-C), and the European Space Agency’s ongoing Swarm mission. Because of their global coverage and high accuracy, these data have greatly facilitated the development of successive WMMs over the past 2 decades.

Prior to the satellite era, the global network of magnetic observatories (under the auspices of the International Association of Geomagnetism and Aeronomy and the International Real-time Magnetic Observatory Network) was the primary source of high-quality data for the WMM and other similar reference models, combined with data collected from boats and planes in areas poorly covered by observatories. Magnetic observatories are specifically designed to track geomagnetic secular variation over a long period of time, and some date back to the early stages of continuous magnetic field measurements made in Europe in the 1830s. Today magnetic data from observatories are still heavily used in WMM development, for example, in helping select which satellite data are used on the basis of global geomagnetic activity and to improve local time coverage.

Accounting for Model Omissions

Criteria to be met by the WMM for its operational use are established in a U.S. Department of Defense (DOD) specification [Department of Defense, 2019], which is also referred to by a NATO standard [NATO Standardization Agency, 2011]. The specification was recently updated, but the model format, including the coefficients that describe it, hasn’t changed in decades. The permanence of the model format facilitates easy adoption of regular model updates by users, which is critical in maintaining the model’s performance and in meeting the DOD criteria. However, it also means that the model has limitations compared with other, more sophisticated geomagnetic reference models, such as NOAA’s High Definition Geomagnetic Model and the BGS Global Geomagnetic Model.

Magnetic field features smaller than about 3,000 kilometers are omitted from the model. They include some core field contributions as well as most features generated by magnetism in Earth’s crust. The corresponding error in the WMM from these omissions is highly variable with geography and can reach up to several thousand nanoteslas near intense crustal magnetic anomalies.

Even a seemingly small error in the model can lead to large errors in navigation, so it is important to understand and characterize the sources of error to inform users about these limitations.The model does not account for magnetic fields generated outside the solid Earth, including, for example, by electric currents in the ionosphere and magnetosphere. Such so-called disturbance fields are only a few nanoteslas at night during geomagnetically quiet times at middle and low latitudes but can reach thousands of nanoteslas at high latitudes during magnetic storms. They are accompanied by induced magnetic fields in electrically conducting layers of the solid Earth, like the mantle and the ocean. Induced fields are generally omitted by the WMM, unless their timescales are greater than a few years, in which case they are indistinguishable from the linear secular variation of the core field.

Although the core field varies slowly in time, it can also display nonlinear changes over periods of a few years. The amplitudes of such variations are generally small (less than a few tens of nanoteslas) but, nonetheless, are detectable both in long-term recordings made by ground-based magnetic observatories and in LEO satellite measurements. With its 5-year predictive outlook, the WMM is not designed to account for such quick changes.

From a practical point of view, these limitations can affect the accuracy of navigation systems and other applications relying on the WMM. Even a seemingly small error of a single degree in the declination described by the model, for example, can lead to large errors in navigation. So it is important to understand and characterize the sources of error in the WMM and to inform users about these limitations.

Recently, NCEI and BGS performed a comprehensive uncertainty analysis of the model [Chulliat et al., 2020]. Both groups independently determined the error associated with omitting the crustal magnetic field by comparing model outputs with data from marine track lines and ground observatories collected since 2000 along with repeat station data collected since 1980. (Repeat stations are permanently marked sites where the geomagnetic field is measured every few years.) Error from omitting disturbance fields was also estimated by comparing model outputs with observatory data since 2000. And the model commission error, defined as the error in model coefficients, was determined by comparing core field models calculated by both groups and by retrospectively comparing the 5-year predictions from old WMM versions with those from recent models calculated using all the data available over the past 20 years.

Fig. 2. The time evolution of the global RMS grid variation error in the Northern Hemisphere for all WMMs since 2000 is shown in this figure. The omission error, assumed to be constant over time, is shown as a gray rectangle at bottom.

The combined error for each magnetic field component (e.g., in X, Y, and Z coordinates, in declination, etc.) in the WMM, obtained by adding all errors from the contributing sources described above, has a characteristic sawtooth-like shape. Figure 2 shows an example of a sawtooth diagram for the grid variation in the Northern Hemisphere. (Grid variation is defined as the difference between declination and local longitude above 55° latitude; its error is the same as that of the declination.) When a new version of the model is released every 5 years, the total error reflects the sum of all omission errors and the very small commission error at the time. Note that this total error was larger in 2000, before high-quality LEO magnetic field measurements became available. Over time, the combined error increases, mostly from the cumulative effects of nonlinear secular variation.

Corrections and Improvements

Performance of the WMM is carefully monitored by NCEI and BGS. Every year, both organizations develop research core field models from the most recent data available to estimate the current error of the WMM and update the “sawtooth” error diagram for each component. In 2018, NCEI, BGS, and NGA projected that the WMM root-mean-square (RMS) error in grid variation would exceed 1°, the maximum level of allowable error set by the DOD specification for the model. The rapidly rising error was mostly a result of the occurrence of intense nonlinear core field variations following the release of WMM2015. The effect of such variations on the declination error was geometrically amplified near magnetic dip poles, locations where the magnetic field is exactly vertical (and therefore where declination is undefined).

With every scheduled—and unscheduled—release of the WMM, the developers are looking for ways to improve the model and incorporate data from different sources.To bring the model back into line with the specification, an out-of-cycle updated WMM was released (WMM2015v2) [Chulliat et al., 2019]. The WMM specification was also revised with the introduction of so-called blackout zones [Department of Defense, 2019] (Figure 1), which are areas in the vicinity of magnetic dip poles where WMM declination values are inaccurate and compasses cannot be trusted. Blackout zones are no longer considered in WMM error calculations (e.g., Figure 2), thus making the specification more robust to nonlinear core field variations while providing better guidance to navigators using the WMM at high latitudes.

With every scheduled—and unscheduled—release of the WMM, the developers are looking for ways to improve the model and incorporate data from different sources. For example, NGA recently concluded MagQuest, a three-phase prize challenge open to industry and academia to develop designs for a future system to collect global geomagnetic data for the WMM. A total of $2.1 million was awarded across the phases to companies and universities around the world, including three winning teams whose designs all incorporated CubeSats. The results of MagQuest will inform NGA’s strategy for the WMM going forward, with an expected procurement of a new system that can be operational by 2027.

With continuing efforts like these, future versions of the WMM should prove increasingly accurate and better support technologies that help us get from place to place around the planet or even just within our communities.

A Culinary Silver Lining of Climate Change: More Truffles

Wed, 01/13/2021 - 13:26

A truffle might not be much to look at, but chefs worldwide revere these subterranean-dwelling fungi for their intense, earthy flavors. Now, scientists have looked to the future of truffle cultivation in Europe by modeling three different climate-warming scenarios. They found that climate change will substantially increase the cultivation potential of one species of truffle commonly used in cooking. Given that truffle farming can be lucrative, it appears that climate change has a culinary silver lining, at least for the niche world of truffles, the researchers concluded.

An Expensive Fungus

Tomáš Čejka, a climate change scientist at the Global Change Research Institute of the Czech Academy of Sciences in Brno, and his colleagues focused on two species of truffles: Tuber aestivum (Burgundy truffle) and Tuber melanosporum (Périgord truffle). These truffles are among the most commonly used in kitchens and cost hundreds of dollars per kilogram. (They’re not quite as renowned as Tuber magnatum, however, a species of white truffle that commands ever higher prices.)

Truffles grow naturally near the roots of such trees as oaks, hazels, spruces, and pines. They prefer alkaline soils, and unlike most agricultural crops, they cannot be coaxed into production. (“A truffle farmer does not plant truffles, he plants oak trees,” a journalist wrote.) When a truffle is ripe, it exudes an aroma perceptible to sensitive noses—for centuries, people have relied on pigs and dogs to sniff out buried truffles.

A Look into the Future

Čejka and his colleagues began by mining 57 previously published research studies to determine the ecological conditions most conducive to truffle growth. They focused on four primary parameters: temperature, precipitation, elevation, and soil pH. The researchers then used this information to estimate the cultivation potential of Burgundy and Périgord truffles on agricultural land in the Czech Republic given both current and future climate conditions.

“You may be looking at 5, 10, 15 years before you get your first truffles.”The team analyzed the average output of five global climate models assuming low-, medium-, and high-emission Representative Concentration Pathways (RCP 2.6, RCP 4.5, and RCP 8.5). They focused on the time interval 2041–2060 and averaged the outputs they obtained to estimate values representative of the year 2050.

It’s important to look to the future when it comes to growing truffles, said Shannon Berch, a mycologist at the University of British Columbia in Vancouver, Canada, not involved in the research. That’s because it takes a while to establish an orchard, she said. “You may be looking at 5, 10, 15 years before you get your first truffles.”

Make Way for More Périgords

Čejka and his collaborators found that Périgord truffles benefited from all three warming scenarios: There were consistent gains in the sizes of moderate- and high-suitability areas for growing this species and consistent losses in the sizes of very low and low-suitability areas. The future of Burgundy truffles, on the other hand, was less clear-cut: The cultivation potential of the species increased for RCP 2.6, but there were no clear trends for RCP 4.5 and RCP 8.5.

Climate change will therefore likely be a boon for Périgord truffle cultivation in the Czech Republic, the researchers concluded. Given the high retail prices of truffles, that’s potentially good news for farmers’ pockets, Čejka and his colleagues suggested. There are ecological benefits to truffle cultivation as well, such as habitat conservation of large trees and land use diversification, the researchers proposed.

“It was unique and unlike anything else.”These results were published in December in Scientific Reports.

It would make sense to redo this analysis in other areas of the world in which truffles are grown, said Charles Lefevre, the founder of New World Truffieres Inc., a company specializing in truffle cultivation, and a past president of the North American Truffling Society. Good places to focus on include the United States and Australia, Lefevre said. “Australia is already the fourth-largest producer of Périgord truffles and could potentially overtake Italy in the next few years.”

Čejka recently had the opportunity to taste the subject of his research—he dined on Périgord truffles at a scientific conference. They had a strong aroma and an earthy flavor, he said. It was “unique and unlike anything else.”

—Katherine Kornei (@KatherineKornei), Science Writer

Network Connects Indigenous Knowledges in the Arctic and U.S. Southwest

Wed, 01/13/2021 - 13:19

On one level, the Arctic and the U.S. Southwest have little in common: One has kilometers of bone-chilling temperatures, ice, and months of darkness; the other has towering cliffs of red rock, parched soil, and broiling summers.

But Indigenous Peoples in each region face similar challenges to food resilience and sovereignty. Because of the colonization of Native lands, Indigenous Peoples have been restricted from accessing, cultivating, and managing their traditional foods. At the same time, climate change in both regions is rapidly altering the landscape.

The Indigenous Foods Knowledges Network (IFKN) connects Indigenous and non-Indigenous scholars, community members, and leaders from the Alaskan and Canadian Arctic and sub-Arctic and the U.S. Southwest to coproduce food sovereignty solutions. The research coordination network was created in 2017 by the University of Colorado and the University of Arizona and is driven primarily by Indigenous community leaders and scholars.

Members of the network exchange knowledge about ways to maintain traditional ways of life, from river restoration, community gardens, and farming practices to culture camps in which Indigenous Knowledges are shared with future generations.

The COVID-19 pandemic has added urgency to the project, because Indigenous elders, who are often the knowledge carriers, are especially at risk from the coronavirus.

A Threat to Foods Is a Threat to Identity

The network focuses on a cornerstone of culture: food.

“It’s not just something [we] physically eat, but it’s part of our ceremonies….It is our connection to the land, to our nonhuman kin,” said Mary Beth Jäger, a member of the Citizen Potawatomi Nation and a research analyst at the Native Nations Institute at the University of Arizona who serves on the IFKN research coordination team. Jäger spoke about IFKN in December at AGU’s Fall Meeting 2020.

Commercial interest in the traditional southwestern tepary bean by non-Indigenous customers is driving up prices and reducing access.Yet access to traditional foods for Indigenous Peoples is strained.

In the Arctic, ice is thinning dangerously under hunters’ feet. Animals like beavers have strayed from their natural habitats, bringing new diseases, such as giardiasis, to communities unfamiliar with them. In the Southwest, repeated droughts have left crops thirsty, and monsoon rains are changing in intensity.

Commercial pressures threaten food security, too. In the Arctic, the Gwich’in have sued the Trump administration for charging ahead with oil and gas leasing in the Arctic National Wildlife Refuge, which is part of the tribe’s caribou habitat. Commercial interest in the traditional southwestern tepary bean by non-Indigenous customers is driving up prices and reducing access to the food staple.

Communities shouldn’t have to face these issues alone, said IFKN steering committee member and Native Movement deputy director Shawna Larson, who is also the vice chairwoman of the Chickaloon Village Traditional Council. “We can learn from one another, teach each other, and also work together on finding different solutions.”

Innovative solutions abound in communities: Ahtna leaders of Chickaloon Village in Sutton, Alaska, created a camp to share Indigenous Knowledges with younger generations. The youth learn to fillet and smoke salmon, collect wild plants, and scrape moose hides.

In the Southwest, the Gila River Indian Community has experience fighting for—and winning—rights to traditional resources. The community won the largest Native water rights settlement in history in 2004 to restore access to water taken by colonial settlers starting in the late 1800s.

Members of both communities hosted delegates from IFKN to share these success stories.

Braided Knowledge

Even though Indigenous Peoples have cultivated a deep understanding of lands and ecosystems, Western science has often disregarded these ways of knowing or even co-opted them.“Indigenous Knowledges go through the ultimate peer review process.”

“Indigenous Knowledges go through the ultimate peer review process,” said Lydia Jennings of the Pascua Yaqui and Huichol Nations. Jennings is an IFKN steering committee member and recently received her Ph.D. from the University of Arizona.

“The knowledge one generates, say, [about] where an animal lives, where certain plants grow, is more rigorous because it literally means survival for communities who depend on traditional food or subsistence food traditions,” Lydia said. “If you collect inaccurate data, you might not eat or [you might] get sick.”

The network harnesses multiple ways of knowing, which Jennings likens to “braiding knowledge systems together.”

“Instead of focusing on Western science, it’s focusing on the idea of utilizing Indigenous research processes and embracing and respecting Indigenous Knowledge systems,” Jäger said.

Although the network includes some non-Indigenous researchers, those researchers center Indigenous Peoples, their communities, and Knowledges at the forefront and follow the lead of Indigenous members, said Jäger. The network is funded by a National Science Foundation program that emphasizes studying social systems alongside the natural and built environments.

Braiding these knowledge systems together is “very healing, in the sense of passing that knowledge down that’s been tried to be broken and to be removed out of the culture by colonization,” said Jäger.

Meetings on the Land

The backbone of IFKN involves visits to Indigenous lands to share stories, foods, and Knowledges. Issues discussed range from ongoing river restoration projects, getting traditional foods into nursing homes, and the effects of colonial mining and extraction on food and medicinal plants.“We eat a lot when we’re together. And we have really good laughs.”

The first visit was by invitation from the Gila River Indian Community in 2018. “That’s a big, important thing for us, that we’re invited,” said Jäger. The network also compensates its hosts.

Importantly, participants say, meeting on the land provides space for deep connection. “There is a difference in how we act and how we talk,” said Jäger. “We eat a lot when we’re together. And we have really good laughs.” After one trip to Finland, Larson told a fellow attendee, a member of a Skolt Sámi community, that “she was like my sister.”

Members of IFKN have met with communities within Finland, Alaska, and Arizona. COVID-19 dashed plans to gather in person at the Hopi reservation, but the network had already planned a series of online webinars.

Quick to Pivot to COVID-19

The strong bonds of the network made it possible to quickly react to the pandemic.

IFKN members received a National Science Foundation rapid response grant to study the effects of COVID-19 on food access for Indigenous communities in the Arctic, the sub-Arctic, and the U.S. Southwest.

Interviews of Indigenous community members and data analysis will begin this month, and the grant will run for 1 year. Althea Walker, a tribal climate science liaison at the Southwest Climate Adaptation Science Center at the University of Arizona, said the grant is important to understand the immediate vulnerabilities from COVID-19.

“Overall, addressing these vulnerabilities that have become apparent during the COVID-19 pandemic allows us to be better prepared for other crises, like the climate crisis,” Walker said.

—Jenessa Duncombe (@jrdscience), Staff Writer

Fault Related Anisotropy in the Hikurangi Subduction Zone

Wed, 01/13/2021 - 12:30

The magnitude and principal direction of seismic azimuthal anisotropy provide key constraints on the deformation processes occurring within the Earth. Although this technique has been successfully applied to a variety of geological settings, high-resolution, three-dimensional anisotropic P-wave velocity model of the shallow part of a subduction zone was never yet achieved.

Arai et al. [2020] provide the first of such models for the Northern Hikurangi subduction zone, where slow earthquakes are known to occur periodically. Deriving this high-resolution model was made possible through an exceptional dataset that resulted from one of the densest and most targeted 3D ocean bottom deployments in a subduction zone. The model highlights larger azimuthal anisotropy close to active faults and to the deformation front. Using the magnitude of the anisotropy, the authors attribute this clear fault related anisotropy to the preferentially oriented cracks and/or clay rich layers along these faults.

These results improve our understanding of the relationship between properties of the shallow part of the subduction zones and slip behavior on the plate boundaries.

Citation: Arai, R., Kodaira, S., Henrys, S., Bangs, N., Obana, K., Fujie, G., et al. [2020]. Three‐dimensional P wave velocity structure of the northern Hikurangi margin from the NZ3D experiment: Evidence for fault‐bound anisotropy. Journal of Geophysical Research: Solid Earth, 125, e2020JB020433. https://doi.org/10.1029/2020JB020433

―Anne Bécel, Associate Editor, JGR: Solid Earth

Newly Identified Instabilities Enhance Atmospheric Turbulence

Tue, 01/12/2021 - 12:54

Physicists have long known that turbulence is a fundamental process in Earth’s atmosphere, where it facilitates mixing, contributes to energy and momentum transport and deposition, and has important implications for weather and climate prediction. However, the mechanisms driving the transitions from laminar airflow to 3D turbulence are complex and poorly understood. In new research, Hecht et al. describe the first atmospheric observations of a new mechanism of strong turbulence generation involving the interaction of atmospheric gravity waves (GWs) and Kelvin-Helmholtz instabilities (KHIs).

GWs are ubiquitous and have many sources at lower and higher altitudes, including airflow over topography, convection, and wind shears, and they often lead to turbulence as they grow in amplitude with increasing altitude. KHIs occur in fluids like air that experience a strong gradient interface in velocity as they flow. At this strongly sheared interface, small perturbations can lead to growing instabilities that can be seen in thin cloud layers with distinctive wave crest–like shapes. Such KHIs can lead to turbulence even in the absence of other influences.

In the new work, the researchers used airglow imaging and lidar observations to study a large-scale KHI event that occurred at about 85 to 90 kilometers altitude over Chile on 1 March 2016. The images revealed a series of KHI billows forming as GWs propagated through the region and appeared to perturb the KHI formation. This perturbation led to misalignments along the KHI billows, causing them to interact with one another as they grew in amplitude. The very high spatial resolution of the imaging revealed details of the KHIs evolving to turbulence, including the formation of “knots” and vortex “tubes” where the KHI billows interact.

The interactions observed between GWs and KHIs motivated a companion modeling study by Fritts et al. that considered conditions such as those that would accompany GW modulations along KHI billow cores. The simulated interactions bore a striking resemblance to the observations reported by Hecht et al.: Specifically, misaligned KHI billows induced vortex tubes linking adjacent KHI billow cores, and their subsequent evolution to knots then drove strong turbulence. The researchers note that the results of both studies are strikingly similar to prior laboratory work confirming the expectation of rapid and strong turbulence transitions accompanying such events.

These papers mark the first quantitative observations and modeling of KHI tubes and knots in Earth’s atmosphere and suggest that these processes are common and have significant effects in the upper atmosphere. They also reveal the benefits, the authors say, of combining theory, laboratory experiments, atmospheric observations, and numerical simulations in studying such dynamic processes as significant pathways to atmospheric turbulence. (Journal of Geophysical Research: Atmospheres, https://doi.org/10.1029/2020JD033414 and https://doi.org/10.1029/2020JD033412, 2020).

—Morgan Rehnberg, Science Writer

Modeling the Creation of Cratons, Earth’s Secret Keepers

Tue, 01/12/2021 - 12:53

The continents, the solid blocks of land beneath our feet, weren’t always as strong as they’ve come to be. Now, scientists from Monash University in Australia have devised a new mechanism to explain how the roots of the continents—cratons—came to be. Using numerical models to simulate the conditions of Archean era Earth, the researchers’ findings, published in Nature, show that a strong base for the continents emerged from the melting and stretching of the cratonic lithospheric mantle.

“They’re the secret keepers of the Earth.”Cratons form the base of continents and hold the title of the oldest existing portion of the lithosphere. They’re extremely thick and began to form up to 3 billion years ago, in the Archean eon. “They’re the secret keepers of the Earth,” said Catherine Cooper, an associate professor of geophysics in the School of the Environment at Washington State University in Pullman. Cooper was not involved in the new research. By studying cratons, scientists might learn how major components of Earth arose and how plate tectonics began. “If you can understand the role of the secret keepers within [Earth], then we can try to answer some of those questions better.”

Scientists can also use this knowledge to study other planets. “Because these processes are the creators of the continents, they are also the processes that create topography, that create an atmosphere,” said Fabio Capitanio, lead author of the new study and an Australian Research Council Future Fellow in Monash University’s School of Earth Atmosphere and Environment. “In principle, they are related to the way we understand life, evolution of planets.”

The Craton Conundrum

The fact that cratons are so thick and enduring poses a problem for scientists. “To make really thick lithosphere requires a good deal of deformation,” said Cooper. “How do we create long-lived, stable features out of material that was once deformable?”

To figure this out, Capitanio and his colleagues turned to numerical models. To simulate the dynamics of the Archean lithosphere, the researchers modeled these layers’ estimated temperatures, pressure, convection, and viscosity, all variables involved in melting rock.

A Surprising Solution

The model revealed a counterintuitive story for craton formation: Parts of the lithospheric mantle became stronger as parts of it were extracted. “The part that is extracted [from the mantle] is essentially melt,” Capitanio said. “Imagine a volcano taking out the lava from the interior of the Earth.” That melt came up through the lithospheric mantle, where it cooled to form crust, leaving behind a portion of mantle devoid of fluids. This process, called dehydration stiffening, left behind a thicker, stronger, and cooling mantle embedded in the lithosphere, forming the roots of the continents.

This residual mantle acts almost like a pin from which the lithosphere stretches laterally, creating new spaces for deformation (melting) and a new zone of stretching. This stretching, or rifting, brings the warmer, deeper material closer to the surface. “In doing so, then you’re having higher temperatures at lower pressures, which then can cause [further] melting to occur,” said Cooper. While the residual portion of the mantle cools, the whole process—dehydration stiffening, rifting, and cooling—repeats in a new section.

As scientists gain a firmer grasp of the origins of cratons, they’re better able to understand processes that might be happening within other planets as well as the processes that helped form our own.“This is a very nice study that unifies many parts of the complicated story of craton formation,” said Lijun Liu, a geodynamicist at the University of Illinois at Urbana-Champaign not involved in the research. “Because it’s a numerical model, it comprehensively brings together many parts [of craton formation] that were hard to reconcile previously.” But, he added, this mechanism doesn’t explain the entire story of cratonic origins.

“It sets the stage for the right material,” said Cooper. But scientists know that cratons are extremely thick, and she said that this mechanism doesn’t fully explain how that happened. “This is a great way to form the material that needs to be thickened later, or further thickened,” said Cooper.

This mechanism aligns with observations of modern cratons. By studying the composition of xenoliths containing pieces of the Archean cratonic lithosphere (brought to Earth’s surface through volcanic activity), scientists can learn about the composition of cratons. The composition also suggests what kinds of conditions might have existed to form that rock, and Capitanio’s mechanism accounts for the pressure and temperature conditions that scientists know are needed to form material from the Archean cratonic lithosphere.

As scientists gain a firmer grasp of the origins of cratons, they’re better able to understand processes that might be happening within other planets as well as the processes that helped form our own. “[Cratons] have kind of gone along for the ride, picking up all of Earth’s secrets for all this time,” said Cooper. “They’re such an intriguing scientific story.”

—Jackie Rocheleau (@JackieRocheleau), Science Writer

Freshened Groundwater in the Sub-seafloor

Mon, 01/11/2021 - 13:16

Offshore freshened groundwater (OFG) is water hosted in sediments and rocks below the seafloor. It can be found offshore of most continents around the world and could possibly become a source of potable water for human populations living near the coast. A recent article in Reviews of Geophysics describes a range of geochemical, geophysical, and modeling approaches that have been used to investigate OFG systems. Here, the lead author gives an overview of what we know about OFG and where it occurs, and what research questions remain.

What is offshore freshened groundwater?

Schematic figure showing how freshened groundwater was deposited offshore when the seafloor was exposed at lower sea-levels. Credit: MARCAN project

Offshore freshened groundwater (OFG) is water that has a salinity lower than seawater and that is stored in sediments and rocks below the seafloor.

Freshened groundwater ends up at the bottom of the ocean in various ways.

One way is the recharge of aquifers by rainfall, either in the past when sea-levels were lower (upper panel, right) or in the current day where onshore aquifers extend offshore (lower panel, right).

Another way is via glaciers with can deposit freshened groundwater offshore via basal melting and the development of sub-glacial streams and lakes.

Other sources of OFG include the release of freshwater during the alteration of sediments (diagenesis) or the decomposition of gas hydrates.

Where does OFG occur and what factors control its characteristics and distribution?

OFG was first reported in the 1960s and has now been documented in most continental margins (see map below). Based on these known locations, we estimate a global volume of 1 million cubic kilometres of OFG.

Map of OFG records and emplacement mechanisms. Credit: Micallef et al. [2020], Figure 1The USA Atlantic margin features the highest number of OFG records, followed by northwest Europe and Australia. Most of the records occur within 55 kilometers of the coast and down to a water depth of 100 meters and sub-seafloor depth of 200 meters, predominantly in passive continental margins. However, OFG has been reported up to 720 kilometers from the coast and in water depths of 3 kilometers.

What methods are used to map, measure, and analyse OFG?

Incidental discoveries during scientific and industry drilling, in conjunction with analyses of total dissolved solids and chloride anomalies in pore waters, have provided most fundamental information on OFG thus far. Coverage of these borehole data is limited, particularly in continental shelves and the shallow sub-seafloor, and spatially biased towards hydrocarbon regions.

Different configurations of marine electromagnetic systems used to map OFG. Credit: Micallef et al. [2020], Figure 6Numerical models, on the other hand, have provided a cost-effective method for estimating OFG volumes and emplacement (Thomas et al., 2019). These techniques have been employed at only a few sites around the globe, and most of these techniques are still in experimental phase.

What are some of the unresolved questions where additional research, data or modelling is needed?

There are many key research questions related to OFG waiting to be addressed. These relate to the distribution, extent and dimensions of OFG bodies, the mechanisms and timing of emplacement, the control that the geological environment exerts on the spatial distribution of OFG, their function (i.e. whether they are actively recharging or recovering from past hydrological conditions), and how they will respond to climate change.

OFG research is at the nexus of many fields of research. Addressing these questions will result in improved understanding of the role of OFG in the global water cycle, as well as biogeochemical cycling, and will allow us to understand whether OFG can be used as an unconventional source of potable water in coastal regions.

Geochemical, geophysical, and numerical modelling methods will allow us to address some of these questions, but the biggest step-change in our understanding will take place with a dedicated scientific drilling campaign.

—Aaron Micallef (amicallef@geomar.de; 0000-0002-9330-0648), Helmholtz Centre for Ocean Research, Germany and Department of Geosciences, University of Malta, Malta

European Colonists Dramatically Increased North American Erosion Rates

Mon, 01/11/2021 - 13:15

Everything wears away in time, but human activities like farming can dramatically accelerate natural erosion rates. The arrival of European colonists in North America, for instance, sped up the rate of erosion and river sediment accumulation on the continent by a factor of 10, according to a new study.

An international team of researchers from China, Belgium, and the United States analyzed 40,000 years of accumulated river sediment from sites across North America to determine the natural background rate of erosion on the continent. They compared this rate to that of the past 200 years, a time when both agriculture and population rapidly increased following European colonization. During the past century alone, humans moved as much material as would be moved by natural processes in 700–3,000 years, the team reported in November in Nature Communications.

“By having this huge compilation [of data] that stretches back many thousands of years, we’re able to contextualize the human impact against that natural geologic variability,” said lead author David Kemp, a geologist with the China University of Geosciences in Wuhan. “It was a surprise to me that the jump was there and that it seemed to be so neatly coincident with European arrival.”

A Widespread Trend

In 94% of the sites surveyed, sediment accumulation rates over the past 200 years were faster than the expected geological rate.To reach their findings, the team compiled data on sediment accumulated in riverbeds from 126 sites across the United States and Canada. In 94% of the sites surveyed, sediment accumulation rates over the past 200 years were faster than the expected geological rate. Even more dramatic, nearly 40% had a rate of sediment accumulation at least 10 times that of the background rate. This trend was not confined to a certain location or region, according to the researchers.

“What I found particularly interesting in the results is that if you look at human impact on the sedimentation rate, you see it continent-wide,” said study coauthor Veerle Vanacker, a geomorphologist with Université Catholique de Louvain in Louvain-la-Neuve, Belgium. “I think that’s quite important, because it shows that this is something which has been generalized over the entire area.”

The researchers cite intensive farming as the likely culprit in the increased sediment accumulation rate, with forestry, ranching, and river management also playing roles. Sediment accumulation rates shot up around the turn of the 19th century, a time period that coincides with a sharp increase in both the European population in North America and the amount of land dedicated to agriculture. Prior to that time, humans did not have a noticeable impact on erosion rates in North America.

Accounting for the Sadler Effect

“I think the most exciting find of this study is that they addressed the time span dependence problem and still found [that] humans affected sediment accumulation” going back 200 years.To compare the background rate of accumulation over 40,000 years with accumulation rates over more recent timescales, the team had to account for a known complication called the Sadler effect, named after study coauthor Peter Sadler of the University of California, Riverside. According to the Sadler effect, the farther back in time you go, the slower the erosion rate appears to be, even if the rate is the same in reality. That’s because more fine scale changes can be smoothed away over time, and certain layers can be lost altogether.

“With this effect in mind, you can see how a recent increase in sediment accumulation compared to the past 40,000 years may simply be the result of this time bias,” said Gary Stinchcomb, a soil geomorphologist at Murray State University in Murray, Ky., who was not involved in the study. “I think the most exciting find of this study is that they addressed the time span dependence problem and still found [that] humans affected sediment accumulation” going back 200 years.

Informing Restoration Efforts

According to the researchers, their findings can help inform modern soil and water conservation efforts by providing a benchmark for natural erosion rates. “There are large and costly river valley restoration projects under way all over North America,” Stinchcomb said. “One could argue that the work presented in [this study] shows us that we will need to peer back before 200 years ago if we want to restore these streams to a more ‘natural’ condition.”

The most recent data in the study also help to provide a glimpse at whether ongoing restoration efforts have worked. “There have been huge investments in soil and water conservation techniques, and one of the questions is always the effectiveness of these techniques,” Vanacker said. “I think that the results show these programs can probably be very effective, because you see that during the last decades, there already seems to be a reduction of the sedimentation rates.”

—Rachel Fritts (@rachel_fritts), Science Writer

A Tried-and-True Medium to Broaden the Reach of Science

Mon, 01/11/2021 - 13:09

The importance of communicating science to the public is being increasingly recognized in light of ongoing climate change, the COVID-19 pandemic, and many other issues affecting everyday people. However, most current outreach endeavors—science days, open houses, podcasting, and the like—skew heavily toward middle- and upper-class audiences.

Even the Internet is not universal. As of April 2020, 59% of people around the world actively use the Internet [Statista, 2020]. This figure is higher in the United States, but 27% of Americans still lack high-speed Internet access, and access is not uniform across demographics. Forty-four percent of U.S. households earning less than $30,000 per year lack high-speed Internet compared with just 8% of households making more than $75,000 per year. The disparities are similar among groups of different educational levels and are more pronounced among various ethnic groups, in Black and Hispanic communities, and in rural areas [Pew Research Center, 2019].

Television reaches audiences spanning a broad range of demographics. Here the author interviews Domingo Kerai in the Republic of Palau, an archipelago in the western Pacific ocean basin, on local fishing trends for the episode “Preserving Palau” of Voice of the Sea. Credit: Yalap P. Yalap

To promote diversity, equity, and inclusion in the geosciences, researchers as well as science communicators need to engage in expanding public perceptions of who does research and how research is conducted, provide connections to local role models for people interested in geoscience careers, and pursue outreach geared toward underrepresented segments of the population. Voice of the Sea, a series of half-hour television episodes now in its seventh year, is attempting to do just that, using a tried-and-true medium that continues to attract large audiences over a broad demographic spectrum.

Television: Where the Audience Is

Reaching underrepresented groups requires looking outside the traditional academic sphere to both novel approaches and existing mediums. Television (TV), for example, remains an effective tool. Despite perceptions among most of my academic friends about it being outmoded, TV is still the most used media platform in the United States. In 2019, Nielsen [2019] reported that Americans watched more than 4 hours of TV daily, with viewing times ranging from 22% of daily media time for young adults (ages 18–34) up to 58% for adults over 65 years.

During the current pandemic, TV viewing for entertainment and education has increased, both in the United States and globally.During the current pandemic, TV viewing for entertainment and education has increased, both in the United States [Nielsen, 2020a] and globally [Mueller and Taj, 2020], raising its relative importance in delivering information. Indeed, TV is a primary source of reliable information for vast segments of the population. Some 65% of surveyed Hawaii residents, for example, reported TV to be the most reliable source of information about important issues facing their communities. This percentage was similar to that reported for newspapers (61%) and far outdistanced social media (26%), radio (19%), and word of mouth (15%) [Ward Research, 2014]. This trust in TV media makes it a valuable resource for sharing important science and cultural information.

The Power of a Good Video

Words are powerful conveyors of information. However, when it comes to communicating complex topics like climate change, video is superior to text [Goldberg et al., 2019]. The reasons for video’s effectiveness in science communication are not fully understood but may relate to high production value, imagery that helps familiarize concepts and link ideas to real life, and viewers visually witnessing scientists agreeing with other experts’ findings [Goldberg et al., 2019].

Increasing awareness of science, technology, engineering, and mathematics (STEM) careers through video can improve the accuracy of students’ perceptions and increase their desire to pursue STEM careers [Wyss et al., 2012]. Moreover, once created, video products can be shared directly in classrooms, education centers, and libraries; online via social media, websites, and streaming services; and on TV and can even be adapted for podcasts.

The author joins Samantha Davis at her herbivore study site to inspect a cage designed to protect large macroalgae from fish predation off the coast of Moorea, Tahiti, for the episode “Sammy’s Reef.” Credit: Thor T. Seraphin Authentic Science Stories, in Depth

In 2009, when my colleagues suggested that we propose to develop a TV series as a companion to a high school marine science curriculum in Hawaii, I jumped at the chance. I wanted to tell authentic science stories through a pedagogical lens using educational research about the way people learn.

Even as flashy, attention-grabbing content has become increasingly needed to penetrate social media markets, TV has proven to be a stable platform useful in addressing equity issues in science.Still, I wondered if my efforts would be better spent working on short, 15- to 30-second pieces focused on interesting facts about marine science. Indeed, some grant reviewers felt that the public would not be willing to watch half-hour episodes devoted to topics like plankton, jellyfish, and aquaponics. The consensus is shifting, however, as the public’s appetite for long-form, informational media, like podcasts, is growing—and projected to double in popularity by 2023 [Nielsen, 2020b]. Thus, even as flashy, attention-grabbing content has become increasingly needed to penetrate social media markets, TV has proven to be a stable platform useful in addressing equity issues in science.

Now in its seventh TV season, more than 100 original episodes of Voice of the Sea have aired. A series of surveys with 650 viewer respondents has shown that a broad cross section of them gained significant content knowledge after watching an episode of the show. They also reported increases in interest, understanding, and motivation [Duncan Seraphin et al., 2017].

Each episode of Voice of the Sea is broadcast twice weekly on TV in Hawaii and is viewed by more than 25,000 people [Marshall Marketing, 2019]. Most viewers are homeowners (50%–60%), female (70%–80%), non–college educated (70%–80%), and local (90%–100% have lived in Hawaii for more than 16 years) [Marshall Marketing, 2017].

We also air in other regions of the Pacific: Guam, Palau, Micronesia, and American Samoa. Although viewership in the outer Pacific is less studied, we know there are fewer TV channels in these areas (and thus fewer programs from which to choose), so a higher percentage of the 300,000 people living in these regions probably views our programming.

Since 2014, we have won 30 national Telly Awards for educational TV programming, including a Gold award in 2020 for our episode about Export Processes in the Ocean from Remote Sensing (EXPORTS), the joint National Science Foundation and NASA project to model carbon sequestration in the deep sea. (The publicity trailer for this episode appears below.)

Our online presence is newer: We first started putting episodes online in 2017. Since then, we have built a following of 4,000 YouTube subscribers, a social media presence, and a website, where each episode is paired with additional content and student activities.

Scientists Help Tell Their Stories Well

Researchers and educators typically leave video and television stories about science to production and marketing professionals. The resulting glossy products may have high production value, but they usually fall short in terms of accuracy, authenticity, and complexity. So often, TV producers focus on the story to the exclusion of the science.

My own experience reflects this. While I was in graduate school, my lab studied large predatory fishes, primarily sharks and tunas, and was frequently filmed for national TV programs. I regularly saw my work distorted in these portrayals.

In one documentary, our work tagging sharks was paired with video of people frolicking on a beach to build a sense of danger and fear among viewers, despite our study site being nowhere near the beach. In another documentary, the narrator said that I “cut off the shark’s clasper—that’s shark talk for penis” to perform genetic analysis. In reality, I had taken a small piece of tissue from the neighboring pelvic fin, but that fact did not enter the story—or stop the throng of emails from concerned viewers who had watched the show!

Moreover, typical TV science focuses on a limited range of topics, like charismatic megafauna (e.g., elephants, pandas, and polar bears) and catastrophic natural hazards. While I spent the early 2000s being filmed for my work on hammerhead and tiger sharks, my ichthyologist colleagues studying tidepool blennies and cryptic, sand-dwelling gobies waited for their phones to ring.

Here the author interviews Hanna Mounce about her team’s work to save the kiwikiu, a small forest bird, at Nakula Natural Area Reserve, Maui, Hawaii, for the episode “Saving the Kiwikiu.” By engaging in video and episode development, researchers can shape story narratives, highlighting the meaning and fascination of research on even the smallest creatures. Credit: Bryan Berkowitz

By engaging in video and episode development, researchers can shape story narratives, ensuring that end products are accurate and meaningful. Finding a compelling narrative to begin with is a key early step in this process, and it can be challenging. But honestly, there are so many excellent stories out there—of who scientists are, how they got interested in research, what their work is really like, and what has surprised them—that finding suitable narratives has become one of the more enjoyable aspects of my work. And there are many models, coaches, seminars, and workshops to help hone science storytelling.

I try to approach Voice of the Sea episodes from an educational and communications perspective. I believe researchers can help showcase diversity in their ranks and promote real-life heroes to stimulate even more diversity in the next generation of scientists by involving ourselves in the production of media and TV related to our areas of interest and study.

Nuts and Bolts of TV and Media Production

Make no mistake, however, TV production is hard work. Voice of the Sea ultimately provides free, high-quality, ready-to-air content to TV stations, but for those of us behind the show—and, indeed, for most educational science programming—securing funding is an ongoing issue, on top of the many challenges of planning, filming, and assembling episodes. The main expenses in our production include filming, editing, graphics, and associated travel; other costs include those for closed captioning, translation (e.g., to Hawaiian, Samoan, and Spanish), and subtitling to extend our reach online and in science centers.

Video production is hard work, involving coordinated efforts behind the scenes as well as in front of the camera. Here the Voice of the Sea crew and Maui Forest Bird Recovery Project team plan their filming strategy for the episode “Saving the Kiwikiu” at Nakula Natural Area Reserve, Maui. Credit: Bryan Berkowitz

We have had success winning funds from local foundations and through small grants to produce episodes about specific topics. But most of our funding for the past four seasons has come by way of “broader impacts and outreach” components of research grants awarded to scientists whose work we cover in Voice of the Sea episodes. Although we do not have the capacity to sell commercial spots for the Voice of the Sea series, commercials may be a potential revenue stream to offset production costs for other such TV projects.

Partnering with a production company or a university media department is arguably the most feasible way to create longer science-focused content for TV (i.e., 30- to 60-minute episodes). Such content can then be broadcast on local, public, or university-run TV stations that are often willing to take one-off specials but that may not have available airtime in their broadcast schedule for a whole series. Finding a home station for a regular (e.g., weekly) TV series is more difficult and usually requires production of a pilot or test episode and the guarantee of at least 12 to 19 new episodes per year.

Local Science, Coming to a TV Near You

Regional TV episode production could be useful as a means of outreach and a way for communities to learn about science research happening where they live.Voice of the Sea is in good company on TV. The U.S. Public Broadcasting Service works with K–12 students to produce programming for local channels, like HIKI NŌ in Hawaii, which offer opportunities for researchers to share their science. There are a handful of dedicated, regional science TV series as well. In Puerto Rico, for example, TV viewers can watch GeoAmbiente, a Spanish language series that shares information about local science and conservation efforts.

New series are also being developed. The University of Rhode Island’s Inner Space Center, for example, which produces video and provides telepresence connectivity on oceanographic research cruises, has been working to extend their resources to deliver content in TV-friendly packages. This type of regional TV episode production has the potential to be useful at many universities and research institutions across the country as a means of outreach and a way for communities to learn about science research happening where they live (see video below). In fact, one of the most common requests that the Voice of the Sea team receives is by people asking us to make an episode about what is happening at a particular reef, beach, or stream that’s near them or about which they care especially.

Making this type of connection to place elevates the value of, and the opportunities provided by, regional TV to share local research with local stakeholders—to profile science role models who look and talk like their viewers do. By increasing exposure to potential opportunities and relatable role model connections, we move closer to increased diversity, equity, and inclusion across scientific disciplines.

Acknowledgments

The Voice of the Sea series is produced through the University of Hawaiʻi Sea Grant, in collaboration with Kauai Sound and Cinema Media Corporation, Aberdeen Broadcast Services, and the experts profiled in each episode, who provide critical knowledge, images, and video to tell their stories. Video clips are provided courtesy of Voice of the Sea.

What Causes Centennial Changes in the Indonesian Throughflow?

Mon, 01/11/2021 - 12:30

The Indonesian seas permit the only equatorial link between two major ocean basins, and so provides a pathway for inter-ocean exchange of mass, heat and freshwater between the Pacific and the Indian Oceans that plays a fundamental role in the coupled ocean and climate system.  In general, changes in the strength of this Indonesian Throughflow (ITF) are thought to be primarily driven by regional forcing related to winds and rainfall in the Indo-Pacific. However, Sun and Thompson [2020] show that, at least on centennial time scales, remote forcing in the high latitude North Atlantic is responsible for driving changes in the ITF.

Using idealized models and various coupled-climate models, the study finds that transient changes in the ITF and the Atlantic Meridional Overturning Circulation (AMOC) are dynamically linked (see figure). In particular, the projected centennial scale weakening of the ITF through enhanced greenhouse gas forcing in the 21st century is attributed to weakening of the AMOC that is transmitted via oceanic planetary waves. Hence, this teleconnection from the high-latitude North Atlantic could thus play a key role in regulating the future tropical climate system of the Indo-Pacific.

Citation: Sun, S., & Thompson, A. F. [2020]. Centennial changes in the Indonesian Throughflow connected to the Atlantic meridional overturning circulation: The ocean’s transient conveyor belt. Geophysical Research Letters, 47, e2020GL090615. https://doi.org/10.1029/2020GL090615  

―Janet Sprintall, Editor, Geophysical Research Letters

Cape Cod: Shipwrecks, Dune Shacks, and Shifting Sands

Fri, 01/08/2021 - 12:47

Flexing into the Atlantic like a boxer’s raised fist, Cape Cod is one of the most distinctive landforms on Earth. The skinny, crooked peninsula was created by glaciers during the last ice age and sculpted into its present-day form by rising sea levels. In geologic timescales, Cape Cod is just a baby, only a few thousand years old. And with rising seas already lapping at the sandy spit, it’ll be gone in a geologic blink.

The Ghosts of Glaciers Past

Cape Cod, Mass., is famous for being a summer beach paradise but owes its entire existence to ice. During the Wisconsin Glaciation, the final period of the last ice age, the Laurentide Ice Sheet covered all of what is now eastern Canada and New England in ice up to 3 kilometers thick. The southern reaches of this ice sheet extended down the Atlantic coast to an archipelago of glacial landforms collectively called the Outer Lands: Long Island, Staten Island, Cape Cod, Martha’s Vineyard, and Nantucket Island.

Cape Cod’s kettle ponds (long home to cranberry bogs) are remnants of the peninsula’s glaciated past. Credit: National Park Service

“At the peak of the last glacial, where I am sitting was under ice,” said Robert Thieler, director of the U.S. Geological Survey’s Woods Hole Coastal and Marine Science Center in Woods Hole, Mass. As it advanced and retreated along the coast, this mass of ice plowed enormous piles of sediment and glacial fill along its front. “These terminal moraines make up the long east-to-west spine of Cape Cod, as well as the north-to-south oriented moraine that runs under Woods Hole,” Thieler said.

Around 23,000 years ago, the Laurentide Ice Sheet reached its maximum extent and then started retreating, leaving a highly malleable landscape in its wake. Radiocarbon dating indicates that the bulbous landmass that would become Cape Cod was likely ice free by around 18,000 years ago.

The pockmarked scars of this retreat can still be seen on the cape in the form of hundreds of kettle ponds. Kettle ponds form when chunks of ice break off from a retreating glacier and get packed in sediment, which insulates the ice. As the ice slowly melts, it creates a round, water-filled depression. On the cape, kettle ponds often host cranberry bogs, where wild cranberry vines take root in the organic matter that collects in the pot holes. Archaeological evidence indicates that the Wampanoag people harvested wild cranberries from kettle ponds on the cape as far back as 12,000 years ago.

Drowning George’s Bank

Even after the ice sheet retreated from the coast, so much water was still locked up in ice that global sea levels remained much lower than they are today. Several huge lobes of land were exposed along North America’s eastern seaboard, including two lobes south and east of what is now Cape Cod. At times, these lobes were forested and occupied by megafauna and Indigenous peoples; mammoth bones and ancient artifacts are habitually dredged up far off the current coastline.

“The process we are trying to understand is destroying the record that we need to study to learn about it.”As the ice age continued to wane, sea level rose in pulses. During some of these pulses, sea levels rose so rapidly that geologists don’t have a clear picture of what the processes of coastal erosion and inundation would have looked like. “It’s a fascinating problem for geologists because the process we are trying to understand is destroying the record that we need to study to learn about it,” Thieler said. “It’s a very fragmentary record out there on the continental shelf. Finding those fragments and stitching them together into a compelling story is a big challenge.”

Models show that one of the major turning points in the shaping of Cape Cod was the inundation of Georges Bank, said Graham Giese, an oceanographer and sedimentologist at the Provincetown, Mass., Center for Coastal Studies. Now a famous offshore commercial fishing ground, Georges Bank was once the outermost lobe of the Outer Lands. “As sea levels rose and waves started washing over Georges Bank, the increasing wave energy reversed the direction of sediment transport on the cape,” Giese said. Previously, sediment had mainly moved along the outer shore of the cape from north to south, driven by northeast storms coming down from the Gulf of Maine.

“Back then, there was no hook at the end of the Cape,” Giese said, referring to the upraised fist of Cape Cod, where Provincetown is located. However, as more wave energy started crossing over George’s Bank, sediment started moving from south to north, and the Provincetown hook began growing.

From around 6,000 years ago, when both climate and the existing ice caps stabilized, the planet has experienced a period of fairly stable sea levels. Radiocarbon dating of the marshes on Cape Cod indicates that the cape’s distinctive shape likely emerged between 5,000 and 4,000 years ago. “A few thousand years is a blink of an eye, geologically speaking,” Thieler said.

A Shapeshifting Spit of Sand

“Even as a kid, I understood that Cape Cod is not a permanent place. You see big changes happening all the time.”Cape Cod’s shapeshifting continues in the present day. “Even as a kid, I understood that Cape Cod is not a permanent place. You see big changes happening all the time,” said Tim Famulare, an environmental planner for Provincetown. These changes are most obvious during storm surges, which can flood the cape and shift huge quantities of sand in a matter of hours, but nuisance flooding during high tides is also becoming more common.

In the winter, Provincetown is a quiet fishing village home to around 3,000 year-round residents. In the summer, however, the community at the hooked end of the cape turns into a completely different beast, hosting as many as 100,000 revelers on weekends. “Provincetown is a very dynamic place that withstands a lot of change between seasons,” said Richard Waldo, the town’s director of public works.

The population of Provincetown, as seen from the top of the Pilgrim Monument, can swell from 3,000 in winter to more than 100,000 on summer weekends. Credit: Mary Caperton Morton

But just how much change the town can withstand remains to be seen. In 2016, Provincetown conducted a coastal resiliency risk assessment. “The most alarming findings were the projections for future sea level rise,” Famulare said. Since 1922, sea levels on the cape have risen by 28 centimeters, and projections for 2100 range up to 3 meters. “Provincetown is very densely developed, and as sea level rises and flooding events become more frequent, we really don’t have anywhere to retreat to,” he said. “Some of our most important assets, like our airport, are located in low-lying areas.”

On 4 January 2018, a higher-than-predicted storm surge delivered by a nor’easter during high tide inundated parts of Provincetown. “The last time we experienced flooding that severe was in 1978, so that kind of flooding was out of the memory of most of our residents,” Famulare said. “It was really a wake-up call for what will happen as sea level rises.” The upshot of that storm, however, is that “our grant work has since been very successful and we’ve been able to undertake several adaptation and mitigation projects to lessen the extent and impact of flooding on some of the areas that were hit during that storm.”

For example, town planners secured a grant for a dune enhancement project to elevate the beach west of MacMillan Pier, a low-lying area near the historic downtown that was a major inundation pathway for flood waters during the January 2018 storm event. But although the Provincetown hook is still growing, the incoming sand isn’t easily accessed for beach nourishment projects, Famulare said. “We have more sand coming in, and that’s a wonderful problem to have, but it comes into the wrong spot, and because of state and environmental regulations we can’t just move it to where it needs to go. Once it starts getting shellfish and beach grass and eel grass growing in it, that sand becomes a sensitive aquatic habitat, and we can’t just dig it up.”

Shipwreck Shacks on the Outer Cape

Provincetown is taking progressive steps to protect itself from future flooding, but the Cape Cod National Seashore, located on the other side of the peninsula on the Outer Cape, subscribes to a very different, hands-off approach, Waldo said. “The philosophy of the national seashore is to leave it alone and let nature take its course.”

This is the wilder side of Cape Cod that I am most familiar with, having spent time in a historic dune shack on the national seashore. In the late 1800s, when shipwrecks were still common on the shoals and sandbars off the coast of Cape Cod, a series of shacks was built along the Outer Cape to provide shelter and supplies to shipwrecked survivors. With better mapping and navigation, shipwrecks became less common, and the shacks began attracting writers and artists, including Henry David Thoreau, Jack Kerouac, and Jackson Pollock.

In 1961, when the Outer Cape became the Cape Cod National Seashore, the dune shacks, many in disrepair, were slated to be destroyed in an effort to return the seashore to its natural state. But the Massachusetts Historical Commission stepped in and recommended that the shacks be listed on the National Register of Historic Places. Today, the National Park Service owns 18 out of the 19 surviving dune shacks, several of which are available for artist residencies and long-term leases.

In the late 1800s, a series of shacks was built along the Outer Cape to provide shelter and supplies to shipwreck survivors. Credit: Mary Caperton Morton

My dune shack was one of the smallest, a one-room shed with an outhouse and a hand-pumped well. Other shacks are more elaborate, but they all fit in with the dunes’ remote and wild character; you can easily imagine drenched and bedraggled shipwreck survivors dragging themselves to your doorstep.

My dune shack is one of the smallest of the 19 historic shipwreck shacks within Cape Cod National Seashore. Credit: Mary Caperton Morton

The year I was in the dunes, a seldom-seen and ghostly relic from the shipwreck era reappeared near Race Point, the northernmost point of the cape. The wreck of the HMS Somerset, a British warship that ran aground on 2 November 1778, resurfaced as it had in in 1886 and 1973. The jagged, waterlogged timbers of the ship’s hull always emerge on the beach in exactly the same spot, a reminder that although change is endemic to the cape, some places are actually quite stable, Giese said.

Before a railroad was built in 1873, and then a road in 1877, there was no overland route to the tip of Cape Cod. Click image for larger version. Credit: top: U.S. Topographical Bureau; bottom: U.S. Geological Survey

The spot where the wreck of the Somerset lies in state actually represents the Outer Cape’s tipping point between erosion and accretion, Giese explained. Although the sandy beaches of the central arm of the cape are rapidly eroding, the dunes on the back of the hand and fist are still growing, as is the Provincetown hook.

The ever-growing dunes will keep Provincetown from becoming an island, cut off from the mainland, for as long as there is beach sand available to feed the dunes, Giese said. But although the wild dunes will endure, the fate of the road that connects Provincetown to the mainland is in doubt. “I do get nervous when I see sand washed up on the highway after a storm,” Famulare said. “As sea level rises and flooding events become more frequent, it won’t take as much of a storm surge to flood the road and cut us off from the rest of the cape.”

Keep Your Dukes Up

How long will Cape Cod keep its pugnacious fist raised against the rising seas of the Atlantic Ocean? “Cape Cod will cease to exist when sea level reaches the highest elevation on the landform,” Giese said. The highest point on the arm of the cape is Scargo Hill at almost 49 meters. Provincetown tops out at 28 meters, and parts of Truro dip as low as 8 meters.

However, because of uncertainties in climate models and the rates of sea level rise, predicting when the Atlantic Ocean will overtop those high points is anybody’s guess. “Current projections for sea level rise are in a range that the cape has not experienced since about 9,000 years ago,” Thieler said. “We simply don’t have good modern analogs for what that rate of change will look like.”

In his 1896 book The Outline of Cape Cod, the father of American geography, William Morris Davis, ventured a guess: “The Truro mainland will soon be destroyed and the sands of Provinceland will be swept away as the oceanic curtain falls on this little one-act geographical drama…10,000 years hence.” Modern projections indicate the curtain may fall even faster, Giese said. “William Morris Davis didn’t know about sea level rise, which is, of course, now the main player in this little geologic drama.”

—Mary Caperton Morton (@theblondecoyote), Science Writer

Living in Geologic Time is a series of personal accounts that highlight the past, present, and future of famous landmarks on geologic timescales.

Very Good Space Boys: Robotic Dogs May Dig Into Martian Caves

Thu, 01/07/2021 - 12:57

A pack of four-legged robotic dogs may rove across regions of the rugged Martian landscape never reached before. Known as “Mars Dogs,” the robots are designed to explore deep lava tube caves on the Red Planet to search for evidence of past or existing life—as well as potential sites for building future human colonies.

Scientists presented the latest research on Mars Dogs at AGU’s annual Fall Meeting in December 2020.

“The Mars caves are places with high astrobiological potential,” said Ana-Catalina Plesa, a planetary scientist at the German Aerospace Center who was not involved in the Mars Dogs research. However, Plesa explained, the caves are not easily accessible to current rovers. “[Mars Dogs] may be more reliable in these kind of areas because they have a different way of traveling.”

Rugged Alien Terrain

With high inclines, large boulders, and deep pits, the surface of Mars is a treacherous obstacle course.To avoid tipping over, traditional wheeled rovers must remain on relatively flat regions. In contrast, lava tubes—formed when subsurface lava flows created tunnels after volcanic eruptions—typically exist at high elevations that are inaccessible to landing. Today’s rovers would need to land in a low-lying region and climb up, a forbidding journey that could take years.

Even if a rover could reach the caves, it would have to descend through a steep entrance. Cave walls would act as a shield against communication with scientists on Earth, leaving the robot to navigate and perform missions on its own.

Enter Mars Dogs, built by scientists at NASA’s Jet Propulsion Laboratory (JPL) and the California Institute of Technology (Caltech). These artificially intelligent, four-legged robots—provided by Boston Dynamics and called “Autonomous Spot,” or “Au-Spot”—are designed to plumb the depths of Martian caves. Engineers tested their creations in lava tubes at Northern California’s Lava Beds National Monument, an environment comparable to what they would face on Mars.

Extreme Exploration

Each robot can walk as fast as 5 kilometers per hour, or 38 times faster than the Curiosity rover that has explored Mars since 2012. Unlike Curiosity, a Mars Dog can autonomously travel dozens of kilometers over extreme terrains. To successfully navigate dark lava tubes, scientists are developing a recovery algorithm to help the robot stay upright and recover if it does topple.

“We are teaching the robot to be able to expect the unexpected while exploring,” said lead researcher Thomas Touma, a robotics engineer at NASA JPL.

Scientists are also working on a robotic arm to interact with the environment and a tethering system to allow a pack of Mars Dogs to work together to lower themselves into caves. While inside, the robots could continue to act in synergy. They could also talk to their counterparts, share what they are seeing in real time, charge each other, and help one another store samples.

“The possibilities are endless because these robots are perceptually aware and actively learning,” Touma said.

The robots recently won the urban circuit of the 2020 Defense Advanced Research Projects Agency’s Subterranean Challenge, the world’s most competitive extreme exploration robotics challenge.



With no substantial atmosphere, Mars is hostile to life at the surface. The lava tube caves, shielded from drastically varying temperatures and bombardment by radiation, may be better suited to preserve biological evidence from the planet’s wetter past. They may also be adaptable for future human colonies, Touma said.

The Red Planet won’t see these four-legged explorers any time soon, however.

“While we are getting really close to having a walking system on the surface of Moon or Mars, there are still a lot of technical questions that need to be answered and a lot of advanced engineering that needs to be done to make these systems flight ready,” said NASA JPL robotics engineer and senior researcher Ali Agha.

—Isabella Backman (@IzzyBackman), Science Writer

Martian Dust Activities Induce Electrochemistry

Thu, 01/07/2021 - 12:30

Homogenized by global storms, Martian dust contains an abundant amorphous (poorly crystalline) component, the source of which has been unknown. Wang et al. [2020] investigate a hypothesis that electrostatic discharge (i.e., like lightning or Aurora on Earth) during dust activities amorphize chlorine- and sulfur-bearing salts in dust by disrupting the crystal structure. Dust activities that would generate charged particulates to induce phase changes by electrochemistry include grain saltation, dust devils, and global dust storms.

The authors conducted electrostatic discharge experiments in a Mars chamber and confirmed they produced amorphous materials from hydrated sulfur- and chlorine- bearing salts by Raman spectroscopy, X-Ray Diffraction, Mössbauer spectroscopy, and Vis-NIR spectroscopy. Other phase changes produced by their experiments involved dehydration (loss of structural water) and oxidation of chlorine, sulfur and iron.  If the prevalent amorphous component in dust is predominantly produced by electrostatic discharge during dust activities, then it must represent a significant, frequent, and ongoing process all over the present-day Mars surface.

Citation: Wang, A., Yan, Y., Dyar, D. M., Houghton, J. L., Farrell, W. M., Jolliff, B. L., et al. [2020]. Amorphization of S, Cl‐Salts induced by Martian dust activities. Journal of Geophysical Research: Planets, 125, e2020JE006701. https://doi.org/10.1029/2020JE006701

―Mariek Schmidt, Associate Editor, JGR: Planets

Special Collection on Open Collaboration Across Geosciences

Wed, 01/06/2021 - 13:06

The way we do science is continually changing. Now there is greater collaboration across disciplines and expansion in open science approaches throughout the research lifecycle (see figure below). While some fields are benefitting from growth and change, others are slow-moving to capitalize on new opportunities.

Designing Open Science into all aspects of the research cycle is achieved through intentionality. Credit: U.S. Department of Energy

Geosciences encompasses a tremendous breadth of fields, and each discipline approaches its research differently. Some disciplines are strongly interconnected across researchers, methods, and instrumentation (e.g., seismology). Others succeed through more independent or specialized research programs that may focus on a single data type or field site. Applying diverse and complementary scientific approaches forms the landscape of scientific discovery we have today. However, not all geoscience disciplines have had the same opportunities to develop research approaches that enhance cross-site synthesis, cross-disciplinary integration, and/or standardized sharing of information and data.

One approach to doing science that is intentionally focused on synthesis and collaboration was coined ICON science or ICON-FAIR in a 2019 U.S. Department of Energy (DOE) Biological and Environmental Research (BER) workshop report. ICON refers to science that:

Integrates processes across traditional disciplines (i.e., physical, chemical, and biological) and across spatial and/or temporal scales; Coordinates use of consistent protocols across systems to generate data that is interoperable across systems and researchers, often with a focus on data types needed to inform, develop, and improve models; Openly exchanges data, software, and models throughout the research lifecycle that are findable, accessible, interoperable, and reusable (FAIR) such that all researchers are enabled to contribute and leverage resources; and Networks efforts, whereby data generation and/or sample collection are done with and for the scientific community, creating research that is mutually beneficial while providing resources (e.g., data, models, sensors) to contributors that otherwise would be difficult or impossible for them to access.

The goal of ICON science is to enhance synthesis, increase resource efficiency, and create knowledge that transcends individual systems.

The WHONDRS consortium (see figure below) is a river corridor ICON use case. Pursuing ICON science is not an all-or-nothing endeavor and elements of ICON are used across all of geosciences. There is, however, significant variation in the degree to which and how ICON principles are implemented. For some geoscience disciplines, ICON principles have been implemented for decades, even if they don’t use the ICON terminology. Other disciplines are trying to find ways to implement ICON principles. In turn, there is an opportunity to grow as a geoscience community by sharing our collective experiences, perspectives, guidance, and lessons learned on how to implement ICON principles.

The Worldwide Hydrobiogeochemistry Observation Network for Dynamic River Systems (WHONDRS) was designed to align with ICON principles. It creates multidisciplinary, model-relevant studies with community input, sends out free sampling kits to the scientific community with video and written protocols, analyzes the samples, and openly publishes all data in a standardized format. Credit: WHONDRS (Shelby Smith, Jackie Wells, Adam Killebrew, Marcy McCall); Google Map data © 2020 INEGI, Imagery © 2020 NASA, TerraMetrics

Researchers across all the geosciences are invited to contribute to a special collection on ICON science. The special collection itself aims to be an example of ICON principles in action.To capitalize on this opportunity, researchers across all the geosciences are invited to contribute to a special collection hosted by the AGU open access journal, Earth and Space Science. ICON science comes in many forms, and the collection is meant to reflect that breadth. The collection will be a resource derived from diverse voices from which researchers can learn how different disciplines have (or have not) engaged with ICON science. The goal is for researchers to use the knowledge and lessons contained in the collection to find creative ways to implement ICON science that are applicable and make sense for their own disciplines.

The special collection itself aims to be an example of ICON principles in action. It will integrate knowledge across geoscience disciplines through coordinated writing teams operating openly with a network of contributors for mutual benefit of all. Each contribution will be written through collaborative open writing teams and will (1) describe opportunities and challenges of applying ICON science within and across disciplines; (2) demonstrate applications and efficacy of successful tools and methods; (3) elucidate gaps among disciplines in their opportunity and motivation to implement ICON science; and (4) identify opportunities for cross-discipline collaboration, thus reducing the activation energy of applying ICON principles.

The collection, titled “The Power of Many: Opportunities and Challenges of Integrated, Coordinated, Open, and Networked (ICON) Science to Advance Geosciences” will consist of commentary articles representing different geoscience disciplines, with the aim of encompassing all 25 AGU sections.

A primary goal of the ICON special collection is to provide a venue for a diverse set of voices.A primary goal of the ICON special collection is to provide a venue for a diverse set of voices that have different experiences, opportunities, and challenges. How individuals implement ICON principles into their research will vary depending on these factors and intentionally building diverse author teams, will mitigate bias.

We encourage geoscientists of all levels and all countries to be involved, specifically communities historically underrepresented in science.We encourage geoscientists of all levels (undergraduates through senior scientists) and all countries to be involved in this project. We would like to specifically welcome contributions from communities historically underrepresented in science, including women, BIPOC, people with diverse abilities, and LGBTQ+ scientists. The only limitation on involvement is that collaborators must fall within the categories of AGU’s science.

Writing teams will be finalized by February 2020. Learn more about the approach to the special collection, sign up to get involved, reach out with questions to and please spread the word.

—Amy E. Goldman (amy.goldman@pnnl.gov;  0000-0003-0490-6451), Pacific Northwest National Laboratory; Sujata R. Emani ( 0000-0003-1118-8689), USDA Agricultural Research Service; Lina C. Pérez-Angel ( 0000-0002-2920-7967), University of Colorado Boulder; Josué A. Rodríguez-Ramos ( 0000-0002-2049-2765), Colorado State University; James C. Stegen ( 0000-0001-9135-7424), Pacific Northwest National Laboratory; and Peter Fox ( 0000-0002-1009-7163), Editor in Chief, Earth and Space Science

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer