Feed aggregator

Monitoring the Agulhas Current Through Maritime Traffic

EOS - Mon, 06/28/2021 - 13:49

As Earth’s climate changes, so too will its oceans. Water temperatures are climbing, sea levels are rising, and ocean currents are shifting. Researchers typically use radar altimeters, which send a microwave pulse toward the ocean and measure the time it takes to rebound, to study the ocean surface, but the usefulness of altimetry data is limited to large areas and long temporal scales. In a new study, Le Goff et al. turn to maritime data to create a more precise picture of ocean currents.

Historically, data on ocean surface currents were based on ships’ logs, which tracked how intense currents affected a vessel’s course or speed. But today’s ships are equipped with much more precise geopositioning technologies. Merchant ships continually transmit their position, bearing, and speed through the Automatic Identification System (AIS), providing mountains of data that are more precise than ever before. Previous studies have shown that surface current velocities from AIS data match well with those predicted by high-frequency radar measurements.

Here the research team focused on the northern reaches of the Agulhas Current, a strong current that roughly follows the continental shelf break off the eastern coast of South Africa. The current, which has surface velocities of up to 2 meters per second, passes through a region with heavy maritime traffic. Using AIS data from vessels in transit through the region in 2016 and mathematical modeling, the team was able to reconstruct the surface current. The authors used surface current estimates collected by satellites and drifting buoys to validate the AIS-based observations. The study shows how AIS data could be a critical part of a more comprehensive current monitoring system.

According to the authors, the methods could be applied to other regions with heavy maritime traffic, such as the Mediterranean Sea. Monitoring ocean currents is critical: As Earth’s climate changes, so too will ocean surface currents, leading to changes in sea surface temperature and salinity that will ripple throughout marine ecosystems. (Journal of Geophysical Research: Oceans, https://doi.org/10.1029/2021JC017228, 2021)

—Kate Wheeling, Science Writer

‘Oumuamua可能是类冥王星系外行星的冰碎片

EOS - Mon, 06/28/2021 - 13:49

This is an authorized translation of an Eos article. 本文是Eos文章的授权翻译。

2017年10月,天文学家首次发现了一个正在穿越太阳系的星际物体。这个星际来客被命名为 ‘Oumuamua,它是一个扁平的发光体,大约有半个街区那么长。

从那以后,科学家们一直在追踪许多线索,试图判定‘Oumuamua的组成,以及它的来源。一些人假设它是由氢冰或水冰组成的,而另一些人则推测它可能是一艘外星飞船。

如今,Jackson和Desch提出了一项新的分析,推测这个神秘物体主要是由氮冰组成的,可能是从另一个太阳系中围绕恒星运行的类冥王星的表面喷射出来的。

这项新分析考虑了‘Oumuamua 的大小、亮度、它在星际空间中所暴露的条件,以及它除重力影响之外的加速度分量。根据这些特征,研究人员缩小了潜在材料的范围,发现最符合所有线索的物质是氮冰。

作者指出,氮在我们的太阳系中不是一种外来物质。例如,冥王星和海王星最大的卫星海卫一都被氮冰包裹着。因此,‘Oumuamua可能起源于一颗类冥王星的系外行星,从其表面喷射出大量的冰碎片。

进一步的计算表明,‘Oumuamua可能是在大约5亿年前,在一个可能位于银河系英仙臂的太阳系中被发射到太空的。这颗系外行星的喷射可能是由于轨道不稳定造成的,类似于我们太阳系早期的历史。(Journal of Geophysical Research: Planets, https://doi.org/10.1029/2020JE006706, 2021)

—科学作家Sarah Stanley

This translation was made by Wiley. 本文翻译由Wiley提供。

Share this article on WeChat. 在微信上分享本文。

Sea-Level Science Coordination: A U.S. and Global Concern

EOS - Mon, 06/28/2021 - 13:37

Sea-level rise is a global problem that is one consequence of the changing climate. The direct and indirect impacts of sea-level rise affect many sectors of society. Sea-level rise is projected to worsen in the coming decades, providing urgency for science-based information that is available to stakeholders to underpin adaptation measures. A commentary by Hamlington et al. [2021] addresses how NASA and NOAA, two government agencies in the United States that provide sea-level observations and science, can coordinate efforts to serve the needs of stakeholders. The authors recommend continued monitoring of sea-level change, development of integrated science products, improved collaboration with other organizations that distribute sea-level science and guidance on regional and local levels, and coordination of delivery of sea-level products to stakeholders with diverse needs. Coordination across NOAA and NASA on sea-level science has the added benefit of broader collaboration on issues of coastal hazards and resiliency.

Citation: Hamlington, B., Osler, M., Vinogradova, N. & Sweet, W. [2021]. Coordinated Science Support for Sea-Level Data and Services in the United States. AGU Advances, 2, e2021AV000418. https://doi.org/10.1029/2021AV000418

—Eileen Hofmann Editor, AGU Advances

Aftershocks and Fiber Optics

EOS - Mon, 06/28/2021 - 13:36

Over recent years, technological advances have led to new types of seismological measurement strategies for both academic and industry applications, including those that allow for very dense (“large N”) sensor deployments. In particular, existing optical fiber cables, such as those used for internet communications, can be transformed into strings of thousands of quasi-seismometers along many kilometers of cable. Li et al. [2021] show the promise of doing this in a rapid response setting, where an objective might be to record seismic activity after an earthquake. By installing a cable interrogation unit at a single strand of fiber near the magnitude 7.1 2019 Ridgecrest event, the authors were able to dramatically increase the number of recorded aftershocks. This demonstrates the potential to complement permanent seismometer networks to allow zooming into fault zone structure and dynamics at unprecedented levels of detail. 

Citation: Li, Z., Shen, Z., Yang, Y. et al. [2021]. Rapid Response to the 2019 Ridgecrest Earthquake with Distributed Acoustic Sensing. AGU Advances, 2, e2021AV000395. https://doi.org/10.1029/2021AV000395

—Thorsten W. Becker, Editor, AGU Advances

Las brechas en las redes ambientales en América Latina

EOS - Fri, 06/25/2021 - 12:24

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

El monitoreo ambiental es fundamental tanto para comprender el mundo como para desarrollar políticas para protegerlo. Las redes de observatorios ambientales (EONs,  por sus siglas en inglés) permiten a los científicos recopilar, compartir y sintetizar datos para hacer nuevos descubrimientos, así como decisiones políticas informadas a escalas regionales y globales. Pero las redes de observatorios no siempre están distribuidas de manera uniforme; algunas regiones del mundo están mejor monitoreadas que otras. Por lo tanto, los investigadores deben evaluar la representatividad de las EONs no solo para aumentar su número en regiones subrepresentadas, sino también para evaluar su aplicabilidad a cuestiones de investigación y políticas.

En un nuevo estudio, Villarreal y Vargas llevaron a cabo una evaluación de este tipo de FLUXNET, un EON conocido como la “red de redes” que mide el intercambio de materia, como dióxido de carbono, agua y metano, y energía entre la tierra y la atmósfera. Aunque investigaciones anteriores habían evaluado las EONs utilizando parámetros climáticos y de vegetación, aquí los autores evaluaron la representatividad de los sitios de covarianza de remolinos dentro de FLUXNET utilizando modelos de distribución de especies. El equipo se centró en América Latina, una región biodiversa con grandes impactos en los ciclos del carbono y del agua mucho más allá de sus fronteras.

A pesar de su enorme impacto ecológico, la densidad de los sitios FLUXNET en América Latina es menor que en los Estados Unidos o Europa. El equipo identificó 41 sitios de covarianza de remolinos registrados con FLUXNET en América Latina a partir de 2018 y evaluó la capacidad de la red para monitorear patrones de productividad primaria bruta (GPP, por sus siglas en inglés), evapotranspiración y variabilidad en múltiples factores ambientales, incluidos el clima, la topografía y el suelo. Luego, los autores utilizaron una técnica estadística multivariante para determinar cuántos sitios FLUXNET más son necesarios en América Latina para mejorar la representatividad de la red para GPP y evapotranspiración.

Descubrieron que los sitios FLUXNET existentes representaban casi la mitad de la GPP y más de un tercio de los patrones de evapotranspiración. Para el clima, el terreno y las propiedades del suelo, esos números fueron del 34%, 36% y 34%, respectivamente. Desafortunadamente, los datos de estos sitios no están ampliamente disponibles. Actualmente, señalan los autores, los modelos deben basarse en datos de sitios FLUXNET fuera de América Latina para hacer predicciones sobre patrones dentro de la región.

El análisis multivariado mostró que agregar 200 sitios de estudio en América Latina podría casi duplicar la representatividad general tanto de la GPP como de la evapotranspiración. Sin embargo, con sitios en una ubicación óptima, se podría lograr el mismo aumento con solo 60 sitios, aunque la incertidumbre sería mucho mayor.

Mientras tanto, los autores piden una mayor coordinación e intercambio de datos entre los investigadores de América Latina y advierten contra la “investigación en helicóptero”, en la que investigadores de instituciones de países desarrollados recopilan datos con poca o ninguna participación de los investigadores locales. En última instancia, las contribuciones locales serán fundamentales para aumentar la representatividad de los sitios FLUXNET en toda la región. (Journal of Geophysical Research: Biogeosciences, https://doi.org/10.1029/2020JG006090, 2021)

—Kate Wheeling, Escritora de ciencia

This translation by Daniela Navarro-Pérez (@DanJoNavarro) was made possible by a partnership with Planeteando. Esta traducción fue posible gracias a una asociación con Planeteando.

Wheels Down for NASA’s Operation IceBridge

EOS - Fri, 06/25/2021 - 12:23

NASA’s Operation IceBridge (OIB) was an airborne mission to survey changing land and sea ice across the Arctic, Antarctic, and Alaska that operated between 2009 and 2021. OIB flew 15 different aircraft nearly one thousand times over 13 years with a variety of instruments to understand how the ice in those regions was changing, why those changes were occurring, and how we might project future changes. It was one of the largest and longest-running scientific airborne survey of the polar regions ever. A recent article published in Reviews of Geophysics examines the 13 year mission. We asked the authors about the findings from OIB and what these might mean for future missions.

What was OIB’s primary goal?

The mission’s primary goal was to bridge the nine-year gap in observations between two successive NASA satellites, ICESat, which operated from 2003 to 2009, and ICESat-2, which launched in 2018 and continues to operate today. These satellites deployed laser altimeters to survey the whole of the Earth. Without OIB, knowledge of changes in ice mass would’ve been substantially limited between 2009 and 2018, and our ability to interpret those changes would’ve suffered even more so.

What kinds of data were collected about land and sea ice, and what methods were used to collect them?

Because we were primarily concerned with the data gap in laser altimetry from these two satellites, we deployed a downward-pointing laser altimeter on every flight. Along with that core instrument, we also deployed radar sounders to measure ice and snow thickness, gravimeters to measure bathymetry, magnetometers to detect geologic boundaries, and cameras to detect changes in surface types (for example, open water leads between sea ice floes).

Summer melt on Greenland’s Steenstrup Glacier taken during OIB. Credit: NASA/John Sonntag

What were some of the most significant advances in our understanding of the polar cryosphere as a result of this program?

For glaciers and ice sheets, OIB dramatically improved knowledge of yearly changes in outlet glaciers, their total thickness, snowfall rates and their evolution over time, bathymetry both within fjords and beneath floating ice shelves, and the hydrology of water on the surface, within and beneath ice sheets.

For sea ice, OIB significantly advanced our understanding of the variability in sea ice freeboard, thickness and its snow cover across both space and time. The self-consistent and operational analyses of laser altimetry, imagery and snow radar measurements were especially important in making sense of sea ice properties.

Were there any surprising or unanticipated discoveries on this mission?

So many surprises! For ice sheets, the highlights included a method for densifying OIB thickness measurements within difficult-to-sound deep channels that was self-consistent with satellite data. OIB radar data also helped reveal the extent of a shallow firn aquifer within the Greenland Ice Sheet, how vulnerable many Greenland and Antarctic outlet glaciers were to ocean-driven melting at their grounding zones, and the dominance of surface melting in driving retreat of Alaskan glaciers. For sea ice, OIB data revealed that snow thickness on Arctic sea ice had decreased significantly since data were gathered manually decades ago.

NASA’s P-3 aircraft on the tarmac at Thule Air Base, northwestern Greenland. Credit: NASA/Goddard/Michael Studinger

Although the OIB mission is now complete, how will the data continue to be of use?

While our campaigns are now over, a few more datasets are still trickling in, and most OIB datasets have arguably yet to fully explored. All the data are archived at the National Snow Ice and Data Center, which is part of a set of NASA-sponsored archiving facilities that make the data available in perpetuity. OIB collected hundreds of terabytes of data, and it will take years if not decades to fully interpret it all. We expect the unexpected – new discoveries using OIB data – will continue for years to come.

That expectation formed part of our motivation in writing this review article. We want to enable future scientists who begin exploring OIB datasets – even just a few years from now – to understand why OIB was designed and conducted the way it was, and what its direct participants considered to be its most essential outcomes, so that those future scientists can put what they discover into context.

—Joseph A. MacGregor (joseph.a.macgregor@nasa.gov; 0000-0002-5517-2235), Linette Nicole Boisvert ( 0000-0003-4778-4765), and Brooke Medley ( 0000-0002-9838-3665), NASA Goddard Space Flight Center, USA

Higher Education During the Pandemic: Truths and Takeaways

EOS - Fri, 06/25/2021 - 12:20

The changes to teaching and learning at colleges and universities that many of us thought would last a few weeks in the spring of 2020 have turned into more than a year’s worth of disruptions. For both students and instructors, these disruptions have interrupted, set back, and, in some cases, irrevocably altered personal and professional lives and relationships, and they have severely strained mental—if not also physical—health.

The forced adaptations have also exposed unresolved and problematic realities in academia that long predate the pandemic, leading to difficult discussions but also creating welcome space for fresh perspective and growth. We reflect here on some of the negative and positive outcomes we’ve seen over the past year, as informed by our own experiences with students and colleagues.

Displaced and Disrupted

When colleges and universities turned to remote instruction, many undergraduate students returned to family homes, where they lost the autonomy they had been cultivating.Pandemic disruptions to teaching, learning, and life took many forms. When colleges and universities instituted social distancing measures and turned to remote instruction, undergraduate students were rushed off campus. Many returned to family homes, where they were isolated from friends and lost the autonomy they had been cultivating while living independently. Subjected once again to household rules set by parents or other guardians, these students were essentially infantilized during the pandemic. Yet their institutions expected them to be mature adults and to keep to academic schedules as if nothing had changed. Many of our students, as they relayed to us, felt untethered, overwhelmed, and unable to sift through endless directions and FAQ pages from the schools about rapidly evolving pandemic protocols.

Meanwhile, graduate students lost access to facilities integral to their research, from offices to laboratory analytical equipment to field sites. They were expected to teach and learn online skillfully and to adjust without complaint. Their support networks became frayed as colleagues and mentors dispersed from campus.

For some students, the burdens of these changes were especially acute. We both work at large public institutions, where up to one quarter of our students are the first in their family to attend college and up to 20% are international students. These students’ lives were suddenly and disproportionately upended by displacement, underresourcing, isolation, and, in some cases, repatriation.

At the same time students were doing their best to adjust to the new landscape of higher education, so too were faculty and instructors. Among other challenges, individuals had to adapt in-person course materials, teaching styles, and mentoring duties to fully remote environments on the fly. They had to reconfigure research programs to account for pandemic restrictions. And many faced the added complexity of maintaining professional responsibilities while simultaneously caring full-time for loved ones also displaced from their usual routines. We, like many of our colleagues, often reminded ourselves of the phrase “I am not working at home because of the pandemic; I’m at home due to the pandemic—trying to work.”

Remote teaching brought important changes to student-teacher relationships. Prior to the pandemic, we took for granted the simple joys of greeting students when they arrived at class, helping facilitate discussions around our course content, and getting to know students—their career aspirations, their challenges, and their interests. During the pandemic, we have still held classes and office hours, conducted research, and mentored students—but all virtually. Although Zoom and other such tools are amazing technological innovations that have enabled us to perform our work, they tend to dull the emotional and personal connections that face-to-face contact builds.

Emotional Tolls

Although we hope the vaccines developed to protect against COVID-19 will enable a full return to prepandemic life, the experiences we have shared with students and colleagues throughout the pandemic will remain with us, with some hanging as shadows over the coming years of social and economic recovery. Of course, these experiences have also been influenced strongly by events not directly related to the pandemic, such as the attack on the U.S. Capitol and the murders of George Floyd and others as well as the large-scale demonstrations in support of racial justice. Among other effects, these events have brought heightened attention to systemic racism and injustice in many institutions, including our own, and have added substantially to the emotional and physical stress of the pandemic for many in academia, particularly people of color.

The pandemic has forced academia to grapple with declining mental health among students and faculty, a trend that began well before 2020.Against this backdrop, the pandemic has forced academia to grapple with declining mental health among students and faculty, a trend that began well before 2020 [National Academies of Sciences, Engineering, and Medicine (NASEM), 2021a], particularly in science. Research has shown that in many cases, student learning and grades have improved during the pandemic, although these successes came with emotional costs. Prior to COVID-19, in spring 2019, three out of five college students reported experiencing extreme anxiety, and two out of five reported debilitating depression sometime in the preceding 12 months [American College Health Association, 2019]. In the past academic year, the trends in mental health have drastically worsened [NASEM, 2021a] as students have been deprived of the ability to engage with others; to participate in educational extracurricular activities and travel; and to pursue many professional and personal opportunities such as internships, fieldwork, and spring break. Moreover, many of our students have contracted COVID-19, including students in our research groups and in all of the courses we teach, and those who have not tested positive themselves have still had to deal with the virus affecting friends and family.

Faculty, who have long faced tremendous dysfunction in career expectations and work-life balance—especially for early-career faculty, women, and faculty of color—are also depressed, stressed, and burned out [NASEM, 2021b]. As of last fall, almost 9 out of 10 faculty surveyed agreed (33%) or strongly agreed (54%) that our jobs had become more difficult, 40% reported considering leaving the profession, and 48% of that number were early-career faculty. These stark figures partly reflect the emotional toll of taking on roles as informal, mostly untrained, and often poorly equipped mental health counselors to our students, which left faculty and students at risk.

We put ourselves through secondary trauma in supporting our students, colleagues, family, and friends while trying to carry on ourselves and maintain our own well-being.Faculty members became critical elements in the support networks for many of our students, requiring us to share additional empathy and to develop new ways to connect, in virtual environments, with students suffering emotionally and physically. We also became critical conduits for sharing university-wide information, from academic schedule changes and new grading modalities to rent relief options in the community and plans for packing up dorms and apartments. This role required keeping up to date with frequently changing policies and information so we could share it clearly and quickly.

Make no mistake: This emotional work, called affective labor, is difficult—and it is labor indeed. Affective labor is the work associated with managing one’s own feelings when things are going to pieces around you—when others are upset, frightened, or angry. We put ourselves through secondary trauma in supporting our students, colleagues, family, and friends while trying to carry on ourselves and maintain our own well-being.

Because this care work has been borne mostly by women faculty and faculty of color, resulting impacts on careers—such as decreased research productivity and delayed promotions—will fall disproportionately on these groups and will affect academia for years to come.

Lights in the Tunnel

Despite the disruptions and added affective burdens placed on students and faculty throughout the pandemic, there have been some positive outcomes to emerge. We speak here not about the learned benefits of technology or of asynchronous learning and other pedagogical adaptations but, rather, of the emotional rewards we experienced during this time.

Students and faculty members bonded like teammates in spring 2020. We struggled with the technology needed for remote instruction, our students struggled with the technology, and we all learned it together. When pets, children, and spouses made visits to our home offices, our students loved seeing us get rattled, because it reminded them of our humanity. They started sharing their pets on screen, and everyone enjoyed getting to know more about one another. Students also shared the stresses of their family situations. We made as many adjustments as we could to help them get through each semester, including changing deadlines, amending or canceling assignments, and just simply listening to them.

Not surprisingly, the camaraderie waned as the pandemic progressed, and by the end of the spring 2021 semester, many students were exhausted from remote learning and the loss of the college environment. This transition only increased the emotional workload for faculty and led to increased frustration and fatigue.

The door to richer teaching and learning experiences has been opened during the pandemic.Nonetheless, the door to richer teaching and learning experiences has been opened during the pandemic. Faculty have opportunities to embrace the role we play in helping students transition to adulthood and to recognize that course content is not the only currency of value to our students. It humanizes us, and our students, when we take the time to get to know them, to open up ourselves, and to admit to the stresses, emotions, and frustrations with which we struggle.

Faculty in science, technology, engineering, and mathematics (STEM) disciplines have historically taken a pass on doing this sort of emotional work. In our classrooms, we traffic in content—observations, calculations, and hypotheses—not in personal stories and cultural issues. We have often told ourselves, “Science doesn’t see color or gender,” “I couldn’t possibly deal with racism because I teach science,” and “Science is not driven by society,” although in each case there is much evidence to the contrary. If we have learned nothing else from the pandemic, we have seen that both we and our students value a more personal approach to instruction.

Outside the classroom as well, there are many things that faculty can do to help themselves and each other: foster, renew, and make new connections with mentors, advisers, colleagues, friends, and family; and develop or continue activities that provide a sense of community among instructors. At the University of Michigan, for example, faculty have set up monthly teaching circles—held virtually during the pandemic—at both departmental and college levels. Teaching has often been a lonely endeavor and not the topic of hallway discussions at research universities, so these regular opportunities to meet have enabled needed support networks and chances to learn and grow professionally. The AGU Education section also provides resources and a venue in which to find, connect with, and support other Earth and space science faculty, both professionally as colleagues and personally as friends.

The affective labor of connecting more deeply with students and colleagues takes time and energy—but it matters. It can make a big difference in helping STEM faculty and their students recover from the myriad disruptions of the pandemic and reshape what postsecondary teaching and learning look like.

Podcast: Standing Up for Science During an Epidemic

EOS - Thu, 06/24/2021 - 16:43



Before COVID, before the swine flu, there was the bird flu outbreak of the mid-2000s. An international group of scientists came together to combat the deadly virus, including Ilaria Capua, now director of the One Health Center of Excellence at the University of Florida (UF) in Gainesville. Little did she know that that experience would not be the most trying moment of her career.

In 2013, Capua was elected to national office in Italy, the only scientist to have been so. Her triumph would be short-lived, however, as she was charged in a criminal case in which plaintiffs accused her of being the mind behind illegal trafficking of viruses—of profiting off her profession. While the legal process dragged on, she was recruited by UF. A few weeks after moving to the United States, she was cleared of all charges.

In this episode of AGU’s podcast Third Pod from the Sun, AGU chatted with Capua about her work with viruses, overcoming a smear campaign, and the value of being surrounding by great peers and team members.

This episode was produced by Kelly McCarthy and Shane M. Hanlon and mixed by Kayla Surrey and Shane M. Hanlon.

—Shane M. Hanlon (@EcologyOfShane), Program Manager, Sharing Science, AGU

 

Episode Transcript

Shane Hanlon (00:01): Hi Nanci.

Nanci Bompey (00:02): Hi Shane.

Shane Hanlon (00:03): It’s good seeing you as always.

Nanci Bompey (00:04): Good seeing you via video again, yes.

Shane Hanlon (00:08): I know. I know. So, we’re still in a pandemic but do you mind me asking, are you vaccinated?

Nanci Bompey (00:16): No, fine to ask. Yes. Got my second vaccine last weekend, so by this coming weekend I’ll be good to go I guess.

Shane Hanlon (00:24): That’s very exciting, yeah.

Nanci Bompey (00:24): How about yourself?

Shane Hanlon (00:25): Yes. Yeah. We did a two shot one, my partner and I, and we… Actually, yesterday, as of this recording, was our two weeks. So, we’re officially out in the world. What’s yours? I want to ask you, what are you going to do? What’s on your list, the first thing you’re doing once you can?

Nanci Bompey (00:48): We were talking about that and I think it’s like you’re still kind of hesitant to do things even so.

Shane Hanlon (00:53): Sure.

Nanci Bompey (00:53): Which is kind of funny. But we are planning to go on a trip to see my mom.

Shane Hanlon (00:58): Oh, okay.

Nanci Bompey (00:58): Haven’t seen her in over a year or whatever it is. So, I’m going up in a couple of weeks to see her. So, that will be nice. We can actually go inside and have dinner and hang out.

Shane Hanlon (01:08): Yeah.

Nanci Bompey (01:08): So, yeah. What about you? Were you like, “We’re doing this when we’re good to go?”

Shane Hanlon (01:14): I want to go to the Drafthouse.

Nanci Bompey (01:17): To see a movie?

Shane Hanlon (01:18): Or something. Yeah. Yeah. So Nancy and I lived close to each other outside of DC and there’s this theater that’s a staple in our neighborhood and it’s like a dinner theater type thing. You can see movies or shows or whatever. And I haven’t been there in yeah, like a year and a half and they’re doing like small capacity and all of that, but yeah, I think we’re going to try to do something coming up soon. Try to make the best of it.

Nanci Bompey (01:38): Yeah, we talked about you know Dune is coming up in the fall and so Richard’s like the biggest Dune fan. So definitely go to the movie is which was a regular staple of ours, so yeah, same.

Shane Hanlon (01:51): Maybe we’ll make post pan… or at least for us, post-pandemic date of it.

Nanci Bompey (01:55): Yes.

Shane Hanlon (02:00): Welcome to the American Geophysical Union’s Podcast about the scientists and the methods behind the science. These are the stories you won’t read in the manuscript or hear in a lecture I’m Shane Hanlon.

Nanci Bompey (02:09): And I’m Nanci Bompey.

Shane Hanlon (02:11): And this is Third Pod From the Sun.

Shane Hanlon (02:15): Okay, so we don’t need to regale everyone with our plans Nanci. We can take that, as the corporate folks say, offline. I’m disgusted with myself. I wish people could see the face I just made. But we are talking about the pandemic, not just because we’re in it, but because our story for today is about another type of I guess non-human pandemic. Another type of-

Nanci Bompey (02:41): Outbreak. It was a… Yeah, yeah yeah yeah. Disease outbreak, I guess.

Shane Hanlon (02:45): Right, and so it’s bird flu. So to bring us more on this, we want to bring in the producer for this episode, Kelly McCarthy.

Kelly McCarthy (02:53): Hi Shane. How are you? Hi Nanci.

Nanci Bompey (02:55): Hi Kelly.

Shane Hanlon (02:57): So yeah, why don’t you just let us know what we’re chatting about today.

Kelly McCarthy (03:00): Yeah. So at the European Geosciences Meeting in 2019, we sat down with a virologist turned member of the Italian parliament who’s going to talk about her science and kind of her path today.

Ilaria Capua (03:15): Hello. My name is Ilaria Capua. Ilaria is the Italian for Hillary. It helps people remember. I’m the Director of the One Health Center of Excellence at the University of Florida in Gainesville. My favorite virus and the viruses I would say that I spend most of my career working with are influenza viruses. I was very active during the bird flu crisis, which occurred around the mid 2000s, and actually bird flu is still a significant problem in many parts of the world.

News Anchor (03:57): China mobilizes resources to combat a new strain of bird flu after a third death is reported while fears spread wider and faster than the disease itself.

News Anchor (04:07): Two-thirds of the 400 people who’ve contracted bird flu have died.

Ilaria Capua (04:12): And thanks to European leadership in their research division, my group, which was based in Padova in Italy, became over the years one of the leading groups in influenza viruses that could jump from animals to humans. And we were very active.

Shane Hanlon (04:37): I love this idea. This is like the scientific ideal, this collaboration and people working together coming to solve this giant problem. This is exactly how it’s supposed to work, right?

Kelly McCarthy (04:49): Exactly how it’s supposed to work, except as with any well-intentioned plan, people can misinterpret things and there can be some unintended negative outcomes.

Ilaria Capua (05:00): As it happens in life, you get sometimes unexpected requests and in 2013, I was asked by the Prime Minister in office at the time to run for election in the national elections. The reason why I was asked was because at the time, Mario Monti recognized that there was a very significant need of people coming from different areas of society and who were successful in their field to join the political debate. And so I agreed to do it. I ran for election and I was elected.

Kelly McCarthy (05:48): Were there any other scientists who were running at that time?

Ilaria Capua (05:53): No, I was the only one who was running at that time and I was very flattered that I was elected and I was very, very motivated to do things around a more meritocratic approach to science, around improving the way that funding was allocated. Again, trying to do things from a more meritocratic point of view. And then I was working on topics of relevance to me and of my areas of expertise, so mainly on emerging infections.

Ilaria Capua (06:40): Suddenly I was phoned up by a journalist and I was informed that there was this criminal case and that I was believed to be the criminal mind behind an illegal traffic of viruses and that I was being… I was basically trying to make personal profits out of my scientific profession. And of course this wasn’t… I mean, this wasn’t true and the criminal court case ended two-and-a-half years after the information was leaked to the press with verdict, which was that the facts never existed and therefore there was no case to answer, and actually that most of the facts that were narrated in the legal documents were non-existent or reality had been transformed.

Shane Hanlon (07:53): I can’t imagine, one being elected as a policymaker, that’s just-

Kelly McCarthy (07:57): The Prime Minister being like you should run for parliament

Nanci Bompey (08:00): And you’re the only scientist on this board on people.

Kelly McCarthy (08:04): Yeah, and you win.

Shane Hanlon (08:06): Yeah, and then during this process being accused of something that you didn’t do.

Kelly McCarthy (08:11): Right, I mean that whole situation is just wow.

Shane Hanlon (08:14): Yeah.

Nanci Bompey (08:15): Very frustrating.

Ilaria Capua (08:17): I had decided to run as a member of parliament, not because I wanted a political career, but because I wanted to do things for science. And so what I did was I was approached by the University of Florida who was looking for a director of their One Health Center. The University of Florida has recently developed to this preeminent recruitment campaign where they recruit scientists from different parts of the world and they were looking for someone with my experience and they were offering me a very interesting job. And so I decided to take it, although I had to say to them that I had this investigation which was pending on my head in Italy. I resigned as a member of parliament. I moved to Florida and after three weeks, the judge for preliminary investigation reviewed the papers and said that the facts were non-existent.

Ilaria Capua (09:23): And so three weeks after I got to the United States, I was completely cleared from all the accusations.

Nanci Bompey (09:34): Well, I mean that’s great. She went to the University of Florida with this stuff kind of hanging over her head, but they, not took a chance on her, but they knew that perhaps that the things weren’t true and they were confident in her.

Kelly McCarthy (09:46): I mean they reached out to her specifically because of her background and she talks a little bit about how grateful she is for the support from that team.

Ilaria Capua (09:53): I have to say that I have great gratitude to the University of Florida, to Jack Payne in particular, and Doug Archer, who are the people who wanted to recruit me. They did due diligence and what I found quite surprising was the fact that they did a few checks and they immediately figured out that it was all fake. And in fact, the investigation was so superficial that they mixed up the name of one virus with the name of another virus. And so it was clear that they didn’t really have a grasp of what was happening. And so for lay people, H7N1 and H7N3 are like similar viruses, but they’re not. They’re completely different viruses in our world.

Ilaria Capua (10:50): These things happen to scientists. Actually, there’s a prize which is awarded every year. It’s called a John Murdoch’s prize. And it’s in the name of the former editor of Nature and it’s about standing up for science and you would be surprised to see how many people are actually attacked or criminalized for doing their science.

Kelly McCarthy (11:22): How do you influence that world now? Do you feel a responsibility having had that experience yourself to continue to advocate for scientists around the world who might be experiencing this?

Ilaria Capua (11:35): So I know what it’s like. I know what it’s like to have your reputation literally ripped off from you and I think it’s one of the worst things that can happen to you. And that’s why I talk about it. I mean, it is important to share these experiences for how hard it can be, because it’s never easy to talk about this sort of, let’s say bumps in the road that you’ve had in your life. However, you also have to have the guts to behave as a senior scientist. I am a senior scientist and therefore I talk about the difficulties scientists can encounter. Because it is part of my job to inform younger scientists and mentor other faculty on issues like this.

Kelly McCarthy (12:33): So I had the opportunity to watch all of these young scientists come up to Ilaria after this talk she gave at the European Geosciences Union meeting, and it was really cool to see all these people from way outside her field just wanting to talk with her more and share their own stories. And she’s clearly an advocate and a mentor.

Nanci Bompey (12:51): Yeah, that’s great and it’s also interesting that like, obviously on this podcast and AGU, you think of oh we interview geoscientists, earth and space science, but it’s so broad. I guess the point is that all these science issues people have in common, but we also, geoscientists can help people learn about different… You know, it’s not just confined now anymore we realize to just studying one particular thing that has no effect on anything else. We have like stuff like geo health, how climate change is going to affect people’s health. It’s like a big emerging field and things like that.

Kelly McCarthy (13:28): Exactly. Yeah. And she actually shared a really good historical example about how that functions and working across disciplines.

Ilaria Capua (13:36): Let me give you an example. John Snow, who is not the guy of the Game of Thrones, but is the father of epidemiology who was an English man who discovered that cholera was transmitted through water. And he was the person who closed the water pump that was collecting water from the infected basin and overnight the deaths for cholera stopped. However, what is amusing is that John Snow, at his time, didn’t know that cholera was caused by a bacterium. I mean, they didn’t even have the tools to see what they were fighting.

Kelly McCarthy (14:24): Okay.

Ilaria Capua (14:26): But he had an intuition and the people who fixed the cholera problem were not scientists. They were the mayor, they were the police officers, they were a series of other people who were not involved in the medical profession and actually fixed the problem. And so where do I see us going? I see us going towards solutions that are not going to be driven only by the scientific community. They’re going to be driven by other people as well. And that’s why we need to engage. And this is something that I think scientists forget to tell their audiences, that we are scientists because we believe in a better world. And that is what motivates most of the scientists. And we should never forget this, regardless of what people out there say.

Ilaria Capua (15:43): So I think that scientists need to reposition themselves as how they are imagined by society. So I would like to launch a call to action to scientists in that okay some of us are nutters. Some of us are a little bit coo-coo. Some of us are nerds and geeks. But we are people who are motivated and are inspired by curiosity and about natural mechanisms of how things work. And so I think that we should actually, even if we haven’t achieved as much as we would’ve wanted, which happens in life, but we still have to be proud about being scientists.

Ilaria Capua (16:43): Of course not all scientists can be super scientists because that’s how distribution works. Some are good. Some are better. Some are super. But still, it’s the critical mass that makes a difference. It’s not the individual.

Nanci Bompey (17:09): So where are you falling on this distribution of scientists Shane? Are you good, better, or super? I’m going to go, you’re-

Shane Hanlon (17:17): I’m what?

Nanci Bompey (17:18): You’re good.

Shane Hanlon (17:19): That’s fine. I actually-

Nanci Bompey (17:21): I’m less than good considering I’m not a scientist anymore, but I know I shouldn’t put myself down like that.

Shane Hanlon (17:26): No, we’re always scientists, we’re just not always practicing.

Nanci Bompey (17:29): Yeah.

Shane Hanlon (17:29): That’s the distinction.

Nanci Bompey (17:30): Yes.

Shane Hanlon (17:30): Yes.

Nanci Bompey (17:31): But in all seriousness, I really like her thought here because it’s actually a lot of what… You know, in terms of the podcast, it’s this critical… she talks about this critical mass of science. Everyone has to be doing this stuff in order to move the science forward. And so you may not be the all-star science, but you are a little piece in this big scientific enterprise.

Shane Hanlon (17:47): Yeah. I mean, yeah, you don’t to be a big name or whatever else, but there’s a reason we do science and it’s knowledge, right?

Nanci Bompey (17:53): Yeah.

Shane Hanlon (17:53): So who cares who-

Nanci Bompey (17:55): Like this podcast. We’re just a little piece of this podcast enterprise moving the needle forward.

Shane Hanlon (18:00): We are doing our part to advance science communication. All right. That’s all from Third Pod From the Sun.

Nanci Bompey (18:09): Thanks so much to Kelly for bringing us this story. And of course to Ilaria for sharing her work with us.

Shane Hanlon (18:14): This podcast was produced by Kelly and mixed by Kayla Suri.

Nanci Bompey (18:19): We would love to hear your thoughts. Please rate and review us on Apple podcasts. You can listen to us wherever you get your podcasts and of course always at thirdpodfromthesun.com.

Shane Hanlon (18:28): Thanks all and we’ll see you next time.

Cutting to the Core

EOS - Thu, 06/24/2021 - 13:28

Science at its Core Cores 3.0: Future-Proofing Earth Sciences’ Historical Records   Improving Access to Paleoclimate Data   An Unbroken Record of Climate During the Age of Dinosaurs   Narwhal Tusks Record Changes in the Marine Arctic   Cold Curriculum for a Hot Topic   The Catcher in the Ice   Cutting to the Core  

We’ll probably never get a real Jurassic Park—and that’s almost certainly for the best—but we are learning quite a bit about what it was like to live during at least the final period of the dinosaurs.

In China’s Songliao Basin, a research team on a drilling project called SK (initiated in 2006) has recovered 8,200 total meters of sediments spanning the entire Cretaceous. During one phase they drilled as deep as 7,018 meters. Their work will give us a thorough and fascinating look at terrestrial climate change during a time of rapid evolutionary turnover.

The heart of the SK team’s research—and the theme of Eos’s July issue—is the study of cores. After completing the drilling phase last February, the team has now turned to inspecting their core samples. Read more in “An Unbroken Record of Climate During the Age of Dinosaurs,” where Chengshan Wang and colleagues explain what they’ve discovered about “Earth’s most intense greenhouse state of the past 150 million years” and what it could tell us about what humans are in for as our climate continues to rapidly change.

Through sediment cores and ice cores, permafrost cores, and even tree rings, scientists have discovered myriad vehicles that allow us to look into the past. Collecting these time machines can be enormously expensive and time-consuming and sometimes only through rare, if terrible, opportunities—such as the chance to collect 9-meter-diameter “cookies” from giant sequoias after loggers felled a third of what is now Sequoia National Park in California, as Thomas Swetnam explains in our feature story.

Given the investment in collecting them, what do researchers do with all these cores once they’ve completed their initial studies? They put them in core libraries, of course, for the benefit of future research. And much like our traditional community libraries, core libraries need support and funding to make sure they survive. In the feature linked above, we look at how several collection caretakers are “future-proofing” these records, sometimes in dramatic scenarios, such as when Tyler Jones rushed to protect a freezer of ice cores at the Institute of Arctic and Alpine Research, or INSTAAR, in Boulder, Colo., in 2013.

Finally, even the best-protected library can be challenging to use if there is no indexing system. Nikita Kaushal and colleagues write about their modern-day Dewey Decimal System for speleothems. Their clever standardization and categorization are already the basis of many papers by researchers who now have richer access to these paleoclimate cave specimens.

We finish off our look at core research with another delightful crossword puzzle from Russ Colson in the print issue. We hope you can find time to take a break, center yourself, and dig right into our core clues.

—Heather Goss (@heathermg), Editor in Chief

Renato Funiciello, an Inspiration to Modern Geology in Italy

EOS - Thu, 06/24/2021 - 13:27

The Tethyan belt is a zone of tectonic activity and mountain ranges stretching from northwestern Africa and western Europe across Turkey, the Caucasus, and Iran to the southwest Pacific Ocean. It is the longest continuous orogenic belt on Earth. One of the sections of particular interest to geoscientists is the Mediterranean section due to its ongoing active tectonic activity.

A special collection published in Tectonics, Geodynamics, Crustal and Lithospheric Tectonics, and active deformation in the Mediterranean Regions, presents new and updated research on this section of the Tethyan belt. More than two dozen research papers explore subduction and mantle convection, volcanism and fluid circulation, structural geology and active tectonics, dynamic topography, and geomorphology. The volume is dedicated to Professor Renato Funiciello (1939–2009), who helped develop modern geological studies in Italy.

Geological science has the powerful ability to bridge gaps between different countries and cultures.Geological science has the powerful ability to bridge gaps between different countries and cultures. Renato Funiciello played a decisive role in creating these bridges.

Born in Libya, he was a true Mediterranean geologist, enthusiastic about both the sea and the science of rocks.

Renato’s life and work fits into a long standing tradition of cross-Mediterranean travel and activity, from the Roman Emperor, Septimius Severus, a North African from Leptis Magna (a Roman city in Libya) with a later career in Rome and responsible for expanding the Roman empire to its greatest extent to date, to Enrico Mattei, the post-war Italian industrialist who led Agip (the Italian national oil and gas company), negotiated agreements about the extraction of oil from Tunisia and Morocco, and after whom the trans-Mediterranean gas pipeline between Algeria and Italy is now named.

In Fall of 1980, two severe and deadly large earthquakes affected the Mediterranean region. The El Asnam event occurred in Algeria in October. This event was a 7.1 magnitude quake followed by a magnitude 6.2 aftershock, the largest in the Tell Atlas ranges for almost two centuries. The Irpinia earthquake occurred in Italy in November. This was a magnitude 6.9 quake, with three main shocks and as many as 90 aftershocks. These two events left thousands of people dead, injured, and displaced. For Renato, then a newly appointed lecturer in structural geology at the University of Rome, it was a unique opportunity to explore the significance of earthquake surface faulting. In the following period, he stimulated many lively discussions about it during different seminars and meetings all over Europe.

Renato generated a new spirit of research, setting out the fundamentals of crustal and lithospheric deformation in structural geology in collaboration with geophysicists. One could say that modern geology was born in Italy amongst a new group of young (at the time) and dashing geologists who now lead the fields of structural and earthquake geology. In fact, several contributions to this collection are from his former students.

Renato Funiciello, Professor of Geology at Università degli Studi Roma Tre (1993–2009). Photo courtesy of Francesca Funiciello

Renato had a large range of interests and contributed to many subfields of geology, ranging from lunar and planetary sciences, and seismic tomography to spatial geodesy, geo-archeology, and urban geology.

He also embraced the use of modern technologies in Earth sciences and applied them to the study of the Adria microplate (see Kiraly et al., 2018), magmatism in southwest Turkey (see Asti et al., 2019), seismic damage in Rome’s Colosseum area, recent volcanic activity at Albano Lake near Castelgandolfo (the Pope residence), and a geological tour of the Seven Hills of Rome.

His professional contributions to our field were far reaching. He was an author and co-author of more than 100 published scientific articles and was particularly committed to raising public awareness of geoscience and geo-risks.

In addition, he served as PI of a NASA project on lunar geology, President of the Scientific Council of the Italian National Research Council (CNR), Director of the Institute for Technologies Applied to Cultural Heritage (ITABC), Director of the International Institute of Geothermal Research (IIRG), and Vice President of the National Institute of Geophysics and Volcanology (INGV Rome).

This collection is a modest dedication to one of the greatest geologists and a dear colleague whom we remember with great fondness.

—Mustapha Meghraoui (m.meghraoui@unistra.fr;  0000-0002-3479-465X), Institut Terre & Environnement de Strasbourg, France

Cores 3.0: Future-Proofing Earth Sciences’ Historical Records

EOS - Thu, 06/24/2021 - 13:27

Science at its Core Cores 3.0: Future-Proofing Earth Sciences’ Historical Records   Improving Access to Paleoclimate Data   An Unbroken Record of Climate During the Age of Dinosaurs   Narwhal Tusks Record Changes in the Marine Arctic   Cold Curriculum for a Hot Topic   The Catcher in the Ice   Cutting to the Core  

In September 2013, a major storm dumped a year’s worth of rain on the city of Boulder, Colo., in just 2 days. Walls of water rushed down the mountainsides into Boulder Creek, causing it to burst its banks and flood nearby streets and buildings.

Instead of trying to escape the flood, Tyler Jones, a biogeochemist at the Institute of Arctic and Alpine Research (INSTAAR) in Boulder, drove directly toward it. His motive? Mere meters from the overflowing creek, a large freezer housed the lab’s collection of precious ice cores.

“We didn’t know if the energy was going to fail in the basement,” Jones said. “So I am scrambling around with a headlamp on, less than a hundred yards from a major flood event, trying to figure out what is going on.”

The INSTAAR scientists were lucky that year, as their collection survived unscathed. But devastating core culls have happened in the past decade. In a 2017 freezer malfunction at the University of Alberta in Edmonton, Canada, part of the world’s largest collection of ice cores from the Canadian Arctic was reduced to puddles. “Thinking of those kinds of instances makes me lose sleep at night,” said Lindsay Powers, technical director of the National Science Foundation Ice Core Facility in Denver.

Collections of cores—including ice cores, tree ring cores, lake sediment cores, and permafrost cores—represent the work of generations of scientists and sometimes investments of millions of dollars in infrastructure and field research. They hold vast quantities of data about the planet’s history ranging from changes in climate and air quality to the incidence of fires and solar flares. “These materials cover anywhere from decades to centuries and even up to millions of years,” said Anders Noren, director of the Facilities for Continental Scientific Drilling and Coring in Minneapolis, which includes a library of core samples. “It’s a natural archive and legacy that we all share and can tap into—it’s a big deal.”

Historically, some individual scientists or groups have amassed core collections, and on occasion, centralized libraries of cores have emerged to house samples. But irrespective of the types of cores stored or their size, these collections have faced a series of growing pains. Consequently, facilities have had to adapt and evolve to keep pace and ensure that their collections are available for equitable scientific research.

“We spend a lot of time in science thinking about open access when it comes to data,” said Merritt Turetsky, director of INSTAAR. Scientists should be having similar conversations about open access to valuable core samples, she said. “It is important to make science fair.”

Cores and Cookies

After 30 years of collecting wood samples for his research, astronomer Andrew Ellicott Douglass founded the Laboratory of Tree-Ring Research (LTRR) in 1937. With its creation at the University of Arizona in Tucson, Douglass formalized the world’s first tree ring library. Its development in the years since is a paradigm for the way core libraries are subject to both luck and strategy.

Dendrochronologists use tools to extract cores from trees to date structures and reconstruct past events such as fire regimes, volcanic activity, and hydrologic cycles. In addition to these narrow cores, they can also saw across tree stumps to get a full cross section of the trunk, called a cookie.

At the Laboratory of Tree-Ring Research in Tuscon, Ariz, curators are cataloging a more than a century’s worth of wood samples. Credit: Peter Brewer

Douglass originally collected cores and cookies to study the cycle of sunspots, as astronomers had observed that the number of these patches on the Sun increased and decreased periodically. The number of sunspots directly affects the brightness of the Sun and, in turn, how much plants and trees grow. By looking at the thickness of the tree rings, Douglass hoped to deduce the number of sunspots in a given year and how that number changed over the years. Douglass also went on to date archaeological samples from the U.S. Southwest using his tree ring techniques. On the way, he amassed an impressive volume of wood.

Douglass’s successors at LTRR were equally fervent in their collection. Thomas Swetnam, the director of LTRR between 2000 and 2014, estimated that his collection of cores and cookies gathered in a single decade occupied about 100 cubic meters.

During the turn of the 20th century, loggers felled a third of the giant sequoias in what is now Sequoia National Park in California. The only upside to the environmental tragedy was that it afforded researchers like Swetnam, who studies past fire regimes, the opportunity to collect cookies. “We were able to go with very large chainsaws and cut slabs of wood out of these sequoia stumps, some of them 30 feet [9 meters] in diameter,” Swetnam said. “Then we would rent a 30-foot U-Haul truck, fill it up, and bring it back to the lab.”

Tree trunks, cores, and cookies are stored in a humidity-controlled environment at the Laboratory of Tree-Ring Research in Tuscon, Ariz. Credit: Peter Brewer

The laboratory’s collection catalogs about 10,000 years of history, Swetnam said. It also amounts to a big space issue. “We’re talking about probably on the order of a million samples, maybe more,” Swetnam said. “We’re not even sure exactly what the total count is.”

The tree ring samples had been temporarily stored under the bleachers of Arizona Stadium in Tucson for nearly 70 years, but with generous funding from a private donor, a new structure was built to house the laboratory and its collection in 2013. The building, shaped like a giant tree house, solved the space issue, and in 2017 the lab received further funding to hire its first curator, who was charged with the gigantean task of organizing more than a hundred years of samples.

“It is a very long term endeavor,” said Peter Brewer, the LTRR curator who now works with a 20-person team on the collection. Brewer set to standardizing the labeling for the samples and is the co-lead on an international effort to produce a universal data standard for dendrochronological data. With this in place, LTRR will soon be launching a public portal for its collections, where scientists can log on and request a sample loan. This portal will make the collection more accessible to researchers around the world.

Ice Issues

In the early 1900s, around the same time that Douglass was collecting his first wood samples, James E. Church devised a tool to sample ice cores 9 meters below the ground. By the 1950s, scientists were able to extract cores from depths of more than 400 meters in the Greenland Ice Sheet. In the following years, scientists have drilled deeper and deeper to extract and collect ice cores from glaciers around the world.

“Recovering ice from 2 miles beneath an ice sheet in extreme cold environments is a massive challenge. You can’t just go back and repeat that…. It’s a one-time deal.”Ice cores can reveal a slew of information, including data about past climate change and global atmospheric chemistry. “We’ve learned so much already about environmental challenges from ice cores, and we think that there is so much more to learn,” said Patrick Ginot of the Institute of Research for Development at the Institute of Environmental Geosciences in Grenoble, France.

Some labs, such as INSTAAR, maintain their own collections, but space can quickly become an issue, and there’s constant concern about keeping the samples frozen and safe. Taking into consideration the massive effort involved in securing a single ice core, each sample is akin to an irreplaceable work of art. “Recovering ice from 2 miles [3.2 kilometers] beneath an ice sheet in extreme cold environments is a massive challenge,” Jones said. “You can’t just go back and repeat that…. It’s a one-time deal.”

The National Ice Core Lab in Denver houses many ice cores collected by scientists on National Science Foundation–funded projects. The goal is to provide a fail-safe storage environment and open access to researchers wishing to use the samples. Denver’s altitude and low humidity make running the freezers more efficient, and a rolling rack system in a new freezer will increase storage capacity by nearly a third. The facility also has backups galore: “We have redundancy on everything, and everything is alarmed,” Powers said.

The carbon footprint of running giant freezers at −36°C is high, but the lab is in the process of installing a new freezer that uses carbon dioxide refrigeration, the most environmentally friendly refrigeration system on the market. “We are at work here promoting climate research, so we want to be using the best technology possible to have the lowest impact on our environment,” Powers said.

Science Without Borders

The ice core community has adapted to various challenges that come with sustaining their libraries and working toward making the samples available on an open-access basis. But other parts of the cryosphere community are still catching up, Turetsky said.

Turetsky collects hundreds of northern soil and permafrost cores each year with her INSTARR team, and scores of other permafrost researchers are amassing equal numbers of cores from across the United States and Canada on a yearly basis. The U.S. permafrost community has more samples than the U.S. ice core community—but still doesn’t have a centralized library.

“We can’t do our best science siloed by national borders. I would love to see sharing of permafrost samples or information be a type of international science diplomacy.”Turetsky said she is looking to learn from the ice core community while recognizing that the challenges are different for permafrost researchers. Because it is easier and less expensive to collect samples, the community hasn’t needed to join forces and pool resources in the same way the ice core community has, leading to a more distributed endeavor.

Turetsky’s vision is to establish a resource for storing permafrost samples that anyone can tap into, as well as for the U.S. permafrost community to come together to develop guiding principles for the data collected. The University of Alberta’s Permafrost Archives Science Laboratory, headed by Duane Froese, is a great example of a multiuser permafrost archive, Turetsky said. Ultimately, the community may need to think about a regional hub with international connections to propel scientific inquiry.

“We can’t do our best science siloed by national borders,” Turetsky said. “I would love to see sharing of permafrost samples or information be a type of international science diplomacy.”

A Race Against Time

The need for the cryosphere community (encompassing both ice core and permafrost researchers) to come together and collect data in such a way that they can be shared and used in the future has never been greater, Turetsky said. The Arctic is warming faster than anywhere else on the planet, and simultaneously warming sea ice, ice sheets, and permafrost have great potential to influence Earth’s future climate. “So not only are [ice and permafrost environments] the most vulnerable to change, they also will change and dictate our climate future,” Turetsky said.

In the worst-case scenario, the Arctic may lose all sea ice or permafrost, and scientists will lose the ability to collect core samples. “So it is a race against time to get cores, to learn, and to communicate to the public how dire the situation is,” Turetsky said.

“A good chunk of what we have no longer exists in the forests. All that is left are the representative pieces of wood that are in our archives.”Tree ring researchers are facing their own race against time, Swetnam said. As wildfires rage across the United States, scientists are trying to collect as much as possible from older trees before they are claimed by flames. “The history that’s contained in the rings is not renewable,” Swetnam said. “It’s there, and if it’s lost, it’s lost.”

That scientists may lose the ability to collect some samples makes maintaining core libraries and sharing their resources all the more important, Brewer said. “A good chunk of what we have no longer exists in the forests. All that is left are the representative pieces of wood that are in our archives.”

A Futuristic Vision

Recognizing threats posed by climate change, one group of cryosphere scientists has set out to create a visionary ice core library for future generations. Instead of housing core samples from around the world in one country, the group plans to store them in Antarctica, a continent dedicated to science and peace; the 1959 Antarctic Treaty specifies that “scientific observations and results from Antarctica shall be exchanged and made freely available.”

Ice cores stored in the temporary core storage in the underground ice cave constructed by the East Greenland Ice-Core Project. Credit: Tyler R. Jones/INSTAAR

And the ice cores won’t be stored in a building. They’ll be buried deep in the largest natural freezer of them all: the Antarctic Ice Sheet. This core library will act as a heritage data set, a legacy for future generations of scientists from all over the world. Researchers can access the cores in the interim, especially those taken from glaciers that no longer exist, and the Ice Memory project’s organizers are currently addressing how to grant access to the cores in a way that is equitable, as travel to Antarctica is cost prohibitive for many researchers.

The first stage of the project has focused on how to store the cores in the ice sheet. The plan is to store them about 10 meters deep, where the temperature is a stable −50°C throughout the year. “Even if there are a few degrees of warming in the next decades or centuries, it will still be kept at minus 50° or 45°,” said Ginot, one of the coordinators of the Ice Memory project.

Researchers from the French and Italian polar institutes have already trialed the best storage techniques on Dome Concordia in Antarctica. They dug 8-meter-deep, 100-meter-long trenches and inserted giant sausage-shaped balloons on the ice floors. Then they used the dug-out snow to cover the balloons and allowed the snow to harden. “When they disassembled the sausage, they had a cave under the snow,” Ginot said.

Constructing giant trenches at Dome Concordia in Antarctica. Digging these trenches was the first step in trialing how to store ice cores in underground caves. Credit: Armand Patoir, French Polar Institute IPEV

The project’s models forecast that the cavities will last for 20–30 years, at which time the scientists will create more caves at a minimal cost, Ginot said. The current focus of the team is to collect samples from glaciers that are quickly disappearing, such as the northern ice field near the summit of Mount Kilimanjaro in Tanzania.

Recognizing the Value

“We recognize that this is a library of information, and we’ve just read some of the pages of some of the books. But as long as the books are still there, we can go back and interrogate them.”Core libraries provide a vital window into events that happened before human records began, a repository for data to better understand Earth systems, and resources to help forecast future scenarios. Researchers believe that as science and technology evolve, they’ll be able to extract even more information from core collections. “We recognize that this is a library of information, and we’ve just read some of the pages of some of the books,” Swetnam said. “But as long as the books are still there, we can go back and interrogate them.”

While the libraries for ice, tree ring, and sediment cores are maintained, scientists are able to access the “books” for further analysis whenever they want.

“We see all kinds of cases where a new analytical technique becomes available, and people can ask new questions of these materials without having to go and collect them in the field,” Noren said. New analytical techniques have led to more accurate reconstruction of past temperatures from lake core sediments, for example, and by integrating several core data sets, scientists have revealed that humans began accelerating soil erosion 4,000 years ago.

The multifaceted value of the core collections has become even more pronounced during the COVID-19 pandemic, Noren said. Core libraries have allowed scientists to continue moving forward with their research even when they can’t do fieldwork. As recently as March 2021, for example, scientists published research on the multimillion-year-old record of Greenland vegetation and glacial history that was based on existing cores, not those collected by the scientists’ field research.

Although some libraries struggle with space constraints, maintaining suitable environmental conditions, cataloging samples, or ensuring open access, every scientist or curator of a core collection shares one concern: sustaining funding.

It costs money to run a core library: money to house samples, money to employ curators, and money to build systems that allow equal and fair access to data. Securing that financial support is a challenge. “Funding priority is about exciting research or a new instrument,” Brewer said. “Updating or maintaining a collection of scientific samples is not such an easy sell.”

Core libraries represent millions of years of history and hold keys to understanding and protecting Earth’s future. They are natural archives of ice-covered continents, forested lands, and ancient cultures. As such, they are a legacy to be preserved and protected for future generations, Noren said. “But if you view it from another lens, they are just storage,” he explained. “So we need to elevate that conversation and make it clear that these materials are essential for science.”

Author Information

Jane Palmer (@JanePalmerComms), Science Writer

医院如何应对野火

EOS - Wed, 06/23/2021 - 13:48

This is an authorized translation of an Eos article. 本文是Eos文章的授权翻译。

野火正变得越来越严重和频繁。比城市污染更为危险的大量烟雾散布在整片大陆上,传播可吸入颗粒物,导致与烟雾有关的死亡,并使得远离实际燃烧地点的一系列医疗状况都变得恶化。

在一项新研究中,Sorensen等人按照邮政编码将可吸入烟雾颗粒浓度与当地医院重症监护病房(ICU)的入院情况进行了比较。研究发现,一个地区的烟雾颗粒物增加5天后,ICU的入院人数出现了微小但足以测量到的上升。

研究人员随后模拟了一个严重的持续一周的烟雾情景。在这种情况下,ICU入住数预计会增加131%——这足以超出ICU的容纳能力,尤其对于资源相对较少的小医院来说。

由于要照顾有生命危险的危重病人,ICU更需要资源,必须随时保有必要的设备来监测和支持每个病人脆弱的器官系统。例如,ICU通常有一对一的护士病人比例,护理的紧迫性意味着医院必须有快速的交通工具往返于各个设施之间。由于烟雾颗粒物导致的ICU入院人数激增,意味着可能会从其他住院病人那里夺走资源。当资源分配过少时,医院就不能满足病人的需求,护理可能会受到影响。

根据医院的记录,作者发现,对烟雾污染反应迅速的年轻哮喘患者在接触烟雾后会立即入住ICU,而患有心血管疾病的老年患者往往延迟入住。长期严重的烟雾对儿童的危害最大,因为他们的肺部在接触烟雾后很快就会受到影响,这意味着医院可能没有足够的时间来获取额外的资源。此外,由于儿童重症监护病房较少,较大地区范围内的儿童经常被送往同一医疗中心。

科学家预测,气候变化将带来更频繁和更强烈的森林野火。所幸目前的系统可以提前2天以精确的空间分辨率预测可吸入烟雾的排放。有了适当的系统,医院可以利用这些信息更有效地分配资源,并对那些远离火灾发生地可能没有意识到风险的地区提出预警。 (GeoHealth, https://doi.org/10.1029/2021GH000385, 2021)

—科学作家Elizabeth Thompson

This translation was made by Wiley. 本文翻译由Wiley提供。

Share this article on WeChat. 在微信上分享本文

The Possible Evolution of an Exoplanet’s Atmosphere

EOS - Wed, 06/23/2021 - 13:47

“Small terrestrial planets, where we might find life outside of our solar system, are profoundly impacted by atmosphere loss. We have no idea how common atmospheric restoration is, but it is going to be important in the long-term study of potential habitable worlds.”Researchers have long been curious about how atmospheres on rocky exoplanets might evolve. The evolution of our own atmosphere is one model: Earth’s primordial atmosphere was rich in hydrogen and helium, but our planet’s gravitational grip was too weak to prevent these lightest of elements from escaping into space. Researchers want to know whether the atmospheres on Earth-like exoplanets experience a similar evolution.

By analyzing spectroscopic data taken by the Hubble Space Telescope, Mark Swain and his team were able to describe one scenario for atmospheric evolution on Gliese 1132 b (GJ 1132 b), a rocky exoplanet similar in size and density to Earth. In a new study published in the Astronomical Journal, Swain and his colleagues suggest that GJ 1132 b has restored its hydrogen-rich atmosphere after having lost it early in the exoplanet’s history.

“Small terrestrial planets, where we might find life outside of our solar system, are profoundly impacted by atmosphere loss,” said Swain, a research scientist at the NASA Jet Propulsion Laboratory (JPL) in Pasadena, Calif. “We have no idea how common atmospheric restoration is, but it is going to be important in the long-term study of potential habitable worlds.”

The Atmosphere Conundrum

GJ 1132 b closely orbits the red dwarf Gliese 1132, about 40 light-years away from Earth in the constellation Vela. Using Hubble’s Wide Field Camera 3, Swain and his team gathered transmission spectrum data as the planet transited in front of the star four times. They checked for the presence of an atmosphere with a tool called Exoplanet Calibration Bayesian Unified Retrieval Pipeline (EXCALIBUR). To their surprise, they detected an atmosphere on GJ 1132 b—one with a remarkable composition.

“Atmosphere can come back, but we were not expecting to find the second atmosphere rich in hydrogen,” said Raissa Estrela, a postdoctoral fellow at JPL and a contributing author on the paper. “We expected a heavier atmosphere, like the nitrogen-rich one on Earth.”

To explain the presence of hydrogen in the atmosphere, researchers considered the evolution of the exoplanet’s surface, including possible volcanic activity. Like early Earth, GJ 1132 b was likely initially covered by magma. As such planets age and cool, denser substances sink down to the core and mantle and lighter substances solidify as crust and create a rocky surface.

Swain and his team proposed that a portion of GJ 1132 b’s primordial atmosphere, rather than being lost to space, was absorbed by its magmatic sea before the exoplanet’s interior differentiated. As the planet aged, its thin crust would have acted as a cap on the hydrogen-infused mantle below. If tidal heating prevented the mantle from crystallizing, the trapped hydrogen would escape slowly through the crust and continually resupply the emerging atmosphere.

“This may be the first paper that explores an observational connection between the atmosphere of a rocky exoplanet and some of the [contributing] geologic processes,” said Swain. “We were able to make a statement that there is outgassing [that has been] more or less ongoing because the atmosphere is not sustainable. It requires replenishment.”

The Hydrogen Controversy

“I find the idea of a hydrogen-dominated atmosphere to be a really implausible story.”Not everyone agrees.

“I find the idea of a hydrogen-dominated atmosphere to be a really implausible story,” said Raymond Pierrehumbert, Halley Professor of Physics at the University of Oxford in the United Kingdom, who did not contribute to the study.

Pierrehumbert pointed to a preprint article from a team of scientists led by Lorenzo V. Mugnai, a Ph.D. student in astrophysics at Sapienza University of Rome. Mugnai’s team examined the same data from GJ 1132 b as Swain’s did, but did not identify a hydrogen-rich atmosphere.

According to Pierrehumbert, the devil is in the details of how the data were analyzed. Most notably, Mugnai’s team used different software (Iraclis) to analyze the Hubble transit data. Later, Mugnai and his group repeated their analysis using another set of tools (Calibration of Transit Spectroscopy Using Causal Data, or CASCADe) when they saw how profoundly different their findings were.

“We used two different software programs to analyze the space telescope data,” said Mugnai. “Both of them lead us to the same answer; it’s different from the one found in [Swain’s] work.”

Another preprint article, by a team led by University of Colorado graduate student Jessica Libby-Roberts, supported Mugnai’s findings. That study, which also used the Iraclis pipeline, ruled out the presence of a cloud-free, hydrogen- or helium-dominated atmosphere on GJ 1132 b. The analysis did not negate an atmosphere on the planet, just one detectable by Hubble (i.e., hydrogen-rich). This group proposed a secondary atmosphere with a high metallicity (similar to Venus), an oxygen-dominated atmosphere, or perhaps no atmosphere at all.

Constructive Conflict

The research groups led by Swain and Mugnai have engaged in constructive conversations to identify the reason for the differences, specifically why the EXCALIBUR, Iraclis, and CASCADe software pipelines are producing such different results.

“We are very proud and happy of this collaboration,” said Mugnai. “It’s proof of how different results can be used to learn more from each other and help the growth of [the entire] scientific community.”

“I think both [of our] teams are really motivated by a desire to understand what’s going on,” said Swain.

The Telescope of the Future

“Every rocky exoplanet is a world of possibilities. JWST is expected to provide the first opportunity to search for signs of habitability and biosignatures in the atmospheres of potentially habitable exoplanets. We are on the brink of beginning to answer [many of] these questions.”According to Pierrehumbert, the James Webb Space Telescope (JWST) may offer a solution to this quandary. JWST will allow for the detection of atmospheres with higher molecular weights, like the nitrogen-dominated atmosphere on Earth. If GJ 1132 b lacks an atmosphere, JWST’s infrared capabilities may even allow scientists to observe the planet’s surface. “If there are magma pools or volcanism going on, those areas will be hotter,” Swain explained in a statement. “That will generate more emission, and so they’ll be looking potentially at the actual geologic activity—which is exciting!”

GJ 1132 b is slated for two observational passes when JWST comes online. Kevin Stevenson, a staff astronomer at Johns Hopkins Applied Physics Laboratory, and Jacob Lustig-Yaeger, a postdoctoral fellow there, will lead the teams.

“Every rocky exoplanet is a world of possibilities,” said Lustig-Yaeger. “JWST is expected to provide the first opportunity to search for signs of habitability and biosignatures in the atmospheres of potentially habitable exoplanets. We are on the brink of beginning to answer [many of] these questions.”

—Stacy Kish (@StacyWKish), Science Writer

Better Subseasonal-to-Seasonal Forecasts for Water Management

EOS - Wed, 06/23/2021 - 13:45

California experiences the largest year-to-year swings in wintertime precipitation (relative to its average conditions) in the United States, along with considerable swings within a given water year (1 October to 30 September). For example, 1977 was one of the driest years on record, whereas 1978 was one of the wettest. In December 2012, California was on pace for its wettest year on record, but starting in January 2013, the next 14 months were drier than any period of the entire 100-year observational record.

The considerable variability of precipitation within given water years and from year to year poses a major challenge to providing skillful long-range precipitation forecasts. This challenge, coupled with precipitation extremes at both ends of the spectrum—extremes that are projected to increase across the state through the 21st century as a result of climate change—greatly complicates smart management of water resources, upon which tens of millions of residents rely.

California and other states stand to benefit from emerging research methods that have the potential to improve the skill of subseasonal-to-seasonal precipitation forecasts.The predictive skill of long-range precipitation forecasts in this region has historically been weak, meaning scientists have not been able to aid state and local water managers with reliable forecasts of precipitation and drought for lead times longer than a week or two. The marginal success that forecasters have had to date in predicting winter season rainfall deviations, or anomalies, in California has been tied to the state of the El Niño–Southern Oscillation (ENSO). Yet ENSO explains only a fraction of the historical year-to-year variation in precipitation over California [e.g., DeFlorio et al., 2013], and many predictability studies have been limited by insufficient observational data that only recorded several large ENSO events. Further limitations have been imposed by climate models that did not have accurate enough representations of ENSO and its associated impacts on California’s weather and climate.

These weaknesses have hindered long-range planning and sometimes resulted in reactive or less-than-optimal management decisions. Now, however, California and other states stand to benefit in many ways from emerging research methods that have the potential to improve the skill of subseasonal (2- to 6-week) to seasonal (2- to 6-month) precipitation forecasts. Such forecasts could help, for example, in managing state water supplies during winters with periods of prolonged drought. Long-lasting drought conditions present unique challenges, such as the necessity for drought response activation at the state level.

A cow stands near a dry watering hole on a California ranch during drought conditions in 2014. Improved subseasonal-to-seasonal weather forecasts could benefit agriculture and ranching, among other sectors. Credit: U.S. Department of Agriculture photo by Cynthia Mendoza, CC BY 2.0

Responding to the substantial demand from end users, including water managers, the international research community has been increasingly focused in recent years on improving forecast skill and quantifying forecast uncertainty on subseasonal-to-seasonal (S2S) timescales [National Academies of Sciences, Engineering, and Medicine, 2016; Vitart et al., 2017]. Several collaborative efforts within the applied research community have detailed the potential value of S2S forecasts to a variety of end users, including (but not limited to) water resource management. Additional end user sectors that stand to benefit from improved S2S forecasts include agriculture, insurance and reinsurance, and commodities trading [Mariotti et al., 2020; Merryfield et al., 2020].

Stakeholder Needs Drive Investments in S2S Forecasting

Worldwide, the focus on S2S forecasting is steadily increasing. This impetus is represented in the World Meteorological Organization’s World Weather Research Programme (WWRP) and the S2S Prediction Project under the World Climate Research Programme (WCRP). Nationally, the U.S. Weather Research and Forecasting Innovation Act of 2017 (Public Law 115-25) mandated that NOAA improve S2S forecasts to benefit society.

Accordingly, NOAA’s Modeling, Analysis, Predictions and Projections (MAPP) program has led the development of the Subseasonal Experiment (SubX) over the past several years. This effort aims to improve subseasonal prediction of precipitation and other climate variables and to provide a public data set for the research community to explore in predictability studies [Pegion et al., 2019].

Separately, since 2017, the California Department of Water Resources (CDWR) has funded a partnership to improve S2S prediction of precipitation over the western United States, with a particular focus on California. This partnership includes the Center for Western Weather and Water Extremes (CW3E), the NASA Jet Propulsion Laboratory (JPL), and other institutional collaborators. CDWR’s motivation is largely to support drought preparedness—as long ago as California’s 1976–1977 drought, state water managers recognized that the skill of available operational seasonal precipitation forecasts was insufficient for decisionmaking.

The S2S research and development effort described here is the only project in the Real-Time Pilot Initiative that is focused on water in the western United States.The objective of the CW3E-JPL partnership is to provide water resource managers in the western United States with new experimental tools for S2S precipitation forecasting. One such tool, for example, addresses atmospheric rivers [Ralph et al., 2018], or ARs (e.g., the Pineapple Express, one “flavor” of AR), and ridging events (elongated areas of high atmospheric pressure) [Gibson et al., 2020a], both of which strongly affect wintertime precipitation over the western United States [e.g., Guan et al., 2013; Ralph et al., 2019].

The efforts of the CW3E-JPL team are also a part of the S2S Prediction Project’s Real-Time Pilot Initiative. This initiative includes 16 international research groups, each of which is using real-time forecast data from particular modeling centers, along with the S2S Prediction Project’s hindcast database for applied research efforts with a specific end user. Examples of end users participating in this project include the Kenya National Drought Management Authority, the Italian Civil Protection Department, and the Agriculture and Food Stakeholder Network of Australia’s Commonwealth Scientific and Industrial Research Organisation.

The S2S research and development effort described here is the only project in the pilot initiative that is focused on water in the western United States, and it is helping raise the visibility of the needs of western U.S. water resource managers among the international applied science community.

Different Decisions Require Different Lead Times

Water management in California and across the western United States is a challenging and dynamic operation. In addition to the fundamental influence of rainfall and snowfall in determining water supply, water management is affected by many political and socioeconomic considerations. Such considerations in water management include public health and safety minimum supply requirements for the population, which are particularly relevant during extreme drought conditions. Another consideration relevant during less extreme drought times is the prioritization of water use when there is an insufficient amount of resources to meet all objectives (balancing use for fisheries, agriculture, municipalities, etc.).

Effective management of water supply across the region requires different information at different lead times, in part because a variety of atmospheric and oceanic phenomena influence precipitation over these different timescales (Figure 1).

Fig. 1. Lead times for water management decision support needs vary over daily to decadal/century timescales, as do physical processes that affect the predictability of precipitation over the western United States.

Weather information provided over shorter lead times provides intelligence for operational decisions regarding flood risk management, emergency response, and situational awareness of potential hazards. Precipitation anomalies on the timescales of weather across the western United States are dominated by the presence or absence of ARs and ridging events. ARs are associated with bringing precipitation to the western United States. They can be beneficial or hazardous from a water management perspective, depending on AR intensity, duration, and antecedent drought conditions [Ralph et al., 2019]. Ridging events are areas of extensive high atmospheric pressure anomalies in the midtroposphere. Several different ridge types have been historically linked to drought over California [Gibson et al., 2020a].

Regulatory limits on water transfer could be better supported if we had improved precipitation forecasts with a lead time of weeks to months.Forecasts with lead times of weeks or months are more useful for decisions about asset positioning or about operational plans that can be adapted to weather outcomes as they happen. For example, state regulations associated with the California State Water Project limit water transfer amounts across the Sacramento–San Joaquin Delta. These water transfers occur because most of California’s water supply originates north of the delta, while most of the demand is south of the delta. Development of water resources infrastructure over the past century has made use of natural waterways to move water from the supply-rich region to the demand centers. Regulatory limits on water transfer could be better supported if we had improved precipitation forecasts with a lead time of weeks to months.

In addition, hydropower systems that have a chain of reservoirs could leverage better S2S forecasts to maximize the value gained from knowing which reservoirs are at capacity and which are running low at any given time. Precipitation anomalies on these timescales are influenced by both ARs and ridging, as well as by variations in the magnitude and phase of ENSO and the Madden–Julian Oscillation, a tropical atmospheric disturbance that travels around the planet every 1–2 months, for example.

On seasonal to annual scales, forecasts aid decisionmaking with respect to resourcing and budgeting that allow water managers to be prepared to respond to weather extremes, or to adopt more costly response packages that may involve legal review components such as environmental review or concurrence with regulatory mandates. Precipitation anomalies at these lead times can be influenced by ENSO and the quasi-biennial oscillation, a quasiperiodic oscillation of equatorial zonal wind anomalies in the stratosphere.

Beyond those scales, longer-term projections of climate change are used for planning adaptation and mitigation strategies. Identifying change thresholds in average precipitation or precipitation extremes can be used as triggers for implementing these strategies, which may require negotiated legislation or longer-term investment strategies.

A key goal of CDWR’s investment in near-term experimental forecasting products is to catalyze improvements in precipitation forecasting to fully implement the S2S requirement of Public Law 115-25. The need for such improvements was highlighted in the National Weather Service’s first-ever service assessment for drought, which summarized California’s drought in 2014 and stated, “A majority of the stakeholders interviewed for this assessment noted one of the best services NOAA could provide is improved seasonal predictions with increased confidence and better interpretation.”

Emerging Technologies Provide New Capabilities

The experimental forecast tools are supported by peer-reviewed hindcast assessments, which test the skill of a model by having it “predict” known events in the past.In response to the substantial need in the western U.S. water management community for better S2S precipitation forecasts, CW3E and JPL have developed a suite of research projects using emerging technical methods (Figure 2). For example, deep neural network techniques in combination with large ensemble climate model simulations will support the creation of experimental S2S forecast products for the region.

These products combine both dynamical model output from the S2S database and novel statistical techniques, including machine learning methods applied to large ensemble data sets and mathematical methods for discovering patterns and associations, such as extended empirical orthogonal function analysis and canonical correlation analysis. The experimental forecast tools are supported by peer-reviewed hindcast assessments, which test the skill of a model by having it “predict” known events in the past. There is a particular focus on applying these emerging methods to longer lead times, ranging from 1 to 6 months, over the broad western U.S. region.

Fig. 2. Quantities of interest, methods, and lead times investigated by the Center for Western Weather and Water Extremes/Jet Propulsion Laboratory S2S team to benefit water management in the western United States.

Critically, stakeholders at CDWR involved in water supply forecasting, reservoir operations, and interactions with governance for drought response provide not only funding but also direct input on the design of both research methodologies and the accompanying experimental forecast products. This research and operations partnership exemplifies an efficient applied research pipeline: End users of the forecast products ensure that the research supporting the products is designed and implemented in ways that will be useful to meet their needs, while at the same time, scientific peer review assures these end users of the forecasts’ scientific rigor.

Recently, this partnership has yielded two primary new products that are now available online and are focused on forecasting the odds of wet or dry conditions in coming weeks across the western United States. Each of these methods has been described in detail in formal publications that include quantification of their skill and reliability [DeFlorio et al., 2019a, 2019b; Gibson et al., 2020a, 2020b].

As weather across California and the U.S. West becomes increasingly variable and more difficult to prepare for, new science-based research and operations partnerships like these and others (e.g., Forecast Informed Reservoir Operations, which has supported better water supply management through skillful short-range forecasts of ARs and precipitation [Jasperse et al., 2020]) are offering enhanced abilities to see weeks and months into the future, a vital benefit for water management across the region.

The Wildfire One-Two: First the Burn, Then the Landslides

EOS - Tue, 06/22/2021 - 12:26

After the record-breaking 2020 wildfire season in California, the charred landscapes throughout the state faced elevated risks of landslides and other postfire hazards. Wildfires burn away the plant canopy and leaf litter on the ground, leaving behind soil stripped of much of its capacity to absorb moisture. As a result, even unassuming rains pose a risk for substantial surface runoff in the state’s mountainous terrain.

California has a history of fatal landslides, and the steep, burned hillsides are susceptible to flash flooding and debris flows. Fire-prone regions in the state rely on rainfall thresholds to anticipate the conditions for which postfire debris flows are more likely.

In a new study, Thomas et al. combined satellite data and hydrologic modeling to develop a predictive framework for landslides. The framework uses inputs, including vegetation reflectance and soil texture, among others, and physics-based simulation of water infiltration into the soil to simulate the hydrologic triggering conditions for landslides. The output offers thresholds to monitor the probability of landslides in the years after a burn.

The researchers tested their model against postwildfire soil moisture and debris flow observations from the San Gabriel Mountains in Southern California. The authors found that their results were consistent with recent debris flow events and previously established warning criteria. Additionally, they suggest that rainfall patterns, soil grain size, and root reinforcement could be critical factors in determining the probability of debris flows as burned landscapes recover.

The results suggest that the model could track soil hydraulic conditions following a fire using widely available rainfall, vegetation, and soil data. Such simulations could eventually support warning criteria for debris flows. The simulation framework, the authors note, could be beneficial for regions that have not historically experienced frequent fires and lack monitoring infrastructure. (Journal of Geophysical Research: Earth Surface, https://doi.org/10.1029/2021JF006091, 2021)

Learning from a Disastrous Megathrust Earthquake

EOS - Tue, 06/22/2021 - 12:24

On 11 March 2011, a 9 to 9.1 magnitude earthquake occurred off the shore of Tohoku, Japan. This was the biggest recorded earthquake in Japan and one of the five largest earthquakes in the world since the beginning of instrumental observations. It occurred in one of the best monitored areas in the world and has been extensively studied in the past decade. Research results have provided several surprises to the earthquake research community, including the earthquake’s unexpectedly large slip near the trench, the recognition of significant precursory seismic and geodetic anomalies, and the widespread and enduring changes in deformation rates and seismicity across Japan since the event. A recent article published in Reviews of Geophysics gives an overview of a decade of research on the Tohoku-oki earthquake. We asked the authors to explain the significance of this earthquake and lessons learned from it.

What are megathrust earthquakes?

Megathrust earthquakes are plate boundary ruptures that occur on the contact area of two converging tectonic plates in subduction zones. Megathrust ruptures involve thrusting of subducting oceanic plates (here the Pacific plate) under the overlying plates (here Japan as part of the North America or Okhotsk plate). Due to the unstoppable relative motion of the plates, stress accumulates in the areas where the interface of the two plates is locked and is eventually released in megathrust earthquakes.

Megathrust earthquake sources are usually located beneath the sea, which makes it difficult to make detailed observations.The world’s greatest earthquakes occur on megathrusts. Megathrust earthquake sources are usually located beneath the sea, which makes it difficult to make detailed observations based on seismic, geodetic, and geologic measurements.

Megathrusts also have the potential to produce devastating tsunamis because of the large ocean bottom vertical movement occurring during the earthquake.

Two days of aftershock recordings about a month (28 April) after the mainshock at Tono earthquake observatory (Tohoku University) in Iwate prefecture. Credit: Naoki Uchida

Prior to the Tohoku-oki earthquake, what were the gaps in our understanding of megathrust earthquakes?

Despite many studies of the Japan Trench, there was no consensus on the possibility of magnitude 9 earthquakes before the Tohoku-oki earthquake.Despite many studies of the Japan Trench, there was no consensus on the possibility of magnitude 9 earthquakes before the Tohoku-oki earthquake.

The instrumental records indicated a heterogeneous distribution of up to magnitude 8 earthquakes and repeated slips in the subduction zone. However, the records of the past 100 years did not directly address events with much longer recurrence intervals.

Land-based geodetic observations collected in the decades prior to the mainshock showed strong interplate locking offshore Tohoku. However, the resolution of these measurements was poor in the offshore area, and various ways to compensate the apparent slip deficit, including slow earthquakes, were considered to explain the discrepancies between seismologic and geodetic estimates of megathrust coupling and earthquake potential.

Since the 1980s, geological investigations of coastal tsunami sand deposits provided clear evidence of large tsunamigenic earthquakes that appeared to be substantially larger than instrumentally recorded events. However, the characterization of the ancient tsunami sources and utilization of these results in the evaluation of earthquake hazard was slow.

What exactly happened during the Tohoku-oki earthquake?

The earthquake was a megathrust event, which occurred along the Japan Trench where the Pacific plate thrusts below Japan. The mainshock rupture initiated close to a zone of slow fault slip with foreshocks on the plate interface in the previous months and a magnitude 7.3 foreshock two days prior.

Over the course of about three minutes, the fault slip propagated to fill out the rupture area of roughly 300 by 200 kilometers, catching up a slip deficit that had built up since as long ago as the 869 A.D. Jyogan earthquake. A maximum slip of about 60 meters occurred near the trench, and the resultant tsunami and shaking caused almost 20,000 deaths in Japan.

The office in which the first author was sitting at his desk at the time of the earthquake (about 180 kilometers west from the epicenter, Tohoku University, Sendai) Credit: Naoki Uchida

How has this event improved our understanding of the earthquake cycle and rupture processes?

Thanks to a decade of research, our understanding of the megathrust earthquake cycle and rupture process has improved in many aspects.Thanks to lessons learned from a decade of research, our understanding of the megathrust earthquake cycle and rupture process has improved in many aspects.

Detailed models of the earthquake slip suggest rupture occurred in an area with a large interplate slip deficit indicated by the pre-earthquake geodetic data. Knowledge of the coupling state and complex seismicity near the trench was improved by ocean bottom observations.

Additional geological surveys of tsunami deposits along the coast and observations of landslide deposits (turbidites) on the ocean bottom revealed the recurrence history of great tsunamis and earthquakes. They suggest quite frequent recurrence of tsunamigenic earthquakes that affected the Tohoku area.

The geophysical observations also identified various kinds of possible precursors before the mainshock. Understanding the uniqueness of such phenomena is important to understand the earthquake cycle and may eventually allow for issuing shorter-term earthquake forecasts.

What are ocean bottom observations and how can they improve our earthquake monitoring efforts?

Typical land surface observations of ground shaking and deformation by seismometers, tiltmeters, GPS, InSAR, and any geodetic measurements requiring the transmission of electromagnetic waves or light are difficult or impossible to record at the ocean bottom.

To complement land-based observations, seafloor systems have been developed to monitor the offshore portion of subduction zones. These include (cabled) ocean-bottom seismometers and pressure gauges, GPS-Acoustic measurements (which use sea-surface GPS and sound measurements between the surface and ocean-bottom for estimating seafloor displacements), and ranging using sound waves from ships or between ocean bottom stations.

The ocean bottom measurements better characterize coseismic and postseismic slip, help more accurately monitor the interplate coupling status, locate smaller earthquakes, and observe seismic and tsunami waves much earlier than the instruments on land.

In addition, observations of seafloor sediments provide evidence of ancient and historical great megathrust earthquakes, and boreholes drilled into the megathrust fault zone far offshore allow for examining the fault-zone materials and properties to improve the characterization of structure and fault behavior.

New offshore seismic and geodetic observation systems. (Top) The time advancement of (left) seismic and (right) tsunami wave detection thanks to the seafloor observation network for earthquakes and tsunami along the Japan trench (S-net, small red circles off NE Japan) and Dense Oceanfloor Network system for Earthquakes and Tsunamis (DONET, small red circles off SW Japan). Credit: Aoi et al., [2020], Figure 13 (Bottom) The S-net (left) and GPS-Acoustic (right) sensors awaiting deployment on ship (July 2014 and July 2012). Credit: National Research Institute for Earth Science and Disaster Resilience (left) and Motoyuki Kido (right).What additional research, data, or modeling is needed to predict future megathrust events more confidently?

Although post-Tohoku-oki studies have better characterized the hazard and a number of possible precursors have been identified, the confident prediction of such events appears impossible in the near future. More detailed investigations of earthquake cycle behavior and interplate locking from the perspective of multiple research fields will further improve the characterization of the conditions of earthquake occurrence and the associated hazard.

A comprehensive compilation of verifiable observations of long-term and short-term precursory processes, including rigorous statistical evaluation of their validity and physical understanding of the processes underlying such phenomena, is important.

While the prospects for reliable short-term prediction of destructive earthquakes may be low, probabilistic operational earthquake forecasting informed by detailed observations of earthquakes and slow-slip activity in the Japan Trench should be possible in the near future.

Why is it essential for earthquake research to be interdisciplinary?

The ability to characterize the nature and hazard of off-Tohoku earthquakes from each disciplinary perspective was limited before the Tohoku-oki earthquake. It appears that it would have been possible to ascertain the occurrence of megathrust events comparable in size to the 2011 Tohoku-oki earthquake if the results from seismic, geodetic, and geological studies had been considered together.

Thanks to a decade of data gathering and research, our understanding of the Japan Trench is much improved.Thanks to a decade of data gathering and research, our understanding of the Japan Trench is much improved compared to what was known before the Tohoku-oki earthquake. However, there are still challenges ahead for each discipline to more fully understand the various facets of megathrust earthquakes and to integrate these findings into a complete picture of the system.

—Naoki Uchida (naoki.uchida.b6@tohoku.ac.jp; 0000-0002-4220-9625), Graduate School of Science and International Research Institute of Disaster Science, Tohoku University, Japan; and Roland Bürgmann ( 0000-0002-3560-044X), Department of Earth and Planetary Science, University of California, Berkeley, USA

Gap in Exoplanet Size Shifts with Age

EOS - Mon, 06/21/2021 - 13:24

Twenty-six years ago, astronomers discovered the first planet orbiting a distant Sun-like star. Today thousands of exoplanets are known to inhabit our local swath of the Milky Way, and that deluge of data has inadvertently revealed a cosmic mystery: Planets just a bit larger than Earth appear to be relatively rare in the exoplanet canon.

A team has now used observations of hundreds of exoplanets to show that this planetary gap isn’t static but instead evolves with planet age—younger planetary systems are more likely to be missing slightly smaller planets, and older systems are more apt to be without slightly larger planets. This evolution is consistent with the hypothesis that atmospheric loss—literally, a planet’s atmosphere blowing away over time—is responsible for this so-called “radius valley,” the researchers suggested.

Changes with Age

“There’s a depletion of planets at about 1.7 Earth radii.”In 2017, scientists reported the first confident detection of the radius valley. (Four years earlier, a different team had published a tentative detection.) Defined by a relative paucity of exoplanets roughly 50%–100% larger than Earth, the radius valley is readily apparent when looking at histograms of planet size, said Julia Venturini, an astrophysicist at the International Space Science Institute in Bern, Switzerland, not involved in the new research. “There’s a depletion of planets at about 1.7 Earth radii.”

Trevor David, an astrophysicist at the Flatiron Institute in New York, and his colleagues were curious to know whether the location of the radius valley—that is, the planetary size range it encompasses—evolves with planet age. That’s an important question, said David, because finding evolution in the radius valley can shed light on its cause or causes. It’s been proposed that some planets lose their atmospheres over time, which causes them to change size. If the timescale over which the radius valley evolves matches the timescale of atmospheric loss, it might be possible to pin down that process as the explanation, said David.

“Age is one of those parameters that’s very difficult to determine for most stars.”In a new study published in the Astronomical Journal, the researchers analyzed planets originally discovered using the Kepler Space Telescope. They focused on a sample of roughly 1,400 planets whose host stars had been observed spectroscopically. Their first task was to determine the planets’ ages, which they assessed indirectly by estimating the ages of their host stars. (Because it takes just a few million years for planets to form around a star, these objects, astronomically speaking, have very nearly the same ages.)

The team calculated planet ages ranging from about 500 million years to 12 billion years, but “age is one of those parameters that’s very difficult to determine for most stars,” David said. That’s because estimates of stars’ ages rely on theoretical models of how stars evolve, and those models aren’t perfect when it comes to individual stars, he said. For that reason, the researchers decided to base most of their analyses on a coarse division of their sample into two age groups, one corresponding to stars younger than a few billion years and one encompassing stars older than about 2–3 billion years.

A Moving Valley

“We’re inferring that some sub-Neptunes are being converted to super-Earths through atmospheric loss.”When David and his collaborators looked at the distribution of planet sizes in each group, they indeed found a shift in the radius valley: Planets within it tended to be about 5% smaller, on average, in younger planetary systems compared with older planetary systems. It wasn’t wholly surprising to find this evolution, but it was unexpected that it persisted over such long timescales [billions of years], said David. “What was surprising was how long this evolution seems to be.”

These findings are consistent with planets losing their atmospheres over time, David and his colleagues proposed. The idea is that most planets develop atmospheres early on but then lose them, effectively shrinking in size from just below Neptune’s (roughly 4 times Earth’s radius) to just above Earth’s. “We’re inferring that some sub-Neptunes are being converted to super-Earths through atmospheric loss,” David told Eos. As time goes on, larger planets lose their atmospheres, which explains the evolution of the radius valley, the researchers suggested.

Kicking Away Atmospheres

Atmospheric loss can occur via several mechanisms, scientists believe, but two in particular are believed to be relatively common. Both involve energy being transferred into a planet’s atmosphere to the point that it can reach thousands of degrees kelvin. That input of energy gives the atoms and molecules within an atmosphere a literal kick, and some of them, particularly lighter species like hydrogen, can escape.

“You can boil the atmosphere of a planet,” said Akash Gupta, a planetary scientist at the University of California, Los Angeles not involved in the research.

In the first mechanism—photoevaporation—the energy is provided by X-ray and ultraviolet photons emitted by a planet’s host star. In the second mechanism—core cooling—the source of the energy is the planet itself. An assembling planet is formed from successive collisions of rocky objects, and all of those collisions deposit energy into the forming planet. Over time, planets reradiate that energy, some of which makes its way into their atmospheres.

Theoretical studies have predicted that photoevaporation functions over relatively short timescales—about 100 million years—while core cooling persists over billions of years. But concluding that core cooling is responsible for the evolution in the radius valley would be premature, said David, because some researchers have suggested that photoevaporation can also act over billions of years in some cases. It’s hard to pinpoint which is more likely at play, said David. “We can’t rule out either the photoevaporation or core-powered mass loss theories.”

It’s also a possibility that the radius valley might arise because of how planets form, not how they evolve. In the future, David and his colleagues plan to study extremely young planets, those only about 10 million years old. These youngsters of the universe should preserve more information about their formation, the researchers hope.

—Katherine Kornei (@KatherineKornei), Science Writer

Subduction Initiation May Depend on a Tectonic Plate’s History

EOS - Mon, 06/21/2021 - 13:24

Subduction zones are cornerstone components of plate tectonics, with one plate sliding beneath another back into Earth’s mantle. But the very beginning of this process—subduction initiation—remains somewhat mysterious to scientists because most of the geological record of subduction is buried and overwritten by the extreme forces at play. The only way to understand how subduction zones get started is to look at young examples on Earth today.

This schematic shows the tectonic setting of the Puysegur Margin approximately 16 million years ago. Strike-slip motion juxtaposed oceanic crust from the Australian plate with thinned continental crust from the Pacific plate. Collision between the plates near the South Island of New Zealand forced the oceanic Australian plate beneath the continental Pacific plate, giving rise to subduction at the Puysegur Trench. Credit: Brandon Shuck

In a new study, Shuck et al. used a combination of seismic imaging techniques to create a detailed picture of the Puysegur Trench off the southwestern coast of New Zealand. At the site, the Pacific plate to the east overrides the Australian plate to the west. The Puysegur Margin is extremely tectonically active and has shifted regimes several times in the past 45 million years, transitioning from rifting to strike-slip to incipient subduction. The margin’s well-preserved geological history makes it an ideal location to study how subduction starts. The team’s seismic structural analysis showed that subduction zone initiation begins along existing weaknesses in Earth’s crust and relies on differences in lithospheric density.

The conditions necessary for the subduction zone’s formation began about 45 million years ago, when the Australian and Pacific plates started to pull apart from each other. During that period, extensional forces led to seafloor spreading and the creation of new high-density oceanic lithosphere in the south. However, in the north, the thick and buoyant continental crust of Zealandia was merely stretched and slightly thinned. Over the next several million years, the plates rotated, and strike-slip deformation moved the high-density oceanic lithosphere from the south to the north, where it slammed into low-density continental lithosphere, allowing subduction to begin.

The researchers contend that the differences in lithospheric density combined with existing weaknesses along the strike-slip boundary from the previous tectonic phases facilitated subduction initiation. The team concludes that strike-slip might be a key driver of subduction zone initiation because of its ability to efficiently bring together sections of heterogeneous lithosphere along plate boundaries. (Tectonics, https://doi.org/10.1029/2020TC006436, 2021)

—David Shultz, Science Writer

Juno Detects Jupiter’s Highest-Energy Ions

EOS - Thu, 06/17/2021 - 12:16

Jupiter’s planetary radiation environment is the most intense in the solar system. NASA’s Juno spacecraft has been orbiting the planet closer than any previous mission since 2016, investigating its innermost radiation belts from a unique polar orbit. The spacecraft’s orbit has enabled the first complete latitudinal and longitudinal study of Jupiter’s radiation belts. Becker et al. leverage this capability to report the discovery of a new population of heavy, high-energy ions trapped at Jupiter’s midlatitudes.

The authors applied a novel technique for detecting this population; rather than using a particle detector or spectrometer to observe and quantify the ions, they used Juno’s star-tracking camera system. Star trackers, or stellar reference units (SRUs), are high-resolution navigational cameras whose primary mission is using observations of the sky to compute the spacecraft’s precise orientation. The SRU on board the Juno spacecraft is among the most heavily shielded components, afforded 6 times more radiation protection than the spacecraft’s other systems in its radiation vault.

https://photojournal.jpl.nasa.gov/archive/PIA24436.mp4

This animation shows the Juno spacecraft’s stellar reference unit (SRU) star camera (left) as it is hit by high-energy particles in Jupiter’s inner radiation belts. The signatures from these hits appear as dots, squiggles, and streaks (right) in the images collected by the SRU. Credit: NASA/JPL-Caltech

Despite its heavy protection, ions and electrons with very high energies still occasionally penetrate the shielding and strike the SRU sensor. This study focuses on 118 unusual events that struck with dramatically higher energy than typical penetrating electrons. Using computer modeling and laboratory experiments, the authors determined that these ions deposited 10 and 100 times more energy than deposited by penetrating protons and electrons, respectively.

To identify potentially responsible ion species, the authors examined the morphology of the sensor strikes. Although most strikes trigger only several pixels, a few events with a low incidence angle can create streaks in which energy is deposited as the particle penetrates successive pixels. Simulation software can predict the energy deposition of various particles moving through matter, providing candidates for the ions encountered by Juno. Ion species as light as helium or as heavy as sulfur could account for at least some of the observed strikes, the authors said. Species from helium through oxygen could account for all the strikes, provided they have energies in excess of 100 megaelectron volts per nucleon.

Finally, the study attributes these ions to the inner edge of the synchrotron emission region, located at radial distances of 1.12–1.41 Jupiter radii and magnetic latitudes ranging from 31° ­to 46°. This region has not been explored by prior missions, and this population of ions was previously unknown. With total energies measured in gigaelectron volts, they represent the highest-energy particles yet observed by Juno. (Journal of Geophysical Research: Planets, https://doi.org/10.1029/2020JE006772, 2021)

—Morgan Rehnberg, Science Writer

Siberian Heat Wave Nearly Impossible Without Human Influence

EOS - Thu, 06/17/2021 - 12:15

Last year was hot. NASA declared that it tied 2016 for the hottest year on record, and the Met Office of the United Kingdom said it was the final year in the warmest 10-year period ever recorded. Temperatures were particularly high in Siberia, with some areas experiencing monthly averages more than 10°C above the 1981–2010 average. Overall, Siberia had the warmest January to June since records began; on 20 June, the town of Verkhoyansk, Russia, hit 38°C, the highest temperature ever recorded in the Arctic Circle.

In a new article in Climatic Change, Andrew Ciavarella from the Met Office and an international team of climate scientists showed that the prolonged heat in Siberia would have been almost impossible without human-induced climate change. Global warming made the heat wave at least 600 times more likely than in 1900, they found.

Ciavarella said that without climate change, such an event would occur less than once in thousands of years, “whereas it has come all the way up in probability to being a one in a 130-year event in the current climate.” Ciavarella and his coauthors are part of the World Weather Attribution initiative, an effort to “analyze and communicate the possible influence of climate change on extreme weather events.”

According to the Met Office, events leading to Siberia’s prolonged heat began the previous autumn. Late in 2019, the Indian Ocean Dipole—the difference in sea surface temperature between the western and eastern Indian Ocean—hit a record high, supercharging the jet stream and leading to low pressure and extreme late winter warmth over Eurasia. This unseasonably warm weather persisted into spring and reduced ice and snow cover, which exacerbated the warm conditions by increasing the amount of solar energy absorbed by land and sea.

Cataloging the Past, Forecasting the Future

The resulting high temperatures unleashed a range of disasters. Most obvious were wildfires that burned almost 255,000 square kilometers of Siberian forests, leading to the release of 56 megatons of carbon dioxide in June. The heat also drove plagues of tree-eating moths and caused permafrost thaws that were blamed for infrastructure collapses and fuel spills, including one leak of 150,000 barrels of diesel.

“Events of precisely the magnitude that we saw, they will increase in frequency.”The researchers compared the climate with and without global warming using long series of observational data sets and climate simulations. At the beginning of the 20th century, similar extremely warm periods in Siberia would have been at least 2°C cooler, they found. Global warming also made the record-breaking June temperature in Verkhoyansk much more likely, with maximum temperatures at least 1°C warmer than they would have been in 1900.

The team also looked to the future. They found that by 2050 such warm spells could be 2.5°C to 7°C hotter than in 1900 and 0.5°C to 5°C warmer than in 2020. “Events of precisely the magnitude that we saw, they will increase in frequency, and it wouldn’t be unexpected that you would then see also events of an even higher magnitude as well,” Ciavarella said.

Dim Coumou, a climate scientist at Vrije Universiteit Amsterdam, agrees that such an event would not have happened in a preindustrial climate. “With global warming summer temperatures are getting warmer, and therefore, the probability of heat waves and prolonged warm periods are really strongly increasing,” he explained, adding that this pattern is particularly pronounced in Siberia, as the high latitudes are warming faster. Coumou was not involved in the new research.

“We should be aware that things may have global effects.”In addition to local issues (like the health impact of heat exposure, wildfires, and the collapsing of structures built on thawing permafrost), we should also be concerned about the wider impact of heat events in Siberia, said Martin Stendel, a climate scientist at the Danish Meteorological Institute. Stendel was not involved in the new research but has worked on other studies for World Weather Attribution. Thawing permafrost, for example, releases greenhouse gases such as carbon dioxide and methane into the atmosphere.

“We should be aware that things may have global effects,” he said.

—Michael Allen (michael_h_allen@hotmail.com), Science Writer

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer