Feed aggregator

A geometry-dependent surface Lambertian-equivalent reflectivity product for UV–Vis retrievals – Part 1: Evaluation over land surfaces using measurements from OMI at 466 nm

A geometry-dependent surface Lambertian-equivalent reflectivity product for UV–Vis retrievals – Part 1: Evaluation over land surfaces using measurements from OMI at 466 nm
Wenhan Qin, Zachary Fasnacht, David Haffner, Alexander Vasilkov, Joanna Joiner, Nickolay Krotkov, Bradford Fisher, and Robert Spurr
Atmos. Meas. Tech., 12, 3997-4017, https://doi.org/10.5194/amt-12-3997-2019, 2019
Satellite observations depend on Sun and view angles due to anisotropy of the Earth's atmosphere and surface reflection. But most of the ultraviolet and visible cloud, aerosol, and trace-gas algorithms utilize surface reflectivity databases that do not account for surface anisotropy. We create a surface database using the GLER concept which adequately accounts for surface anisotropy, validate it with independent satellite data, and provide a simple implementation to the current algorithms.

Comparison between the assimilation of IASI Level 2 ozone retrievals and Level 1 radiances in a chemical transport model

Comparison between the assimilation of IASI Level 2 ozone retrievals and Level 1 radiances in a chemical transport model
Emanuele Emili, Brice Barret, Eric Le Flochmoën, and Daniel Cariolle
Atmos. Meas. Tech., 12, 3963-3984, https://doi.org/10.5194/amt-12-3963-2019, 2019
We examine the differences between assimilating ozone profiles retrieved from IASI or the corresponding infrared spectra in a chemical transport model. This allows the impact of the retrieval's prior information on ozone reanalyses to be quantified. We found that significant differences can arise between the two approaches, depending on the latitude. An improved O3 variability is obtained assimilating IASI radiances directly. The implications for coupled Earth system models are discussed.

Rayleigh wind retrieval for the ALADIN airborne demonstrator of the Aeolus mission using simulated response calibration

Rayleigh wind retrieval for the ALADIN airborne demonstrator of the Aeolus mission using simulated response calibration
Xiaochun Zhai, Uwe Marksteiner, Fabian Weiler, Christian Lemmerz, Oliver Lux, Benjamin Witschas, and Oliver Reitebuch
Atmos. Meas. Tech. Discuss., https//doi.org/10.5194/amt-2019-274,2019
Manuscript under review for AMT (discussion: open, 0 comments)
An airborne prototype called A2D was developed for validating the Aeolus measurement principle based on realistic atmospheric signals. However, the atmospheric and instrumental variability currently limit the reliability and repeatability of the measured Rayleigh response calibration (MRRC), which is a prerequisite for accurate wind retrieval. A procedure for a simulated Rayleigh response calibration is developed and presented to resolve these limitations of the A2D Rayleigh channel MRRC.

Simulating precipitation radar observations from a geostationary satellite

Atmos. Meas. techniques - Fri, 07/19/2019 - 18:00
Simulating precipitation radar observations from a geostationary satellite
Atsushi Okazaki, Takumi Honda, Shunji Kotsuki, Moeka Yamaji, Takuji Kubota, Riko Oki, Toshio Iguchi, and Takemasa Miyoshi
Atmos. Meas. Tech., 12, 3985-3996, https://doi.org/10.5194/amt-12-3985-2019, 2019
The JAXA is surveying the feasibility of a potential satellite mission equipped with a precipitation radar on a geostationary orbit, as a successor of the GPM Core Observatory. We investigate what kind of observation data will be available from the radar using simulation techniques. Although the quality of the observation depends on the radar specifications and the position of precipitation systems, the results demonstrate that it would be possible to obtain three-dimensional precipitation data.

A geometry-dependent surface Lambertian-equivalent reflectivity product for UV–Vis retrievals – Part 1: Evaluation over land surfaces using measurements from OMI at 466 nm

Atmos. Meas. techniques - Fri, 07/19/2019 - 18:00
A geometry-dependent surface Lambertian-equivalent reflectivity product for UV–Vis retrievals – Part 1: Evaluation over land surfaces using measurements from OMI at 466 nm
Wenhan Qin, Zachary Fasnacht, David Haffner, Alexander Vasilkov, Joanna Joiner, Nickolay Krotkov, Bradford Fisher, and Robert Spurr
Atmos. Meas. Tech., 12, 3997-4017, https://doi.org/10.5194/amt-12-3997-2019, 2019
Satellite observations depend on Sun and view angles due to anisotropy of the Earth's atmosphere and surface reflection. But most of the ultraviolet and visible cloud, aerosol, and trace-gas algorithms utilize surface reflectivity databases that do not account for surface anisotropy. We create a surface database using the GLER concept which adequately accounts for surface anisotropy, validate it with independent satellite data, and provide a simple implementation to the current algorithms.

Comparison between the assimilation of IASI Level 2 ozone retrievals and Level 1 radiances in a chemical transport model

Atmos. Meas. techniques - Fri, 07/19/2019 - 18:00
Comparison between the assimilation of IASI Level 2 ozone retrievals and Level 1 radiances in a chemical transport model
Emanuele Emili, Brice Barret, Eric Le Flochmoën, and Daniel Cariolle
Atmos. Meas. Tech., 12, 3963-3984, https://doi.org/10.5194/amt-12-3963-2019, 2019
We examine the differences between assimilating ozone profiles retrieved from IASI or the corresponding infrared spectra in a chemical transport model. This allows the impact of the retrieval's prior information on ozone reanalyses to be quantified. We found that significant differences can arise between the two approaches, depending on the latitude. An improved O3 variability is obtained assimilating IASI radiances directly. The implications for coupled Earth system models are discussed.

Rayleigh wind retrieval for the ALADIN airborne demonstrator of the Aeolus mission using simulated response calibration

Atmos. Meas. techniques - Fri, 07/19/2019 - 18:00
Rayleigh wind retrieval for the ALADIN airborne demonstrator of the Aeolus mission using simulated response calibration
Xiaochun Zhai, Uwe Marksteiner, Fabian Weiler, Christian Lemmerz, Oliver Lux, Benjamin Witschas, and Oliver Reitebuch
Atmos. Meas. Tech. Discuss., https://doi.org/10.5194/amt-2019-274,2019
Manuscript under review for AMT (discussion: open, 0 comments)
An airborne prototype called A2D was developed for validating the Aeolus measurement principle based on realistic atmospheric signals. However, the atmospheric and instrumental variability currently limit the reliability and repeatability of the measured Rayleigh response calibration (MRRC), which is a prerequisite for accurate wind retrieval. A procedure for a simulated Rayleigh response calibration is developed and presented to resolve these limitations of the A2D Rayleigh channel MRRC.

Measuring compound flood potential from river discharge and storm surge extremes at the global scale and its implications for flood hazard

Natural Hazards and Earth System Sciences - Fri, 07/19/2019 - 17:49
Measuring compound flood potential from river discharge and storm surge extremes at the global scale and its implications for flood hazard
Anaïs Couasnon, Dirk Eilander, Sanne Muis, Ted I. E. Veldkamp, Ivan D. Haigh, Thomas Wahl, Hessel Winsemius, and Philip J. Ward
Nat. Hazards Earth Syst. Sci. Discuss., https//doi.org/10.5194/nhess-2019-205,2019
Manuscript under review for NHESS (discussion: open, 0 comments)

The interaction between physical drivers from oceanographic, hydrological, and meteorological processes in coastal areas can result in compound flooding. Compound flood events, like Cyclone Idai and Hurricane Harvey, have revealed the devastating consequences of the co-occurrence of coastal and river floods. A number of studies have recently investigated the likelihood of compound flooding at the continental scale based on simulated variables of flood drivers such as storm surge, precipitation, and river discharges. At the global scale, this has only been performed based on observations, thereby excluding a large extent of the global coastline. The purpose of this study is to fill this gap and identify potential hotspots of compound flooding from river discharge and storm surge extremes in river mouths globally. To do so, we use daily time-series of river discharge and storm surge from state-of-the-art global models driven with consistent meteorological forcing from reanalysis datasets. We measure the compound flood potential by analysing both variables with respect to their timing, joint statistical dependence, and joint return period. We find many hotspot regions of compound flooding that could not be identified in previous global studies based on observations alone, such as: Madagascar, Northern Morocco, Vietnam, and Taiwan. We report possible causal mechanisms for the observed spatial patterns based on existing literature. Finally, we provide preliminary insights on the implications of the bivariate dependence behaviour on the flood hazard characterisation using Madagascar as a case study. Our global and local analyses show that the dependence structure between flood drivers can be complex and can significantly impact the joint probability of discharge and storm surge extremes. These emphasise the need to refine global flood risk assessments and emergency planning to account for these potential interactions.

Plasma transport into the duskside magnetopause caused by Kelvin–Helmholtz vortices in response to the northward turning of the interplanetary magnetic field observed by THEMIS

Plasma transport into the duskside magnetopause caused by Kelvin–Helmholtz vortices in response to the northward turning of the interplanetary magnetic field observed by THEMIS
Guang Qing Yan, George K. Parks, Chun Lin Cai, Tao Chen, James P. McFadden, and Yong Ren
Ann. Geophys. Discuss., https//doi.org/10.5194/angeo-2019-103,2019
Manuscript under review for ANGEO (discussion: open, 0 comments)
We are presenting: (1) K–H vortices in direct response to the northward turning of IMF without pre-existing denser boundary layer to facilitate the instability; (2) substantial solar wind transport into magnetosphere caused by the K–H vortices, involving both ion and electron fluxes; (3)typical portaits of the ion and electron fluxes in the region of plasma tranport. The unique characteristics may reshape our understanding of the K–H vortices and tranport process.
Categories:

Looking Straight at the Sun

EOS - Fri, 07/19/2019 - 12:16

The world’s largest solar telescope—the Daniel K. Inouye Solar Telescope (DKIST), which sits perched atop Mount Haleakalā on Hawaii’s island of Maui—will begin peering at Earth’s closest star later this year.

“First light is this fall,” said Stacey Sueoka, an applied optical systems engineer at the National Solar Observatory (NSO) in Boulder, Colo.

It has been a fraught road to first light. In 2017, a Native Hawaiian group lay down in the road leading to DKIST’s construction site and, hand in hand, delayed the delivery of the telescope’s primary 4-meter mirror.

The primary mirror is one of myriad mirrors that DKIST houses, and recently, a team of researchers that included Sueoka calibrated these mirrors as well as other parts of DKIST so that the telescope can look at the Sun more clearly.

“When you put on eyeglasses and everything suddenly becomes clear, you’re correcting geometric aberrations in your eyeball,” said James Breckinridge, an instrumentalist with the International Society for Optics and Photonics.

“They have devised a wonderful scheme for calibrating the entire instrument.”Sueoka and her colleagues devised eyeglasses, in the form of lab measurement-based computer modeling of the mirrors and their protective coatings, for DKIST because there are a lot of things that can hamper the ability of a telescope like DKIST to look at the Sun clearly. One of those things is what happens to sunlight when it enters DKIST and bounces off the telescope’s mirrors. That light, according to Sueoka and David Harrington—an astronomer at NSO who led the new research, which published in June in the Journal of Astronomical Telescopes, Instruments, and Systems—can, as it reflects from mirror to mirror, lose its original solar signature.

That series of reflections and incoming sunlight can heat the mirrors and deform them, further fogging DKIST’s picture of the Sun. Part of the reason has to do with the unprecedented size of the telescope, Harrington said.

“As you push the envelope to bigger apertures, there’s all kinds of optical issues you have to confront in order to make a telescope of high quality,” Harrington said. DKIST is a big telescope, and because of that it receives a lot of light—about 300 watts of energy.

“You need to remove the characteristics of the instrument if you’re going to study the nature of the source,” said Breckinridge, who was not involved in the new research. And that, he added, is what Harrington and his team did: “They have devised a wonderful scheme for calibrating the entire instrument.”

With its new specs, DKIST will be able to examine the atmosphere of the Sun, and it will study the Sun’s magnetic field. This, according to Breckinridge, will help scientists to better grasp things like what triggers solar storms—magnetic maelstroms that can lead to electrical surges and blackouts here on Earth.

After first light this fall, DKIST will take some of the clearest-ever pictures of storm features like solar flares and coronal mass ejections—explosive events that Harrington said astronomers still do not fully understand.

—Lucas Joel, Freelance Journalist

Hearing Garners Bipartisan Support for Scientific Integrity

EOS - Fri, 07/19/2019 - 12:13

The need for scientific integrity policies in federal agencies received support from both sides of the aisle during a 17 July congressional hearing, with Republicans and Democrats alike stressing the importance of protecting scientists and the scientific process.

However, so far there are no Republicans among the 192 cosponsors of legislation that would establish these policies as a safeguard no matter who controls Congress or the White House.“Allowing political power or special interests to manipulate or suppress federal science hurts, and hurts all of us.”

“Allowing political power or special interests to manipulate or suppress federal science hurts, and hurts all of us. It leads to dirtier air, unsafe water, toxic products on our shelves, and chemicals in our homes and environment. And it has driven federal inaction in response to the growing climate crisis,” Rep. Paul Tonko (D-N.Y.) said at the hearing of the House Science, Space, and Technology Committee.

Tonko, a member of the committee, introduced the Scientific Integrity Act (H.R. 1709) in March to establish scientific integrity policies for federal agencies that fund, conduct, or oversee scientific research.

“Scientific integrity is a long-standing concern that transcends any one party or political administration,” Tonko said. “The abuses directed by this president [President Donald Trump] and his top officials have brought a new urgency to the issue, but the fact remains whether a Democrat or Republican sits in the speaker’s chair or the Oval Office, we need strong scientific integrity policies.”

Codifying those policies into law would strengthen current scientific integrity initiatives that have been instituted at a number of federal agencies following 2010 guidelines that were issued during the Obama administration. At the hearing, committee chair Rep. Eddie Bernice Johnson (D-Texas) charged that the current policies have been “proving unable to counter the Trump administration’s manipulation and oppression of science.”

The Trump administration has taken a number of measures that critics have charged run counter to scientific integrity. These measures include altering scientific content, misrepresenting climate science, restricting communication of scientists, creating a hostile environment for scientific staff, and weakening federal advisory committees that provide scientific advice to federal agencies, according to the Union of Concerned Scientists (UCS).

“It is essential to codify these policies precisely because they are vulnerable to repeal, they are vulnerable to being cut back at any moment,” testified Michael Halpern, deputy director for the UCS Center for Science and Democracy. Halpern cited a report by the Los Angeles Times that, for instance, in 2018 the U.S. Geological Survey began requiring scientists to ask for permission before speaking to reporters.“Political interference in science happens under all presidential administrations, although the recent level of attacks on science is unprecedented.”

However, Halpern stressed the bipartisan need for codifying scientific integrity policies. “Political interference in science happens under all presidential administrations, although the recent level of attacks on science is unprecedented,” he said.

“I hope that today will serve as an example to all that there can be a bipartisan commitment to promoting responsible conduct in federal scientific agencies regarding the developments and communication of scientific information,” Halpern noted. “There’s not Democratic science, there’s not Republican science. There’s just science. Decision makers and the public want to hear directly from the experts, and they deserve that access.”

Another witness at the hearing was Joel Clement, a former executive at the Department of the Interior (DOI) who was a whistle-blower about the Trump administration’s climate and science policies.

“While every federal scientist hopes to influence policy with their work, it is never guaranteed. What they do expect, however, is the ability to conduct and communicate their research and findings without interference from politicians, to advance their careers with publications and presentations, to engage with peers both within and outside of the federal science enterprise, and to ensure that their findings are available to the American public that paid for the research,” said Clement, who was reassigned to an accounting position by then DOI secretary Ryan Zinke. Clement currently is an Arctic Initiative senior fellow at the Harvard Kennedy School’s Belfer Center for Science and International Affairs.

Clement added, “Unfortunately, some agencies have had difficulty assuring even these fundamental workplace conditions and establishing a culture of scientific integrity.”

The Legislation “Offers a Good Start”

Roger Pielke Jr., a professor of environmental studies at the University of Colorado Boulder, testified at the hearing that the legislation “offers a good start.”

Pielke, who was invited to testify by Republicans, stressed that the legislation would be beneficial for both parties. For Republicans, “this is an investment in your future,” he said. “For Democrats, it’s an investment in today to oversee the Republican administration. But this is where I think the interests of Congress have to outweigh the party affiliation, which makes [the legislation] so difficult.”

At the hearing, several Republican members of the committee expressed strong support for science integrity policies, though they complained that there was no bipartisan deliberation in preparing for the hearing.

“We must have rigorous policies on scientific integrity, research misconduct, conflict of interest, and data transparency. This instills public trust and confidence in taxpayer-funded research,” said Rep. Jim Baird (R-Ind.), ranking member of the Research and Technology Subcommittee.

Baird said, however, that there is a difference between the findings of scientific research and public policy decisions. “Science is science. But politics, as all of us on this side of the dais know, is more complicated. Two people may look at the same scientific data and relevant information and come to two totally different policy conclusions,” he said. “You may disagree with the current administration, but let’s stick with the facts of what is happening with science at our federal agencies, not rumor and exaggeration.”

Looking for Republican Support

At the hearing, Rep. Don Beyer (D-Va.) called for inviting all Republicans on the committee to cosponsor the legislation. Addressing his Republican colleagues, Beyer said, “If you can’t [cosponsor], please tell us why you can’t and what the specific objection is to, because I think this [bill] is something that should unite us as we move forward.”

After the hearing, Tonko told Eos that he looks forward to Republicans joining Democrats as cosponsors on the bill. Scientific integrity is a bipartisan issue, he said. “Throughout history, we have seen where administrations of whatever stripe have shown that there’s a need for this sort of legislation.”“The only transparent thing they’ve done in the Trump administration is be absolutely anti-science.”

Clement told Eos that there have been concerns in many administrations about supporting scientific integrity “because the temptation if you’re a policy maker is to override the science because it doesn’t coincide with your views.”

Clement said that civil and political discourse generally corrects those problems over time. However, he said that the Trump administration is different. “The only transparent thing they’ve done in the Trump administration is be absolutely anti-science,” he said. “There is no comparison with any previous administration.”

However, Clement drew some hope from the hearing, which he said was an honest exchange that helps to make it clear that there is some bipartisan interest in the validity and the integrity of the federal science enterprise.

“Sometimes you have to hit rock bottom—I hope we’re at rock bottom—in order to start making the changes that need to happen,” he said. “My hope now is that we’re at rock bottom and we can do better.”

—Randy Showstack (@RandyShowstack), Staff Writer

How Satellite Data Improve Earthquake Monitoring

EOS - Fri, 07/19/2019 - 12:12

In the aftermath of magnitude 5.5 or larger earthquakes, the U.S. Geological Survey’s National Earthquake Information Center (NEIC) creates and distributes disaster response guides to decision-makers, search and rescue operations, and other groups. It also creates and circulates these products for smaller-magnitude but “societally important” events, such as quakes that cause fatalities or property damage, as well as ones that are scientifically interesting or show promise for informing future response efforts, said William Barnhart, an Earth scientist at the University of Iowa in Iowa City.

Recently, researchers studied the role of geodetic observations, especially interferometric synthetic aperture radar and satellite optical imagery, in earthquake response efforts.Recently, researchers studied the role of geodetic observations—especially interferometric synthetic aperture radar (InSAR) and satellite optical imagery—in earthquake response efforts. They explored how those observations inform and validate seismically derived source models, independently constrain earthquake impact products, and more, Barnhart and his colleagues noted in a study published in Remote Sensing on 6 June 2019. They also studied how geodetic observations improve ShakeMap, the Prompt Assessment of Global Earthquakes for Response (PAGER) system, and other NEIC earthquake response products.

Traditionally, NEIC products have relied on data from seismic networks because “seismometers are highly sensitive and provide data very rapidly,” equipping them to detect smaller events more quickly than if geodetic methods are used, Jessica Murray, the geodesy topical coordinator for the Earthquake Hazards Program in Menlo Park, Calif., wrote in an email to Eos. Murray wasn’t involved with the recent study, but she collaborated with Barnhart on another study of one of the earthquakes discussed in the paper.

However, geodetic measurements record the “permanent offset of the ground due to the fault motion without ‘clipping’ (loss of information that occurs when the ground shaking amplitude exceeds the range that the seismometer can measure),” Murray wrote. Therefore, geodetic data provide more accurate magnitude estimates for some large earthquakes, she added.

The recent article includes case studies of four earthquakes to show a wide range of utility for geodetic observations in earthquake response, says Barnhart.

Earthquake Case Studies

In 2013 the 7.7 moment magnitude earthquake in Baluchistan, Pakistan, killed hundreds. It occurred in a region with sparse seismic observations, “which contributed to an initial event mislocation that biased subsequent response products,” the researchers wrote in the study.

Spatially complex and unusual—“rupturing a non-planar fault bilaterally” over about 200 kilometers “at relatively shallow depths for a large crustal earthquake”—the quake’s characteristics were challenging to capture using teleseismic fault models, the researchers noted. However, “the geodetic observations provided the most direct constraint on the spatial characteristics of the Baluchistan earthquake without having to undertake any fault source inversions,” the researchers wrote.

InSAR and GPS observations “were critical to imaging the spatially complex distribution of fault slip.”The 6.0 moment magnitude quake in Napa, Calif., in 2014 was, at the time, the largest quake in the San Francisco Bay in more than 25 years. InSAR and GPS observations “were critical to imaging the spatially complex distribution of fault slip. Although this capability contributed little new information in densely instrumented portions of California, similar observations and modeling efforts applied to [other] moderate magnitude earthquakes…could prove critical in characterizing the rupture details of the earthquake,” the researchers wrote.

The 7.8 moment magnitude earthquake in Gorkha, Nepal, in 2015 killed approximately 9,000 people, injured thousands more, and damaged or destroyed more than 600,000 structures. Here the geodetic data refined and verified “an already well-constrained earthquake source model,” the researchers wrote. “The spatial constraints from InSAR were used to re-parameterize the teleseismic finite fault model that was then ingested into ShakeMap,” condensing the region with the most shaking and constraining fatality estimates to the 1,000 to 10,000 range, which “ultimately bore true,” they added.

Finally, the 7.5 moment magnitude earthquake in Palu, Indonesia, triggered a tsunami, liquefaction, and landslides, the combination of which killed at least 2,077 people, injured at least 4,438 more, and caused an estimated $911 million worth of damage. As in the Gorkha quake, the Palu quake’s hypocenter was correctly identified using the NEIC data. However, the alert level and impacts were underestimated until “pixel tracking results from Landsat-8 and Sentinel-2 imagery” revealed the earthquake’s approximately 130-kilometer extent, the researchers wrote. This tracking raised the PAGER alert level from yellow to red because of Palu’s expected high shaking exposure.

In the future, researchers hope to automate geodetic image processing and analysis at the NEIC, Barnhart said.

—Rachel Crowell (@writesRCrowell), Science Journalist

Updated Temperature Data Give a Sharper View of Climate Trends

EOS - Fri, 07/19/2019 - 12:11

Government agencies, businesses, academic researchers, and members of the public rely on climate information to support informed decision-making. This information includes data obtained on the ground and at sea, satellite data, and computational models that help with interpreting the data and that allow climate scientists to construct forecasts and scenarios. One key indicator for Earth’s climate system, global surface temperature (GST), is widely used in climate monitoring and assessments.

One of the most widely used GST data sets is the National Oceanic and Atmospheric Administration’s (NOAA) Global Surface Temperature Dataset (NOAAGlobalTemp), formerly known as the Merged Land-Ocean Surface Temperature (MLOST) [Smith et al., 2008]. Version 5 of this data set was released on 18 June 2019. This new version of NOAAGlobalTemp uses more comprehensive data collection and increased spatial coverage over land and ocean surfaces, as well as improved treatment of historical changes in observing practice.

Identifying and Monitoring Anomalies and Trends

Reports on temperature trends and anomalies constructed using this data set provide policy makers, business leaders, and the general public with information that is essential for making decisions associated with climate variability. Hence, it is important for NOAAGlobalTemp to be kept up to date, using the best available observational data.

High-impact applications of this data set include annual climate reports from the World Meteorological Organization and American Meteorological Society and the monthly global climate reports from NOAA’s National Centers for Environmental Information (NCEI) for the previous month, season, and year.

NOAAGlobalTemp enables analyses of temperature anomalies in various ways. For example, global anomalies maps describe regions where temperatures are above or below averages and by how much. Global percentiles maps illustrate how the temperature anomaly for a given map grid point ranks in comparison to previous years. This comparison informs users of any grid points where warm or cold temperatures set records or fell into the upper or lower decile.

Global trend maps show the rates at which temperatures are changing for each grid point. Global and continental time series provide the changing trends and fluctuations for regions like North America, South America, Europe, Africa, Asia, and Oceania.

Filling in Gaps over Data-Sparse Regions

Air temperature data over land surfaces in NOAAGlobalTemp version 5 are taken from the Global Historical Climatology Network-Monthly data set (GHCNm), which was updated from version 3.3.0 to version 4 in October 2018 [Menne et al., 2018]. GHCNm version 4 consists of data from approximately 26,000 surface stations, roughly 4 times as many as its predecessor (Figure 1). The increase in the number of stations and the use of estimates for missing base period (30-year) averages expand the geographic coverage of temperature anomalies throughout the record period.

Fig. 1. NOAAGlobalTemp visualization of land and ocean observations for November 2015, version 4 (top left) and version 5 (top right). Surface-drifting buoy positions are subsampled every 5 days for easier readability. NOAAGlobalTemp temperature anomalies reconstructed for November 2015 for version 4 (bottom left) and version 5 (bottom right). CLIMAT data come from land-based meteorological surface observation sites. VOSCLIM is Voluntary Observing Ship–Climate. USHCN is the U.S. Historical Climatology Network. GHCNd is the Global Historical Climatology Network–Daily. Click image for larger version.

The new version greatly expands spatial coverage in the 5° × 5° gridded field. Figure 2 illustrates the impact of these updates on decadal climate trends. Quality control, bias adjustment, and gridding procedures are largely the same as in the previous version. GHCNm version 4 contains a newly added comprehensive uncertainty budget that broadly follows the approach of Morice et al. [2012] for the land component of the U.K. Met Office’s Hadley Centre Climatic Research Unit Temperature (HadCRUT) product. In GHCNm version 4, major sources of uncertainty, from the station level monthly averages up to the calculation of regional means, are estimated primarily via a 100-member ensemble. This ensemble was produced to quantify random, systematic, and correlated error structures in the monthly temperature data, as well as uncertainties associated with the GHCNm version 4 process [Menne et al., 2018].

Fig. 2. Surface air temperature trends over land areas from 1988 to 2017. Trends are based on station records binned into 5° × 5° boxes. The latest version (right) shows greatly increased spatial coverage over the previous version (left). Click image for larger version. Argo Floats Improve Coverage in the Southern Ocean and Tropical Regions

Over the ocean, NOAAGlobalTemp version 5 uses updated sea surface temperature (SST) data from version 5 of the Extended Reconstructed Sea Surface Temperature (ERSST) data set [Huang et al., 2017], which has improvements in both data and methods. Specifically, the new version of ERSST incorporates Argo float observations for improvements over the Southern and tropical oceans.

Additionally, ship and buoy data in NOAAGlobalTemp were updated to release 3.0 of NOAA’s International Comprehensive Ocean-Atmosphere Data Set (ICOADS), and sea ice data were updated to version 2 of the Hadley Centre Sea Ice and Sea Surface Temperature (HadISST2) data set.

Improved methods include better quality control, interpolation, and bias adjustments using a new baseline reference from more accurate modern buoy observations. ERSST version 5 has improved the representation of spatial variability over the oceans, the magnitude of El Niño and La Niña events, and the accuracy of absolute SST.

Land and Ocean Data Show Continuing Warming Trends

NOAAGlobalTemp version 5 trends support earlier research findings, showing the robustness of the warming trends and no slowdown or warming hiatus on decadal scales.Using the new land and ocean surface temperature data sets, NOAAGlobalTemp version 5 employs a statistical reconstruction method [Smith et al., 2008] to generate global surface temperature data with 5° × 5° grids and monthly resolution. Over the global domain, the NOAAGlobalTemp version 5 trends are statistically consistent with the previous version. These trends further support earlier research findings over decadal and longer timescales, showing the robustness of the warming trends and no slowdown or warming hiatus on decadal scales.

For the 1880–2018 centennial scale, the warming rates are roughly 0.07°C/decade in both data sets (Figure 3). Warming rates have increased in recent decades: Both versions 4 and 5 show warming rates of about 0.14°C/decade from 1950 to 2018. This warming has become more rapid since the mid-1970s (about 0.17°C/decade from 1975 to 2018 in version 4 and 0.18°C/decade in version 5).

Warming rates are even higher in the most recent period beginning in the late 1990s (0.18°C and 0.19°C/decade for 1990 to 2018 in versions 4 and 5, respectively) and early 2000s (0.19°C and 0.20°C/decade for 2000 to 2018 in versions 4 and 5). (Numbers are obtained from the version runs for the January 2019 update.)

Fig. 3. Annual mean temperature anomaly time series from NOAAGlobalTemp version 5 (solid line) and version 4 (dotted line). The line for version 4 was shifted down by a constant of 0.07°C for the whole time period, largely because of the reference baseline shift from ship sea surface temperature (SST) to buoy SST—readings from buoys tend to be lower than readings from ships for a given location in general [Huang et al., 2017]. Global trends are about the same (at the 90% linear regression error confidence level as in the Intergovernmental Panel on Climate Change’s Fifth Assessment Report), although version 5 shows a very slightly larger warming trend than version 4 for 1880 to 2017.Worldwide, several internationally recognized organizations actively and continuously improve the GST data sets. Although these organizations employ somewhat different input data and technical approaches, all data sets depict widespread warming over the long term—roughly 1°C since 1901 [Intergovernmental Panel on Climate Change, 2013; Zhang et al., 2016].

Continuing Improvements for Informed Decisions

The most notable improvement currently in progress addresses the incomplete coverage in the Arctic, where evidence of climate change is greatest.Further improvements are under development for future releases. The most notable improvement currently in progress addresses the incomplete coverage in the Arctic, where evidence of climate change is greatest; the lack of full coverage has been shown to underestimate the global warming rate. Increasing the spatial resolution from 5° to 2° is another potential improvement area, as is further improving global uncertainty estimates and incorporating additional observations collected through ongoing collaboration with the international community.

NOAAGlobalTemp is part of the suite of climate products and services that NOAA provides to government, business, academia, and the public to support informed decision-making. This latest release is designed to ensure the best possible representation of historical climate conditions across the globe by leveraging newly available data and the latest peer-reviewed scientific methods. Data are freely available from the National Centers for Environmental Information.

Acknowledgments

This global data set is only possible through the international collaboration under the auspices of the World Meteorological Organization and through significant contributions by the NOAA Climate Program Office’s Ocean Observation and Monitoring Division and the Earth System Science and Modeling Division’s networks. We thank Kevin O’Brien for providing some metadata for Figure 1. The reviews from the NCEI internal review process and Eos reviewers and editor have made this article more readable.

Apollo 11 at 50 and Other Things We’re Reading This Week

EOS - Fri, 07/19/2019 - 12:05

Apollo 11 at 50. Ten years ago, “at the 40th anniversary of the Apollo 11 mission, your intrepid reporter caught up with Neil Armstrong.”

—Randy Showstack, Staff Writer

Credit: Randy Showstack

. Ed Dwight Was Set to Be the First Black Astronaut. Here’s Why That Never Happened. We must not forget this part of our space exploration history: “It took two decades after Dwight became an astronaut trainee before a black American would go to space.”

—Kimberly Cartier, Staff Writer . A Window into Space at the National Cathedral.

The Moon rock at the center of the “Space Window” of Washington National Cathedral weighs only 7.18 grams. Credit: NASA

Like rocks and glass windows, science and religion don’t always coexist happily. This is a poignant tale of one case where they all just get along, beautifully.

—Timothy Oleson, Science Editor . Teaching Global Warming in a Charged Political Climate. A thought-provoking article about the challenges of teaching climate change in K–12.

—Faith Ishii, Production Manager . Revisiting the Role of the Science Journalist. While #scicomm is thriving, the pursuit of science journalism is in a precarious position. Read Eos.

—Caryl-Sue, Managing Editor . SharkCam!

And here I thought science would be devoid of jump scares.

—Kimberly Cartier, Staff Writer . The Battle to Rebuild Centuries of Science After an Epic Inferno. This story on Brazil’s National Museum 1 year after the fire is heart-wrenching.

—Jenessa Duncombe, Staff Writer

. Resurrecting Interest in a “Dead” Planet. With so much attention on the Moon and Mars, Venus tends to get short shrift. Catch up on what scientists have learned about “Earth’s twin” in the past quarter century—and what questions they’re still trying to answer—in this hugely informative and fascinating summary.

—Timothy Oleson, Science Editor

. Elephants Boost Carbon Storage in Rain Forests.

Forest elephants splash and play at the forest’s edge in the Dzanga-Sangha Special Reserve in the Republic of the Congo. Credit: iStock.com/ANDREYGUDKOV

By trampling and eating fast-growing softwood trees, elephants contribute to the growth of slow-growing hardwood trees, which have higher carbon densities. Plus, who doesn’t love these majestic animals?

—Faith Ishii, Production Manager .

Viewing Venus from the Space Station.

Sunlight spills over Earth’s beautiful blue horizon, while Venus twinkles as the morning (and evening!) star near the bottom of the photo. Credit: Expedition 59 and the ISS Crew Earth Observations Facility and the Earth Science and Remote Sensing Unit, Johnson Space Center NASA

OK, how cool is this?! “Orbiting Earth approximately every 90 minutes, astronauts living and working on the International Space Station (ISS) see sixteen sunrises and sunsets every 24 hours.”

—Melissa Tribur, Production Specialist

Stocking a Proper Buffet for a Megadiverse Smorgasbord

EOS - Fri, 07/19/2019 - 11:30

Understanding how Earth systems are responding to change across space and time requires distributed measurement samples, a buffet in a smorgasbord, so to speak. Many of these measurements arise organically, as individual research projects identify priorities and syntheses form, while some are more centralized efforts. In either case, these Environmental Observatory Networks (EONs) can only sample a subset of the diversity of ecosystems, soils, watersheds, and species that span any network.

Villarreal et al. [2019] develop one method to evaluate how representative the “buffet” represented by the EON sampling sites represents the true variety, using the megadiverse country of Mexico as a case study. The authors show that an optimal sampling strategy would require a more than 80 sites to represent a majority of the dynamics of carbon and water cycling, which the current network undersamples. The innovativeness of the approach is the focus on using public datasets and easy to adapt methods, critical for use in regions with limited resources.

The paper also introduces the growing EONs in Mexico, the results of which are featured in a number of papers that, like this one, are part of a special issue on MexFlux: advances in ecosystem carbon and water fluxes across Mexico in JGR: Biogeosciences.

Citation: Villarreal, S., Guevara, M., Alcaraz‐Segura, D., & Vargas, R. [2019]. Optimizing an environmental observatory network design using publicly available data. Journal of Geophysical Research: Biogeosciences, 124. https://doi.org/10.1029/2018JG004714

—Ankur Rashmikant Desai, Editor, JGR: Biogeosciences

Response to Comment on "Earth and Moon impact flux increased at the end of the Paleozoic"

Science - Thu, 07/18/2019 - 20:13

Hergarten et al. interpret our results in terms of erosion and uncertain calibration, rather than requiring an increase in impact flux. Geologic constraints indicate low long-term erosion rates on stable cratons where most craters with diameters of ≥20 kilometers occur. We statistically test their proposed recalibration of the lunar crater ages and find that it is disfavored relative to our original calibration.

Small Steps and Giant Leaps

EOS - Thu, 07/18/2019 - 19:50

In partnership with the National Archives Foundation, AGU was proud to copresent a panel discussion about the role of geosciences in the Apollo missions and the future of the space program on 17 July 2019. Coinciding with the 50th anniversary of the first lunar landing and AGU’s Centennial year, the event was introduced by AGU president Robin Bell, AGU CEO/executive director Chris McEntee, and U.S. archivist David Ferriero.

“In 1919, when AGU was founded, the world was a very different place. However, despite the century’s worth of change, the ability of Earth and space science to improve our society—and the desire of scientists to provide those benefits to humanity—has remained the same,” said McEntee. “During times of uncertainty and change to Earth’s climate and the scientific enterprise, all of us—particularly the scientific community—must join together to address these concerns. Like all those who were part of the Apollo 11 mission, we must be creative and passionate; committed and determined. We must advance research and do so with the integrity and transparency that is the foundation of scientific discovery.”

The panel was moderated by NASA chief scientist James L. Green and included Sean Solomon, past president of AGU and director of the Lamont-Doherty Earth Observatory; Sonia Tikoo, assistant professor at Stanford University; Steven Hauck, professor of planetary geodynamics at Case Western Reserve University; and Heather Meyer, postdoctoral fellow at the Lunar and Planetary Institute.

Speaking at the “Small Steps and Giant Leaps” event were (from left) AGU president Robin Bell; panelists Sean Solomon, Sonia Tikoo, Heather Meyer, and Steven Hauck; NASA chief scientist James L. Green; and AGU CEO/executive director Chris McEntee. Credit: AGU

—Joshua Speiser (jspeiser@agu.org), Manager of Strategic Communications, AGU

SWOT and the ice-covered polar oceans: An exploratory analysis

Publication date: Available online 17 July 2019

Source: Advances in Space Research

Author(s): Thomas W.K. Armitage, Ron Kwok

Abstract

The Surface Water Ocean Topography mission (SWOT), scheduled for launch in 2021, is the first space-borne radar interferometer capable of providing wide-swath height maps of water surfaces with centimetric precision. In addition to its primary objectives in oceanography and hydrography, the SWOT instrument offers opportunities for other applications. Here, we explore the feasibility of sea ice freeboard and sea surface height retrievals in the ice-covered oceans from SWOT data. The quality of SWOT height estimates depends on the backscatter strength and number of samples used for multi-looking. We use near-nadir radar backscatter estimates from sea ice and water over the range of SWOT incidence angles to simulate SWOT height maps and assess the retrieval precision under different backscatter, surface type and roughness conditions. Unlike wind-roughened open water, the available observations suggest that backscatter over sea ice has a moderate dependence on look angle (specularity), and the backscatter of younger, flatter sea ice has a greater degree of specularity than older, more deformed and colder sea ice. To achieve a similar freeboard precision to conventional altimeters (∼3 cm) requires averaging over 15–40 km2 in the near- to mid-swath and 90–175 km2 in the far-swath for lower northern latitudes (<65°N), and 9–18 km2 in the near- to mid-swath and 30–50 km2 in the far-swath over Southern hemisphere ice. Compared to a typical altimeter grid cell used for time and area averages (∼25km x 25km or 625 km2), this represents an improvement in resolution of 3–70 fold between the near- and far-swath. Overall, the results suggest that SWOT has the potential to provide unique new insights in the high-latitude oceans by providing two-dimensional maps of sea ice thickness and dynamic ocean topography at higher resolution, in both space and time, than has previously been possible.

A Charge Sharing Study of Silicon Microstrip Detectors with Electrical Characterization and SPICE Simulation

Publication date: Available online 17 July 2019

Source: Advances in Space Research

Author(s): Rui Qiao, Wen-Xi Peng, Xing-Zhu Cui, Guang-Qi Dai, Yi-Fan Dong, Rui-Rui Fan, Min Gao, Ke Gong, Dong-Ya Guo, Xiao-Hua Liang, Ya-Qing Liu, Huan-Yu Wang, Jin-Zhou Wang, Di Wu, Jia-Wei Yang, Fei Zhang, Hao Zhao

Abstract

Silicon microstrip detectors with floating strips have nonuniform charge collection efficiency. This nonuniformity depends on the incident position and incident angle and should be corrected during charge reconstruction. A novel charge reconstruction algorithm, called the charge sharing algorithm, is introduced to correct this nonuniformity. This algorithm assumes that the nonuniformity in charge collection efficiency is due to charge sharing through the capacitors and resistors of silicon microstrip detectors. This charge sharing assumption is tested in this paper using electrical characterization and SPICE simulation.

Comparison-space selection to achieve efficient tracklet-to-object association

Publication date: Available online 17 July 2019

Source: Advances in Space Research

Author(s): J.A. Siminski, T. Flohrer

Abstract

A major challenge when maintaining a space object catalog is the proper association of new measurements to already cataloged objects. Optical observations are typically associated by comparing the modeled observation to the measured one. The modeled observation is generated from cataloged object states by propagating them to the epoch of observation and transforming them from state space, e.g. orbital elements, to the observation space, e.g. right ascension and declination angles. In addition to propagating the states, their propagated uncertainty distribution is transformed to observation space as well. Statistical distance metrics, such as the Mahalanobis distance, are then evaluated to test whether the observation originated from the cataloged object or not. These distance measures often assume that the uncertainty can be represented with a normal distribution. Assuming that the catalog state uncertainty is properly represented by a normal distribution, it can still loose this property during the propagation in time and the transformation to observation space. The uncertainty of the catalog state is typically much larger than the one from new measurements (only a few arc seconds for optical telescopes) and is therefore more affected by transformation distortions. It is therefore beneficial to perform the comparison in a space advantageous for the state representation. This study will present a projection-based transformation of tracklet information into a favorable frame around the cataloged object state. The effect of the comparison-space selection on cataloguing performance is assessed, i.e. it is systematically tested if it is beneficial to directly compare angles and angular rates, or to compare in the newly proposed projected frame.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer