Feed aggregator

Tilted Planet System? Maybe It Was Born That Way

EOS - Wed, 09/17/2025 - 13:19

Astronomers have recently found that roughly a third of planet-forming disks around young Sun-like stars are tilted relative to the direction that their star spins.

“All young stars start out with a disk. But the relative orientation between the disk and the star’s spin axis, little was known about that,” explained lead researcher Lauren Biddle, a planetary scientist at the University of Texas at Austin.

This discovery, published in Nature, could help answer the long-standing question of how planets come to orbit their stars at wonky angles: Maybe they were born that way.

Skewed from the Start

There’s a universal truth that when a new star collapses out of a cloud of gas, angular momentum must be conserved. That means that as the nebulous star shrinks in size, it also rotates faster, like when a spinning ice skater draws their arms in and speeds up. The surrounding leftover gas and dust flatten out into a disk that spins in the same direction as the star, and that disk may eventually form planets that spin and orbit in that same direction.

But the universe is rarely so neat and tidy.

Of the thousands of known exoplanets, dozens of them orbit at wonky angles relative to their star’s spin axis. In our own solar system, the plane in which the eight planets orbit is tilted by about 6º from the Sun’s spin axis. Astronomers have theorized that some of these misalignments, or obliquities, result from dynamical events that take place after a planetary system has already formed: A star passes by and disturbs the orbits, or a major collision knocks a planet off course.

Some of those misalignments, however, are baked in from the start. Previous studies have attempted to observe young star systems and their planet-forming disks to see whether those disks start out tilted or aligned. But those studies were limited by the fact that not many protoplanetary disks had yet been discovered, and many of those that were known were part of binary star systems, Biddle explained. Although those studies found some tilted disks, the gravity from the binary star, rather than an intrinsic misalignment, may have been the culprit.

“If systems begin with primordially tilted orbits, then there is no need to invoke other mechanisms—many of which would destabilize neighboring planets—within those systems,” said Malena Rice, a planetary astrophysicist at Yale University in New Haven, Conn. “By understanding the range of primordial tilts and comparing that distribution with more evolved systems, we can piece together the evolutionary sequences of different classes of planetary systems.” Rice was not involved with this study.

“The one-third rate of misalignment stands independent of everything else.”

Biddle and her colleagues compiled a new sample of young star systems by combining observations of protoplanetary disks from the Atacama Large Millimeter/submillimeter Array (ALMA) and measurements of stars’ spin from the Transiting Exoplanet Survey Satellite (TESS) and retired K2 mission. Biddle explained that because ALMA, TESS, and K2 have released such large datasets, her team could curate their sample to look only at Sun-like stars that did not have any binary companions.

They found that 16 of the 49 stars in their sample (about a third) had protoplanetary disks with obliquities of at least 10°, the lower limit of what they could measure. The remaining two thirds of the systems showed no significant evidence of misalignment. This rate of high-obliquity disks is consistent with past studies but more than doubles the number of young, single, Sun-like stars for which astronomers know the degree of disk misalignment.

The 16 stars that host tilted disks did not share any obvious characteristics like mass, temperature, and size, and the disks themselves also had different sizes, masses, and structures.

“We didn’t find any correlation there,” Biddle said. “At this point, independent of other system parameters, the one-third rate of misalignment stands independent of everything else.”

Oblique Across Space and Time

Past studies have suggested that moderate disk obliquities might rise from imperfections in the nebulous cloud that formed the star system: An odd clump in the right spot might create turbulence that grows stronger as the cloud collapses, or the clump might fall onto the disk late and tip it off its axis.

“Moderate misalignments of a few tens of degrees can be produced naturally by either turbulence in the natal molecular cloud, late-stage disk accretion, or some combination of the two,” Rice said.

However, that doesn’t necessarily mean that every misaligned exoplanet, or even a third of them, started out that way.

“It would be great to take a crack at mapping stellar obliquities across space and time.”

“A misalignment between a planet’s orbital plane and its host star’s spin axis can originate in two broad phases: during the star and planet formation stage…[and] later, during the system’s main sequence lifetime,” explained Simon Albrecht, an astronomer at Aarhus University in Denmark who was not involved with this research. “If we can determine the fraction of systems that are already misaligned right after birth, that helps us distinguish between these two broad possibilities.”

Determining how much of a system’s tilt comes early or late and whether that tilt changes over a planetary system’s lifetime will require observing a lot more misaligned planetary systems at all stages of evolution, Biddle said. She added that the upcoming data release from the now-retired Gaia mission will be key to answering both of those questions.

“It would be great to take a crack at mapping stellar obliquities across space and time,” Biddle said. “Being able to fill in that time parameter space will help quantify how important dynamics is for generating that final [obliquity] distribution that we observe in planetary systems.”

—Kimberly M. S. Cartier (@astrokimcartier.bsky.social), Staff Writer

This news article is included in our ENGAGE resource for educators seeking science news for their classroom lessons. Browse all ENGAGE articles, and share with your fellow educators how you integrated the article into an activity in the comments section below.

Citation: Cartier, K. M. S. (2025), Tilted planet system? Maybe it was born that way, Eos, 106, https://doi.org/10.1029/2025EO250338. Published on 17 September 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A hard look at geoengineering reveals global risks

Phys.org: Earth science - Wed, 09/17/2025 - 12:56
With CO2 emissions continuing unabated, an increasing number of policymakers, scientists and environmentalists are considering geoengineering to avert a climate catastrophe. Such interventions could influence everything from rainfall to global food supplies, making the stakes enormous.

Rising CO2 and Climate Change Reorganize Global Terrestrial Carbon Cycling

EOS - Wed, 09/17/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances

To help project Earth’s future climate, it is critical to understand how the capacity of ecosystems to take up and store carbon is changing as atmospheric carbon dioxide levels rise and climate change intensifies.

Bilir et al. [2025] integrate satellite data with a model of terrestrial carbon cycling to parse regionally-specific influences of CO2 and climate on carbon storage in living and dead plant material, and the associated residence time of carbon in those pools.

For the specified regions, changes in total carbon storage (left Y axis, solid bars) and percent change in mean residence time of carbon (right Y axis, hatched bars) that can be attributed to atmospheric CO2 (top panel), climate trends (middle panel), and the combined, interacting effects of CO2 and climate (bottom panel). Credit: Bilir et al. [2025], Figure 6

Their work helps untangle the mechanisms driving what they and others have observed: that CO2 increases carbon storage more than climate effects decrease it. They find greater carbon storage in living plants globally and a shift in dead carbon storage from mid- and high latitudes to the tropics. They also demonstrate a reduction in mean carbon residence times across all latitudes. The shift in carbon storage from dead to live pools underscores the sensitivity of terrestrially-mediated carbon cycling and residence times to living plant carbon uptake and storage potentials.

These efforts help us understand, at a global scale, how rising atmospheric CO2 and climate change interact to prompt a latitudinally-specific reorganization of our planet’s terrestrial carbon cycling, and thus its climate.

Citation: Bilir, T. E., Bloom, A. A., Konings, A. G., Liu, J., Parazoo, N. C., Quetin, G. R., et al. (2025). Satellite-constrained reanalysis reveals CO2 versus climate process compensation across the global land carbon sink. AGU Advances, 6, e2025AV001689. https://doi.org/10.1029/2025AV001689

—Sharon Billings, Editor, AGU Advances

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Climate Change May Have Killed 16,469 People in Europe This Summer

EOS - Wed, 09/17/2025 - 04:01

Climate change caused 16,469 deaths in European cities this summer, new research estimates.

This summer was the fourth hottest in European history, and its effects on the continent’s population have been widely reported. Spain experienced its most intense heat wave in history in August 2025. Türkiye saw its highest recorded temperature ever (50.5°C, or 122.9°F). Finland saw an “unprecedented” three straight weeks of 30°C heat.

A new, rapid-analysis study by researchers at Imperial College London (ICL) and the London School of Hygiene & Tropical Medicine estimated that 24,400 people across 854 European cities and urban centers died from heat-related causes between June and August 2025. Using climate models and a comparison of this figure with how many heat-related deaths would have occurred in a 1.3°C cooler world, the researchers estimated that climate change was responsible for 68% of these deaths.

“These numbers represent real people who have lost their lives in the last months due to extreme heat.”

“In other words, it could have tripled the death toll,” said Garyfallos Konstantinoudis, a biostatistician at ICL’s Grantham Institute – Climate Change and the Environment.

Though the planet has warmed about 1.3°C overall since preindustrial times, Europe is warming more quickly than the rest of the planet, meaning that temperatures on the continent this summer were about 1.5°C to 2.9°C warmer than they would have been without anthropogenic warming.

In a Tuesday press conference, the researchers explained that their estimate of 16,649 climate-related deaths is likely conservative, in part because climate models are known to underestimate warming in Europe. In addition, their estimate includes only deaths in urban centers with populations above 50,000 people—areas that represent only about 30% of Europe’s population. They focused on these urban areas because these locations had greater data availability, but that means the estimate is just a snapshot.

“These numbers represent real people who have lost their lives in the last months due to extreme heat,” said Friederike Otto, a climatologist at ICL’s Centre for Environmental Policy. “Many of these would not have died if it wasn’t for climate change. And if we continue on the path that we are on now, continue burning fossil fuels, these deaths will only increase.”

The Hidden Costs of Heat

The study also notes that northern Europe experienced a higher proportion of heat-related deaths than southern Europe, despite southern Europe enduring higher heat (some cities in the region have warmed by up to 3.6°C) and more excess mortality overall. The reason is that prior to climate change, heat in northern Europe rarely reached levels that affected human health at all. Now, explained Konstantinoudis, “almost all of the heat-related deaths in northern Europe…are due to climate change.”

“Reducing fossil fuel use is one of the most important public health interventions of our time.”

Courtney Howard, vice-chair of the Global Climate and Health Alliance and an emergency physician in Canada’s Yellowknives Dene Territory, who was not involved in the study, noted that extreme heat can raise the risk of deadly heart attacks and strokes because high heat causes the heart to work harder. It can also fatally worsen respiratory conditions such as asthma because ozone pollution tends to increase during extreme heat events. Thus, many of the deaths that occur during heat waves are not necessarily recorded as heat deaths.

“The result is that heat numbers capture only a small fraction of the real story at the bedside,” she said. “Experts do not believe that we can adapt health systems adequately to cope with the temperatures that we are currently facing. That’s why reducing fossil fuel use is one of the most important public health interventions of our time.”

To estimate heat deaths, researchers turned to an existing dataset that showed relationships between temperature and mortality across the 854 urban areas used in the study. They then estimated the number of daily deaths during the heat wave using historical Eurostat data and information on which days exceeded minimum mortality temperature.

It’s Not Just Europe Click image for larger version. Credit: Imperial Grantham Institute

Among the countries included in the study, the Baltic nations of Estonia, Latvia, and Lithuania were the only three that did not experience hotter-than-usual summers.

Rome, Athens, and Bucharest saw the highest heat-related death rates per capita among European capital cities, the study found. In general, cities are hotter than surrounding areas because of the urban heat island effect, in which concrete surfaces trap heat and raise city temperatures.

Chris Callahan, a climate scientist at Indiana University Bloomington who was not involved in the study, said that though the study is not peer reviewed, its methods appear to be “standard and based on extensive peer-reviewed research.”

The researchers noted several factors their study did not consider, including cities’ efforts to adapt to climate change and all adverse health effects of heat. It also did not capture changes to baseline populations that occurred post-COVID-19, which might have led to higher numbers for some cities.

“The findings in this study are stark and concerning, as they illustrate that climate change is already the dominant influence on heat-related mortality in Europe,” Callahan told Eos in an email.

“We are warming the world through our fossil fuel emissions and other activities and that…is causing people to die.”

Europe faces particularly high risks related to climate change, he added, both because temperatures are rising more quickly in western Europe than in other parts of the world and because Europe’s aging population is highly vulnerable to heat. In fact, this study found that people over 64 made up 85% of the climate-related deaths in European cities this summer.

However, the study authors noted that the growing toll warming is taking on human health is not unique to Europe.

“The specifics will vary wherever you’re looking in the world, but the basic point of these studies will always be the same: that we are warming the world through our fossil fuel emissions and other activities and that this is causing people to die,” said Clair Barnes, a statistician at ICL’s Centre for Environmental Policy.

—Emily Gardner (@emfurd.bsky.social), Associate Editor

Citation: Gardner, E. (2025), Climate change may have killed 16,469 people in Europe this summer, Eos, 106, https://doi.org/10.1029/2025EO250348. Published on 17 September 2025. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A spectrum of regularisation approaches to resolving sharp boundaries in complex resistivity inversions

Geophysical Journal International - Wed, 09/17/2025 - 00:00
SummaryMineral exploration is frequently centred around delineating discrete geological units with, typically, sharp boundaries that could represent economic targets. In the case of complex resistivity (CR) inversions, the choice of regularisation and model parameterisation significantly impacts the inversion’s ability to delineate targets. Initially, however, a prudent researcher may not wish to bias their inversion towards sharp distinct units without prior justification. Here we explore how a suite of regularisation approaches to the CR inverse problem allows to encompass different classes of prior beliefs. We present these as a progression as more information becomes available regarding the likelihood of distinct geological units. The most weakly informed approach with respect to the delineation of geological units we consider is the classic ℓ2-type regularisation, tend to produce smeared-out fuzzy images. However this is typically not what is expected for distinct geological units, and we compare this with schemes that increasingly resolve sharp boundaries. We test a range of ℓ1-type regularisations, which have been frequently touted in the geophysics and optimisation literature as being well-suited for such tasks. We experiment with using a so-called overcomplete parameterisation of the CR field, which aims to separate smooth background and sharp foreground features. These ℓ1 schemes are shown to produce generally sharper images than ℓ2. In the most informed case, where strong assumptions can be made about the local geology, we represent the CR field as a foreground ellipse in a homogenous background. This approach significantly reduces the size of the parameter space, and tends to have a simple geometric interpretation. While the anomaly parameterisation has some unique challenges, we show it clearly resolves distinct units compared to both the ℓ2 and ℓ1 regularisations. Applications first to synthetic data and then to field data from Century Zinc Deposit in northern Australia, demonstrate the progression from weakly informed to strongly informed regularisation and parameterisation and the sharpness of the recovered geological units.

Tomographic Imaging of the Pampean Flat Slab: Evidence of Subduction Erosion and Volatile Migration

Geophysical Journal International - Wed, 09/17/2025 - 00:00
SummaryFollowing reanalysis of data from 8 seismic networks that operated in the region surrounding the Pampean flat slab during the past several decades, we generated 3D images of Vp, Vs, and Vp/Vs from a combination of arrival times of P and S waves from local earthquakes, and Rayleigh wave dispersion curves from both ambient noise and existing shear wave models. Among the robust features in these images is a low velocity, root-like structure that extends beneath the high Andes to a deflection in the flat slab, which suggests the presence of an overthickened Andean crust rather than a hypothesized continental lithospheric root. Most of the larger scale features observed in both the subducted Nazca plate and the overriding continental lithosphere are related to the intense seismic activity in and around the Juan Fernandez Ridge Seismic Zone (JFRSZ). Vp/Vs ratios beneath, within, and above the JFRSZ are generally lower (∼1.65–1.68) than those in the surrounding Nazca and continental lithosphere (∼1.74–1.80). While the higher continental lithosphere ratios are due to reduced Vs and likely a result of hydration, the lower JFRSZ related ratios are due to reduced Vp and can be explained by increased silica and CO2 originating from beneath the slab, perhaps in concert with supercritical fluid located within the fracture and fault networks associated with the JFR. These and related features such as a region of high Vp and Vs observed at the leading edge of the JFRSZ are consistent with a basal displacement model previously proposed for the Laramide flat-slab event, in which the eroded base of the continental lithosphere accumulates as a keel at the front end of the flat slab while compressional horizontal stresses cause it to buckle. An initial concave up bend in the slab facilitates the infiltration of silica and CO2-rich melts from beneath the slab in a manner analogous to petit spot volcanism, while a second, concave down bend, releases CO2 and supercritical fluid into the overlying continental lithosphere.

Louisville Ridge Seamount Chain - Vp/Vs investigation of seamount structure and subduction-related deformation

Geophysical Journal International - Wed, 09/17/2025 - 00:00
SummaryTomographic inversion of traveltime picks from both P-wave and S-wave wide-angle seismic data acquired along and across the Louisville Ridge Seamount Chain (LRSC) provides key insights into its magmatic construction and subsequent subduction-related deformation. Our P-wave velocity-depth models reveal that each seamount along the LRSC comprises an intrusive mafic-ultramafic core that rises within the crust to within 1–2 km of the seabed summit (P-wave velocity, Vp = 5.5–6.5 km s−1; S-wave velocity, Vs < 3.6 km s−1), with each underlain by a crustal root ∼4–5 km thick. Notably, Canopus seamount comprises two adjacent eruptive centres, and our modelling shows that the more northern is currently being internally deformed as it rides up (ascends) the Tonga-Kermadec Trench (TKT)-related plate bending outer rise. Lateral variation in Vs within models along and across the LRSC also primarily reflects subduction-related deformation, with low-velocity regions corresponding to large-scale faulting constrained within the crust. Comparison of pre- and post-LRSC-TKT collision forearc crustal structure indicates that bulk Vp properties recover within ∼50 kyr, whereas Vs structure retains it fault-related fabric for at least ∼740 kyr. Vp/Vs ratios (1.75–1.85) confirm a magmatic origin for all LRSC seamounts, with evidence of localized water-filled cracks due to seawater infiltration along faults, particularly beneath the TKT-ward side of the Osbourn seamount. Estimated water content within the upper crust ranges from 12–15 per cent by weight, decreasing to < 10 per cent in the mid-lower crust, with no evidence of > 12 per cent water content within the Pacific crust being subducted. In comparison with post-collision subduction further north, where the observed upper mantle velocity suggests up to 30 per cent water content, our models suggest that, although deformed and faulted as part of subduction, the LRSC appears more resistant to this deformation than the background Pacific crust adjacent. Our findings provide new constraints on the mechanical and compositional evolution of the LRSC, both prior to and during its collision with the overriding Indo-Australian plate.

Correcting low-frequency EM data using inverted IP parameters of regolith clays

Geophysical Journal International - Wed, 09/17/2025 - 00:00
SummaryThis research had an initial goal to quantitatively fit and then separate an induced polarization (IP) contribution to extensive ground electromagnetic (EM) data from the Girrilambone area, NSW. A secondary goal identified during the study was to explain why inversion of data from two different EM systems covering the same area each consistently predicted different IP time-constants and chargeabilities. The mineral exploration area was originally surveyed by a 6.25 Hz central loop SIROTEM survey measuring dB/dt. The area was later resurveyed with 1 Hz base-frequency Slingram survey using a Landtem B field sensor. The targets were economic sulphides at depth, with expected signatures being slowly decaying EM responses of small amplitude. Most of the data was affected by inductive IP effects of negative sign, with potential late-delay time EM responses of positive sign obscured. The Girrilambone area surveyed includes the Tritton Mine, discovered in 1995 as a result of the 6.25 Hz SIROTEM survey. To enable the subtraction of IP effects from the EM data, our primary goal, we used the EM data to predict Cole-Cole IP parameters that are consistent with documented values associated with extensive in-situ regolith clay resulting from weathering. The data sets were inverted using a polarisable thin-sheet model that estimated regolith conductivity-thickness or conductance S, chargeability m, IP frequency dependence c and conductivity IP time constant τσ. The thin sheet model was generally able to fit the observed responses, with the fitted IP contribution subtracted from the observed data to produce an ‘IP corrected’ data set of EM data more suitable for the detection of slow decays indicative of sulphide targets. The 6.25 Hz dB/dt data was however modelled with quite different parameters to the1 Hz B field data. The 6.25 Hz IP conductivity time constant was smaller by a factor of 10 while the chargeability was smaller by a factor of more than 2. This initial goal of the research was achieved in that subtraction of the fitted IP contributions in either case improved the capability to identify deeper conductive targets. We are confident that the systematic differences in fitted IP conductivity time constant and chargeability are not due to data or system description error, or to inversion constraints. We conclude that TEM systems will not accurately estimate intrinsic IP conductivity time-constants as rigorously defined from wideband laboratory physical property measurements but rather estimate an IP time-constant whose characteristic frequency (inverse of IP time constant) lies within the bandwidth of the TEM system used. Further, the chargeability estimate will reflect only that fraction of polarizable material whose response is within the bandwidth of the system.

The Gutenberg-Richter law strikes back: the exponentiality of magnitudes is confirmed by worldwide seismicity

Geophysical Journal International - Wed, 09/17/2025 - 00:00
SummaryThe magnitudes of earthquakes are generally described by an empirical relation called the Gutenberg-Richter law. This relation corresponds to a well-known statistical distribution, i.e. the exponential distribution. In this work, we verify the validity of the Gutenberg-Richter law using a 44-year-long worldwide seismic catalog of strong (Mw ≥ 6.5) events, by testing the exponentiality and the independence of the magnitudes. Moreover, we suggest a new way to visualize the distribution of the magnitudes, which complements the classical magnitude frequency distribution plot.

Geoengineering Fears on Display at Congressional Hearing

EOS - Tue, 09/16/2025 - 20:56
body {background-color: #D2D1D5;} Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

Misunderstandings and disinformation abounded at a 16 September hearing of the Subcommittee on Delivering on Government Efficiency about geoengineering, which encompasses efforts to alter Earth systems for the purpose of mitigating climate change. 

Rep. Marjorie Taylor Greene (R-GA), chairwoman of the subcommittee, called for an outright ban on geoengineering and used the hearing to promote her Clear Skies Act, which would impose fines of up to $100,000 and potentially jail time for anyone conducting “weather modification” activities.

Geoengineering is an amorphous term that can refer to a range of climate intervention activities, including cloud seeding to spur precipitation, management of solar radiation to cool Earth by reflecting sunlight, and carbon capture and sequestration efforts.

“Today’s advocates of geoengineering don’t just want to address droughts or improve conditions for agriculture” Greene said. “They want to control the Earth’s climate to address the fake climate change hoax and head off global warming. That, of course, requires massive interventions.”

In addition to asserting that climate change is a hoax, Greene implied that climate interventions could remove enough carbon dioxide from the atmosphere to harm plant life. In questioning, Rep. Brian Jack (R-GA) repeated a dubious claim that the release of dry ice into a hurricane in 1947 in an experiment called Project Cirrus caused the hurricane to turn toward Georgia. And Rep. Pat Fallon (R-TX) argued that former Vice President Al Gore’s misrepresentation regarding the melting of the north polar ice cap invalidates decades of climate science. 

One witness during the hearing was Christopher Martz, a policy analyst and meteorologist at the Committee for a Constructive Tomorrow, an environmental policy think tank that has cast doubts on climate science. Martz received an undergraduate degree in meteorology in May and runs a weather blog that questions the influence of climate change in extreme weather events. 

 
Related

Martz asserted that the science behind climate change is uncertain, and therefore that climate intervention is an alarmist reaction: “Warming could be mostly natural and we just don’t know,” he said. It’s not: The vast majority of scientists agree that Earth is warming and human activities are to blame.

The hearing’s only climate scientist witness, former Lawrence Livermore National Laboratory scientist Michael MacCracken, tried to combat the climate denialism in the room. He challenged the ideas that current climate intervention efforts are sufficiently powerful or scalable enough to change a major weather phenomenon, or that they are targeted to harm the public.

Despite the falsehoods raised by Greene and others at the hearing, some of their comments aligned with how many scientists view climate intervention—as a potentially risky endeavor that requires more research before it is considered viable and safe.

AGU’s own Ethical Framework Principles for Climate Intervention Research, developed with the contributions of scientists, policymakers, ethicists, government agencies, nongovernmental organizations, and potentially impacted communities, acknowledges this perspective: “Substantial research and evaluation efforts will be required to determine the effectiveness, risks, and opportunities of climate intervention,” the framework states.

At the hearing, Greene asked “who would control the dial” if scientists managed to reliably alter Earth’s climate.

Such questions are a reason to lean into Earth systems research, said Roger Pielke, Jr., a political scientist at the conservative American Enterprise Institute who spoke at the hearing. Pielke called for Congress to enact legislation to improve oversight of geoengineering and recommended that Congress ask the National Academy of Sciences to assess what scientists do and don’t know about the effects of climate intervention activities.

Rep. Melanie Stansbury (D-NM), ranking member of the subcommittee, closed the hearing with a plea to support science. “Literally all we’re trying to accomplish by climate action is to keep our planet in some sort of balance,” she said, calling the Trump administration’s firing of federal scientists and engineers, the defunding of science agencies, the firing of the EPA science panel, and the deregulation of carbon emissions “dangerous.”

Stansbury and Greene agreed on one thing: “We have one Earth,” they each said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org. Text © 2025. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Q&A: Why we still need ozone research

Phys.org: Earth science - Tue, 09/16/2025 - 19:27
On 16 September, the world marks the international day for the preservation of the ozone layer—a day of action initiated by the United Nations. This year's theme is "from science to global action"—a reference to the fact that scientific findings have underpinned successful political action to protect the ozone layer for decades.

Researchers reveal first complete MDICE signal in Ordovician organic carbon isotope record

Phys.org: Earth science - Tue, 09/16/2025 - 18:38
The Ordovician Period stands as a critical chapter in Earth's geological history, with carbon isotope records serving as both a key tool for stratigraphic correlation and a vital archive to unravel the coevolution of ancient climates and biospheres. For decades, however, prior research has largely focused on carbonate carbon isotope (δ13Ccarb) data, leaving organic carbon isotope (δ13Corg) records significantly understudied.

Tropical rainforest soil may fuel climate change as Earth warms, accelerating global warming

Phys.org: Earth science - Tue, 09/16/2025 - 15:46
A new study led by the U.S. Forest Service, with Chapman University as a key senior collaborator, published in Nature Communications, suggests Earth's own tropical soils may contribute to climate change as global warming continues, releasing vast amounts of carbon dioxide (CO₂) as they warm and potentially accelerating a dangerous feedback loop.

Geologists discover where energy goes during an earthquake

Phys.org: Earth science - Tue, 09/16/2025 - 14:55
The ground-shaking that an earthquake generates is only a fraction of the total energy that a quake releases. A quake can also generate a flash of heat, along with a domino-like fracturing of underground rocks. But exactly how much energy goes into each of these three processes is exceedingly difficult, if not impossible, to measure in the field.

Volcanoes can help us untangle the evolution of humans—here's how

Phys.org: Earth science - Tue, 09/16/2025 - 14:54
How did humans become human? Understanding when, where and in what environmental conditions our early ancestors lived is central to solving the puzzle of human evolution.

Cyclones Affect Heart Health for Months After They Subside

EOS - Tue, 09/16/2025 - 13:15

After a tropical cyclone passes through an area, governments take stock of the damage. NOAA, for instance, lists the costs associated with damaged buildings and roads and reports any injuries or deaths attributed to the storm.

“This research supports the historically overlooked indirect health risk and burden of tropical cyclones.”

However, research suggests that storms can also have hidden, long-term consequences for human health. In a new study published in Science Advances, scientists report that cyclones, also known as hurricanes and typhoons, produce a significant uptick in hospitalizations due to cardiovascular disease for months after they subside. In addition, the potential populations at risk for such hospitalizations are growing as a result of climate change intensifying cyclones and driving them into temperate regions such as Canada and New Zealand.

“This research supports the historically overlooked indirect health risk and burden of tropical cyclones and suggests the need for extending public health interventions and disaster preparedness beyond the immediate cyclone aftermath,” said Wenzhong Huang, an environmental epidemiologist at Monash University in Australia and the lead author of the new study.

Heart Problems Spike After Storms

Previous studies have examined possible connections between cardiovascular disease and cyclones, but most have focused on a single health center and storm in the United States.

“For our study, we encompassed multiple tropical cyclone events across decades and across multiple countries and territories with diverse socioeconomic contexts,” Huang said. “We also analyzed much longer post cyclone periods.”

“I didn’t expect that the risk would persist that long.”

The researchers tracked cardiovascular disease–related hospitalizations of more than 6.5 million people across Canada, New Zealand, South Korea, Taiwan, Thailand, and Vietnam from 2000 to 2019. They identified 179 locations that experienced cyclones and documented how many days storms hit each area. The team then examined hospital records to see whether more people were admitted for heart problems after cyclones, tracking patients for up to a year after each storm.

The results revealed that hospitalizations associated with heart health jumped 13% for every additional day a location was hit by a cyclone. The biggest spike in hospitalizations didn’t occur immediately after the cyclones but, rather, came 2 months after they passed, and the increased risk of hospitalizations didn’t subside until 6 months later.

“I didn’t expect that the risk would persist that long,” Huang said.

The health burden also fell unevenly across populations. Men, people in their 20s through 50s, and those in disadvantaged communities had the highest risk. In fact, cardiovascular risks after cyclones fell during the study period in wealthier areas while rising in poorer areas. This result suggests that improved health care access and disaster preparedness have benefited only some populations, with Thailand and Vietnam seeing the most cyclone-related heart problems. In total, strokes and ischemic heart disease (in which blood vessels supplying the heart are narrowed) were the most common maladies reported.

“There is not a single disease that’s not touched upon by hurricanes.”

Naresh Kumar, an environmental health scientist at the University of Miami who studies the health effects of cyclones but was not involved in the new study, was not surprised by the findings. According to his own extensive research on hurricanes in Florida and Puerto Rico, “there is not a single disease that’s not touched upon by hurricanes,” Kumar said.

But he would have liked the authors of the new study to narrow down the mechanisms driving up cardiovascular health risk after cyclones. The possible causes are abundant. In the months following a cyclone, people increase their use of generators, which produce pollutants; eat more calorie-dense canned foods; can’t exercise or access prescription medicines as easily; and are under immense psychological stress—all of which can increase the risk of cardiovascular disease. Meanwhile, regular health care services are often disrupted, so preventative care is limited.

Understanding these mechanisms is critical because current disaster response systems vastly underestimate the health burden of tropical storms, researchers say. “We are still scratching the surface in terms of characterizing the health effects of hurricanes,” Kumar said.

Huang said untangling the most significant contributors to increased risk following a cyclone is the next phase of his research. “I want to understand and investigate the candidates underlying this risk pattern,” he said.

As part of this process, Huang also aims to identify the reasons behind the elevated risk in some populations, such as working-age men. The research could help public health officials target interventions to high-risk populations and monitor cardiovascular health in the months following cyclones.

The Worsening Exposure to Storms

Answering the question of why more people suffer from heart problems after cyclones is becoming increasingly important to policymakers as more communities come under threat. Warmer oceans are fueling more intense storms with higher wind speeds and longer durations, while rising sea levels worsen storm surge flooding that can prolong recovery.

Climate change is also pushing tropical cyclones poleward into regions that have historically experienced few severe storms, such as eastern Canada and New Zealand. “Places that historically experienced fewer cyclone events could have much higher risk,” Huang said, suggesting such regions may be inadequately equipped to respond to major storms. “We need to focus on these regions to better prepare for the growing risk.”

—Andrew Chapman (@andrewchapman.bsky.social), Science Writer

Citation: Chapman, A. (2025), Cyclones affect heart health for months after they subside, Eos, 106, https://doi.org/10.1029/2025EO250342. Published on 16 September 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

A Survey of the Kuiper Belt Hints at an Unseen Planet

EOS - Tue, 09/16/2025 - 13:14

It’s been nearly 2 centuries since a planet was discovered in the solar system. But now scientists think they’ve uncovered evidence of a newcomer that just might usurp that honor from Neptune. Following an analysis of the orbits of bodies in the Kuiper Belt, a team has proposed that an unseen planet at least 25 times more massive than Pluto might reside there. These results were published in Monthly Notices of the Royal Astronomical Society.

The Kuiper Belt is loosely defined as a doughnut-shaped swath of space beginning just beyond the orbit of Neptune and extending to roughly 1,000 times the Earth-Sun distance. It’s home to untold numbers of icy, rocky objects, including Pluto and other so-called Kuiper Belt objects such as Arrokoth.

Everything in the Kuiper Belt can be thought of as cosmic debris, said Amir Siraj, an astrophysicist at Princeton University and lead author of the new paper. “It represents some of the leftovers from the formation of our solar system.”

And most of those leftovers are small: Pluto is the most massive known Kuiper Belt object, and it’s just 0.2% the mass of Earth.

But over the past decade, scientists have hypothesized that something substantially larger than Pluto might be lurking in the Kuiper Belt. Evidence of that unseen world—a so-called Planet Nine or Planet X—lies in the fact that six Kuiper Belt objects share curiously similar orbital parameters and are associated in physical space. A nearby, larger planet could have shepherded those worlds into alignment, researchers have proposed.

Planes, Planes, Everywhere

Siraj and his colleagues recently took a different tack to look for a massive resident of the Kuiper Belt: They analyzed a much larger sample of Kuiper Belt objects and focused on their orbital planes. One would naively expect the average orbital plane of Kuiper Belt objects to be the same as the average orbital plane of the planets in the solar system, said Siraj. But a planet-mass body in the Kuiper Belt would exert a strong enough gravitational tug on its neighboring Kuiper Belt objects to measurably alter the average orbital plane of the Kuiper Belt, at least in the vicinity of the planet. Siraj and his collaborators set out to see whether they could spot such a signal.

“Neptune has a really strong grasp on the outer solar system.”

The researchers extracted information about the orbits of more than 150 Kuiper Belt objects from the JPL Small-Body Database managed by NASA’s Jet Propulsion Laboratory in Pasadena, Calif. Of the several thousand known Kuiper Belt objects, the team honed in on that subset because those objects aren’t gravitationally influenced by Neptune. Neptune is the playground bully of the outer solar system, and the orbits of many Kuiper Belt objects are believed to be literally shoved around by gravitational interactions with the ice giant. “Neptune has a really strong grasp on the outer solar system,” said Siraj.

The team calculated the average orbital plane of their sample of Kuiper Belt objects. At distances of 50 to 80 times the Earth-Sun distance, they recovered a plane consistent with that of the inner solar system. But farther out, at distances between 80 and 200 times the Earth-Sun distance, the researchers found that their sample of Kuiper Belt objects formed a plane that was warped relative to that of the inner solar system. There was only a roughly 4% probability that that signal was spurious, they calculated.

Meet Planet Y

Siraj and his collaborators then modeled how planets of different masses at various orbital distances from the Sun would affect a simulated set of Kuiper Belt objects. “We tried all sorts of planets,” said Siraj.

By comparing those model results with the observational data, the researchers deduced that a planet 25–450 times more massive than Pluto with a semimajor axis in the range of 100–200 times the Earth-Sun distance was the most likely culprit. There’s a fair bit of uncertainty in those numbers, but the team’s results make sense, said Kat Volk, a planetary scientist at the Planetary Science Institute in Tucson, Ariz., not involved in the research. “They did a pretty good job of bracketing what kind of object could be causing this signal.”

To differentiate their putative planet from Planet X, Siraj and his colleagues suggested a new name: Planet Y. It’s important to note that these two worlds, if they even exist, aren’t one and the same, said Siraj. “Planet X refers to a distant, high-mass planet, while Planet Y denotes a closer-in, lower-mass planet.”

“This is really expected to be a game changer for research on the outer solar system.”

There’s hope that Planet Y will soon get its close-up. The Legacy Survey of Space and Time (LSST)—a 10-year survey of the night sky that will be conducted by the Vera C. Rubin Observatory in Chile beginning as soon as this fall—will be supremely good at detecting Kuiper Belt objects, said Volk, who is a member of the LSST Solar System Science Collaboration. “We’re going to be increasing the number of known objects by something like a factor of 5–10.”

It’s entirely possible that Planet Y itself could be spotted, said Volk. But even if it isn’t, simply observing so many more Kuiper Belt objects will better reveal the average orbital plane of the Kuiper Belt. That will, in turn, shed light on whether it’s necessary to invoke Planet Y at all.

Even if his team’s hypothesis is proven wrong, Siraj says he’s looking forward to the start of the LSST and its firehose of astronomical data. “This is really expected to be a game changer for research on the outer solar system.”

—Katherine Kornei (@KatherineKornei), Science Writer

Citation: Kornei, K. (2025), A survey of the Kuiper Belt hints at an unseen planet, Eos, 106, https://doi.org/10.1029/2025EO250344. Published on 16 September 2025. Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Donde hay fuego, hay humo

EOS - Tue, 09/16/2025 - 13:13

This is an authorized translation of an Eos article. Esta es una traducción al español autorizada de un artículo de Eos.

Gale Sinatra y su esposo huyeron de su casa en Altadena, California, el 7 de enero con poco más que sus maletas, llevándose solo uno de sus dos autos.

“Pensábamos que íbamos a estar fuera por esa noche”, dijo Sinatra. “Pensábamos que controlarían el incendio y que volveríamos a entrar”.

Cuando la pareja regresó, semanas después, fue para excavar entre los escombros de su antigua casa, quemada por el incendio de Eaton.

Aunque escaparon con vida, los riesgos para la salud no fueron la excepción para Sinatra, su esposo (quien prefirió no ser identificado para esta historia) y otros vecinos. Los incendios de Eaton y el cercano Palisades llenaron la cuenca de Los Ángeles con una neblina tóxica durante días, y las labores de limpieza amenazaron con levantar partículas carbonizadas mucho después de que los incendios se extinguieran.

Equipos de científicos de todo el país, junto con miembros de la comunidad, monitorearon la calidad del aire en las semanas posteriores al incendio, buscando aprender más sobre los riesgos asociados a la salud respiratoria e informar a la comunidad sobre cómo protegerse.

Incendios urbanos versus incendios forestales

Inhalar humo de cualquier incendio puede ser perjudicial. El humo contiene componentes peligrosos, como compuestos orgánicos volátiles (COV) emitidos por la quema de vegetación y productos tales como pintura y productos de limpieza; y material particulado, como polvo y hollín.

Aproximadamente el 90 % del material particulado (PM) presente en el humo de los incendios forestales son las PM2.5, o partículas de menos de 2.5 micrómetros de diámetro, lo suficientemente pequeñas como para penetrar en el torrente sanguíneo y en las zonas profundas de los pulmones.

Michael Kleeman utiliza estos instrumentos para monitorear la calidad del aire desde la parte trasera de un vehículo en Victory Park, Altadena, lo más al norte posible sin entrar en la zona de evacuación. Crédito: Michael Kleeman

Los incendios forestales urbanos presentan sus propios peligros, ya que no solo queman árboles y otra vegetación, sino también viviendas e infraestructura.

Cuando Sinatra regresó a su antiguo hogar, quedó impactada por todo lo que el fuego había quemado, desde sus joyas hasta su coche. “Se me hizo muy inquietante estar en la cocina y de repente decir: ¿Dónde está mi refrigerador?”, comentó. “¿Cómo derrites totalmente refrigerador?”

En enero de 2025, los incendios de Palisades y Eaton devastaron más de 150 kilómetros cuadrados en ciudades y zonas forestales del condado de Los Ángeles. A pesar de verse afectados personalmente, los científicos del área de Los Ángeles trabajaron diligentemente para comprender cómo los incendios en la interfaz entre lo urbano y lo forestal crean peligros únicos a través del aire, la tierra y el agua.

En el futuro, las condiciones cálidas y secas, agravadas por el cambio climático, seguirán aumentando el riesgo de incendios como estos. El trabajo de estos científicos puede proporcionar un modelo para la evaluación rápida de riesgos, la mitigación de riesgos para la salud y la planificación urbana en otras comunidades propensas a incendios”.

“Desde colchones hasta alfombras, pintura y aparatos electrónicos, todo se quema”, afirmó Roya Bahreini, científica ambiental de la Universidad de California, Riverside (UCR). Bahreini también es coinvestigadora principal de la Red de Medición de la Química y la Ciencia Atmosférica (ASCENT, por sus siglas en inglés), un proyecto de monitoreo de la calidad del aire a largo plazo liderado por el Instituto de Tecnología de Georgia, UCR, y la Universidad de California, Davis (UC Davis).

ASCENT, que se lanzó en 2021, cuenta con estaciones en todo el país, incluyendo tres en el sur de California. Durante los incendios de enero en Los Ángeles, que arrasaron no solo Altadena (una comunidad no incorporada del interior) sino también barrios costeros, estas estaciones detectaron niveles de plomo, cloro y bromo en órdenes de magnitud superiores a lo habitual.

Las casas antiguas a veces tienen pintura con plomo, techos de asbesto o terrazas y cercas de madera tratadas con conservantes que contienen arsénico. Las tuberías de PVC contienen cloro. Y los retardantes de llama a menudo contienen compuestos orgánicos bromados. En estas formas, estos materiales no necesariamente representan un alto riesgo para la salud humana. Sin embargo, al quemarse y liberarse al aire, pueden ser peligrosos.

Las columnas de humo del incendio de Palisades (izquierda) y del incendio de Eaton se observan desde el espacio el 9 de enero. Crédito: ESA, contiene datos modificados de Copernicus Sentinel, CC BY-SA 3.0 IGO

Michael Kleeman, ingeniero civil y ambiental de la Universidad de California en Davis, explicó que la mortalidad a corto plazo asociada con eventos con altos niveles de PM2.5, como los incendios forestales, suele manifestarse en forma de un infarto cardíaco. Sin embargo, inhalar el humo de los incendios forestales urbanos o las partículas que se levantan del polvo y las cenizas durante las labores de remediación puede presentar riesgos que no son evidentes de inmediato. “No se trata de un infarto inmediato, al día siguiente o a los tres días de la exposición. Se trata de un riesgo de cáncer que aparece mucho más adelante”, señaló Kleeman. “[El riesgo a] la exposición a lago plazo puede tener un efecto insidioso”.

Mapas de calidad del aire

“[El riesgo a] la exposición a lago plazo puede tener un efecto insidioso”.

El sur de California no es ajeno a los incendios forestales (tampoco Sinatra, quien ha evacuado varias veces durante sus 15 años en Altadena). Las frecuentes sequías en la cuenca de Los Ángeles resultan en grandes extensiones de vegetación reseca. Los infames vientos de Santa Ana, que soplan en la cuenca desde el este y el noreste, pueden provocar que los incendios se descontrolen rápidamente, como ocurrió con los incendios de Palisades y Eaton.

Los mapas de calidad del aire en tiempo real, como los del Distrito de Gestión de la Calidad del Aire de la Costa Sur (AQMD, por sus siglas en inglés) y la EPA de EE. UU., se basan en diversas fuentes para proporcionar datos durante todo el año. Los datos más detallados provienen de sofisticados instrumentos instalados por las propias agencias; el AQMD de la Costa Sur alberga 32 estaciones permanentes de monitoreo del aire en los condados de Los Ángeles, Orange, Riverside y San Bernardino.

Datos menos detallados, pero más generalizados, sobre material particulado provienen de redes de herramientas de medición de la calidad del aire disponibles comercialmente, como los monitores PurpleAir y los sensores Clarity, instalados por residentes u organizaciones comunitarias.

El Distrito de Gestión de la Calidad del Aire cuenta con instalaciones permanentes para monitorear la calidad del aire, pero tras los incendios forestales de Los Ángeles de enero de 2025, implementó iniciativas complementarias, recopilando datos de calidad del aire en tiempo real desde camionetas móviles de monitoreo. Crédito: South Coast AQMD.

“Resulta que las zonas donde se produjeron los incendios contaban con [una] red muy densa de estos sensores de bajo costo”, afirmó Scott Epstein, gerente de planificación y normativa del South Coast AQMD. “Al combinar esto con nuestra red regulatoria, obtuvimos una excelente cobertura de la contaminación por partículas finas”.

Esta densidad permitió a los investigadores observar las columnas de humo de los incendios forestales de Eaton y Palisades a medida que se dirigían hacia la costa.

Una estación del AQMD en Compton, a unos 37 kilómetros (23 millas) al sur del incendio de Eaton, mostró niveles muy elevados de metales tóxicos, como arsénico y plomo, entre el 7 y el 11 de enero, mientras la columna pasaba sobre la zona. Estos niveles se normalizaron en pocos días. Los instrumentos ASCENT en Pico Rivera, a unos 23 kilómetros (14 millas) al sur del incendio de Eaton, registraron un aumento de 110 veces en los niveles de plomo entre el 8 y el 11 de enero.

Estaciones permanentes de medición de la calidad del aire como estas ofrecen una fuente de información pública que residentes como Sinatra pueden consultar para decidir cuándo quedarse en casa o regresar a una zona quemada. Sin embargo, cuando estallaron los incendios de Palisades y Eaton, investigadores del AQMD y otras instituciones se propusieron complementar estos esfuerzos con un monitoreo más detallado.

Movilizándose rápidamente Melissa Bumstead (izquierda) y Jeni Knack se ofrecieron como voluntarias para recolectar muestras de aire y cenizas tras los incendios de Eaton y Palisades. Crédito: Shelly Magier.

En enero, investigadores de la Universidad de Harvard; la Universidad de California, Los Ángeles (UCLA); la Universidad de Texas en Austin; la Universidad del Sur de California (USC); y UC Davis lanzaron el Estudio de Exposición Humana y Salud a Largo Plazo de Los Angeles Fire, o LA Fire HEALTH.

Mientras muchos residentes de Los Ángeles, incluyendo a Sinatra, seguían bajo órdenes de evacuación, los investigadores de LA Fire HEALTH se dirigían a zonas de evacuación.

Uno de estos investigadores fue Nicholas Spada, un científico especializado en aerosoles que viajó a Los Ángeles desde UC Davis el 14 de enero para instalar cuatro impactadores en cascada en Santa Mónica (cerca del incendio de Palisades), Pasadena (cerca del incendio de Eaton), Hollywood y West Hills. Estos instrumentos, del tamaño de un maletín, actúan como máquinas clasificadoras de monedas, explicó Spada: toman una muestra de aire y clasifican las partículas en ocho categorías de tamaño diferentes, desde 10 micrómetros (aproximadamente 1/9 del grosor promedio de un cabello humano) hasta 90 nanómetros (aproximadamente 1/1000 del grosor de un cabello humano). Los instrumentos recogieron ocho muestras cada dos horas hasta el 10 de febrero.

El instrumento “capta los cambios en las columnas de humo a medida que el incendio progresa de activo a latente y luego a extinto para después seguircon los efectos de mitigación”.

Un impactador en cascada permite a los científicos “asociar los perfiles de tamaño de las partículas con el tiempo”, explicó Spada. El instrumento “capta los cambios en las columnas de humo a medida que el incendio progresa de activo a latente y luego a extinto para después seguircon los efectos de mitigación”.

Las mediciones mostraron que no solo había elementos tóxicos como el plomo y el arsénico presentes en el aire durante todo el período de muestreo, sino que también una alta proporción de su masa (alrededor del 25 %) se encontraba en forma de partículas ultrafinas (del orden de nanómetros). Estas partículas no son filtradas por las mascarillas N95 y pueden penetrar profundamente en el cuerpo al inhalarse, explicó Spada.

Un equipo de investigadores de la Universidad de Texas llegó en una camioneta que también funcionaba como laboratorio móvil el 2 de febrero. Para entonces, los incendios ya estaban extinguidos, pero ya habían comenzado las labores de remediación que causaban la acumulación de polvo. Descubrieron que la calidad del aire exterior en las semanas posteriores a los incendios había recuperado los niveles previos y se ajustaba a las directrices de la EPA. Las muestras de interiores, especialmente las de viviendas dentro de las zonas quemadas, mostraron niveles más altos de COVs en comparación con las muestras de exteriores.

Los vecinos tienden una mano

Los miembros de la comunidad se sumaron a los esfuerzos para monitorear la calidad del aire.

Los miembros de la comunidad del sur de California también se sumaron a los esfuerzos para monitorear la calidad del aire. Melissa Bumstead y Jeni Knack, codirectoras del Laboratorio de Campo de Padres Contra Santa Susana, trabajaron con investigadores para crear y distribuir folletos sobre las medidas adecuadas para el equipo de protección personal, así como un protocolo de auto muestreo para los residentes que desearan recolectar muestras de ceniza de sus propiedades.

Por aproximadamente dos veces a la semana, del 14 de enero al 19 de febrero, recolectaron muestras de aire y ceniza en Pasadena, Altadena, Santa Mónica, Topanga y Pacific Palisades, y luego las enviaron a laboratorios, incluido el de Spada, para su análisis. El arsénico en todas las muestras de ceniza y el plomo en aproximadamente un tercio de ellas superaron los niveles de detección regionales de la EPA. Spada señaló en sus comunicaciones a los residentes que estos niveles de detección se basan en lo que es seguro para la ingestión de un niño y son relativamente conservadores.

“Esto ayudará a las personas en la próxima iteración de incendios a saber qué hacer”, recordó Bumstead haberles dicho a los residentes en las zonas de muestreo.

Después de las cenizas Sinatra perdió su casa en Altadena en el incendio de Eaton de enero de 2025. Al regresar para excavar entre los escombros, recorrió chimenea tras chimenea sin ninguna casa. Crédito: Gale Sinatra

El próximo incendio, dijo Sinatra, es algo que la abruma mientras ella y sus vecinos consideran la posibilidad de reconstruir.

Cuando la lluvia finalmente llegó al sur de California el 26 de enero, ayudó a extinguir los incendios y a controlar el polvo acumulado durante las labores de remediación, reduciendo así el riesgo de inhalación de toxinas.

Aun así, esas toxinas también estaban presentes en el suelo y el agua. Cuando Sinatra y su esposo regresaron al lugar calcinado de su casa, tomaron todas las precauciones que habían escuchado en las noticias, la EPA, los líderes comunitarios y los vecinos: usaron respiradores, trajes de protección, gafas protectoras y dos pares de guantes cada uno para protegerse.

La preocupación por las posibles consecuencias a largo plazo del aire que ya habían respirado, así como del suelo bajo sus pies, persiste mientras esperan más datos.

“Todos creen que existe una probabilidad significativa de un incendio en el futuro”, dijo Sinatra. “Nos preguntamos si sería seguro vivir allí, considerando la calidad del suelo y del aire, y si volverá a ocurrir.”

Emily Dieckman (@emfurd.bsky.social), Escritora Asociada.

This translation by Daniela Navarro-Pérez was made possible by a partnership with Planeteando and GeoLatinas. Esta traducción fue posible gracias a una asociación con Planeteando and GeoLatinas.

Deep Learning Goes Multi-Tasking

EOS - Tue, 09/16/2025 - 12:00
Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: Water Resources Research

Deep learning’s (DL’s) promise and appeal is algorithmic amalgamation of all available data to achieve model generalization and prediction of complex systems. Thus, there is a need to design multivariate training and predictions tasks in order to identify all relevant connections between variables across different space and time scales.

Ouyang et al. [2025] propose a multi-task long-short-term memory (LSTM) neural network to predict time series of multiple hydrologic variables. In the application of the approach, by combining different variables in the prediction task and sharing information between them, improved physical consistency and accuracy is achieved. The authors demonstrate this in various prediction exercises of streamflow and evapotranspiration including conditions of data scarcity.

The study is a good example of how innovation within DL can realize the promise of generalizable hydrological models and predictions of complex systems in future. It also implicitly encourages hydrologists to expand their DL approaches for multi-tasking. After all, there is a plethora of data and computing resources available to achieve DL’s promise.

Citation: Ouyang, W., Gu, X., Ye, L., Liu, X., & Zhang, C. (2025). Exploring hydrological variable interconnections and enhancing predictions for data‐limited basins through multi‐task learning. Water Resources Research, 61, e2023WR036593. https://doi.org/10.1029/2023WR036593

—Stefan Kollet, Editor, Water Resources Research

Text © 2025. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Universal relations between parallel and perpendicular spectral power-law exponents in nonaxisymmetric magnetohydrodynamic turbulence

Physical Review E (Plasma physics) - Tue, 09/16/2025 - 10:00

Author(s): Ramesh Sasmal and Supratik Banerjee

Following a general heuristic approach, algebraic constraints are established between the parallel and perpendicular power-law exponents of nonaxisymmetric, highly aligned magnetohydrodynamic turbulence, both with and without a strong imbalance between the Elsässer variables. Such relations are univ…


[Phys. Rev. E 112, 035208] Published Tue Sep 16, 2025

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer