Atmospheric blocking events drive persistent weather extremes in midlatitudes, but isolating the influence of sea surface temperature (SST) from chaotic internal atmospheric variability on these events remains a challenge. We address this challenge using century-long (1900-2010), large-ensemble simulations with two computationally efficient deep-learning general circulation models. We find these models skillfully reproduce the observed blocking climatology, matching or exceeding the performance of a traditional high-resolution model and representative CMIP6 models. Averaging the large ensembles filters internal atmospheric noise to isolate the SST-forced component of blocking variability, yielding substantially higher correlations with reanalysis than for individual ensemble members. We identify robust teleconnections linking Greenland blocking frequency to North Atlantic SST and El Niño-like patterns. Furthermore, SST-forced trends in blocking frequency show a consistent decline in winter over Greenland, and an increase over Europe. These results demonstrate that SST variability exerts a significant and physically interpretable influence on blocking frequency and establishes large ensembles from deep learning models as a powerful tool for separating forced SST signals from internal noise.
The FY-4A satellite represents a new generation of geostationary platforms, providing high-temporal-resolution observations over China. However, challenges remain in effectively leveraging the FY-4A satellite data for high-temporal-resolution PM<sub>2.5</sub> concentration estimation, particularly regarding the unclear key parameters required for accurate estimation and the limited interpretability of models. This study utilizes an interpretable deep learning framework that integrates FY-4A Top-of-Atmosphere (TOA) reflectance data, meteorological variables, and auxiliary data to estimate surface high-temporal-resolution PM<sub>2.5</sub> concentrations from 2019 to 2023. A multicollinearity test was applied to optimize feature selection, while the SHapley Additive exPlanations (SHAP) method was used to enhance model interpretability. The results indicate that parameters such as TOA02, TOA03, TOA04, and boundary layer height (BLH) significantly influence model performance across years. The model demonstrates strong predictive ability in the Beijing–Tianjin–Hebei (BTH) region, achieving an average R<sup>2</sup> of 0.83. Root mean square error (RMSE) values remained below 15 µg/m<sup>3</sup>, aligning well with ground-based monitoring data. These findings demonstrate that combining high temporal satellite data with interpretable deep learning provides a reliable approach for long-term, high-temporal-resolution PM<sub>2.5</sub> monitoring in regions.
Valentino Petrić, Nikolina Račić, Ivana Hrga
et al.
Accurate, high-resolution air quality data are crucial for understanding environmental health risks; however, the cost and complexity of maintaining dense, reference-grade monitoring networks remain a significant barrier. This study presents the first city-wide evaluation of next-generation air quality sensors in Zagreb, Croatia, involving 35 sensor locations, one local reference-grade station, and three national reference stations that measure PM<sub>10</sub> and NO<sub>2</sub>. Sensor performance was evaluated against reference data under various meteorological and temporal conditions. To better understand sensor drift and measurement bias, we developed machine learning (ML) calibration models (XGBoost) using spatiotemporal features, ERA5 meteorological variables, and traffic proxy indicators. The models significantly improved accuracy, reducing the root mean squared error (RMSE) by up to 82%, with the greatest improvements observed during pollution peaks. A rolling Root Mean Square Error (RMSE) approach was introduced to track model degradation over time, revealing that recalibration was typically needed within 1–6 months. Our findings demonstrate that, with proper calibration and maintenance, sensor networks can serve as reliable and scalable tools for urban air quality monitoring, capable of supporting both public health assessments and informed decision-making.
Abstract Globally, the 2022 Pakistan mega-flood displaced over 33 million people and incurred economic losses exceeding $ 40 billion. By coupling seventy years of historical flood data with advanced machine learning techniques (GeoPINS within FloodCast), this study quantifies the event’s primary drivers and projects future risk under climate change. Results show that the 2022 monsoon, amplified by low-pressure systems, delivered 7–8 times the 1990–2020 mean rainfall, flooding over 2100 streams and breaching 177 check dams. In Balochistan alone, these dam failures caused 80–85% of the province’s economic losses. Spatial–spectral analysis reveals that monsoon intensification, infrastructural vulnerability, and orographic forcing collectively govern inundation patterns. Under the SSP5 scenario, the area of high flood risk zones is projected to expand by 6.62% by 2080, even when modeling data-scarce regions. These findings underscore an urgent need for climate-resilient dam design, strategic sediment management, and adaptive flood-risk governance in similar vulnerable areas.
Meteorology. Climatology, Disasters and engineering
Abstract. The year-over-year changes in economic growth across the Caribbean Antilles islands demonstrate sensitivity to climatic conditions. Daily wind and rainfall exceedances from passing storms are negatively related to the gross domestic product (GDP). Field regression of the GDP time series from 1971 to 2022 for Puerto Rico and the neighboring Antilles islands reveals links with eastern Pacific sea temperature. A zonal overturning atmospheric circulation over the equatorial Atlantic emerges in composite analysis. Alternating at an approximate 7-year interval, it modulates weather events and economic prosperity in the Caribbean. A multivariate algorithm is developed to predict changes in the annual GDP growth rate. The most influential predictor is precipitable water in the equatorial Atlantic 1 year earlier. Reduced moisture overlain by westerly winds in a global bottleneck at 5° S–5° N, 20–40° W tends to suppress Caribbean storms, leading to economic prosperity in the following year. Statistical methods and risk-reduction strategies are outlined.
Time series forecasting is a critical task in domains such as energy, finance, and meteorology, where accurate long-term predictions are essential. While Transformer-based models have shown promise in capturing temporal dependencies, their application to extended sequences is limited by computational inefficiencies and limited generalization. In this study, we propose KEDformer, a knowledge extraction-driven framework that integrates seasonal-trend decomposition to address these challenges. KEDformer leverages knowledge extraction methods that focus on the most informative weights within the self-attention mechanism to reduce computational overhead. Additionally, the proposed KEDformer framework decouples time series into seasonal and trend components. This decomposition enhances the model's ability to capture both short-term fluctuations and long-term patterns. Extensive experiments on five public datasets from energy, transportation, and weather domains demonstrate the effectiveness and competitiveness of KEDformer, providing an efficient solution for long-term time series forecasting.
The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing patterns, turbulent spatial plankton patterns, and fractals. Determinism is a central measure in this framework quantifying the level of regularity of spatial structures. We show by basic examples of fully regular patterns of different symmetries that this measure underestimates the orderliness of circular patterns resulting from rotational symmetries. We overcome this crucial problem by checking additional structural elements of the generalized recurrence plot which is demonstrated with the examples. Furthermore, we show the potential of the extended quantity of determinism applying it to more irregular circular patterns which are generated by the complex Ginzburg-Landau-equation and which can be often observed in real spatially extended dynamical systems. So, we are able to reconstruct the main separations of the system's parameter space analyzing single snapshots of the real part only, in contrast to the use of the original quantity. This ability of the proposed method promises also an improved description of other systems with complicated spatio-temporal dynamics typically occurring in fluid dynamics, climatology, biology, ecology, social sciences, etc.
Alberto Bassanoni, Alessandro Vezzani, Raffaella Burioni
We study rare events in the extreme value statistics of stochastic symmetric jump processes with power tails in the distributions of the jumps, using the big-jump principle. The principle states that in the presence of stochastic processes with power tails statistics, if at a certain time a physical quantity takes on a value much larger than its typical value, this large fluctuation is realised through a single macroscopic jump that exceeds the typical scale of the process by several orders of magnitude. In particular, our estimation focuses on the asymptotic behaviour of the tail of the probability distribution of maxima, a fundamental quantity in a wide class of stochastic models used in chemistry to estimate reaction thresholds, in climatology for earthquake risk assessment, in finance for portfolio management, and in ecology for the collective behaviour of species. We determine the analytical form of the probability distribution of rare events in the extreme value statistics of three jump processes with power tails; Lévy flights, Lévy walks and the Lévy-Lorentz gas. For the Lévy flights, we re-obtain through the big-jump approach recent analytical results, extending their validity. For the Lévy-Lorentz gas we show that the topology of the disordered lattice along which the walker moves induces memory effects in its dynamics, which influences the extreme value statistics. Our results are confirmed by extensive numerical simulations.
Rydberg atom,which exhibits a strong response to weak electric(E) fields,is regarded as a promising atomic receiver to surpass sensitivity of conventional receivers. However, its sensitivity is strongly limited by the noise coming from both classical and quantum levels and how to enhance it significantly remains challenging. Here we experimentally prove that the sensitivity of Rydberg atomic receiver can be increased to 23 nV/cm/Hz1/2 by combining laser arrays. Theoretically, we demonstrate that multiple beams illuminating on a PD perform better than multiple PDs for laser arrays.In our experiment,10 dB SNR enhancement is achieved by utilizing 2 * 2 probe beam arrays, compared to the performance of a laser beam,and it can be enhanced further just by adding a resonator. The results could offer an avenue for the design and optimization of ultrahigh-sensitivity Rydberg atomic receivers and promote applications in cosmology, meteorology, communication, and microwave quantum technology.
Mélanie Ngutuka Kinzunga, Daniel M. Westervelt, Daniel Matondo Masisa
et al.
Background: Ambient air pollution remains a major risk factor for population health worldwide. The impact of PM<sub>2.5</sub> air pollution is underestimated in sub-Saharan Africa due to a lack of epidemiological studies. AirQ+ is proposed to reduce these inequalities in research. The aim of this study is to assess, by AirQ+, the impact of prolonged exposure to PM<sub>2.5</sub> on respiratory health in Kinshasa in 2019, and to estimate the health benefits of reducing this air pollution. Methods: Population and mortality data were obtained from the Institut National de la Statistique and the Institut de Métrologie et d’Évaluation en santé, respectively. PM<sub>2.5</sub> concentrations were measured using PurpleAir PA-II-SD sensors, and average annual concentration was 43.5 µg/m<sup>3</sup> in 2019. AirQ+ was used to estimate the health effect attributable to PM<sub>2.5</sub> in adults aged over 25 in Kinshasa. Results: In 2019, the proportion of deaths attributable to PM<sub>2.5</sub> air pollution was 30.72% for ALRI, 26.55% for COPD and 24.32% for lung cancers. Each 10% reduction in current PM<sub>2.5</sub> levels would prevent 1093 deaths (from all causes) per year in Kinshasa. Life expectancy would increase by 4.7 years (CI 3.5–5.3) if the WHO threshold of 5 mg/m<sup>3</sup> were respected. Conclusions: The results of this study highlight the major respiratory public health problem associated with air pollution by fine particles in Kinshasa. AirQ+ was used to assess the impact of prolonged exposure to PM<sub>2.5</sub> and respiratory deaths among adults in Kinshasa and revealed that this number of deaths could be avoided by improving air quality.
Rural development has the potential to improve the well-being of villagers, but it may also impact local plant diversity. In addition, plant diversity differs across various subareas of rural development efforts in scenic areas, reflecting variations along the center-periphery gradient. To explore the relationship between economic development, plant diversity, and villagers’ well-being in villages within scenic areas during the rural development process, this study analyzed plant species data and satellite remote sensing images between 1984 and 2021, focusing on changes in land-use types in the Hangzhou West Lake Scenic Area and their impacts on plant diversity in the central, middle, and peripheral scenic areas. The results indicated that the area of rural construction land decreased, compared to 1984, while other land use types showed varying degrees of increase. Specifically, part of the forest in the peripheral scenic area was transformed into tea fields, resulting in an observable increase in the overall area of tea field. Moreover, the number of plant species decreased along the center-periphery gradient of the scenic spot. The greatest differences in the number of plant species and increases in the number of invasive plants were found in the peripheral scenic area, reflecting the greater impact of rural development in this region. Additionally, the number of rare and endangered plant species increased the most in the central scenic area in recent years, which was related to the use of urban green space to protect such plants. Ellenberg’s ecological indicator values (EIVs) indicated an increase in the number of species preferring shady locations and acidic soils in the scenic areas. The changes in land use and production and business activity strategies in the villages of the West Lake Scenic Area improved the villagers’ well-being, and the resulting factors such as the introduction of non-indigenous plants and environmental filtering changed local plant diversity. Therefore, the administrators should carefully consider the trade-off between conserving biodiversity and enhancing the well-being of the local villagers. The findings offer evidence of how different economic development models over time can influence rural biodiversity and villagers’ well-being, providing a reference for sustainable development in scenic spot villages.
Multivariate time series have many applications, from healthcare and meteorology to life science. Although deep learning models have shown excellent predictive performance for time series, they have been criticised for being "black-boxes" or non-interpretable. This paper proposes a novel modular neural network model for multivariate time series prediction that is interpretable by construction. A recurrent neural network learns the temporal dependencies in the data while an attention-based feature selection component selects the most relevant features and suppresses redundant features used in the learning of the temporal dependencies. A modular deep network is trained from the selected features independently to show the users how features influence outcomes, making the model interpretable. Experimental results show that this approach can outperform state-of-the-art interpretable Neural Additive Models (NAM) and variations thereof in both regression and classification of time series tasks, achieving a predictive performance that is comparable to the top non-interpretable methods for time series, LSTM and XGBoost.
The increasing frequency and scale of wildfires carry significant ecological, socioeconomic, and environmental implications, prompting the need for a deeper grasp of wildfire characteristics. Essential meteorological factors like temperature, humidity, and precipitation wield a crucial impact on fire behavior and the estimation of burned areas. This study aims to unravel the interconnections between meteorological conditions and fire attributes within the Salmon-Challis National Forest located in east-central Idaho, USA. Through the utilization of remotely sensed data from the Fire Monitoring, Mapping, and Modeling system (Fire M3) alongside meteorological variables recorded between 2010 and 2020, an exploration is conducted into varied meteorological patterns associated with wildfire events. By integrating the computed burned area into the clustering process, valuable insights are gained into the specific influences of fire weather conditions on the extent of burned areas. The Salmon-Challis National Forest, encompassing more than 4.3 million acres and encompassing the largest wilderness area in the Continental United States, emerges as a pivotal research site for wildfire investigations. This work elucidates the data attributes employed for clustering and visualization, along with the algorithms employed. Additionally, the study presents research findings and delineates potential future applications, ultimately contributing to the advancement of fire management and mitigation strategies in regions prone to wildfires.
In this article, we study the test for independence of two random elements $X$ and $Y$ lying in an infinite dimensional space ${\cal{H}}$ (specifically, a real separable Hilbert space equipped with the inner product $\langle ., .\rangle_{\cal{H}}$). In the course of this study, a measure of association is proposed based on the sup-norm difference between the joint probability density function of the bivariate random vector $(\langle l_{1}, X \rangle_{\cal{H}}, \langle l_{2}, Y \rangle_{\cal{H}})$ and the product of marginal probability density functions of the random variables $\langle l_{1}, X \rangle_{\cal{H}}$ and $\langle l_{2}, Y \rangle_{\cal{H}}$, where $l_{1}\in{\cal{H}}$ and $l_{2}\in{\cal{H}}$ are two arbitrary elements. It is established that the proposed measure of association equals zero if and only if the random elements are independent. In order to carry out the test whether $X$ and $Y$ are independent or not, the sample version of the proposed measure of association is considered as the test statistic after appropriate normalization, and the asymptotic distributions of the test statistic under the null and the local alternatives are derived. The performance of the new test is investigated for simulated data sets and the practicability of the test is shown for three real data sets related to climatology, biological science and chemical science.
An earlier study evaluating the dust life cycle in the Energy Exascale Earth System Model (E3SM) Atmosphere Model version 1 (EAMv1) has revealed that the simulated global mean dust lifetime is substantially shorter when higher vertical resolution is used, primarily due to significant strengthening of dust dry removal in source regions. This paper demonstrates that the sequential splitting of aerosol emissions, dry removal, and turbulent mixing in the model's time integration loop, especially the calculation of dry removal after surface emissions and before turbulent mixing, is the primary reason for the vertical resolution sensitivity reported in that earlier study. Based on this reasoning, we propose a simple revision to the numerical process coupling scheme, which moves the application of the surface emissions to after dry removal and before turbulent mixing. The revised scheme allows newly emitted particles to be transported aloft by turbulence before being removed from the atmosphere, and hence better resembles the dust life cycle in the real world. Sensitivity experiments are conducted and analyzed to evaluate the impact of the revised coupling on the simulated aerosol climatology in EAMv1.
Abstract Drought‐related risks pose a threat to the agricultural sector of Guyana despite the country's wealth of freshwater resources. As a result, the advancement of the understanding of soil moisture deficits as a means of forecasting agricultural drought is needed to aid farmers, extension officers, and other agricultural decision‐makers. Hence, this study has been motivated by the following research question: Can the Tropical Applications of Meteorology using SATellite data—AgriculturaL Early waRning sysTem (TAMSAT‐ALERT) be used to assess the meteorological risk to cultivation at key points in the growing season? Due to the absence of in situ soil moisture data for the area of study, the Joint UK Land and Environment Simulator (JULES) model was used to model the historical soil moisture, based on gauge precipitation data and NCEP reanalysis data. A case study approach during the 1997 growing seasons of the rice crop was taken to determine whether the TAMSAT‐ALERT can be used to detect drought‐related risks during the growing season of the crop. Additionally, the skill of the TAMSAT‐ALERT drought forecasting system was highly dependent on the land surface state at the initialization of the forecast period. Therefore, the meteorological conditions over the area of interest mainly precipitation in the months or weeks leading up to the initialization of the forecast will have a strong influence on the soil moisture at that period.
Carlos Misael Madrid Padilla, Daren Wang, Zifeng Zhao
et al.
We study the problem of change-point detection and localisation for functional data sequentially observed on a general d-dimensional space, where we allow the functional curves to be either sparsely or densely sampled. Data of this form naturally arise in a wide range of applications such as biology, neuroscience, climatology, and finance. To achieve such a task, we propose a kernel-based algorithm named functional seeded binary segmentation (FSBS). FSBS is computationally efficient, can handle discretely observed functional data, and is theoretically sound for heavy-tailed and temporally-dependent observations. Moreover, FSBS works for a general d-dimensional domain, which is the first in the literature of change-point estimation for functional data. We show the consistency of FSBS for multiple change-point estimations and further provide a sharp localisation error rate, which reveals an interesting phase transition phenomenon depending on the number of functional curves observed and the sampling frequency for each curve. Extensive numerical experiments illustrate the effectiveness of FSBS and its advantage over existing methods in the literature under various settings. A real data application is further conducted, where FSBS localises change-points of sea surface temperature patterns in the south Pacific attributed to El Nino.
This paper attempts to illustrate the complexity of thermal infrared (TIR) data analysis for urban heat island studies. While a certain shift regarding the use of correct scientific nomenclature (using the term “surface urban heat island”) could be observed, the literature is full of incorrect conclusions and results using erroneous terminology. This seems to be the result of the ease of such literature implicitly suggesting that “warm surfaces” result in “high air temperatures”, ultimately drawing conclusions for urban planning authorities. It seems that the UHI is easy to measure, easy to explain, easy to find, and easy to illustrate—simply take a TIR-image. Due to this apparent simplicity, many authors seem to jump into UHI studies without fully understanding the nature of the phenomenon as far as time and spatial scales, physical processes, and the numerous methodological pitfalls inherent to UHI studies are concerned. This paper attempts to point out some of the many pitfalls in UHI studies, beginning with a proper correction of longwave emission data, the consideration of the source area of a thermal signal in an urban system—which is predominantly at the roof level—demonstrating the physics and interactions of radiation and heat fluxes, especially in relation to the importance of urban storage heat flux, and ending with an examination of examples from the Basel study area in Switzerland. Attention is then turned to the analysis of spatially distributed net radiation in the day- and at nighttime as a minimum requirement for urban heat island studies. The integration of nocturnal TIR images is notably recommended, as satellite data and the UHI-phenomenon cover the same time period.