Abstract A gridded wet-bulb globe temperature (WBGT) climatology for the southeastern United States is presented, emphasizing athlete and worker safety thresholds. Hourly WBGT estimates are derived from the land component of the fifth generation European Centre for Medium-Range Weather Forecasts atmospheric reanalysis (ERA5-Land) and validated using U.S. Climate Reference Network estimates from 1991 to 2020. Validation statistics indicate that, despite a nighttime warm bias across the region, ERA5-Land WBGT estimates are commensurate with estimates derived using in situ data. Across the southeastern United States, evergreen forests and moisture-rich locations such as rivers, lakes, and wetlands typically experience higher WBGT values than drier locations, with high-elevation (≥1000 m) areas experiencing the lowest WBGT values. The locations with the most days and the most hours per year with outdoor activity restrictions for athletes and workers shift as WBGT increases, with peak locations transitioning from southern Texas and southern Florida in May to northern Louisiana, southern Arkansas, and eastern Texas by August. An areal analysis of the most dangerous conditions shows a gradual expansion across the southeast, driven primarily by increases in Texas. There was no discernible spatial shift in the location of black flag conditions aside from interannual variability. The results characterize the spatiotemporal extent of heat exposure, provide a basis for delineating regions at risk for unsafe conditions, and support education and mitigation strategies by healthcare professionals, athletic administrators, and local officials.
The identification of domain sets whose outcomes belong to predefined subsets can address fundamental risk assessment challenges in climatology and medicine. Existing approaches for inverse domain estimates require restrictive assumptions, including domain density and continuity of function near thresholds, and large-sample guarantees, which limit the applicability. Besides, the estimation and coverage depend on setting a fixed threshold level, which is difficult to determine. Recently, Ren et al. (2024) proved that confidence sets of multiple levels can be simultaneously constructed with the desired confidence non-asymptotically through inverting simultaneous confidence bands. Here, we present the SCoRES R package, which implements Ren's approach for both the estimation of the inverse region and the corresponding simultaneous outer and inner confidence regions, along with visualization tools. Besides, the package also provides functions that help construct SCBs for regression data, functional data and geographical data. To illustrate its broad applicability, we present three rigorous examples that demonstrate the SCoRES workflow.
Spectral estimation is an important tool in time series analysis, with applications including economics, astronomy, and climatology. The asymptotic theory for non-parametric estimation is well-known but the development of non-asymptotic theory is still ongoing. Our recent work obtained the first non-asymptotic error bounds on the Bartlett and Welch methods for $L$-mixing stochastic processes. The class of $L$-mixing processes contains common models in time series analysis, including autoregressive processes and measurements of geometrically ergodic Markov chains. Our prior analysis assumes that the process has zero mean. While zero-mean assumptions are common, real-world time-series data often has unknown, non-zero mean. In this work, we derive non-asymptotic error bounds for both Bartlett and Welch estimators for $L$-mixing time-series data with unknown means. The obtained error bounds are of $O(\frac{1}{\sqrt{k}})$, where $k$ is the number of data segments used in the algorithm, which are tighter than our previous results under the zero-mean assumption.
Dana Looschelders, Andreas Christen, Sue Grimmond
et al.
ABSTRACT Characterizing inter‐instrument variability of sensors is crucial to assessing uncertainties in observational campaigns, networks, and for data assimilation. Here, we co‐locate six high signal‐to‐noise ratio Vaisala CL61 lidar‐ceilometers for a period of 10 days to quantify instrument‐related differences in several observed variables: profiles of attenuated backscatter, its components (parallel‐ and cross‐polarized backscatter) and the volume linear depolarisation ratio (δ), as well as derived cloud variables and mixed‐layer height. Analysing intervals between 5 and 60 min, median absolute differences between sensors (AD50) and percentiles (e.g., AD75) are used to quantify instrument related uncertainties. For backscatter and δ, we differentiate between conditions with rain, clear sky, and clouds. Here we address instrument precision rather than accuracy, with instrument accuracy assumed. The detected agreement between instruments suggests a distributed measurement network should be capable of providing context for interpretation of spatial differences. If instruments measure accurately, it is possible to resolve spatial differences (e.g., urban–rural) for attenuated backscatter, derived cloud variables and layer heights. However, differences exist and vary with signal‐to‐noise ratio and atmospheric conditions. The AD50 inter‐sensor results for 15 min intervals for total cloud‐cover fraction (excluding clear sky and fully overcast conditions) is 1.9%, and for cloud base height 7.3 m. Agreement of all cloud variables is better for boundary layer clouds (when first cloud layer < 4 km agl) than for all five cloud layers recorded by the sensor firmware. The 15 min mixed‐layer height AD50 is 0 m and the AD75 21.5 m. We show that instrument precipitation flags are in good agreement, but do not link closely with ground‐level rainfall observations, hence an alternative algorithm is proposed. We provide quality control recommendations for data processing to improve inter‐instrument agreement of cloud variables and mixed‐layer height.
Lateef Adesola Afolabi, Takvor Soukissian, Diego Vicinanza
et al.
The exploitation of renewable energy is essential for mitigating climate change and reducing fossil fuel emissions. Wind energy, the most mature technology, is highly dependent on wind speed, and the accurate prediction of the latter substantially supports wind power generation. In this work, various artificial neural networks (ANNs) were developed and evaluated for their wind speed prediction ability using the ERA5 historical reanalysis data for four potential Offshore Wind Farm Organized Development Areas in Greece, selected as suitable for floating wind installations. The training period for all the ANNs was 80% of the time series length and the remaining 20% of the dataset was the testing period. Of all the ANNs examined, the hybrid model combining Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) networks demonstrated superior forecasting performance compared to the individual models, as evaluated by standard statistical metrics, while it also exhibited a very good performance at high wind speeds, i.e., greater than 15 m/s. The hybrid model achieved the lowest root mean square errors across all the sites—0.52 m/s (Crete), 0.59 m/s (Gyaros), 0.49 m/s (Patras), 0.58 m/s (Pilot 1A), and 0.55 m/s (Pilot 1B)—and an average coefficient of determination (R<sup>2</sup>) of 97%. Its enhanced accuracy is attributed to the integration of the LSTM and GRU components strengths, enabling it to better capture the temporal patterns in the wind speed data. These findings underscore the potential of hybrid neural networks for improving wind speed forecasting accuracy and reliability, contributing to the more effective integration of wind energy into the power grid and the better planning of offshore wind farm energy generation.
In July 2023, a super cluster of extreme rainfall event attacked North China during the decaying phase of Typhoon Doksuri, which induced a severe flood, causing great damage and people death. Our results show that this event is mainly caused by the cross-time scale interaction between the enhanced anticyclone over the Sea of Japan, the decaying Typhoon Doksuri, and the developing Typhoon Khanun. The anomalous anticyclone exhibits significant 10−30-day and 30–90-day time scales. It interacted with Typhoon Doksuri, providing moisture advection via southeasterly flow. The anticyclone was enhanced by the combination of wave energy dispersed from Typhoons Doksuri and Khanun in the lower troposphere and eastward propagation of Rossby waves in the upper troposphere, which jointly resulted in the local moisture convergence and vertical motion during the flooding. As a result, the 10–30-day and 30–90-day intraseasonal oscillations of the anticyclone play an important role as an initial atmospheric condition. It bridges the East Asian summer monsoon and Typhoons of Doksuri and Khanun, showing an important source of subseasonal predictability of extreme heavy rainfall events in North China.
Understanding the combined influences of meteorological and hydrological factors on water level and flood events is essential, particularly in today's changing climate environments. Transformer, as one kind of the cutting-edge deep learning methods, offers an effective approach to model intricate nonlinear processes, enables the extraction of key features and water level predictions. EXplainable Artificial Intelligence (XAI) methods play important roles in enhancing the understandings of how different factors impact water level. In this study, we propose a Transformer variant by integrating sparse attention mechanism and introducing nonlinear output layer for the decoder module. The variant model is utilized for multi-step forecasting of water level, by considering meteorological and hydrological factors simultaneously. It is shown that the variant model outperforms traditional Transformer across different lead times with respect to various evaluation metrics. The sensitivity analyses based on XAI technology demonstrate the significant influence of meteorological factors on water level evolution, in which temperature is shown to be the most dominant meteorological factor. Therefore, incorporating both meteorological and hydrological factors is necessary for reliable hydrological prediction and flood prevention. In the meantime, XAI technology provides insights into certain predictions, which is beneficial for understanding the prediction results and evaluating the reasonability.
Sebastiano Stramaglia, Luca Faes, Jesus M. Cortes
et al.
Transfer Entropy (TE), the primary method for determining directed information flow within a network system, can exhibit bias - either in deficiency or excess - during both pairwise and conditioned calculations, owing to high-order dependencies among the dynamic processes under consideration and the remaining processes in the system used for conditioning. Here, we propose a novel approach. Instead of conditioning TE on all network processes except the driver and target, as in its fully conditioned version, or not conditioning at all, as in the pairwise approach, our method searches for both the multiplets of variables that maximize information flow and those that minimize it. This provides a decomposition of TE into unique, redundant, and synergistic atoms. Our approach enables the quantification of the relative importance of high-order effects compared to pure two-body effects in information transfer between two processes, while also highlighting the processes that contribute to building these high-order effects alongside the driver. We demonstrate the application of our approach in climatology by analyzing data from El Niño and the Southern Oscillation.
The sustainable operation of ambient air quality monitoring stations in developing countries is not always possible. Intermittent failures and breakdowns at air quality monitoring stations often affect the continuous measurement of data as required. These failures and breakdowns result in missing data. This study aimed to impute NO<sub>2</sub>, SO<sub>2</sub>, O<sub>3</sub>, and PM 10 to produce complete data sets of daily average exposures from 2010 to 2017. Models were built for (a) an individual pollutant at a monitoring station, (b) a combined model for the same pollutant from different stations, and (c) a data set with all the pollutants from all the monitoring stations. This study sought to evaluate the efficacy of the Multiple Imputation by Chain Equations (MICE) algorithm in successfully imputing air quality data that are missing at random. The application of classification and regression trees (CART) analysis using the MICE package in the R statistical programming language was compared with the predictive mean matching (PMM) method. The CART method performed better, with the pooled R-squared statistics of the imputed data ranging from 0.3 to 0.7, compared to a range of 0.02 to 0.25 for PMM. The MICE algorithm successfully resolved the incompleteness of the data. It was concluded that the CART method produced better reliable data than the PMM method. However, in this study, the pooled R<sup>2</sup> values were accurate for NO<sub>2</sub>, but not so much for other pollutants.
Studying water droplets is a rich lesson in fields of fluid dynamics, nonlinear systems, and differential equations. Understanding various physical aspects of raindrops can help us in understanding drop dynamics, rainfall density estimation, size distributions which can be grant insights in the fields of meteorology, hydrology, and climate science. This work identifies the real world significance in developing more accurate atmospheric attenuation correction algorithms that could potentially overcome the scattering effect of rain on radio and micro wave communication. This works presents and aims to compile some historical work which focuses on the influence of surface tension in the droplet nucleation and formation, raindrop oscillation, burst cycles, and transformation into parachute-like shapes before fragmentation. The interplay between surface tension and air resistance in raindrop dynamics is also explored.
Paritosh Jha, Sona Chinngaihlian, Priyanka Upreti
et al.
The paper examines the direct and indirect implications of the risk factors relating to climate change on various parameters of agricultural production/productivity in India. India, being an emerging economy with considerable dependence on agriculture both for food security and employment generation, offers an important case study for understanding the macro-economic issues and designing the right set of policy approach for mitigating implications of climate change. Furthermore, unlike other central banks, the Reserve Bank of India (RBI) shares a close association with agriculture owing to the continued credit support to the sector as part of its priority sector lending policy, and because it is closely involved in addressing climate change. The paper focuses on the decade of 2010s, the recorded warmest decade till now and adopts a machine learning approach (sequential multivariate adaptive regression splines model) to assess the interaction between climate risk factors and agriculture. To the best of our knowledge, the modelling approach applied in this paper is unique, novel and of the first kind to assess the implications of climate change on agriculture. The results indicate that carbon dioxide (CO2) emission, precipitation, irrigation water used, and rainfall are the most prominent factors affecting different parameters of agricultural production. These factors are taken at the yearly aggregate level and their interactions among themselves, particularly CO2 emissions, affect productivity of foodgrains and oilseeds, providing a detailed insight about the recent decade of climate change in the context of Indian economy.
S. Vicente‐Serrano, F. Domínguez‐Castro, T. McVicar
et al.
There is a strong scientific debate on how drought will evolve under future climate change. Climate model outputs project an increase in drought frequency and severity by the end of the 21st century. However, there is a large uncertainty related to the extent of the global land area that will be impacted by enhanced climatological and hydrological droughts. Although climate metrics suggest a likely strong increase in future drought severity, hydrologic metrics do not show a similar signal. In the literature, numerous attempts have been made to explain these differences using several physical mechanisms. This study provides evidence that characterization of drought from different statistical perspectives can lead to unreliable detection of climatological/hydrological droughts in model projections and accordingly give a “false alarm” of the impacts of future climate change. In particular, this study analyses future projections based on different drought metrics and stresses that detecting trends in drought behavior in future projections must consider the extreme character of drought events by comparing the percentage change in drought magnitude relative to a reference climatological period and rely on the frequency of events in the tail of the distribution. In addition, the autoregressive character of drought indices makes necessary the use of the same temporal scale when comparing different drought metrics in order to maintain comparability. Taking into consideration all these factors, our study demonstrates that climatological and hydrological drought trends are likely to undergo similar temporal evolution during the 21st century, with almost 30% of the global land areas experiencing water deficit under future greenhouse gas emissions scenarios. As such, a proper characterization of drought using comparable metrics can introduce lower differences and more consistent outputs for future climatic and hydrologic droughts.
Uncertainty quantification (UQ) in computational chemistry (CC) is still in its infancy. Very few CC methods are designed to provide a confidence level on their predictions, and most users still rely improperly on the mean absolute error as an accuracy metric. The development of reliable uncertainty quantification methods is essential, notably for computational chemistry to be used confidently in industrial processes. A review of the CC-UQ literature shows that there is no common standard procedure to report nor validate prediction uncertainty. I consider here analysis tools using concepts (calibration and sharpness) developed in meteorology and machine learning for the validation of probabilistic forecasters. These tools are adapted to CC-UQ and applied to datasets of prediction uncertainties provided by composite methods, Bayesian Ensembles methods, machine learning and a posteriori statistical methods.
Varun Kulkarni, Venkata Yashasvi Lolla, Suhas Tamvada
et al.
For decades, researchers worldwide have investigated phenomena related to natural, artificial oil leakages such as oil drop formation within water bodies, their rise, and oil slick evolution after they breach the water-air interface. Despite this, the event leading to slick formation -the bursting of oil drops at the liquid-air interface has remained unnoticed thus far. In this work, we investigate this and report a counterintuitive jetting reversal that releases a daughter oil droplet inside the bulk as opposed to the upwards shooting jets observed in bursting air bubbles. We show that the daughter droplet size thus produced can be correlated to the bulk liquid properties and that its formation can be suppressed by increasing the bulk viscosity or by the addition of microparticles. We further demonstrate the significance of our results by synthesizing colloidal pickered droplets and show applications of bursting compound drops in double emulsions and studies on raindrop impact on a slick. These results could be immensely transformative for diverse areas, including climatology, oceanic, atmospheric sciences, colloidal synthesis and drug delivery.
Ocean-driven melt of Antarctic ice shelves is an important control on mass loss from the ice sheet, but is complex to study due to significant variability in melt rates both spatially and temporally. Here we assess the strengths and weakness of satellite and field-based observations as tools for testing models of ice-shelf melt. We discuss how the complementary use of field, satellite and model data can be a powerful but underutilised tool for studying melt processes. Finally, we identify some community initiatives working to collate and publish coordinated melt rate datasets, which can be used in future for validating satellite-derived maps of melt and evaluating processes in numerical simulations.
Iva Hůnová, Pavel Kurfürst, Leona Vlasáková
et al.
Benzo[a]pyrene (BaP), an indicator of polycyclic aromatic hydrocarbons (PAHs) in the atmosphere, is an important ambient air pollutant with significant human health and environmental effects. In the Czech Republic (CR), BaP, together with aerosol and ambient ozone, ranks (with respect to limit value exceedances and resulting population exposure) among the most problematic air pollutants. The aim of this study is to develop atmospheric deposition patterns of BaP in three years, namely 2012, 2015 and 2019, reflecting different BaP ambient levels. With respect to the available measurements, we accounted for dry deposition fluxes, neglecting wet contribution. We assumed, nevertheless, that the real atmospheric deposition is dominated by dry pathways in our conditions, which is supported by measurements from the rural site of Košetice. The dry deposition spatial pattern was constructed using an inferential approach, with two input layers, i.e., annual mean ambient air BaP concentrations, and deposition velocity of 0.89 cm·s<sup>−1</sup>. Though our results show an overall decrease in BaP loads over the years, the BaP deposition fluxes, in particular in the broader Ostrava region, remain very high. The presented maps can be considered an acceptable approximation of total BaP deposition and are useful for further detailed analysis of airborne BaP impacts on the environment.
Nowadays the world has entered into the digital age, in which the data analysis and visualization have become more and more important. In analogy to imaging the real object, we demonstrate that the computational ghost imaging can image the digital data to show their characteristics, such as periodicity. Furthermore, our experimental results show that the use of optical imaging methods to analyse data exhibits unique advantages, especially in anti-interference. The data analysis with computational ghost imaging can be well performed against strong noise, random amplitude and phase changes in the binarized signals. Such robust data data analysis and imaging has an important application prospect in big data analysis, meteorology, astronomy, economics and many other fields.
With the current COVID-19 pandemic being spread all over the world, lockdown measures are being implemented, making air pollution levels go down in several countries. In this context, the air quality changes in the highly populated and trafficked Brazilian states of São Paulo (SP) and Rio de Janeiro (RJ) were addressed using a combination of satellite and ground-based daily data analysis. We explored nitrogen dioxide (NO<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mrow></mrow><mn>2</mn></msub></semantics></math></inline-formula>) and fine particulate matter (PM<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mrow></mrow><mrow><mn>2.5</mn></mrow></msub></semantics></math></inline-formula>) daily levels for the month of May from 2015–2020. Daily measurements of NO<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mrow></mrow><mn>2</mn></msub></semantics></math></inline-formula> column concentrations from the Ozone Monitoring Instrument (OMI) aboard NASA’s Aura satellite were analyzed and decreases of 42% and 49.6% were found for SP and RJ, respectively, during the year 2020 compared to the 2015–2019 average. Besides NO<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mrow></mrow><mn>2</mn></msub></semantics></math></inline-formula> column retrievals, ground-based data measured by the Brazilian States Environmental Institutions were analyzed and correlated with satellite retrievals. Correlation coefficients between year-to-year changes in satellite column and ground-based concentrations were 77% and 53% in SP and RJ, respectively. Ground-based data showed 13.3% and 18.8% decrease in NO<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mrow></mrow><mn>2</mn></msub></semantics></math></inline-formula> levels for SP and RJ, respectively, in 2020 compared to 2019. In SP, no significant change in PM<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mrow></mrow><mrow><mn>2.5</mn></mrow></msub></semantics></math></inline-formula> was observed in 2020 compared to 2019. To further isolate the effect of emissions reduction due to the lockdown, meteorological data and number of wildfire hotspots were analyzed. NO<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mrow></mrow><mn>2</mn></msub></semantics></math></inline-formula> concentrations showed negative and positive correlations with wind speed and temperature, respectively. PM<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mrow></mrow><mrow><mn>2.5</mn></mrow></msub></semantics></math></inline-formula> concentration distributions suggested an influence by the wildfires in the southeast region of the country. Synergistic analyses of satellite retrievals, surface level concentrations, and weather data provide a more complete picture of changes to pollutant levels.