Geoffrey Smith
Hasil untuk "Earthwork. Foundations"
Menampilkan 20 dari ~638108 hasil · dari CrossRef, DOAJ, arXiv, Semantic Scholar
Aravind Gollakota, Parikshit Gopalan, Aayush Karan et al.
Given a predictor and a loss function, how well can we predict the loss that the predictor will incur on an input? This is the problem of loss prediction, a key computational task associated with uncertainty estimation for a predictor. In a classification setting, a predictor will typically predict a distribution over labels and hence have its own estimate of the loss that it will incur, given by the entropy of the predicted distribution. Should we trust this estimate? In other words, when does the predictor know what it knows and what it does not know? In this work we study the theoretical foundations of loss prediction. Our main contribution is to establish tight connections between nontrivial loss prediction and certain forms of multicalibration, a multigroup fairness notion that asks for calibrated predictions across computationally identifiable subgroups. Formally, we show that a loss predictor that is able to improve on the self-estimate of a predictor yields a witness to a failure of multicalibration, and vice versa. This has the implication that nontrivial loss prediction is in effect no easier or harder than auditing for multicalibration. We support our theoretical results with experiments that show a robust positive correlation between the multicalibration error of a predictor and the efficacy of training a loss predictor.
P. Jedlicka, Šimon Kos, Martin Šmíd et al.
As we approach the centennial anniversary of modern quantum mechanics, this paper revisits the foundational debates through a new poll within the research community. Inspired by the survey by Schlosshauer, Kofler, and Zeilinger at the specialized 2011 Quantum Physics and the Nature of Reality conference, we expanded our recruitment to include a more representative sample of the broader community of physicists with the aim of revealing potential shifts in scientists’ views and to compare our findings with those from several previous polls. While quantum foundations still lack a consensus interpretation, our results indicate a persistent preference for the Copenhagen interpretation. This enduring support likely reflects both the educational emphasis on the Copenhagen interpretation and its pragmatic appeal in avoiding complex metaphysical questions and introducing new notions (e.g., other worlds or the pilot wave). Our findings thus underscore the relative stability of interpretational preferences over the past decades.
A. G. Barr, J. Landgraf, M. Martinez-Velarte et al.
<p>Accurately measuring greenhouse gas concentrations to identify regional sources and sinks is essential for effectively monitoring and mitigating their impact on the Earth's changing climate. In this article we present the scientific data products of <span class="inline-formula">XCO<sub>2</sub></span> and <span class="inline-formula">XCH<sub>4</sub></span>, retrieved with RemoTeC, from the Greenhouse Gases Observing Satellite-2 (GOSAT-2), which span a time range of 5 years. GOSAT-2 has the capability to measure total columns of <span class="inline-formula">CO<sub>2</sub></span> and <span class="inline-formula">CH<sub>4</sub></span> to the necessary requirements set by the Global Climate Observing System (GCOS), who define said requirements as <span class="inline-formula">accuracy<10 ppb</span> and <span class="inline-formula"><0.5 ppm</span> for <span class="inline-formula">XCH<sub>4</sub></span> and <span class="inline-formula">XCO<sub>2</sub></span> respectively, and stability of <span class="inline-formula"><math xmlns="http://www.w3.org/1998/Math/MathML" id="M11" display="inline" overflow="scroll" dspmath="mathml"><mrow><mo><</mo><mn mathvariant="normal">3</mn><mspace linebreak="nobreak" width="0.125em"/><mrow class="unit"><mi mathvariant="normal">ppb</mi><mspace linebreak="nobreak" width="0.125em"/><msup><mi mathvariant="normal">yr</mi><mrow><mo>-</mo><mn mathvariant="normal">1</mn></mrow></msup></mrow></mrow></math><span><svg:svg xmlns:svg="http://www.w3.org/2000/svg" width="58pt" height="15pt" class="svg-formula" dspmath="mathimg" md5hash="c274747d24092c2581f21b3f4bf3a32d"><svg:image xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="amt-18-6093-2025-ie00001.svg" width="58pt" height="15pt" src="amt-18-6093-2025-ie00001.png"/></svg:svg></span></span> and <span class="inline-formula"><math xmlns="http://www.w3.org/1998/Math/MathML" id="M12" display="inline" overflow="scroll" dspmath="mathml"><mrow><mo><</mo><mn mathvariant="normal">0.5</mn><mspace linebreak="nobreak" width="0.125em"/><mrow class="unit"><mi mathvariant="normal">ppm</mi><mspace linebreak="nobreak" width="0.125em"/><msup><mi mathvariant="normal">yr</mi><mrow><mo>-</mo><mn mathvariant="normal">1</mn></mrow></msup></mrow></mrow></math><span><svg:svg xmlns:svg="http://www.w3.org/2000/svg" width="71pt" height="15pt" class="svg-formula" dspmath="mathimg" md5hash="bd663d4703fede7549f820562193c246"><svg:image xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="amt-18-6093-2025-ie00002.svg" width="71pt" height="15pt" src="amt-18-6093-2025-ie00002.png"/></svg:svg></span></span> for <span class="inline-formula">XCH<sub>4</sub></span> and <span class="inline-formula">XCO<sub>2</sub></span> respectively.</p> <p>Central to the quality of the <span class="inline-formula">XCO<sub>2</sub></span> and <span class="inline-formula">XCH<sub>4</sub></span> datasets is the post-retrieval quality flagging step. Previous versions of RemoTeC products have relied on threshold filtering, flagging data using boundary conditions from a list of retrieval parameters. We present a novel quality filtering approach utilising a machine learning technique known as Random Forest Classifier (RFC) models. This method is developed under the European Space Agency's (ESA) Climate Change Initiative+ (CCI+) program and applied to data from GOSAT-2. Data from the Total Carbon Column Observing Network (TCCON) are employed to train the RFC models, where retrievals are categorized as good or bad quality based on the bias between GOSAT-2 and TCCON measurements. TCCON is a global network of Fourier transform spectrometers that measure telluric absorption spectra at infrared wavelengths. It serves as the scientific community's standard for validating satellite-derived <span class="inline-formula">XCO<sub>2</sub></span> and <span class="inline-formula">XCH<sub>4</sub></span> data. Our results demonstrate that the machine learning-based quality filtering achieves a significant improvement, with data yield increasing by up to 85 % and RMSE improving by up to 30 %, compared to traditional threshold-based filtering. Furthermore, inter-comparison with the TROPOspheric Monitoring Instrument (TROPOMI) indicates that the quality filtering RFC models generalise well to the full dataset, as the expected behaviour is reproduced on a global scale.</p> <p><span id="page6094"/>Low systematic biases are essential for extracting meaningful fluxes from satellite data products. Through TCCON validation we find that all data products are within the breakthrough bias requirements set, with RMSE for <span class="inline-formula">XCH<sub>4</sub></span> <span class="inline-formula"><</span> 15 <span class="inline-formula">ppb</span> and <span class="inline-formula">XCO<sub>2</sub></span> <span class="inline-formula"><</span> 2 <span class="inline-formula">ppm</span>. We derive station-to-station biases of 4.2 ppb and 0.5 ppm for <span class="inline-formula">XCH<sub>4</sub></span> and <span class="inline-formula">XCO<sub>2</sub></span> respectively, and linear drift of 0.6 <span class="inline-formula">ppb yr<sup>−1</sup></span> and 0.2 <span class="inline-formula">ppm yr<sup>−1</sup></span> for <span class="inline-formula">XCH<sub>4</sub></span> and <span class="inline-formula">XCO<sub>2</sub></span> respectively.</p> <p>For <span class="inline-formula">XCH<sub>4</sub></span>, GOSAT-2 and TROPOMI are highly correlated with standard deviations less than 18 ppb and globally averaged biases close to 0 ppb. The inter-satellite bias between GOSAT and GOSAT-2 is significant, with an average global bias of <span class="inline-formula">−</span>15 <span class="inline-formula">ppb</span>. This is comparable to that seen between GOSAT and TROPOMI, consistent with our findings that GOSAT-2 and TROPOMI are in close agreement.</p>
Robert Ferydouni, Daniel D. Spiegel
We aim to give a self-contained and detailed yet simplified account of the foundations of the theory of double operator integrals, in order to provide an accessible entry point to the theory. We make two new contributions to these foundations: (1) a new proof of the existence of the product of two projection-valued measures, which allows for the definition of the double operator integral for Hilbert-Schmidt operators, and (2) a variant approach to the integral projective tensor product on arbitrary (not necessarily separable) Hilbert spaces using a somewhat more explicit norm than has previously been given. We prove the Daletskii-Krein formula for strongly differentiable perturbations of a densely-defined self-adjoint operator and conclude by reviewing an application of the theory to quantum statistical mechanics.
Zhipeng Liu, Jaehyung Ju
A mechanical model of a laminated composite ring on a nonreciprocal elastic foundation is a valuable engineering tool during the early design stages of various applications, such as non-pneumatic wheels, flexible bearings, expandable tubulars in oil wells, and vascular stents interacting with blood vessel linings, especially under non-axisymmetric loadings. Despite its importance, limited research has focused on the interaction between laminated composite rings and nonreciprocal elastic foundations. Moreover, no quantitative studies have yet explored the influence of foundation stiffness on the ring deformation. This work aims to develop an analytical framework for a laminated composite ring supported by a nonreciprocal elastic foundation under non-axisymmetric loading conditions. The model generates a design map that correlates the foundation stiffness with the ring deformation, accounting for ring dimensions, laminate layup architecture, and lamina anisotropy. The closed-form solution provides an efficient design tool for analyzing non-axisymmetric and nonuniform loadings at a low computational cost. The resulting design map provides a valuable resource for exploring the interaction between the nonreciprocal foundation and the laminated ring. The proposed analytical framework and design map hold broad potential applications in automotive, mechanical, civil, and biomedical engineering fields.
Ana Carolina Condez, Diogo Tavares, João Magalhães
Recent advances in vision-language models have enabled rich semantic understanding across modalities. However, these encoding methods lack the ability to interpret or reason about the moral dimensions of content-a crucial aspect of human cognition. In this paper, we address this gap by introducing MoralCLIP, a novel embedding representation method that extends multimodal learning with explicit moral grounding based on Moral Foundations Theory (MFT). Our approach integrates visual and textual moral cues into a unified embedding space, enabling cross-modal moral alignment. MoralCLIP is grounded on the multi-label dataset Social-Moral Image Database to identify co-occurring moral foundations in visual content. For MoralCLIP training, we design a moral data augmentation strategy to scale our annotated dataset to 15,000 image-text pairs labeled with MFT-aligned dimensions. Our results demonstrate that explicit moral supervision improves both unimodal and multimodal understanding of moral content, establishing a foundation for morally-aware AI systems capable of recognizing and aligning with human moral values.
Biao Li, Qing-Kai Song, Wen-Gang Qi et al.
Predicting the lateral pile response is challenging due to the complexity of pile-soil interactions. Machine learning (ML) techniques have gained considerable attention for their effectiveness in non-linear analysis and prediction. This study develops an interpretable ML-based model for predicting p-y curves of monopile foundations. An XGBoost model was trained using a database compiled from existing research. The results demonstrate that the model achieves superior predictive accuracy. Shapley Additive Explanations (SHAP) was employed to enhance interpretability. The SHAP value distributions for each variable demonstrate strong alignment with established theoretical knowledge on factors affecting the lateral response of pile foundations.
Sanjog Misra
Foundation models, and in particular large language models, can generate highly informative responses, prompting growing interest in using these ''synthetic'' outputs as data in empirical research and decision-making. This paper introduces the idea of a foundation prior, which shows that model-generated outputs are not as real observations, but draws from the foundation prior induced prior predictive distribution. As such synthetic data reflects both the model's learned patterns and the user's subjective priors, expectations, and biases. We model the subjectivity of the generative process by making explicit the dependence of synthetic outputs on the user's anticipated data distribution, the prompt-engineering process, and the trust placed in the foundation model. We derive the foundation prior as an exponential-tilted, generalized Bayesian update of the user's primitive prior, where a trust parameter governs the weight assigned to synthetic data. We then show how synthetic data and the associated foundation prior can be incorporated into standard statistical and econometric workflows, and discuss their use in applications such as refining complex models, informing latent constructs, guiding experimental design, and augmenting random-coefficient and partially linear specifications. By treating generative outputs as structured, explicitly subjective priors rather than as empirical observations, the framework offers a principled way to harness foundation models in empirical work while avoiding the conflation of synthetic ''facts'' with real data.
Rishi Bommasani
Artificial intelligence is humanity's most promising technology because of the remarkable capabilities offered by foundation models. Yet, the same technology brings confusion and consternation: foundation models are poorly understood and they may precipitate a wide array of harms. This dissertation explains how technology and society coevolve in the age of AI, organized around three themes. First, the conceptual framing: the capabilities, risks, and the supply chain that grounds foundation models in the broader economy. Second, the empirical insights that enrich the conceptual foundations: transparency created via evaluations at the model level and indexes at the organization level. Finally, the transition from understanding to action: superior understanding of the societal impact of foundation models advances evidence-based AI policy. View together, this dissertation makes inroads into achieving better societal outcomes in the age of AI by building the scientific foundations and research-policy interface required for better AI governance.
Y. Liu, Y. Liu, J.-D. Paris et al.
<p>Methane emissions from natural gas systems are increasingly scrutinized, and accurate reporting requires quantification of site- and source-level measurement. We evaluate the performance of 10 available state-of-the-art CH<span class="inline-formula"><sub>4</sub></span> emission quantification approaches against a blind controlled-release experiment at an inerted natural gas compressor station in 2021. The experiment consisted of 17 blind 2 h releases at a single exhaust point or multiple simultaneous ones. The controlled releases covered a range of methane flow rates from 0.01 to 50 kg h<span class="inline-formula"><sup>−1</sup></span>. Measurement platforms included aircraft, drones, trucks, vans, ground-based stations, and handheld systems. Herewith, we compare their respective strengths, weaknesses, and potential complementarity depending on the emission rates and atmospheric conditions. Most systems were able to quantify the releases within an order of magnitude. The level of errors from the different systems was not significantly influenced by release rates larger than 0.1 kg h<span class="inline-formula"><sup>−1</sup></span>, with much poorer results for the 0.01 kg h<span class="inline-formula"><sup>−1</sup></span> release. It was found that handheld optical gas imaging (OGI) cameras underestimated the emissions. In contrast, the “site-level” systems, relying on atmospheric dispersion, tended to overestimate the emission rates. We assess the dependence of emission quantification performance on key parameters such as wind speed, deployment constraints, and measurement duration. At the low wind speeds encountered (below 2 m s<span class="inline-formula"><sup>−1</sup></span>), the experiments did not reveal a significant dependence on wind speed. The ability to quantify individual sources degraded during multiple-source releases. Compliance with the Oil and Gas Methane Partnership's (OGMP 2.0) highest level of reporting may require a combination of the specific advantages of each measurement technique and will depend on reconciliation approaches. Self-reported uncertainties were either not available or were based on the standard deviation in a series of independent realizations or fixed values from expert judgment or theoretical considerations. For most systems, the overall relative errors estimated in this study are higher than self-reported uncertainties.</p>
K. Ren, H. Gao, H. Gao et al.
<p>The variation trends and characteristics of polar mesospheric clouds (PMCs) are important for studying the evolution of atmospheric systems and understanding various atmospheric dynamic processes. Through observation and analysis of PMCs, we can gain a comprehensive understanding of the mechanisms driving atmospheric processes, providing a scientific basis and support for addressing climate change. Ultraviolet (UV) imaging technology, adopted by the Cloud Imaging and Particle Size (CIPS) instrument on board the Aeronomy of Ice in the Mesosphere (AIM) satellite, has significantly advanced the research on PMCs. Due to the retirement of the AIM satellite, there is currently no concrete plan for next-generation instruments based on the CIPS model, resulting in a discontinuity in the observation data sequence.</p> <p>In this study, we propose a compact and cost-effective wide-field-of-view ultraviolet imager (WFUI) that can be integrated into various satellite platforms for future PMC observation missions. A forward model was built to evaluate the detection capability and efficiency of the WFUI. CIPS and Solar Occultation for Ice Experiment (SOFIE) data were fused to reconstruct a three-dimensional PMC scene as the input background. Based on the scattering and extinction characteristics of ice particles and atmospheric molecules, the radiative transfer was calculated using the solar radiation path through the atmosphere and PMCs. The optical system and satellite platform parameters of the WFUI were selected according to CIPS, enabling the calculation of the number of photons received by the WFUI. The actual detection signal is then simulated by photoelectric conversion, and the PMC information can be obtained by removing detector noise. Subsequently, a comparison with the input background field was conducted to compute and analyze the detection efficiency. Additionally, a sensitivity analysis of the instrument and platform parameters was conducted.</p> <p>Simulations were performed for both individual orbits and for the entire PMC seasons. The research results demonstrate that the WFUI performs well in PMC detection and has high detection efficiency. Statistical analysis of the detection efficiency using data from 2008 to 2012 revealed an exponential relationship between the ice water content (IWC) of PMCs and detection efficiency. During the initial and final durations of the PMC season, when the IWC was relatively low, the detection efficiency remained limited. However, as the season progressed and the IWC increased, the detection efficiency significantly improved. We note that regions at lower latitudes exhibited a lower IWC and, consequently, lower detection efficiency. In contrast, regions at higher latitudes, with a greater IWC, demonstrated better detection efficiency. Additionally, the sensitivity analysis results suggest that increasing the satellite orbit altitude and expanding the field of view (FOV) of the WFUI both contribute to improving the detection efficiency.</p>
Atilla P. Kiraly, Sebastien Baur, Kenneth Philbrick et al.
Robust medical Machine Learning (ML) models have the potential to revolutionize healthcare by accelerating clinical research, improving workflows and outcomes, and producing novel insights or capabilities. Developing such ML models from scratch is cost prohibitive and requires substantial compute, data, and time (e.g., expert labeling). To address these challenges, we introduce Health AI Developer Foundations (HAI-DEF), a suite of pre-trained, domain-specific foundation models, tools, and recipes to accelerate building ML for health applications. The models cover various modalities and domains, including radiology (X-rays and computed tomography), histopathology, dermatological imaging, and audio. These models provide domain specific embeddings that facilitate AI development with less labeled data, shorter training times, and reduced computational costs compared to traditional approaches. In addition, we utilize a common interface and style across these models, and prioritize usability to enable developers to integrate HAI-DEF efficiently. We present model evaluations across various tasks and conclude with a discussion of their application and evaluation, covering the importance of ensuring efficacy, fairness, and equity. Finally, while HAI-DEF and specifically the foundation models lower the barrier to entry for ML in healthcare, we emphasize the importance of validation with problem- and population-specific data for each desired usage setting. This technical report will be updated over time as more modalities and features are added.
Alexander V. Gheorghiu, David J. Pym
The development of logic has largely been through the 'deductive' paradigm: conclusions are inferred from established premisses. However, the use of logic in the context of both human and machine reasoning is typically through the dual 'reductive' perspective: collections of sufficient premisses are generated from putative conclusions. We call this paradigm, 'reductive logic'. This expression of logic encompass as diverse reasoning activities as proving a formula in a formal system to seeking to meet a friend before noon on Saturday. This paper is a semantical analysis of reductive logic. In particular, we provide mathematical foundations for representing and reasoning about 'reduction operators'. Heuristically, reduction operators may be thought of as `backwards' inference rules. In this paper, we address their mathematical representation, how they are used in the context of reductive reasoning, and, crucially, what makes them 'valid'.
L. Egli, J. Gröbner, H. Schill et al.
<p>This study presents a new total column ozone (TCO) retrieval from the Koherent system, developed at the Physikalisch-Meteorologisches Observatorium Davos, World Radiation Center (PMOD/WRC). The instrument is based on a small, cost-effective, robust, low-maintenance and state-of-the-art-technology array spectroradiometer. It consists of a BTS-2048-UV-S-F array spectroradiometer from Gigahertz-Optik GmbH, coupled with an optical fibre to a lens-based telescope mounted on a sun tracker for measuring direct UV irradiance in the ultraviolet wavelength band between 305 to 345 nm.</p> <p>Two different algorithms are developed for retrieving TCO from these spectral measurements: (1) TCO retrieved by a least-squares-fit algorithm (LSF) and (2) a custom-double-ratio (CDR) technique using four specifically selected wavelengths from the spectral measurements. The double-ratio technique is analogous to the retrieval algorithm applied for the Dobson instruments and the Brewer instruments but is adopted here for TCO retrieval with Koherent. The instrument was calibrated in two different ways: (a) absolute calibration of the spectra using the portable reference for ultraviolet radiation QASUME for the LSF retrieval and (b) relative calibration of the extraterrestrial constant (ETC) of the CDR retrieval by minimising the slope between air mass and the relative differences of TCO from QASUME and Koherent. This adjustment of the ETC allows the instrument to be calibrated with standard TCO reference instruments during calibration campaigns, such as a double-monochromator Brewer.</p> <p>A 2-year comparison in Davos, Switzerland, between Koherent and the Brewer 156 (double monochromator) shows that TCO derived from Koherent and the Brewer 156 agree, on average, over the entire period within 0.7 % for all retrievals in terms of offset. The performance in terms of slant path depends on the selected retrieval and the applied corrections. The stray-light-corrected LSF retrieval exhibits a smaller slant path dependency than the CDR retrieval and performs almost as well as a double-monochromator system. The slant path dependency of the CDR is comparable to the slant path dependency of a single Brewer monochromator. The combination of both retrievals leads to performance with an offset close to zero compared to Brewer 156, a seasonal amplitude of the relative difference of 0.08 % and a slant path dependency of maximum 1.64 %, which is similar to other standard TCO instruments such as single Brewer or Dobson.</p> <p>Applying the double-ratio technique by selecting the wavelengths and slit functions from Brewer and Dobson, respectively, allows for the determination of the effective ozone temperature with an uncertainty of 3 K in terms of daily averages. With the improved TCO retrieval, Koherent serves as a new low-maintenance instrument which could also be used to monitor TCO at remote sites. The TCO retrieval presented here may be applied to other array-based spectroradiometers, providing direct spectral measurements in the ultraviolet wavelength band.</p>
Darryl Biggar
In regulatory proceedings, few issues are more hotly debated than the cost of capital. This article formalises the theoretical foundation of cost of capital estimation for regulatory purposes. Several common regulatory practices lack a solid foundation in the theory. For example, the common practice of estimating a single cost of capital for the regulated firm suffers from a circularity problem, especially in the context of a multi-year regulatory period. In addition, the relevant cost of debt cannot be estimated using the yield-to-maturity on a corporate bond. We suggest possible directions for reform of cost of capital practices in regulatory proceedings.
Dylan J. Foster, Alexander Rakhlin
These lecture notes give a statistical perspective on the foundations of reinforcement learning and interactive decision making. We present a unifying framework for addressing the exploration-exploitation dilemma using frequentist and Bayesian approaches, with connections and parallels between supervised learning/estimation and decision making as an overarching theme. Special attention is paid to function approximation and flexible model classes such as neural networks. Topics covered include multi-armed and contextual bandits, structured bandits, and reinforcement learning with high-dimensional feedback.
Chen Gong, Zhuguang Liu
C. M. Nguyen, M. Wolde, A. Battaglia et al.
<p>The dataset collected during the Radar Snow Experiment (RadSnowExp) presents the first-ever airborne triple-frequency radar observations combined with almost perfectly co-located and coincident airborne microphysical measurements from a single platform, the National Research Council Canada (NRC) Convair-580 aircraft. The potential of this dataset is illustrated using data collected from one flight during an Arctic storm, which covers a wide range of snow habits from pristine ice crystals and low-density aggregates to heavily rimed particles with maximum size exceeding 10 mm. Three different flight segments with well-matched in situ and radar measurements were analyzed, giving a total of 49 min of triple-frequency observations. The in situ particle imagery data for this study include high-resolution imagery from the Cloud Particle Imager (CPI) probe, which allows accurate identification of particle types, including rimed crystals and large aggregates, within the dual-frequency ratio (DFR) plane. The airborne triple-frequency radar data are grouped based on the dominant particle compositions and microphysical processes (level of aggregation and riming). The results from this study are consistent with the main findings of previous modeling studies, with specific regions of the DFR plane associated with unique scattering properties of different ice habits, especially in clouds where the radar signal is dominated by large aggregates. Moreover, the analysis shows close relationships between the triple-frequency signatures and cloud microphysical properties (particle characteristic size, bulk density, and level of riming).</p>
D. Summa, D. Summa, F. Madonna et al.
<p>This paper reports results from an inter-comparison effort involving different sensors and models used to measure the atmospheric boundary layer height (ABLH). The effort took place in the framework of the first Special Observing Period of the Hydrological Cycle in the Mediterranean Experiment (HyMeX-SOP1), with the Raman lidar system BASIL deployed in Candillargues (southern France) and operating in almost continuous mode over the time period September–November 2012. ABLH estimates were obtained based on the application of the Richardson number technique to Raman lidar and radiosonde measurements and to ECMWF-ERA5 reanalysis data. In the effort we considered radiosondes launched in the proximity of the lidar site, as well as radiosondes launched from the closest radiosonde station included in the Integrated Global Radiosonde Archive (IGRA). The inter-comparison effort also includes ABLH measurements from the wind profiler, which rely on the turbulence method, as well as measurements obtained from elastic backscatter lidar signals. The Richardson number approach applied to the on-site radiosonde data is taken as reference. Measurements were carried out throughout the month of October 2012. The inter-comparison is extended to both daytime and night-time data. Results reveal a very good agreement between the different approaches, with values of the correlation coefficient <span class="inline-formula"><i>R</i><sup>2</sup></span> for all compared data pairs in the range 0.94–0.98. Values of the slope of the fitting line in the regression analysis are in the range 0.91–1.08 for daytime comparisons and in the range 0.95–1.03 for night-time comparisons, which testifies to the presence of the very small biases affecting all five ABLH estimates with respect to the reference ABLH estimate, with slightly smaller bias values found at night. Results also confirm that the combined application of different methods to the sensors and model data allows us to get accurate and cross-validated estimates of the ABL height in a variety of weather conditions. Correlations between the ABLH measurements and other atmospheric dynamic and thermodynamic variables, such as CAPE (convective available potential energy), friction velocity and relative humidity, are also evaluated to infer possible mutual dependences.</p>
Halaman 12 dari 31906