Hasil untuk "Earthwork. Foundations"

Menampilkan 20 dari ~389968 hasil · dari DOAJ, arXiv, Semantic Scholar

JSON API
DOAJ Open Access 2026
Intercomparison of seven collocated ground-based infrared spectrometer radiance observations and retrieved thermodynamic profiles

D. D. Turner, B. Adler, B. Adler et al.

<p>Thermodynamic profiles, especially in the atmospheric boundary layer (ABL), are essential for many research and operational applications. Ground-based infrared spectrometers (IRS) are commercially available, and thermodynamic profiles in the ABL can be retrieved from these observations at 5 min resolution or better. This study deployed seven IRS systems within 5 m of each other in Boulder, Colorado, USA, in September–October 2023, providing an opportunity to evaluate the relative accuracy of the measured radiances from these systems as well as the retrieved thermodynamic profiles. The analysis demonstrates that the observed radiances from the seven instruments agree within 1 % of the ambient radiance in both opaque and more transparent channels. The differences in the spectral calibration between the instruments were smaller than 0.11 cm<span class="inline-formula"><sup>−1</sup></span>, relative to the nominal effective wavenumber of the metrology laser of 15 799 cm<span class="inline-formula"><sup>−1</sup></span> (i.e., better than 7.1 ppm). Further, the retrieved temperature and humidity profiles agree with each other well within the uncertainty of the retrieved profiles, and quantities derived from these thermodynamic profiles such as precipitable water vapor and height of the convective boundary layer also agree within their uncertainties. These results demonstrate a high degree of repeatability and precision, and that if these instruments were deployed as part of a network, any differences larger than the retrieval uncertainty would be associated with real environmental differences and not an artifact of the instrument calibration or retrieval.</p>

Environmental engineering, Earthwork. Foundations
arXiv Open Access 2026
Foundations of Quantum Optics for Quantum Information: Crash Course on Nonclassical States and Quantum Correlations

Jhoan Eusse, Esteban Vasquez, Tom Rivlin et al.

Nonclassical states of light and their correlations lie at the heart of quantum optics, serving as fundamental resources that underpin both the exploration of quantum phenomena and the realisation of quantum information protocols. These lecture notes provide an accessible yet rigorous introduction to the foundations of quantum optics, emphasising their relevance to quantum information science and technology. Starting from the quantisation of the electromagnetic field and the bosonic formalism of Fock space, the notes develop a unified framework for describing and analysing quantum states of light. Key families of states -- thermal, coherent, and squeezed -- are introduced as paradigmatic examples illustrating the transition from classical to nonclassical behaviour. The concepts of convexity, classicality, and quasiprobability representations are presented as complementary tools for characterising quantumness and defining operational notions such as P-nonclassicality. The discussion extends naturally to Gaussian states, composite systems, and continuous-variable entanglement, highlighting how nonclassicality serves as a resource for generating and quantifying quantum correlations. Theoretical developments are complemented by computational and experimental perspectives, including simulations of optical states using the Python library Strawberry Fields and data analysis from simulated data. Together, these notes aim to bridge the foundational concepts of quantum optics and modern quantum information, offering both conceptual insight and practical tools for students and researchers entering the field.

en quant-ph
DOAJ Open Access 2025
Quantifying landcover-specific fluxes over a heterogeneous landscape through coupling UAV-measured mixing ratios with a large-eddy simulation model and Eddy-covariance measurements

T. Yazbeck, M. Schlutow, A. Bolek et al.

<p>Many natural ecosystems are composed of heterogeneous patches differentiated by wetness levels and vegetation composition, resulting in fine-scale flux patterns across the different landcovers that can be challenging to quantify. Here, we present a case study at Stordalen Mire in subarctic Sweden, where we conducted Uncrewed Aerial Vehicle (UAV) measurements of CO<span class="inline-formula"><sub>2</sub></span> mole fractions and combine them with a large-eddy simulation (LES) model through a site-level inversion method to differentiate the flux rate signatures from different patch types. We use the LES model EULAG (EUlerian LAGrangian) to simulate high-resolution flow patterns and benchmark the spatial variability of modelled concentrations with data from UAV-based grid surveys of CO<span class="inline-formula"><sub>2</sub></span> mixing ratio. Coupling the inversion results with eddy-covariance (EC) flux measurements for the time of the UAV flight allows quantifying net CO<span class="inline-formula"><sub>2</sub></span> fluxes for the individual landcover types. Model evaluation showed an <span class="inline-formula"><i>R</i><sup>2</sup></span> up to 0.70, with model uncertainties mostly related to the transport model uncertainty and the UAV sampling footprint that does not evenly sample landcover types. The inversion fluxes were subsequently compared to patch-level chamber measurements of carbon dioxide from palsa, bog, and fen, and showed a good agreement in flux patterns across those patch types dominating the UAV-sampled footprint. Different landcover classification schemes were considered, and results showed a consistent improvement in the model performance when further representing the ecological and hydrological heterogeneities. Our novel technique shows promising results in estimating landcover-type flux heterogeneity within eddy-covariance tower footprints, thus providing a basis for upscaling of EC fluxes to a larger domain.</p>

Environmental engineering, Earthwork. Foundations
arXiv Open Access 2025
Universal and Transferable Attacks on Pathology Foundation Models

Yuntian Wang, Xilin Yang, Che-Yung Shen et al.

We introduce Universal and Transferable Adversarial Perturbations (UTAP) for pathology foundation models that reveal critical vulnerabilities in their capabilities. Optimized using deep learning, UTAP comprises a fixed and weak noise pattern that, when added to a pathology image, systematically disrupts the feature representation capabilities of multiple pathology foundation models. Therefore, UTAP induces performance drops in downstream tasks that utilize foundation models, including misclassification across a wide range of unseen data distributions. In addition to compromising the model performance, we demonstrate two key features of UTAP: (1) universality: its perturbation can be applied across diverse field-of-views independent of the dataset that UTAP was developed on, and (2) transferability: its perturbation can successfully degrade the performance of various external, black-box pathology foundation models - never seen before. These two features indicate that UTAP is not a dedicated attack associated with a specific foundation model or image dataset, but rather constitutes a broad threat to various emerging pathology foundation models and their applications. We systematically evaluated UTAP across various state-of-the-art pathology foundation models on multiple datasets, causing a significant drop in their performance with visually imperceptible modifications to the input images using a fixed noise pattern. The development of these potent attacks establishes a critical, high-standard benchmark for model robustness evaluation, highlighting a need for advancing defense mechanisms and potentially providing the necessary assets for adversarial training to ensure the safe and reliable deployment of AI in pathology.

en cs.CV, cs.LG
arXiv Open Access 2025
Foundations and Evaluations in NLP

Jungyeul Park

This memoir explores two fundamental aspects of Natural Language Processing (NLP): the creation of linguistic resources and the evaluation of NLP system performance. Over the past decade, my work has focused on developing a morpheme-based annotation scheme for the Korean language that captures linguistic properties from morphology to semantics. This approach has achieved state-of-the-art results in various NLP tasks, including part-of-speech tagging, dependency parsing, and named entity recognition. Additionally, this work provides a comprehensive analysis of segmentation granularity and its critical impact on NLP system performance. In parallel with linguistic resource development, I have proposed a novel evaluation framework, the jp-algorithm, which introduces an alignment-based method to address challenges in preprocessing tasks like tokenization and sentence boundary detection (SBD). Traditional evaluation methods assume identical tokenization and sentence lengths between gold standards and system outputs, limiting their applicability to real-world data. The jp-algorithm overcomes these limitations, enabling robust end-to-end evaluations across a variety of NLP tasks. It enhances accuracy and flexibility by incorporating linear-time alignment while preserving the complexity of traditional evaluation metrics. This memoir provides key insights into the processing of morphologically rich languages, such as Korean, while offering a generalizable framework for evaluating diverse end-to-end NLP systems. My contributions lay the foundation for future developments, with broader implications for multilingual resource development and system evaluation.

en cs.CL
arXiv Open Access 2025
Foundation of Affective Computing and Interaction

Changzeng Fu

This book provides a comprehensive exploration of affective computing and human-computer interaction technologies. It begins with the historical development and basic concepts of human-computer interaction, delving into the technical frameworks and practical applications of emotional computing, visual interaction, voice interaction, brain-computer interfaces, physiological electrical signal analysis, and social robotics. The book covers a wide range of topics, including the psychological and neuroscience foundations of emotion, multimodal emotion recognition, emotional expression mechanisms, and the principles of brain-computer interfaces. Key technologies such as affective computing based on discrete emotion theory and dimensional models, visual perception principles, speech recognition and synthesis, EEG signal acquisition and processing, and multimodal emotion recognition are explained in detail. This book also addresses the technical challenges in the field, including multimodal data fusion, privacy and security, and ethical considerations in human-machine relationships. It discusses the applications of these technologies across various domains such as education, healthcare, entertainment, and intelligent assistance. Looking to the future, the book anticipates trends such as the deep integration of artificial intelligence with emotion recognition, the advancement of multimodal interaction technologies, and the development of more personalized and adaptive emotion recognition systems. It emphasizes the importance of balancing technological innovation with ethical considerations to ensure the responsible development and application of affective computing technologies.

en cs.HC
arXiv Open Access 2025
Geometric Foundations of Tuning without Forgetting in Neural ODEs

Erkan Bayram, Mohamed-Ali Belabbas, Tamer Başar

In our earlier work, we introduced the principle of Tuning without Forgetting (TwF) for sequential training of neural ODEs, where training samples are added iteratively and parameters are updated within the subspace of control functions that preserves the end-point mapping at previously learned samples on the manifold of output labels in the first-order approximation sense. In this letter, we prove that this parameter subspace forms a Banach submanifold of finite codimension under nonsingular controls, and we characterize its tangent space. This reveals that TwF corresponds to a continuation/deformation of the control function along the tangent space of this Banach submanifold, providing a theoretical foundation for its mapping-preserving (not forgetting) during the sequential training exactly, beyond first-order approximation.

en cs.LG, math.OC
arXiv Open Access 2025
Hypergraph Foundation Model

Yue Gao, Yifan Feng, Shiquan Liu et al.

Hypergraph neural networks (HGNNs) effectively model complex high-order relationships in domains like protein interactions and social networks by connecting multiple vertices through hyperedges, enhancing modeling capabilities, and reducing information loss. Developing foundation models for hypergraphs is challenging due to their distinct data, which includes both vertex features and intricate structural information. We present Hyper-FM, a Hypergraph Foundation Model for multi-domain knowledge extraction, featuring Hierarchical High-Order Neighbor Guided Vertex Knowledge Embedding for vertex feature representation and Hierarchical Multi-Hypergraph Guided Structural Knowledge Extraction for structural information. Additionally, we curate 11 text-attributed hypergraph datasets to advance research between HGNNs and LLMs. Experiments on these datasets show that Hyper-FM outperforms baseline methods by approximately 13.4%, validating our approach. Furthermore, we propose the first scaling law for hypergraph foundation models, demonstrating that increasing domain diversity significantly enhances performance, unlike merely augmenting vertex and hyperedge counts. This underscores the critical role of domain diversity in scaling hypergraph models.

en cs.LG
DOAJ Open Access 2024
First evaluation of the GEMS glyoxal products against TROPOMI and ground-based measurements

E. S. Ha, R. J. Park, H.-A. Kwon et al.

<p>The Geostationary Environment Monitoring Spectrometer (GEMS) on board the GEO-KOMPSAT-2B satellite is the first geostationary satellite launched to monitor the environment. GEMS conducts hourly measurements during the day over eastern and southeastern Asia. This work presents glyoxal (CHOCHO) vertical column densities (VCDs) retrieved from GEMS, with optimal settings for glyoxal retrieval based on sensitivity tests involving reference spectrum sampling and fitting window selection. We evaluated GEMS glyoxal VCDs by comparing them to the TROPOspheric Monitoring Instrument (TROPOMI) and multi-axis differential optical absorption spectroscopy (MAX-DOAS) ground-based observations. On average, GEMS and TROPOMI VCDs show a spatial correlation coefficient of 0.63, increasing to 0.87 for northeastern Asia. While GEMS and TROPOMI demonstrate similar monthly variations in the Indochinese Peninsula regions (<span class="inline-formula"><i>R</i></span> <span class="inline-formula">&gt;</span> 0.67), variations differ in other areas. Specifically, GEMS VCDs are higher in the winter and either lower or comparable to TROPOMI and MAX-DOAS VCDs in the summer across northeastern Asia. We attributed the discrepancies in the monthly variation to a polluted reference spectrum and high NO<span class="inline-formula"><sub>2</sub></span> concentrations. When we correct GEMS glyoxal VCDs as a function of NO<span class="inline-formula"><sub>2</sub></span> SCDs, the monthly correlation coefficients substantially increase from 0.16–0.40 to 0.45–0.72 in high NO<span class="inline-formula"><sub>2</sub></span> regions. When averaged hourly, GEMS and MAX-DOAS VCDs exhibit similar diurnal variations, especially at stations in Japan (Chiba, Kasuga, and Fukue).</p>

Environmental engineering, Earthwork. Foundations
DOAJ Open Access 2024
Airborne lidar measurements of atmospheric CO<sub>2</sub> column concentrations to cloud tops made during the 2017 ASCENDS/ABoVE campaign

J. Mao, J. Mao, J. B. Abshire et al.

<p>We measured the column-averaged atmospheric <span class="inline-formula">CO<sub>2</sub></span> mixing ratio (<span class="inline-formula">XCO<sub>2</sub></span>) to a variety of cloud tops with an airborne pulsed multi-wavelength integrated path differential absorption (IPDA) lidar during NASA's 2017 ASCENDS/ABoVE airborne campaign. Measurements of height-resolved atmospheric backscatter profiles allow this lidar to retrieve <span class="inline-formula">XCO<sub>2</sub></span> to cloud tops, as well as to the ground, with accurate knowledge of the photon path length. We validated these measurements with those from an onboard in situ <span class="inline-formula">CO<sub>2</sub></span> sensor during spiral-down maneuvers. These lidar measurements were 2–3 times better than those from previous airborne campaigns due to our using a wavelength step-locked laser transmitter and a high-efficiency detector for this campaign. Precisions of 0.6 parts per million (<span class="inline-formula">ppm</span>) were achieved for 10 <span class="inline-formula">s</span> average measurements to mid-level clouds and 0.9 <span class="inline-formula">ppm</span> to low-level clouds at the top of the planetary boundary layer. This study demonstrates the lidar's capability to fill in <span class="inline-formula">XCO<sub>2</sub></span> measurement gaps in cloudy regions and to help resolve the vertical and horizontal distributions of atmospheric <span class="inline-formula">CO<sub>2</sub></span>. Future airborne campaigns and spaceborne missions with this capability can be used to improve atmospheric transport modeling, flux estimation and carbon data assimilation.</p>

Environmental engineering, Earthwork. Foundations
arXiv Open Access 2024
In 1955, Paul Lorenzen clears the sky in foundations of mathematics for Hermann Weyl

Stefan Neuwirth, Henri Lombardi, Thierry Coquand

In 1955, Paul Lorenzen is a mathematician who devotes all his research to foundations of mathematics, on a par with Hans Hermes, but his academic background is algebra in the tradition of Helmut Hasse and Wolfgang Krull. This shift from algebra to logic goes along with his discovery that his ``algebraic works [...] have been concerned with a problem that has formally the same structure as the problem of consistency of the classical calculus of logic'' (letter to Carl Friedrich Gethmann dated 14 January 1988). After having provided a proof of consistency for arithmetic in 1944 and published it in 1951, Lorenzen inquires still further into the foundations of mathematics and arrives at the conviction that analysis can also be given a predicative foundation. Wilhelm Ackermann as well as Paul Bernays have pointed out to him in 1947 that his views are very close to those proposed by Hermann Weyl in Das Kontinuum (1918): sets are not postulated to exist beforehand; they are being generated in an ongoing process of comprehension. This seems to be the reason for Lorenzen to get into contact with Weyl, who develops a genuine interest into Lorenzen's operative mathematics and welcomes with great enthusiasm his Einf{ü}hrung in die operative Logik und Mathematik (1955), which he studies line by line. This book's aim is to grasp the objects of analysis by means of inductive definitions; the most famous achievement of this enterprise is a generalised inductive formulation of the Cantor-Bendixson theorem that makes it constructive. This mathematical kinship is brutally interrupted by Weyl's death in 1955; a planned visit by Lorenzen at the Institute for Advanced Study in Princeton takes place only in 1957--1958. As told by Kuno Lorenz, Lorenzen's first Ph.D. student, a discussion with Alfred Tarski during this visit provokes a turmoil in Lorenzen's operative research program that leads to his abandonment of language levels and to a great simplification of his presentation of analysis by distinguishing only between ``definite'' and ``indefinite'' quantifiers: the former govern domains for which a proof of consistency is available and secures the use of the law of excluded middle; the latter govern those for which there isn't, e.g. the real numbers. Lorenzen states in his foreword to Differential und Integral (1965) that he is faithful to Weyl's approach of Das Kontinuum in this simplification. This history motivates a number of mathematical and philosophical issues about predicative mathematics: how does Weyl's interest into Lorenzen's operative mathematics fit with his turn to Brouwer's intuitionism as expressed in ``{Ü}ber die neue Grundlagenkrise der Mathematik'' (1921)? Why does Lorenzen turn away from his language levels and how does this turn relate to Weyl's conception of predicative mathematics? What do Lorenzen's conceptions of mathematics reveal about Weyl's conceptions?

en math.HO, math.LO
arXiv Open Access 2024
Continuous and algebraic domains in univalent foundations

Tom de Jong, Martín Hötzel Escardó

We develop the theory of continuous and algebraic domains in constructive and predicative univalent foundations, building upon our earlier work on basic domain theory in this setting. That we work predicatively means that we do not assume Voevodsky's propositional resizing axioms. Our work is constructive in the sense that we do not rely on excluded middle or the axiom of (countable) choice. To deal with size issues and give a predicatively suitable definition of continuity of a dcpo, we follow Johnstone and Joyal's work on continuous categories. Adhering to the univalent perspective, we explicitly distinguish between data and property. To ensure that being continuous is a property of a dcpo, we turn to the propositional truncation, although we explain that some care is needed to avoid needing the axiom of choice. We also adapt the notion of a domain-theoretic basis to the predicative setting by imposing suitable smallness conditions, analogous to the categorical concept of an accessible category. All our running examples of continuous dcpos are then actually examples of dcpos with small bases which we show to be well behaved predicatively. In particular, such dcpos are exactly those presented by small ideals. As an application of the theory, we show that Scott's $D_\infty$ model of the untyped $λ$-calculus is an example of an algebraic dcpo with a small basis. Our work is formalised in the Agda proof assistant and its ability to infer universe levels has been invaluable for our purposes.

en cs.LO, math.LO
arXiv Open Access 2024
Foundation Models for Education: Promises and Prospects

Tianlong Xu, Richard Tong, Jing Liang et al.

With the advent of foundation models like ChatGPT, educators are excited about the transformative role that AI might play in propelling the next education revolution. The developing speed and the profound impact of foundation models in various industries force us to think deeply about the changes they will make to education, a domain that is critically important for the future of humans. In this paper, we discuss the strengths of foundation models, such as personalized learning, education inequality, and reasoning capabilities, as well as the development of agent architecture tailored for education, which integrates AI agents with pedagogical frameworks to create adaptive learning environments. Furthermore, we highlight the risks and opportunities of AI overreliance and creativity. Lastly, we envision a future where foundation models in education harmonize human and AI capabilities, fostering a dynamic, inclusive, and adaptive educational ecosystem.

en cs.CY, cs.LG
arXiv Open Access 2024
FedBaF: Federated Learning Aggregation Biased by a Foundation Model

Jong-Ik Park, Srinivasa Pranav, José M. F. Moura et al.

Foundation models are now a major focus of leading technology organizations due to their ability to generalize across diverse tasks. Existing approaches for adapting foundation models to new applications often rely on Federated Learning (FL) and disclose the foundation model weights to clients when using it to initialize the global model. While these methods ensure client data privacy, they compromise model and information security. In this paper, we introduce Federated Learning Aggregation Biased by a Foundation Model (FedBaF), a novel method for dynamically integrating pre-trained foundation model weights during the FL aggregation phase. Unlike conventional methods, FedBaF preserves the confidentiality of the foundation model while still leveraging its power to train more accurate models, especially in non-IID and adversarial scenarios. Our comprehensive experiments use Pre-ResNet and foundation models like Vision Transformer to demonstrate that FedBaF not only matches, but often surpasses the test accuracy of traditional weight initialization methods by up to 11.4% in IID and up to 15.8% in non-IID settings. Additionally, FedBaF applied to a Transformer-based language model significantly reduced perplexity by up to 39.2%.

en cs.LG, cs.CR
arXiv Open Access 2024
Foundations of Quantum Contextual Topos: Integrating Modality and Topos Theory in Quantum Logic

Jesse Werbow

This paper introduces the Quantum Contextual Topos (QCT), a novel framework that extends traditional quantum logic by embedding contextual elements within a topos-theoretic structure. This framework seeks to provide a classically-obedient tool for exploring the logical foundations of quantum mechanics. The QCT framework aims to address the limitations of classical quantum logic, particularly its challenges in capturing the dynamic and contextual nature of quantum phenomena. By integrating modal operators and classical propositional logic within a topos structure, the QCT offers a unified approach to modeling quantum systems. The main result of this work is demonstrating that the internal logic of QCT corresponds to a form of classical propositional polymodal logic. We do this by generalizing Stone's Representation Theorem for a specific case of polymodal algebras and their underlying Stone Spaces.

en math.LO, math-ph
arXiv Open Access 2024
Domain theory in univalent foundations I: Directed complete posets and Scott's $D_\infty$

Tom de Jong

We develop domain theory in constructive and predicative univalent foundations (also known as homotopy type theory). That we work predicatively means that we do not assume Voevodsky's propositional resizing axioms. Our work is constructive in the sense that we do not rely on excluded middle or the axiom of (countable) choice. Domain theory studies so-called directed complete posets (dcpos) and Scott continuous maps between them and has applications in a variety of fields, such as programming language semantics, higher-type computability and topology. A common approach to deal with size issues in a predicative foundation is to work with information systems, abstract bases or formal topologies rather than dcpos, and approximable relations rather than Scott continuous functions. In our type-theoretic approach, we instead accept that dcpos may be large and work with type universes to account for this. A priori one might expect that iterative constructions of dcpos may result in a need for ever-increasing universes and are predicatively impossible. We show, through a careful tracking of type universe parameters, that such constructions can be carried out in a predicative setting. In particular, we give a predicative reconstruction of Scott's $D_\infty$ model of the untyped $λ$-calculus. Our work is formalised in the Agda proof assistant and its ability to infer universe levels has been invaluable for our purposes.

en cs.LO, math.LO
S2 Open Access 2023
Part I: Selection and Design of Rigid Inclusions

Miriam Smith

The use of rigid inclusions (RIs) as a ground improvement technology has become widely accepted as a cost-effective solution to support structures and embankments on sites with marginal, compressible ground. Rigid inclusion elements are defined as stiff, columnar, slender, vertical, discrete elements typically installed in regularly spaced groups. The term RI can apply to a variety of materials, but the primary focus of this paper is on cementitious, cast-in-place RIs. Thousands of structures throughout the United States are supported on RIs; however, there currently exists no industry-wide standard for the selection, design, and verification of RIs. When evaluating, recommending, and specifying RI ground improvement methods, practitioners should understand the fundamental mechanics involved, as well as the applicability and limitations of RIs. The purpose of this document is to provide commentary guidelines that address general concerns regarding the selection and design of RI systems for ground improvement. The development of this document is the result of a collaborative effort of the Rigid Inclusion Task Force of the Deep Foundations Institute (DFI) Ground Improvement Committee. This document (Part I) describes the fundamental mechanics of RIs for ground improvement and discusses the state-of-the-practice for the design of RIs in the United States. A subsequent document (Part II) will discuss construction and verification considerations of RIs.

DOAJ Open Access 2023
Use of lidar aerosol extinction and backscatter coefficients to estimate cloud condensation nuclei (CCN) concentrations in the southeast Atlantic

E. D. Lenhardt, L. Gao, J. Redemann et al.

<p>Accurately capturing cloud condensation nuclei (CCN) concentrations is key to understanding the aerosol–cloud interactions that continue to feature the highest uncertainty amongst numerous climate forcings. In situ CCN observations are sparse, and most non-polarimetric passive remote sensing techniques are limited to providing column-effective CCN proxies such as total aerosol optical depth (AOD). Lidar measurements, on the other hand, resolve profiles of aerosol extinction and/or backscatter coefficients that are better suited for constraining vertically resolved aerosol optical and microphysical properties. Here we present relationships between aerosol backscatter and extinction coefficients measured by the airborne High Spectral Resolution Lidar 2 (HSRL-2) and in situ measurements of CCN concentrations. The data were obtained during three deployments in the NASA ObseRvations of Aerosols above CLouds and their intEractionS (ORACLES) project, which took place over the southeast Atlantic (SEA) during September 2016, August 2017, and September–October 2018.</p> <p>Our analysis of spatiotemporally collocated in situ CCN concentrations and HSRL-2 measurements indicates strong linear relationships between both data sets. The correlation is strongest for supersaturations (<span class="inline-formula"><i>S</i></span>) greater than 0.25 % and dry ambient conditions above the stratocumulus deck, where relative humidity (RH) is less than 50 %. We find CCN–HSRL-2 Pearson correlation coefficients between 0.95–0.97 for different parts of the seasonal burning cycle that suggest fundamental similarities in biomass burning aerosol (BBA) microphysical properties. We find that ORACLES campaign-average values of in situ CCN and in situ extinction coefficients are qualitatively similar to those from other regions and aerosol types, demonstrating overall<span id="page2038"/> representativeness of our data set. We compute CCN–backscatter and CCN–extinction regressions that can be used to resolve vertical CCN concentrations across entire above-cloud lidar curtains. These lidar-derived CCN concentrations can be used to evaluate model performance, which we illustrate using an example CCN concentration curtain from the Weather Research and Forecasting Model coupled with physics packages from the Community Atmosphere Model version 5 (WRF-CAM5). These results demonstrate the utility of deriving vertically resolved CCN concentrations from lidar observations to expand the spatiotemporal coverage of limited or unavailable in situ observations.</p>

Environmental engineering, Earthwork. Foundations
arXiv Open Access 2023
On the categorical foundations of quantum information theory: Categories and the Cramer-Rao inequality

Florio M. Ciaglia, Fabio Di Cosmo, Laura González-Bravo et al.

An extension of Cencov's categorical description of classical inference theory to the domain of quantum systems is presented. It provides a novel categorical foundation to the theory of quantum information that embraces both classical and quantum information theory in a natural way, while also allowing to formalise the notion of quantum environment. A first application of these ideas is provided by extending the notion of statistical manifold to incorporate categories, and investigating a possible, uniparametric Cramer-Rao inequality in this setting.

en quant-ph, cs.IT
arXiv Open Access 2023
Minding rights: Mapping ethical and legal foundations of 'neurorights'

Sjors Ligthart, Marcello Ienca, Gerben Meynen et al.

The rise of neurotechnologies, especially in combination with AI-based methods for brain data analytics, has given rise to concerns around the protection of mental privacy, mental integrity and cognitive liberty - often framed as 'neurorights' in ethical, legal and policy discussions. Several states are now looking at including 'neurorights' into their constitutional legal frameworks and international institutions and organizations, such as UNESCO and the Council of Europe, are taking an active interest in developing international policy and governance guidelines on this issue. However, in many discussions of 'neurorights' the philosophical assumptions, ethical frames of reference and legal interpretation are either not made explicit or are in conflict with each other. The aim of this multidisciplinary work here is to provide conceptual, ethical and legal foundations that allow for facilitating a common minimalist conceptual understanding of mental privacy, mental integrity and cognitive liberty to facilitate scholarly, legal and policy discussions.

en q-bio.NC, cs.CY

Halaman 28 dari 19499