N. A. D. Richards, N. A. D. Richards, N. A. Kramarova
et al.
<p>The Ozone Mapping and Profiler Suite Limb Profiler (OMPS LP) was launched onboard the Suomi National Polar-orbiting Partnership (SNPP) satellite in 2011 and began routine science operations in April 2012. The OMPS LP uses measurements of scattered solar radiation in the ultraviolet, visible and near infrared wavelengths to retrieve high vertical resolution profiles of ozone from 12 km (or cloud tops) up to 57 km. In mid-2023, version 2.6 of the OMPS LP ozone profile retrievals was released, featuring improvements in calibration, retrieval algorithm, and data quality. We evaluate OMPS LP version 2.6 ozone retrievals using correlative data from other satellite instruments and ground based data for the period April 2012 to April 2024. Our results show agreement between OMPS LP and all correlative data sources between 20 and 50 km at all latitudes with differences of less than 10 %, with OMPS generally exhibiting a negative bias, except between 32 and 38 km in the tropics and southern mid-latitudes, where the bias is positive. In the tropics and southern mid-latitudes the differences between OMPS LP and MLS, and OMPS LP and SAGE III/ISS are less than <span class="inline-formula">±5</span> % between 20 and 45 km. Above 50 km, the agreement with MLS is still on the order of <span class="inline-formula">−</span>5 % or better. Larger positive biases, up to <span class="inline-formula">∼</span> 35 %, are seen in the upper troposphere lower stratosphere layer (<span class="inline-formula">∼</span> 15 to 20 km) between approximately 40° S and 40° N. We find that OMPS version 2.6 ozone exhibits the same seasonal cycle as compared to all correlative measurement sources and our analysis shows that there is no significant seasonal bias in OMPS LP. We find drifts relative to correlative observations at all latitude bands of less than <span class="inline-formula">±2</span> % per decade (<span class="inline-formula">±1</span> % per decade) between 25 and 50 km for the 2012–2024 period, with larger drifts above 50 km and below 20 km. These drifts vary between correlative measurements and straddle the zero line, we therefore conclude that there is no significant systematic drift in OMPS LP version 2.6 ozone for the period 2012 to 2024. The drift results represent an improvement in the long term stability of version 2.6 ozone over that of version 2.5.</p>
The chapter discusses the foundational impact of modern generative AI models on information access (IA) systems. In contrast to traditional AI, the large-scale training and superior data modeling of generative AI models enable them to produce high-quality, human-like responses, which brings brand new opportunities for the development of IA paradigms. In this chapter, we identify and introduce two of them in details, i.e., information generation and information synthesis. Information generation allows AI to create tailored content addressing user needs directly, enhancing user experience with immediate, relevant outputs. Information synthesis leverages the ability of generative AI to integrate and reorganize existing information, providing grounded responses and mitigating issues like model hallucination, which is particularly valuable in scenarios requiring precision and external knowledge. This chapter delves into the foundational aspects of generative models, including architecture, scaling, and training, and discusses their applications in multi-modal scenarios. Additionally, it examines the retrieval-augmented generation paradigm and other methods for corpus modeling and understanding, demonstrating how generative AI can enhance information access systems. It also summarizes potential challenges and fruitful directions for future studies.
We reassess foundational aspects of Metric-Affine Gravity (MAG) in light of the Dressing Field Method, a tool allowing to systematically build gauge-invariant field variables. To get MAG started, one has to deal with the problem of "gauge translations". We first recall that Cartan geometry is the proper mathematical foundation for gauge theories of gravity, and that this problem never arises in that framework, which still allows to clarify the geometric status of gauge translations. Then, we show how the MAG kinematics is obtained via dressing in a technically streamlined way, which highlights that it reduces to a Cartan-geometric kinematics.
<p>Fuel-operated auxiliary heaters (AHs) are potentially significant additional sources of particle- and gas-phase pollution from vehicles, but information on their emissions is scarce. In particular, an understanding of secondary aerosol formation originating from AH exhaust is lacking. In this study, we measured the gas and particle emissions, including secondary emissions, of diesel- and gasoline-operated AHs used in passenger cars. Investigation revealed the importance of peak emissions during start and shutdown events of the heaters and differences between emissions of gasoline- and diesel-fuelled AHs: gasoline-operated AHs also produced particles under steady-state operating conditions, while their diesel counterparts did not. Furthermore, ambient air temperature was observed to impact the emission profiles, with, for example, higher nitrogen oxide (NO<span class="inline-formula"><sub><i>x</i></sub></span>) and particle mass emissions but lower particle number (PN) emissions observed in outdoor (<span class="inline-formula">−</span>19 to <span class="inline-formula">−</span>7 °C) measurements compared to laboratory measurements (<span class="inline-formula">+</span>25 °C). However, further quantification is necessary to fully connect the temperature-related effects and AH emissions. Our findings highlight the importance of also characterizing the atmospherically aged aerosols, specifically secondary organic aerosol (SOA) formation, which was simulated here both by an environmental chamber and by an oxidation flow reactor (OFR). The particle mass in photochemically aged aerosols surpassed the fresh exhaust particulate mass emissions by 1 to 3 orders of magnitude, with the increase depending mainly on fuel, combustion conditions, and ageing methods. Further research into formation pathways of secondary aerosols from precursors is still needed, along with the quantification of vehicle AH emissions at the fleet level, to enable the estimation of atmospheric and air quality effects of AH usage.</p>
<p>Shipping is an important source of nitrogen oxide (NO<span class="inline-formula"><sub><i>x</i></sub></span>) emissions worldwide, contributing to air pollution and negatively affecting marine environments, ecosystems, and biodiversity. TROPOMI (TROPOspheric Monitoring Instrument) on board the Sentinel-5 Precursor (S5P) has significantly enhanced the ability to detect ship emissions from space due to its low measurement noise levels and high spatial resolution of <span class="inline-formula">5.5×3.5</span> <span class="inline-formula">km<sup>2</sup></span> at nadir. This study uses the TROPOMI tropospheric NO<span class="inline-formula"><sub>2</sub></span> slant column density (tSCD) to identify global shipping routes qualitatively. Preprocessing techniques, including iterative high-pass and Fourier filtering, markedly improve the detection of shipping lanes, revealing previously undetectable routes. Our analysis examines the impact of high-pass-filter box sizes, demonstrating that smaller sizes enhance the visibility of narrow shipping features, whereas larger box sizes increase overall NO<span class="inline-formula"><sub>2</sub></span> signals. Additionally, we investigate various flagging criteria that affect NO<span class="inline-formula"><sub>2</sub></span> signal distribution, highlighting the critical importance of careful selection for accurate emission monitoring. Filtered TROPOMI NO<span class="inline-formula"><sub>2</sub></span> tSCDs over oceans show a strong correlation with shipping activities, as confirmed by comparison with the CAMS-GLOB-SHIP (Copernicus Atmospheric Monitoring Service for Global Shipping) inventory, and also reveal unknown shipping routes in regions such as the Bering Sea. Furthermore, TROPOMI effectively captures NO<span class="inline-formula"><sub>2</sub></span> emissions from offshore oil and gas platforms, with NO<span class="inline-formula"><sub>2</sub></span> hotspots in the TROPOMI data aligning well with locations of offshore installations listed in the OSPAR (Oslo and Paris Commission) and BOEM (Bureau of Ocean Energy Management) inventories. Lastly, the filtered TROPOMI NO<span class="inline-formula"><sub>2</sub></span> tropospheric vertical column densities (tVCDs) are compared with the high-pass-filtered NO<span class="inline-formula"><sub>2</sub></span> tVCDs from the CAMS (Copernicus Atmospheric Monitoring Service) model, which has a coarse spatial resolution of 0.4°. While both datasets effectively identify global shipping lanes, the high-pass-filtered CAMS NO<span class="inline-formula"><sub>2</sub></span> tVCDs are significantly higher than the filtered TROPOMI NO<span class="inline-formula"><sub>2</sub></span> tVCDs in the North Atlantic and strongly depend on the masking threshold in the high-pass filtering method in the South Atlantic Ocean.</p>
Pierre Cagne, Ulrik Buchholtz, Nicolai Kraus
et al.
Working in univalent foundations, we investigate the symmetries of spheres, i.e., the types of the form $\mathbb{S}^n = \mathbb{S}^n$. The case of the circle has a slick answer: the symmetries of the circle form two copies of the circle. For higher-dimensional spheres, the type of symmetries has again two connected components, namely the components of the maps of degree plus or minus one. Each of the two components has $\mathbb{Z}/2\mathbb{Z}$ as fundamental group. For the latter result, we develop an EHP long exact sequence.
As the preface to the special issue for the conference ``Quantum Information and Probability: from Foundations to Engineering'' (QIP23), I wrote these notes with recollection about Växjö conferences. These conferences covered 25 years of my life (2000-24) and played the crucial role in evolution of my own views on the basic problems of quantum foundations. I hope to continue this conference series as long as possible. Up to my understanding, this is the longest conference series on quantum foundation in the history of quantum physics. These notes contain my recollections of conversations with the world's leading experts on quantum foundations. I think that such notes may have the historical value. My own views on quantum foundations are specific and they evolved essentially during 25 years. Finally, I discovered the practically forgotten pathway in physical foundations developed by von Helmholtz, Hertz, Boltzmann, and Schrödinger and known as the Bild conception. A scientific theory is a combination of two models, observational and causal. Coupling between these models can be tricky. A causal model can operate with hidden quantities which can't be identified with observables. From this viewpoint, Bell's coupling of subquantum and quantum models is very special and the violation of the Bell inequalities doesn't close (local) ways beyond quantum mechanics.
In this paper I offer an introduction to group field theory (GFT) and to some of the issues affecting the foundations of this approach to quantum gravity. I first introduce covariant GFT as the theory that one obtains by interpreting the amplitudes of certain spin foam models as Feynman amplitudes in a perturbative expansion. However, I argue that it is unclear that this definition of GFTs amounts to something beyond a computational rule for finding these transition amplitudes and that GFT doesn't seem able to offer any new insight into the foundations of quantum gravity. Then, I move to another formulation of GFT which I call canonical GFT and which uses the standard structures of quantum mechanics. This formulation is of extended use in cosmological applications of GFT, but I argue that it is only heuristically connected with the covariant version and spin foam models. Moreover, I argue that this approach is affected by a version of the problem of time which raises worries about its viability. Therefore, I conclude that there are serious concerns about the justification and interpretation of GFT in either version of it.
We investigate two constructive approaches to defining quasi-compact and quasi-separated schemes (qcqs-schemes), namely qcqs-schemes as locally ringed lattices and as functors from rings to sets. We work in Homotopy Type Theory and Univalent Foundations, but reason informally. The main result is a constructive and univalent proof that the two definitions coincide, giving an equivalence between the respective categories of qcqs-schemes.
In this article, the weakest possible theorem providing a foundation for the Hilbert space formalism of quantum theory is stated. The necessary postulates are formulated, and the mathematics is spelt out in detail. It is argued that, from this approach, a general epistemic interpretation of quantum mechanics is natural. Some applications to the Bell experiment and to decision theory are briefly discussed. The article represents the conclusion of a series of articles and books on quantum foundations.
Itaï Ben Yaacov, Pablo Destic, Ehud Hrushovski
et al.
We present foundations of globally valued fields, i.e., of a class of fields with an extra structure, capturing some aspects of the geometry of global fields, based on the product formula. We provide a dictionary between various data defining such extra structure: syntactic (models of some unbounded continuous logic theory), Arakelov theoretic, and measure theoretic. In particular we obtain a representation theorem relating globally valued fields and adelic curves defined by Chen and Moriwaki.
We characterize the epimorphisms in homotopy type theory (HoTT) as the fiberwise acyclic maps and develop a type-theoretic treatment of acyclic maps and types in the context of synthetic homotopy theory as developed in univalent foundations. We present examples and applications in group theory, such as the acyclicity of the Higman group, through the identification of groups with 0-connected, pointed 1-types. Many of our results are formalized as part of the agda-unimath library.
<p>The Earth's energy imbalance, i.e. the difference between incoming solar radiation and outgoing reflected and emitted radiation, is the one quantity that ultimately controls the evolution of our climate system. However, despite its importance, there is limited knowledge of the exact magnitude of the energy imbalance, and the small net difference of about 1 W m<span class="inline-formula"><sup>−2</sup></span> between two large fluxes (approximately 340 W m<span class="inline-formula"><sup>−2</sup></span>) makes it challenging to measure directly. There has recently been renewed interest in using wide-field-of-view radiometers on board satellites to measure the outgoing radiation, as a possible method for deducing the global annual mean energy imbalance. Here we investigate how to sample in order to correctly determine the global annual mean imbalance and interannual trends, using a limited number of satellites. We simulate satellites in polar (90° inclination), sun-synchronous (98°) and precessing orbits (73, 82°), as well as constellations of these types of satellite orbits. We find that no single satellite provides sufficient sampling, both globally and of the diurnal and annual cycles, to reliably determine the global annual mean. If sun-synchronous satellites are used, at least six satellites are required for an uncertainty below 1 <span class="inline-formula">W m<sup>−2</sup></span>. One precessing satellite combined with one polar satellite results in root-mean-square errors of 0.08 to 0.10 <span class="inline-formula">W m<sup>−2</sup></span>, and a combination of two or three polar satellites results in root-mean-square errors of 0.10 or 0.04 <span class="inline-formula">W m<sup>−2</sup></span>, respectively. In conclusion, at least two satellites that complement each other are necessary to ensure global coverage and achieve a sampling uncertainty well below the current estimate of the energy imbalance.</p>
<p>The term “hotspot” refers to the sharp increase in the reflectance occurring when incident (solar) and reflected (viewing) directions almost coincide in the backscatter direction. The accurate simulation of hotspot directional signatures is important for many remote sensing applications. The RossThick–LiSparse–Reciprocal (RTLSR) bidirectional reflectance distribution function (BRDF) model is widely used in radiative transfer simulations, and the hotspot model mostly used is from Maignan–Bréon, but it typically requires large values of numerical quadrature and Fourier expansion terms in order to represent the hotspot accurately for its use coupled with atmospheric radiative transfer modeling (RTM). In this paper, we have developed a modified version based on the Maignan–Bréon's hotspot BRDF model that converges much faster numerically, making it more practical for use in the RTMs that require Fourier expansion of BRDF to simulate the top-of-atmosphere (TOA) hotspot signatures, such as in the RTM models using the Doubling–Adding or discrete ordinate method. Using the vector linearized discrete ordinate radiative transfer model (VLIDORT), we found that reasonable TOA–hotspot accuracy can be obtained with just 23 Fourier terms for clear atmosphere and 63 Fourier terms for atmosphere with aerosol scattering.</p>
<p>In order to study the impact of molecular and aerosol scattering on hotspot signatures, we carried out a number of hotspot signature simulations with VLIDORT. We confirmed that (1) atmospheric molecule scattering and the existence of aerosol tend to smooth out the hotspot signature at the TOA and that (2) the hotspot signature at the TOA in the near-infrared is larger than in the visible, and its impact by surface reflectance is more significant. As the hotspot amplitude at the TOA with aerosol scattering included is smaller than that with molecular scattering only, the amplitude of hotspot signature at the surface is likely underestimated in the previous analysis based on the POLDER measurements, where the atmospheric correction was based on a single-scatter Rayleigh-only calculation. This modified model can calculate the amplitude of the hotspot accurately, and, as it agrees very well with the original RossThick model away from the hotspot region, this model can be simply used in conditions with and without hotspots. However, there are some differences in this modified model compared to the original Maignan–Bréon model for the scattering angles close to the hotspot point; thus, it may not be appropriate for those who need an exact representation of the hotspot angular signature.</p>
Léa Saunier, Nicolas Hoffmann, Marius Preda
et al.
Automation and robotics are destined to play a critical role in the Industry 4.0 revolution, as illustrated by the emergence of autonomous machinery in earthwork operations. Despite rapid progress, autonomous agents will always require human supervision to instruct their mission and to guaranty safety when unexpected problems arise. Traditional human supervision requires an operator to physically enter each machine at risk and manually take control. This approach is time-consuming and requires highly qualified personnel capable of operating various machines. This process can be hastened and simplified by means of teleoperated supervision, which itself requires the appropriate interface. In this paper we evaluate a virtual reality (VR)-based interface using hybrid interactions and an immersive digital-twin compared to a real-life control. We compare these interfaces through control tasks performed by expert and non-expert operators, analyzing time and precision, as well as user feedback. The preliminary results show that the VR interface brings equivalent and satisfactory performances for experts and improves the efficiency of apprentices. Therefore, not only does everyone performs well in the virtual environment, but also the training time can be shortened significantly as non-experts can perform similarly under the same conditions.
Dividido em dois momentos da pesquisa artística teórico-prática, o artigo se inicia com breve genealogia do Antropoceno, mapeando autores que contribuíram com teorias similares ao longo da história, ideias pioneiras que desembocaram no pensamento sobre o agente humano como força de transformação geológica. Apresenta uma síntese do debate na esfera das ciências humanas, com destaque para as discussões em torno da palavra antropos e de figurações alternativas propostas por vários críticos. Na segunda parte, o texto toma a liberdade de dar um salto na direção de uma escrita poética, que caminha entre o relato pessoal e a ficção, a partir do processo de criação da obra Un-Earthwork. Revisitando paisagens e contextos socioambientais de uma viagem de campo a Santana do Cariri, a narrativa encarna a pesquisa teórico-prática a partir da experiência situada, que levantou reflexões e referências, entre elas obras de artistas como Ana Mendieta, Nancy Holt, Celeida Tostes e Robert Smithson.
<p>Continuous long-term ground-based remote-sensing observations combined with vertically pointing cloud radar and ceilometer measurements are well suited for identifying precipitation evaporation fall streaks (so-called virga). Here we introduce the functionality and workflow of a new open-source tool, the <i>Virga-Sniffer</i>, which was developed within the framework of RV <i>Meteor</i> observations during the ElUcidating the RolE of Cloud–Circulation Coupling in ClimAte (EUREC<span class="inline-formula"><sup>4</sup></span>A) field experiment in January–February 2020 in the tropical western Atlantic. The Virga-Sniffer Python package is highly modular and configurable and can be applied to multilayer cloud situations. In the simplest approach, it detects virga from time–height fields of cloud radar reflectivity and time series of ceilometer cloud base height. In addition, optional parameters like lifting condensation level, a surface rain flag, and time–height fields of cloud radar mean Doppler velocity can be added to refine virga event identifications. The netCDF-output files consist of Boolean flags of virga and cloud detection, as well as base and top heights and depth for the detected clouds and virga. The sensitivity of the Virga-Sniffer results to different settings is explored (in the Appendix).
The performance of the Virga-Sniffer was assessed by comparing its results to the CloudNet target classification resulting from using the CloudNet processing chain. A total of 86 % of pixels identified as virga correspond to CloudNet target classifications of precipitation. The remaining 14 % of virga pixels correspond to CloudNet target classifications of aerosols and insects (about 10 %), cloud droplets (about 2 %), or clear sky (2 %). Some discrepancies of the virga identification and the CloudNet target classification can be attributed to temporal smoothing that was applied. Additionally, it was found that CloudNet mostly classified aerosols and insects at virga edges, which points to a misclassification caused by CloudNet internal thresholds.
For the RV <i>Meteor</i> observations in the downstream winter trades during EUREC<span class="inline-formula"><sup>4</sup></span>A, about 42 % of all detected clouds with bases below the trade inversion were found to produce precipitation that fully evaporates before reaching the ground.
A proportion of 56 % of the detected virga originated from trade wind cumuli. Virga with depths less than 0.2 km most frequently occurred from shallow clouds with depths less than 0.5 km, while virga depths larger than 1 km were mainly associated with clouds of larger depths, ranging between 0.5 and 1 km. The presented results substantiate the importance of complete low-level precipitation evaporation in the downstream winter trades. Possible applications of the Virga-Sniffer within the framework of EUREC<span class="inline-formula"><sup>4</sup></span>A include detailed studies of precipitation evaporation with a focus on cold pools or cloud organization or distinguishing moist processes based on water vapor isotopic observations. However, we envision extended use of the Virga-Sniffer for other cloud regimes or scientific foci as well.</p>
I critically discuss a controversial 'trans-Planckian censorship' conjecture, which has recently been introduced to researchers working at the intersection of fundamental physics and cosmology. My focus explicitly avoids any appeals to contingent research within string theory (the sociological origins of the conjecture) or regarding the more general (quantum) gravitational 'swampland'. Rather, I concern myself with the conjecture's foundations in our current, well-trodden physics of quantized fields, spacetime, and (classical) gravity. In doing so, I locate what exactly within trans-Planckian censorship amounts to a departure from current physics -- identifying what is, ultimately, so conjectural about the conjecture.