Broad presence of ferromagnetism in bees and relationship to phylogeny, natural history, and sociality
Laura Russo, Caleb Allen, Cameron S. Jorgensen
et al.
Scientists have long been fascinated by magnetoreception, the innate capacity of many animals to sense and use the Earth's magnetic field for navigation. In eusocial insects like honey bees, magnetoreception has been linked to communication and foraging. However, little is known about magnetoreception's phylogenetic patterns and relationship to species traits and natural history. Here, we demonstrate that putative magnetoreception based on ferromagnetic particles is widespread across a diversity of bee species (72 out of 96 species tested), with no phylogenetic signal. We also detected such putative magnetoreception in non-bee outgroups, suggesting this magnetic capacity predates the evolution of the Anthophila. While magnetic signals were found across a diversity of life history traits, the strength of the magnetic signal varied within and between species, and increased with body size and social behavior.
en
q-bio.PE, cond-mat.mes-hall
Demystifying the Paradox of Importance Sampling with an Estimated History-Dependent Behavior Policy in Off-Policy Evaluation
Hongyi Zhou, Josiah P. Hanna, Jin Zhu
et al.
This paper studies off-policy evaluation (OPE) in reinforcement learning with a focus on behavior policy estimation for importance sampling. Prior work has shown empirically that estimating a history-dependent behavior policy can lead to lower mean squared error (MSE) even when the true behavior policy is Markovian. However, the question of why the use of history should lower MSE remains open. In this paper, we theoretically demystify this paradox by deriving a bias-variance decomposition of the MSE of ordinary importance sampling (IS) estimators, demonstrating that history-dependent behavior policy estimation decreases their asymptotic variances while increasing their finite-sample biases. Additionally, as the estimated behavior policy conditions on a longer history, we show a consistent decrease in variance. We extend these findings to a range of other OPE estimators, including the sequential IS estimator, the doubly robust estimator and the marginalized IS estimator, with the behavior policy estimated either parametrically or non-parametrically.
Filtering Learning Histories Enhances In-Context Reinforcement Learning
Weiqin Chen, Xinjie Zhang, Dharmashankar Subramanian
et al.
Transformer models (TMs) have exhibited remarkable in-context reinforcement learning (ICRL) capabilities, allowing them to generalize to and improve in previously unseen environments without re-training or fine-tuning. This is typically accomplished by imitating the complete learning histories of a source RL algorithm over a substantial amount of pretraining environments, which, however, may transfer suboptimal behaviors inherited from the source algorithm/dataset. Therefore, in this work, we address the issue of inheriting suboptimality from the perspective of dataset preprocessing. Motivated by the success of the weighted empirical risk minimization, we propose a simple yet effective approach, learning history filtering (LHF), to enhance ICRL by reweighting and filtering the learning histories based on their improvement and stability characteristics. To the best of our knowledge, LHF is the first approach to avoid source suboptimality by dataset preprocessing, and can be combined with the current state-of-the-art (SOTA) ICRL algorithms. We substantiate the effectiveness of LHF through a series of experiments conducted on the well-known ICRL benchmarks, encompassing both discrete environments and continuous robotic manipulation tasks, with three SOTA ICRL algorithms (AD, DPT, DICP) as the backbones. LHF exhibits robust performance across a variety of suboptimal scenarios, as well as under varying hyperparameters and sampling strategies. Notably, the superior performance of LHF becomes more pronounced in the presence of noisy data, indicating the significance of filtering learning histories.
Entanglement of temporal sections as quantum histories and their quantum correlation bounds
Marcin Nowakowski
In this paper we focus on the underlying quantum structure of temporal correlations and show their peculiar nature which differentiate them from spatial quantum correlations. With a growing interest in representation of quantum states as topological objects, we consider quantum history bundles based on the temporal manifold and show the source of violation of monogamous temporal Bell-like inequalities. We introduce definitions for the mixture of quantum histories and consider their entanglement as sections over the Hilbert vector bundles. As a generalization of temporal Bell-like inequalities, we derive the quantum bound for multi-time Bell-like inequalities.
Star Formation History of the Small Magellanic Cloud: the shell substructure
Joanna D. Sakowska, Noelia E. D. Noël, Tomás Ruiz-Lara
et al.
We present the spatially resolved star formation history (SFH) of a shell-like structure located in the northeastern Small Magellanic Cloud (SMC). We quantitatively obtain the SFH using unprecedented deep photometric data (g~24 magnitude) from the SMASH survey and colour-magnitude diagram (CMD) fitting techniques. We consider, for the first time, the SMC's line-of-sight depth and its optical effects on the CMDs. The SFH presents higher accuracy when a line-of-sight depth of ~3 Kpc is simulated. We find young star formation enhancements at ~150 Myr, ~200 Myr, ~450 Myr, ~650 Myr, and ~1 Gyr. Comparing the shell's SFH with the Large Magellanic Cloud's (LMC) northern arm SFH we show strong evidence of synchronicity from at least the past ~2.8 Gyr and, possibly, the past ~3.5 Gyr. Our results place constraints on the orbital history of the Magellanic Clouds which, potentially, have implications on their dynamical mass estimates.
A Summarized History-based Dialogue System for Amnesia-Free Prompt Updates
Hyejin Hong, Hibiki Kawano, Takuto Maekawa
et al.
In today's society, information overload presents challenges in providing optimal recommendations. Consequently, the importance of dialogue systems that can discern and provide the necessary information through dialogue is increasingly recognized. However, some concerns existing dialogue systems rely on pre-trained models and need help to cope with real-time or insufficient information. To address these concerns, models that allow the addition of missing information to dialogue robots are being proposed. Yet, maintaining the integrity of previous conversation history while integrating new data remains a formidable challenge. This paper presents a novel system for dialogue robots designed to remember user-specific characteristics by retaining past conversation history even as new information is added.
A Chronological History of X-Ray Astronomy Missions
Andrea Santangelo, Rosalia Madonia, Santina Piraino
In this chapter, we briefly review the history of X-ray astronomy through its missions. We follow a temporal development, from the first instruments onboard rockets and balloons to the most recent and complex space missions. We intend to provide the reader with detailed information and references on the many missions and instruments that have contributed to the success of the exploration of the X-ray universe. We have not included missions that are still operating, providing the worldwide community with high-quality observations. Specific chapters for these missions are included in a dedicated section of the handbook.
The impact of varying inhomogeneous reionization histories on metrics of Ly$α$ opacity
Caitlin C. Doughty, Joseph F. Hennawi, Jose Oñorbe
et al.
The epoch of hydrogen reionization is complete by $z=5$, but its progression at higher redshifts is uncertain. Measurements of Ly$α$ forest opacity show large scatter at $z<6$, suggestive of spatial fluctuations in neutral fraction ($x_\mathrm{HI}$), temperature, or ionizing background, either individually or in combination. However, these effects are degenerate, necessitating modeling these physics in tandem in order to properly interpret the observations. We begin this process by developing a framework for modeling the reionization history and associated temperature fluctuations, with the intention of incorporating ionizing background fluctuations at a later time. To do this, we generate several reionization histories using semi-numerical code AMBER, selecting histories with volume-weighted neutral fractions that adhere to the observed CMB optical depth and dark pixel fractions. Implementing these histories in the \texttt{Nyx} cosmological hydrodynamics code, we examine the evolution of gas within the simulation, and the associated metrics of the Ly$α$ forest opacity. We find that the pressure smoothing scale within the IGM is strongly correlated with the adiabatic index of the temperature-density relation. We find that while models with 20,000 K photoheating at reionization are better able to reproduce the shape of the observed $z=5$ 1D flux power spectrum than those with 10,000 K, they fail to match the highest wavenumbers. The simulated autocorrelation function and optical depth distributions are systematically low and narrow, respectively, compared to the observed values, but are in better agreement when the reionization history is longer in duration, more symmetric in its distribution of reionization redshifts, or if there are remaining neutral regions at $z<6$. The systematically low variance likely requires the addition of a fluctuating UVB.
History Filtering in Imperfect Information Games: Algorithms and Complexity
Christopher Solinas, Douglas Rebstock, Nathan R. Sturtevant
et al.
Historically applied exclusively to perfect information games, depth-limited search with value functions has been key to recent advances in AI for imperfect information games. Most prominent approaches with strong theoretical guarantees require subgame decomposition - a process in which a subgame is computed from public information and player beliefs. However, subgame decomposition can itself require non-trivial computations, and its tractability depends on the existence of efficient algorithms for either full enumeration or generation of the histories that form the root of the subgame. Despite this, no formal analysis of the tractability of such computations has been established in prior work, and application domains have often consisted of games, such as poker, for which enumeration is trivial on modern hardware. Applying these ideas to more complex domains requires understanding their cost. In this work, we introduce and analyze the computational aspects and tractability of filtering histories for subgame decomposition. We show that constructing a single history from the root of the subgame is generally intractable, and then provide a necessary and sufficient condition for efficient enumeration. We also introduce a novel Markov Chain Monte Carlo-based generation algorithm for trick-taking card games - a domain where enumeration is often prohibitively expensive. Our experiments demonstrate its improved scalability in the trick-taking card game Oh Hell. These contributions clarify when and how depth-limited search via subgame decomposition can be an effective tool for sequential decision-making in imperfect information settings.
Local stellar formation history from the 40 pc white dwarf sample
E. Cukanovaite, P. -E. Tremblay, S. Toonen
et al.
We derive the local stellar formation history from the Gaia-defined 40 pc white dwarf sample. This is currently the largest volume-complete sample of white dwarfs for which spectroscopy is available, allowing for classification of the chemical abundances at the photosphere, and subsequently accurate determination of the atmospheric parameters. We create a population synthesis model and show that a uniform stellar formation history for the last ~10.5 Gyr provides a satisfactory fit to the observed distribution of absolute Gaia G magnitudes. To test the robustness of our derivation, we vary various assumptions in the population synthesis model, including the initial mass function, initial-to-final mass relation, kinematic evolution, binary fraction and white dwarf cooling timescales. From these tests, we conclude that the assumptions in our model have an insignificant effect on the derived relative stellar formation rate as a function of look-back time. However, the onset of stellar formation (age of Galactic disc) is sensitive to a variety of input parameters including the white dwarf cooling models. Our derived stellar formation history gives a much better fit to the absolute Gaia G magnitudes than most previous studies.
en
astro-ph.SR, astro-ph.GA
Using heritability of stellar chemistry to reveal the history of the Milky Way
Holly Jackson, Paula Jofre, Keaghan Yaxley
et al.
Since chemical abundances are inherited between generations of stars, we use them to trace the evolutionary history of our Galaxy. We present a robust methodology for creating a phylogenetic tree, a biological tool used for centuries to study heritability. Combining our phylogeny with information on stellar ages and dynamical properties, we reconstruct the shared history of 78 stars in the Solar Neighbourhood. The branching pattern in our tree supports a scenario in which the thick disk is an ancestral population of the thin disk. The transition from thick to thin disk shows an anomaly, which we attribute to a star formation burst. Our tree shows a further signature of the variability in stars similar to the Sun, perhaps linked to a minor star formation enhancement creating our Solar System. In this paper, we demonstrate the immense potential of a phylogenetic perspective and interdisciplinary collaboration, where with borrowed techniques from biology we can study key processes that have contributed to the evolution of the Milky Way.
Dynamic Graph Embedding via LSTM History Tracking
Shima Khoshraftar, Sedigheh Mahdavi, Aijun An
et al.
Many real world networks are very large and constantly change over time. These dynamic networks exist in various domains such as social networks, traffic networks and biological interactions. To handle large dynamic networks in downstream applications such as link prediction and anomaly detection, it is essential for such networks to be transferred into a low dimensional space. Recently, network embedding, a technique that converts a large graph into a low-dimensional representation, has become increasingly popular due to its strength in preserving the structure of a network. Efficient dynamic network embedding, however, has not yet been fully explored. In this paper, we present a dynamic network embedding method that integrates the history of nodes over time into the current state of nodes. The key contribution of our work is 1) generating dynamic network embedding by combining both dynamic and static node information 2) tracking history of neighbors of nodes using LSTM 3) significantly decreasing the time and memory by training an autoencoder LSTM model using temporal walks rather than adjacency matrices of graphs which are the common practice. We evaluate our method in multiple applications such as anomaly detection, link prediction and node classification in datasets from various domains.
The History of Digital Spam
Emilio Ferrara
Spam!: that's what Lorrie Faith Cranor and Brian LaMacchia exclaimed in the title of a popular call-to-action article that appeared twenty years ago on Communications of the ACM. And yet, despite the tremendous efforts of the research community over the last two decades to mitigate this problem, the sense of urgency remains unchanged, as emerging technologies have brought new dangerous forms of digital spam under the spotlight. Furthermore, when spam is carried out with the intent to deceive or influence at scale, it can alter the very fabric of society and our behavior. In this article, I will briefly review the history of digital spam: starting from its quintessential incarnation, spam emails, to modern-days forms of spam affecting the Web and social media, the survey will close by depicting future risks associated with spam and abuse of new technologies, including Artificial Intelligence (e.g., Digital Humans). After providing a taxonomy of spam, and its most popular applications emerged throughout the last two decades, I will review technological and regulatory approaches proposed in the literature, and suggest some possible solutions to tackle this ubiquitous digital epidemic moving forward.
A Topos Formulation of Consistent Histories
Cecilia Flori
Topos theory has been suggested by Doring and Isham as an alternative mathematical structure with which to formulate physical theories. In particular it has been used to reformulate standard quantum mechanics in such a way that a novel type of logic is used to represent propositions. In this paper we extend this formulation to include temporally-ordered collections of propositions as opposed to single-time propositions. That is to say, we have developed a quantum history formalism in the language of topos theory where truth values can be assigned to temporal propositions. We analyse the extent to which such truth values can be derived from the truth values of the constituent, single-time propositions.
Photon: history, mass, charge
L. B. Okun
The talk consists of three parts. ``History'' briefly describes the emergence and evolution of the concept of photon during the first two decades of the 20th century. ``Mass'' gives a short review of the literature on the upper limit of the photon's mass. ``Charge'' is a critical discussion of the existing interpretation of searches for photon charge. Schemes, in which all photons are charged, are grossly inconsistent. A model with three kinds of photons (positive, negative and neutral) seems at first sight to be more consistent, but turns out to have its own serious problems.
Star Formation and Metallicity History of the SDSS galaxy survey: unlocking the fossil record
Benjamin Panter, Alan F. Heavens, Raul Jimenez
Using MOPED we determine non-parametrically the star-formation and metallicity history of over 37,000 high-quality galaxy spectra from the Sloan Digital Sky Survey (SDSS) early data release. We use the entire spectral range, rather than concentrating on specific features, and we estimate the complete star formation history without prior assumptions about its form (by constructing so-called `population boxes'). The main results of this initial study are that the star formation rate in SDSS galaxies has been in decline for ~6 Gyr; a metallicity distribution for star-forming gas which is peaked ~3 Gyr ago at about solar metallicity, inconsistent with closed-box models, but consistent with infall models. We also determine the infall rate of gas in SDSS and show that it has been significant for the last 3 Gyr. We investigate errors using a Monte-Carlo Markov Chain algorithm. Further, we demonstrate that recovering star formation and metallicity histories for such a large sample becomes intractable without data compression methods, particularly the exploration of the likelihood surface. By exploring the whole likelihood surface we show that age-metallicity degeneracies are not as severe as by using only a few spectral features. We find that 65% of galaxies contain a significant old population (with an age of at least 8 Gyr), including recent starburst galaxies, and that over 97% have some stars older than 2 Gyr. It is the first time that the past star formation history has been determined from the fossil record of the present-day spectra of galaxies.
Probing the thermal history of the Intergalactic Medium with Lyman-alpha absorption lines
Martin G. Haehnelt, Matthias Steinmetz
The Doppler parameter distribution of Lyman-alpha absorption is calculated for a set of different reionization histories. The differences in temperature between different reionization histories are as large as a factor three to four depending on the spectrum of the ionizing sources and the redshift of helium reionization. These temperature differences result in observable differences in the Doppler parameter distribution. Best agreement with the observed Doppler parameter distribution between redshift two and four is found if hydrogen and helium are reionized simultaneously at or before redshift five with a quasar-like spectrum.
The (absence of a) relationship between thermodynamic and logical reversibility
O. J. E. Maroney
Landauer erasure seems to provide a powerful link between thermodynamics and information processing (logical computation). The only logical operations that require a generation of heat are logically irreversible ones, with the minimum heat generation being $kT \ln 2$ per bit of information lost. Nevertheless, it will be shown logical reversibility neither implies, nor is implied by thermodynamic reversibility. By examining thermodynamically reversible operations which are logically irreversible, it is possible to show that information and entropy, while having the same form, are conceptually different.
en
physics.hist-ph, cond-mat.stat-mech
The Star Formation History of NGC 6822
Ted K. Wyder
Images of five fields in the Local Group dwarf irregular galaxy NGC 6822 obtained with the {\it Hubble Space Telescope} in the F555W and F814W filters are presented. Photometry for the stars in these images was extracted using the Point-Spread-Function fitting program HSTPHOT/MULTIPHOT. The resulting color-magnitude diagrams reach down to $V\approx26$, a level well below the red clump, and were used to solve quantitatively for the star formation history of NGC 6822. Assuming that stars began forming in this galaxy from low-metallicity gas and that there is little variation in the metallicity at each age, the distribution of stars along the red giant branch is best fit with star formation beginning in NGC 6822 12-15 Gyr ago. The best-fitting star formation histories for the old and intermediate age stars are similar among the five fields and show a constant or somewhat increasing star formation rate from 15 Gyr ago to the present except for a possible dip in the star formation rate from 3 to 5 Gyr ago. The main differences among the five fields are in the higher overall star formation rate per area in the bar fields as well as in the ratio of the recent star formation rate to the average past rate. These variations in the recent star formation rate imply that stars formed within the past 0.6 Gyr are not spatially very well mixed throughout the galaxy.
The Star Formation History of the Universe
Andrew M. Hopkins
Strong constraints on the cosmic star formation history (SFH) have recently been established using ultraviolet and far-infrared measurements, refining the results of numerous measurements over the past decade. Taken together, the most recent and robust data indicate a compellingly consistent picture of the SFH out to redshift z~6, with especially tight constraints for z < 1. There have also been a number of dedicated efforts to measure or constrain the SFH at z~6 and beyond. It is also possible to constrain the normalisation of the SFH using a combination of electron antineutrino flux limits from Super-Kamiokande measurements and supernova rate density measurements. This review presents the latest compilation of SFH measurements, and summarises the corresponding evolution for stellar and metal mass densities, and supernova rate densities. The constraints on the normalisation of the cosmic SFH, arising from the combination of the supernova rate measurements and the measurement limit on the supernova electron antineutrino flux, are also discussed.