Hasil untuk "History of Great Britain"

Menampilkan 20 dari ~2409803 hasil · dari CrossRef, arXiv

JSON API
arXiv Open Access 2026
A Portrait of the Cosmic Reionisation History in the Context of the Early Dark Energy Model

Weiyang Liu, Xin Wang, Hu Zhan et al.

Recent JWST observations of Lyman-$α$ emission at $z \sim 11-6$ indicate a rapid reionization of the intergalactic medium within the first $\sim700$ Myr. The required Lyman continuum (LyC) photon budget may naturally arise from the unexpectedly high galaxy number densities revealed by JWST, reducing the need for scenarios invoking very high LyC escape fractions ($f_{\rm esc}\gtrsim0.2$) or dominant contributions from ultra-faint galaxies ($M_{\rm UV}>-15$) in the standard $Λ$CDM framework. In this work, we model the reionization history under the Early Dark Energy (EDE) paradigm -- originally proposed to ease the Hubble tension -- which also explains the observed over-abundance of high-$z$ galaxies without extreme star formation efficiencies. The EDE model yields reionization histories consistent with current constraints while requiring only moderate LyC escape fractions and UV luminosity densities ($f_{\rm esc}\sim 0.05-0.1$, $M_{\rm UV}\lesssim -17$ to $-15$). Our results suggest that, once key astrophysical parameters are better constrained, the reionization history could serve as an independent and complementary probe of EDE cosmologies.

en astro-ph.CO, astro-ph.GA
arXiv Open Access 2025
HistoryFinder: Advancing Method-Level Source Code History Generation with Accurate Oracles and Enhanced Algorithm

Shahidul Islam, Ashik Aowal, Md Sharif Uddin et al.

Reconstructing a method's change history efficiently and accurately is critical for many software engineering tasks, including maintenance, refactoring, and comprehension. Despite the availability of method history generation tools such as CodeShovel and CodeTracker, existing evaluations of their effectiveness are limited by inaccuracies in the ground truth oracles used. In this study, we systematically construct two new oracles -- the corrected CodeShovel oracle and a newly developed HistoryFinder oracle -- by combining automated analysis with expert-guided manual validation. We also introduce HistoryFinder, a new method history generation tool designed to improve not only the accuracy and completeness of method change histories but also to offer competitive runtime performance. Through extensive evaluation across 400 methods from 40 open-source repositories, we show that HistoryFinder consistently outperforms CodeShovel, CodeTracker, IntelliJ, and Git-based baselines in terms of precision, recall, and F1 score. Moreover, HistoryFinder achieves competitive runtime performance, offering the lowest mean and median execution times among all the research-based tools. While Git-based tools exhibit the fastest runtimes, this efficiency comes at the cost of significantly lower precision and recall -- leaving HistoryFinder as the best overall choice when both accuracy and efficiency are important. To facilitate adoption, we provide a web interface, CLI, and Java library for flexible usage.

en cs.SE
arXiv Open Access 2025
Decomposing Non-Markovian History Dependence

Matthew P. Leighton, Christopher W. Lynn

Non-Markovian stochastic processes are ubiquitous in biology. Nevertheless, we lack a general framework for quantifying historical dependencies. In this Letter, we propose an information-theoretic approach to decompose history dependence in systems with non-Markovian dynamics, quantifying the information encoded in dependencies of each order. In minimal models of non-Markovian dynamics, we show that this framework correctly captures the underlying historical dependencies, even when autocorrelations do not. In prolonged recordings of fly behavior, we find that the scaling of non-Markovian dependencies is invariant across timescales from fractions of a second to minutes. Despite this invariance, the overall amount of non-Markovian information is non-monotonic, suggesting a unique timescale on which historical dependencies are strongest.

en cond-mat.stat-mech, physics.bio-ph
arXiv Open Access 2025
Comparison of methods used to derive the Galactic star formation history from white dwarf samples

Emily K. Roberts, Pier-Emmanuel Tremblay, Mairi W. O'Brien et al.

We compare three methods of deriving the local Galactic star formation history, using as a benchmark the Gaia-defined 40 pc white dwarf sample, currently the largest volume complete sample of stellar remnants with medium-resolution spectroscopy. We create a population synthesis model to 1) reproduce the observed white dwarf luminosity function, 2) reproduce the observed absolute Gaia G magnitude distribution, and 3) directly calculate the ages of all individual white dwarfs in the 40 pc volume. We then compare the star formation histories determined from each method. Previous studies using these methods were based on different white dwarf samples and as such were difficult to compare. Uncertainties in each method such as the initial mass function, initial-final mass relation, main sequence lifetimes, stellar metallicity, white dwarf cooling ages and binary evolution are accounted for to estimate the precision and accuracy of each method. We conclude that no method is quantitatively better at determining the star formation history and all three produce star formation histories that agree within uncertainties of current external astrophysical relations.

en astro-ph.SR, astro-ph.GA
arXiv Open Access 2024
Towards a Brazilian History Knowledge Graph

Valeria de Paiva, Alexandre Rademaker

This short paper describes the first steps in a project to construct a knowledge graph for Brazilian history based on the Brazilian Dictionary of Historical Biographies (DHBB) and Wikipedia/Wikidata. We contend that large repositories of Brazilian-named entities (people, places, organizations, and political events and movements) would be beneficial for extracting information from Portuguese texts. We show that many of the terms/entities described in the DHBB do not have corresponding concepts (or Q items) in Wikidata, the largest structured database of entities associated with Wikipedia. We describe previous work on extracting information from the DHBB and outline the steps to construct a Wikidata-based historical knowledge graph.

en cs.AI, cs.DL
arXiv Open Access 2024
Relational Graph in Vector Autoregression: A Case Study on the Effect of the Great Recession on Connectivity of Economic Indicators

Arkaprava Roy, Anindya Roy, Subhashis Ghosal

Under a high-dimensional vector autoregressive (VAR) model, we propose a way of efficiently estimating both the stationary graph structure between the nodal time series and their temporal dynamics. The framework is then used to make inferences on the change in interdependencies between several economic indicators due to the impact of the Great Recession, the financial crisis that lasted from 2007 through 2009. There are several key advantages of the proposed framework; (1) it develops a reparametrized VAR likelihood that can be used in general high-dimensional VAR problems, (2) it strictly maintains causality of the estimated process, making inference on stationary features more meaningful and (3) it is computationally efficient due to the reduced rank structure of the parameterization. We apply the methodology to the seasonally adjusted quarterly economic indicators available in the FRED-QD database of the Federal Reserve. The analysis essentially confirms much of the prevailing knowledge about the impact of the Great Recession on different economic indicators. At the same time, it provides deeper insight into the nature and extent of the impact on the interplay of the different indicators. We also contribute to the theory of Bayesian VAR by showing the consistency of the posterior under sparse priors for the parameters of the reduced rank formulation of the VAR process.

en stat.ME, math.ST
arXiv Open Access 2024
Constraints on the Early Luminosity History of the Sun: Applications to the Faint Young Sun Problem

Connor Basinger, Marc Pinsonneault, Sandra T. Bastelberger et al.

Stellar evolution theory predicts that the Sun was fainter in the past, which can pose difficulties for understanding Earth's climate history. One proposed solution to this Faint Young Sun problem is a more luminous Sun in the past. In this paper, we address the robustness of the solar luminosity history using the YREC code to compute solar models including rotation, magnetized winds, and the associated mass loss. We present detailed solar models, including their evolutionary history, which are in excellent agreement with solar observables. Consistent with prior standard models, we infer a high solar metal content. We provide predicted X-ray luminosities and rotation histories for usage in climate reconstructions and activity studies. We find that the Sun's luminosity deviates from the standard solar model trajectory by at most 0.5% during the Archean (corresponding to a radiative forcing of 0.849 W m$^{-2}$). The total mass loss experienced by solar models is modest because of strong feedback between mass and angular momentum loss. We find a maximum mass loss of $1.35 \times 10^{-3} M_\odot$ since birth, at or below the level predicted by empirical estimates. The associated maximum luminosity increase falls well short of the level necessary to solve the FYS problem. We present compilations of paleotemperature and CO$_2$ reconstructions. 1-D "inverse" climate models demonstrate a mismatch between the solar constant needed to reach high temperatures (e.g. 60-80 $^{\circ}$C) and the narrow range of plausible solar luminosities determined in this study. Maintaining a temperate Earth, however, is plausible given these conditions.

en astro-ph.SR, astro-ph.EP
arXiv Open Access 2023
Explainable History Distillation by Marked Temporal Point Process

Sishun Liu, Ke Deng, Yan Wang et al.

Explainability of machine learning models is mandatory when researchers introduce these commonly believed black boxes to real-world tasks, especially high-stakes ones. In this paper, we build a machine learning system to automatically generate explanations of happened events from history by \gls{ca} based on the \acrfull{tpp}. Specifically, we propose a new task called \acrfull{ehd}. This task requires a model to distill as few events as possible from observed history. The target is that the event distribution conditioned on left events predicts the observed future noticeably worse. We then regard distilled events as the explanation for the future. To efficiently solve \acrshort{ehd}, we rewrite the task into a \gls{01ip} and directly estimate the solution to the program by a model called \acrfull{model}. This work fills the gap between our task and existing works, which only spot the difference between factual and counterfactual worlds after applying a predefined modification to the environment. Experiment results on Retweet and StackOverflow datasets prove that \acrshort{model} significantly outperforms other \acrshort{ehd} baselines and can reveal the rationale underpinning real-world processes.

en cs.LG
arXiv Open Access 2022
Acoustic Modeling for End-to-End Empathetic Dialogue Speech Synthesis Using Linguistic and Prosodic Contexts of Dialogue History

Yuto Nishimura, Yuki Saito, Shinnosuke Takamichi et al.

We propose an end-to-end empathetic dialogue speech synthesis (DSS) model that considers both the linguistic and prosodic contexts of dialogue history. Empathy is the active attempt by humans to get inside the interlocutor in dialogue, and empathetic DSS is a technology to implement this act in spoken dialogue systems. Our model is conditioned by the history of linguistic and prosody features for predicting appropriate dialogue context. As such, it can be regarded as an extension of the conventional linguistic-feature-based dialogue history modeling. To train the empathetic DSS model effectively, we investigate 1) a self-supervised learning model pretrained with large speech corpora, 2) a style-guided training using a prosody embedding of the current utterance to be predicted by the dialogue context embedding, 3) a cross-modal attention to combine text and speech modalities, and 4) a sentence-wise embedding to achieve fine-grained prosody modeling rather than utterance-wise modeling. The evaluation results demonstrate that 1) simply considering prosodic contexts of the dialogue history does not improve the quality of speech in empathetic DSS and 2) introducing style-guided training and sentence-wise embedding modeling achieves higher speech quality than that by the conventional method.

en cs.SD, cs.CL
arXiv Open Access 2014
Numerical Implementation of a Cohesive Zone Model in History-Dependent Materials

L. Hakim, S. E. Mikhailov

A non-linear history-dependent cohesive zone model of crack propagation in linear elastic and visco-elastic materials is presented. The viscoelasticity is described by a linear Volterra integral operator in time. The normal stress on the cohesive zone satisfies the history dependent yield condition, given by a non-linear Abel-type integral operator. The crack starts propagating, breaking the cohesive zone, when the crack tip opening reaches a prescribed critical value. A numerical algorithm for computing the evolution of the crack and cohesive zone in time is discussed along with some numerical results.

arXiv Open Access 2013
The IMACS Cluster Building Survey: IV. The Log-normal Star Formation History of Galaxies

Michael D. Gladders, Augustus Oemler, Alan Dressler et al.

We present here a simple model for the star formation history of galaxies that is successful in describing both the star formation rate density over cosmic time, as well as the distribution of specific star formation rates of galaxies at the current epoch, and the evolution of this quantity in galaxy populations to a redshift of z=1. We show first that the cosmic star formation rate density is remarkably well described by a simple log-normal in time. We next postulate that this functional form for the ensemble is also a reasonable description for the star formation histories of individual galaxies. Using the measured specific star formation rates for galaxies at z~0 from Paper III in this series, we then construct a realisation of a universe populated by such galaxies in which the parameters of the log-normal star formation history of each galaxy are adjusted to match the specific star formation rates at z~0 as well as fitting, in ensemble, the cosmic star formation rate density from z=0 to z=8. This model predicts, with striking fidelity, the distribution of specific star formation rates in mass-limited galaxy samples to z=1; this match is not achieved by other models with a different functional form for the star formation histories of individual galaxies, but with the same number of degrees of freedom, suggesting that the log-normal form is well matched to the likely actual histories of individual galaxies. We also impose the specific star formation rate versus mass distributions at higher redshifts from Paper III as constraints on the model, and show that, as previously suggested, some galaxies in the field, particularly low mass galaxies, are quite young at intermediate redshifts. As emphasized in Paper III, starbursts are insufficient ...[abridged]

en astro-ph.CO
arXiv Open Access 2013
Consistent Histories as Valuations

Yousef Ghazi-Tabatabai

Sorkin's coevent interpretation shifts the focus of quantum logic from the structure of a propositional lattice to the nature of truth valuations thereon. We apply this shift in emphasis to a simple formulation of the consistent histories approach, expressing it in terms of truth valuations which are brought together in a logical framework. We see that these consistent histories valuations are related to Sorkin's multiplicative coevents, and that they can be naturally described by an adaptation of Isham's early topos-theoretic approach.

en quant-ph
arXiv Open Access 2009
Star Formation Histories, Abundances and Kinematics of Dwarf Galaxies in the Local Group

E. Tolstoy, V. Hill, M. Tosi

Within the Local Universe galaxies can be studied in great detail star by star, and here we review the results of quantitative studies in nearby dwarf galaxies. The Color-Magnitude Diagram synthesis method is well established as the most accurate way to determine star formation history of galaxies back to the earliest times. This approach received a large boost from the exceptional data sets that wide field CCD imagers on the ground and the Hubble Space Telescope could provide. Spectroscopic studies using large ground based telescopes such as VLT, Magellan, Keck and HET have allowed the determination of abundances and kinematics for significant samples of stars in nearby dwarf galaxies. These studies have shown how the properties of stellar populations can vary spatially and temporally. This leads to important constraints to theories of galaxy formation and evolution. The combination of spectroscopy and imaging and what they have taught us about dwarf galaxy formation and evolution is the aim of this review.

en astro-ph.CO, astro-ph.GA
CrossRef Open Access 2004
A Commentary on the History of Social Psychiatry and Psychotherapy in Twentieth-Century Germany, Holland and Great Britain

Michael Neve

The detailed essays in this special issue of Medical History provide an opportunity for reflection on common themes as well as on differing medical and historical contexts, specifically examining the organization and practice of European social psychiatry, its various definitions, as well as the history of psychotherapy, in twentieth-century Germany, Holland and Great Britain. The chance has also arisen for one of the two guest editors to comment briefly on various other points that seem pertinent, by way of brief introduction. His fellow guest editor, Harry Oosterhuis, is the author of one essay and co-author of another.

Halaman 52 dari 120491