Hamiltonian systems lie at the heart of modeling the physical world. Their defining scalar, the Hamiltonian, encodes both energy conservation and symplectic geometry in its phase-space trajectories. Recent deep learning approaches model Hamiltonian systems by embedding their properties either in the architecture or in the loss function. However, they typically ignore that: i) a Hamiltonian carries units of energy and/or ii) that every integrable Hamiltonian admits a canonical transformation to action-angle coordinates in which the dynamics reduce to a simple rotation on an invariant torus. We propose BuSyNet, a deep learning architecture that combines these two constraints via a dimensionally-consistent, symplectic transformation. A symplectic layer maps input trajectories to lower-dimensional latent action-angle variables, which are then combined with system parameters to discover a symbolic Hamiltonian expression in units of energy. Evaluated on the harmonic oscillator and the Kepler two-body problem (in 2D and 3D), BuSyNet recovers concise, closed-form Hamiltonians that outperform state-of-the-art neural architectures in long-term prediction accuracy and stability, while maintaining interpretability.
Mario Santos, Margarida Reis Santos, Anna-Lena Zietlow
et al.
Early relational health during the first 24 months of life is a key determinant of child development and wellbeing. During this postnatal period, the parent–infant relationship plays a central role in emotional regulation, bonding, and developmental trajectories. Although the broader early relational health framework encompasses the first 1,000 days of life, this scoping review focuses specifically on the postnatal phase, where parent–infant interactions are directly observable and measurable. However, existing assessment instruments vary widely in their conceptual focus, scope, and characteristics, and no comprehensive review has systematically mapped tools used to assess the parent–infant relationship during early infancy. In response to this gap, a transdisciplinary working group within the COST Action CA22114 – TREASURE collaboratively developed a scoping review protocol to systematically map instruments assessing the parent–infant relationship from birth to 24 months of age. This Brief Report describes the collaborative methodological process underpinning the protocol’s development. The process followed an iterative, consensus-driven approach involving multidisciplinary experts from multiple COST member countries. Through structured online meetings, the group clarified core constructs and established the age range using the Population–Concept–Context (PCC) framework. The JBI methodology for scoping reviews was adopted and aligned with PRISMA-ScR standards to ensure transparency and reproducibility. Progressive drafting, internal peer review, and iterative refinement led to the final protocol, which was registered on the Open Science Framework (DOI: 10.17605/OSF.IO/HRVX9).The resulting protocol provides a replicable methodological framework for mapping instruments that assess the parent–infant relationship in the first two years of life. This Brief Report presents a framework for collaborative protocol development in international research networks, promoting shared knowledge generation in early relational health research and offering potential applicability to other COST initiatives.
The integration of the history and philosophy of statistics was initiated at least by Hacking (1975) and advanced by Hacking (1990), Mayo (1996), and Zabell (2005), but it has not received sustained follow-up. Yet such integration is more urgent than ever, as the recent success of artificial intelligence has been driven largely by machine learning -- a field historically developed alongside statistics. Today, the boundary between statistics and machine learning is increasingly blurred. What we now need is integration, twice over: of history and philosophy, and of two fields they engage -- statistics and machine learning. I present a case study of a philosophical idea in machine learning (and in formal epistemology) whose root can be traced back to an often under-appreciated insight in Neyman and Pearson's 1936 work (a follow-up to their 1933 classic). This leads to the articulation of an epistemological principle -- largely implicit in, but shared by, the practices of frequentist statistics and machine learning -- which I call achievabilism: the thesis that the correct standard for assessing non-deductive inference methods should not be fixed, but should instead be sensitive to what is achievable in specific problem contexts. Another integration also emerges at the level of methodology, combining two ends of the philosophy of science spectrum: history and philosophy of science on the one hand, and formal epistemology on the other hand.
Hagit Attiya, Michael A. Bender, Martín Farach-Colton
et al.
A history-independent data structure does not reveal the history of operations applied to it, only its current logical state, even if its internal state is examined. This paper studies history-independent concurrent dictionaries, in particular, hash tables, and establishes inherent bounds on their space requirements. This paper shows that there is a lock-free history-independent concurrent hash table, in which each memory cell stores two elements and two bits, based on Robin Hood hashing. Our implementation is linearizable, and uses the shared memory primitive LL/SC. The expected amortized step complexity of the hash table is $O(c)$, where $c$ is an upper bound on the number of concurrent operations that access the same element, assuming the hash table is not overpopulated. We complement this positive result by showing that even if we have only two concurrent processes, no history-independent concurrent dictionary that supports sets of any size, with wait-free membership queries and obstruction-free insertions and deletions, can store only two elements of the set and a constant number of bits in each memory cell. This holds even if the step complexity of operations on the dictionary is unbounded.
While Multimodal Large Language Models (MLLMs) have advanced GUI navigation agents, current approaches face limitations in cross-domain generalization and effective history utilization. We present a reasoning-enhanced framework that systematically integrates structured reasoning, action prediction, and history summarization. The structured reasoning component generates coherent Chain-of-Thought analyses combining progress estimation and decision reasoning, which inform both immediate action predictions and compact history summaries for future steps. Based on this framework, we train a GUI agent, \textbf{GUI-Rise}, through supervised fine-tuning on pseudo-labeled trajectories and reinforcement learning with Group Relative Policy Optimization (GRPO). This framework employs specialized rewards, including a history-aware objective, directly linking summary quality to subsequent action performance. Comprehensive evaluations on standard benchmarks demonstrate state-of-the-art results under identical training data conditions, with particularly strong performance in out-of-domain scenarios. These findings validate our framework's ability to maintain robust reasoning and generalization across diverse GUI navigation tasks. Code is available at https://leon022.github.io/GUI-Rise.
Measurements of jet substructure are key to probing the energy frontier at colliders, and many of them use track-based observables which take advantage of the angular precision of tracking detectors. Theoretical calculations of track-based observables require ‘track functions’, which characterize the transverse momentum fraction rq carried by charged hadrons from a fragmenting quark or gluon. This letter presents a direct measurement of rq distributions in dijet events from the 140 fb−1 of proton–proton collisions at s=13 TeV recorded with the ATLAS detector. The data are corrected for detector effects using machine-learning methods. The scale evolution of the moments of the rq distribution is sensitive to non-linear renormalization group evolution equations of QCD, and is compared with analytic predictions. When incorporated into future theoretical calculations, these results will enable a precision program of theory-data comparison for track-based jet substructure observables.
The thematization of sacrifice, although important in Adorno's work, is more often than not subjacent, developed indirectly and through the interpretation of the Odyssey. Our aim is to help reveal its modalities, its historicity and its stakes by comparing it with René Girard's mimetic theory, in which the conversion of the victim mechanism into sociogenetic sacrifice is decisive. After presenting the argument which posits sacrifice as a necessary form of violence, because of its social meaning, we return to the place of sacrifice in Girard's work before sketching out in which way this could clarify Adorno’s historicization and social recontextualization of it.
Christiane Wesarg-Menzel, Mathilde Gallistl, Michael Niconchuk
et al.
Abstract Many refugees experience multiple traumatic events, which set them at increased risk to develop post-traumatic stress disorder (PTSD). To refine interventions aimed at improving refugees’ mental health, a better understanding of the factors modulating vulnerability to war-related trauma is needed. In the present study, we focused on stress resonance as a potential vulnerability factor. Stress resonance reflects the empathic sharing of others’ subjective and physiological stress experience. Sixty-seven participants who came from Arabic-speaking countries and had entered Germany as refugees or migrants took part in an empathic stress test, in which they observed a native German speaker undergo a psychosocial laboratory stressor. Meanwhile, different stress markers (subjective stress, heart rate, heart rate variability, and cortisol release) were simultaneously captured in the stressed targets and passive observers. Moderation analyses did not support our hypothesis that the extent to which someone resonates with others’ stress is a vulnerability factor in the development of PTSD symptoms after trauma exposure. Rather, higher levels of subjective and autonomic stress resonance were directly related to PTSD symptom severity when controlling for sex, age, and trauma exposure. Our findings suggest that heightened stress resonance may constitute a malleable correlate of PTSD symptoms rather than a trait modulating health risk. In the future, efforts should be made to test whether individuals with a history of war-related trauma would benefit from interventions aimed to reduce the tendency to excessively share others’ stress.
Public health diplomacy addresses global challenges impacting societies, economies, the environment, and health by integrating foreign policy and development. The University of Memphis School of Public Health hosted a multistakeholder summit to identify strategies and competencies essential for effective public health diplomacy. A 3-day summit included 29 participants from 15 countries, representing the WHO, the World Federation of United Nations, and seven regional public health associations. An iterative human-centered design (HCD) approach and concept mapping were employed to facilitate discussions and generate actionable recommendations. Developed a working definition of Public Health Diplomacy emphasizing cross-disciplinary collaborations, communication, negotiation, and consensus building. Produced a 9-point action plan to establish a global framework, launch capacity-building initiatives, and institutionalize public health diplomacy as a public health discipline.
Joan Vedrí, Raquel Niclòs, Lluís Pérez-Planells
et al.
Surface air temperature (SAT) is an essential climate variable (ECV). Models based on remote sensing data allow us to study SAT, without the need for a large network of meteorological stations. Therefore, it allows monitoring the climate in remote and extensive areas. Niclos et al. (2014) proposed parametric equations for the SAT retrieval over the Spanish Mediterranean basins. In this study, we evaluated those equations, but in a larger area and period of study. In addition, we proposed several linear regression models and nonlinear models based on decision tree methods, non-parametric methods and neuronal networks. These models relate SAT to land surface temperature, vegetation indexes and albedo from MODIS data. Moreover, meteorological reanalysis data, from ERA5-Land database, and geographical parameters were used. The accuracy of each model was evaluated against data from meteorological stations operated by AEMET in the Spanish Mediterranean basins, during the period 2021–2022. The equations of Niclos et al. (2014) obtained a robust root mean square error (RRMSE) of 3.1 K at daytime and 1.9 K at nighttime. For the linear regression models, the RRMSE decreased to 2.3 K (1.5 K) at daytime (nighttime). Finally, the nonlinear methods, in particular XGBoost model, showed an RRMSE of 1.5 K for daytime and 1.0 K at nighttime. Therefore, the comparison between methods showed that nonlinear models, in particular those based on decision tree methods, offered the best results in SAT retrieval in our study.
Traditional imitation learning focuses on modeling the behavioral mechanisms of experts, which requires a large amount of interaction history generated by some fixed expert. However, in many streaming applications, such as streaming recommender systems, online decision-makers typically engage in online learning during the decision-making process, meaning that the interaction history generated by online decision-makers includes their behavioral evolution from novice expert to experienced expert. This poses a new challenge for existing imitation learning approaches that can only utilize data from experienced experts. To address this issue, this paper proposes an inverse batched contextual bandit (IBCB) framework that can efficiently perform estimations of environment reward parameters and learned policy based on the expert's behavioral evolution history. Specifically, IBCB formulates the inverse problem into a simple quadratic programming problem by utilizing the behavioral evolution history of the batched contextual bandit with inaccessible rewards. We demonstrate that IBCB is a unified framework for both deterministic and randomized bandit policies. The experimental results indicate that IBCB outperforms several existing imitation learning algorithms on synthetic and real-world data and significantly reduces running time. Additionally, empirical analyses reveal that IBCB exhibits better out-of-distribution generalization and is highly effective in learning the bandit policy from the interaction history of novice experts.
We overview the history of primordial black hole (PBH) research from the first papers around 50 years ago to the present epoch. The history may be divided into four periods, the dividing lines being marked by three key developments: inflation on the theoretical front and the detection of microlensing events by the MACHO project and gravitational waves by the LIGO/Virgo/KAGRA project on the observation front. However, they are also characterised by somewhat different focuses of research. The period 1967-1980 covered the groundbreaking work on PBH formation and evaporation. The period 1980-1996 mainly focussed on their formation, while the period 1996-2016 consolidated the work on formation but also collated the constraints on the PBH abundance. In the period 2016-2024 there was a shift of emphasis to the search for evidence for PBHs and - while opinions about the strength of the purported evidence vary - this has motivated more careful studies of some aspects of the subject. Certainly the soaring number of papers on PBHs in this last period indicates a growing interest in the topic.
Ridge and furrow fields are land-use-related surface structures that are widespread in Europe and represent a geomorphological key signature of the Anthropocene. Previous research has identified various reasons for the intentional and unintentional formation of these structures, such as the use of a mouldboard plough, soil improvement and drainage. We used GIS-based quantitative erosion modelling according to the Universal Soil Loss Equation (USLE) to calculate the erosion susceptibility of a selected study area in Southern Germany. We compared the calculated erosion susceptibility for two scenarios: (1) the present topography with ridges and furrows and (2) the smoothed topography without ridges and furrows. The ridges and furrows for the studied site reduce the erosion susceptibility by more than 50% compared to the smoothed surface. Thus, for the first time, we were able to identify lower soil erosion susceptibility as one of the possible causes for the formation of ridge and furrow fields. Finally, our communication paper points to future perspectives of quantitative analyses of historical soil erosion.
<p>The Franconian Alb of SE Germany is characterized by large-scale exposures
of Jurassic shallow marine limestones and dolostones, which are frequently
considered to be outcrop analogues for deep geothermal reservoir rocks in the
North Alpine Foreland Basin farther south. However, the burial history of
the Franconian Alb Jurassic strata is not well known as they were affected
by emersion, leading to extensive erosion and karstification with only
remnants of the original Cretaceous and Cenozoic cover rocks preserved. To
estimate the original thicknesses of the post-Jurassic overburden we
investigated the petrophysical properties and the thermal history of Lower
and Middle Jurassic mudstones to constrain their burial history in the
Franconian Alb area. We measured mudstone porosities, densities, and
maturities of organic material and collected interval velocities from
seismic refraction and logging data in shallow mudstone-rich strata.
Mudstone porosities and P-wave velocities vertical to bedding were then
related to a normal compaction trend that was calibrated on stratigraphic
equivalent units in the North Alpine Foreland Basin. Our results suggest
maximum burial depths of 900–1700 m, 300–1100 m of which is attributed
to Cretaceous and younger sedimentary rocks overlying the Franconian Alb
Jurassic units. Compared to previous considerations this implies a more
widespread distribution and increased thicknesses of up to <span class="inline-formula">∼900</span> m for Cretaceous and up to <span class="inline-formula">∼200</span> m for Cenozoic units in
SE Germany. Maximum overburden is critical to understand mechanical and
diagenetical compaction of the dolostones and limestones of the Upper
Jurassic of the Franconian Alb. The results of this study therefore help
to better correlate the deep geothermal reservoir properties of the Upper
Jurassic from outcrop to reservoir conditions below the North Alpine
Foreland Basin. Here, the Upper Jurassic geothermal reservoir can be found
at depths of up to 5000 m.</p>
AbstractThis chapter will focus on the two decades after 1945, the period of the “post-war society” (1945–1967), which in the historical sciences is also characterized as a period of boom (keywords: “Wirtschaftswunder” (“economic miracle”), expansion of the welfare state, expansion of the educational sector, certainty about the future) and which comes to an end in the 1970s. Germany was undergoing a profound process of change: socio-structural changes in an advanced industrial society, structural changes in the family and a retreat into the private sphere, new opportunities in the areas of consumption and leisure due to the “Wirtschaftswunder,” urbanization and changes in communities, “Western Integration” (“Westbindung”), the ban on the KPD (Communist Party of Germany) in 1956, remilitarization, the development of the mass media and mass motorization, and the repression of the Nazi past were central social and sociological issues. At the same time, fascist tendencies were still virulent during the 1950s and 1960s. After 1945, sociology had to be rebuilt. Journals were refounded or newly founded, the German Sociological Association was restored and sociology was re-established as a teaching subject. Different “schools” and regional centers of sociology emerged. The so-called Cologne School centered around René König, the Frankfurt School around Adorno and Horkheimer, and the circle around Helmut Schelsky should be mentioned in particular; but also, Wolfgang Abendroth, Werner Hofmann, and Heinz Maus (Marburg School), Otto Stammer (Berlin), Arnold Bergstraesser (Freiburg i.Br.), and Helmuth Plessner (Göttingen). Despite their theoretical and political differences, up until the 1950s, they all had in common the decisive will for political and social enlightenment regarding the post-war situation. Furthermore, the particular importance that empirical social research and non-university research institutions had for the further development of sociology after 1945 is worth mentioning.At the end of the 1950s, field-specific dynamics gained momentum. The different “schools” and groups tried to secure and expand their position in the sociological field and their divergent research profiles became increasingly visible. The so-called civil war in sociology drove the actors further apart. Additionally, disciplinary struggles and camp-building processes during the first 20 years of West German sociology revolved around the debate on role theory and the dispute over positivism. By the end of the 1950s, an institutional and generational change can be observed. The so-called post-war generation, which included Ralf Dahrendorf, Jürgen Habermas, Niklas Luhmann, Erwin K. Scheuch, Heinrich Popitz, Hans Paul Bahrdt, M. Rainer Lepsius, and Renate Mayntz, assumed central positions in organizations, editorial boards of journals, and universities. While the early “schools” and circles (König, Schelsky, Adorno, and Horkheimer) initially focused on the sociology of the family and empirical research, the following generation concentrated foremost on industrial sociology, but also on topics of social structure and social stratification as well as on social mobility.
Sibylle Kautz-Freimuth, Marcus Redaèlli, Kerstin Rhiem
et al.
Abstract Background Women with pathogenic BRCA1 and BRCA2 mutations possess a high risk of developing breast and ovarian cancer. They face difficult choices when considering preventive options. This study presents the development process of the first decision aids to support this complex decision-making process in the German healthcare system. Methods A six-step development process based on the International Patient Decision Aid Standards was used, including a systematic literature review of existing decision aids, a topical medical literature review, preparation of the decision aids, focus group discussions with women with BRCA1/2 mutations, internal and external reviews by clinical and self-help experts, and user tests. All reviews were followed by iterative revisions. Results No existing decision aids were transferable to the German setting. The medical research revealed a need to develop separate decision aids for women with BRCA1/2 mutations (A) without a history of cancer (previvors) and (B) with a history of unilateral breast cancer (survivors). The focus group discussions confirmed a high level of approval for the decision aids from both target groups. Additionally, previvors requested more information on risk-reducing breast surgery, risk-reducing removal of both ovaries and Fallopian tubes, and psychological aspects; survivors especially wanted more information on breast cancer on the affected side (e.g. biological parameters, treatment, and risk of recurrence). Conclusions In a structured process, two target-group-specific DAs for previvors/survivors with BRCA1/2 mutations were developed to support decision-making on risk-adapted preventive options. These patient-oriented tools offer an important addition to existing specialist medical care in Germany.
Computer applications to medicine. Medical informatics
The paper presents an overview of the history and achievements of trans-border cooperation in the Lithuania-Germany-Poland triangle in planning instruments in Construction Management, decision-making theory, application of Operational Research, and Multiple Criteria Decision Making (MCDM) methods in Civil Engineering and sustainable development. The cooperation and results of the Colloquiums with 35 years of tradition, their multidimensional nature is underlined. The research instruments, methods, studied phenomena are reviewed and characteristic applications in engineering and economics are presented. The knowledge and combined efforts of three academic centers have created a synergy which set in motion many original methods and spectacular implementations. The Colloquium calendar and the evolution of organizational forms are presented along with the inclusion of the informal EURO Working Group on Operations Research in Sustainable Development and Civil Engineering.
Mrigank Rochan, Mahesh Kumar Krishna Reddy, Linwei Ye
et al.
Recently, there is an increasing interest in highlight detection research where the goal is to create a short duration video from a longer video by extracting its interesting moments. However, most existing methods ignore the fact that the definition of video highlight is highly subjective. Different users may have different preferences of highlight for the same input video. In this paper, we propose a simple yet effective framework that learns to adapt highlight detection to a user by exploiting the user's history in the form of highlights that the user has previously created. Our framework consists of two sub-networks: a fully temporal convolutional highlight detection network $H$ that predicts highlight for an input video and a history encoder network $M$ for user history. We introduce a newly designed temporal-adaptive instance normalization (T-AIN) layer to $H$ where the two sub-networks interact with each other. T-AIN has affine parameters that are predicted from $M$ based on the user history and is responsible for the user-adaptive signal to $H$. Extensive experiments on a large-scale dataset show that our framework can make more accurate and user-specific highlight predictions.
The infection fatality rate (IFR) of the Coronavirus Disease 2019 (COVID-19) is one of the most discussed figures in the context of this pandemic. Using German COVID-19 surveillance data and age-group specific IFR estimates from multiple international studies, this work investigates time-dependent variations in effective IFR over the course of the pandemic. Three different methods for estimating (effective) IFRs are presented: (a) population-averaged IFRs based on the assumption that the infection risk is independent of age and time, (b) effective IFRs based on the assumption that the age distribution of confirmed cases approximately reflects the age distribution of infected individuals, and (c) effective IFRs accounting for age- and time-dependent dark figures of infections. Results show that effective IFRs in Germany are estimated to vary over time, as the age distributions of confirmed cases and estimated infections are changing during the course of the pandemic. In particular during the first and second waves of infections in spring and autumn/winter 2020, there has been a pronounced shift in the age distribution of confirmed cases towards older age groups, resulting in larger effective IFR estimates. The temporary increase in effective IFR during the first wave is estimated to be smaller but still remains when adjusting for age- and time-dependent dark figures. A comparison of effective IFRs with observed CFRs indicates that a substantial fraction of the time-dependent variability in observed mortality can be explained by changes in the age distribution of infections. Furthermore, a vanishing gap between effective IFRs and observed CFRs is apparent after the first infection wave, while a moderately increasing gap can be observed during the second wave. Further research is warranted to obtain timely age-stratified IFR estimates.