BackgroundEthiopia has been implementing a community health extension program (HEP) since 2003. We aimed to assess the successes and challenges of the HEP over time, and develop a framework that may assist the implementation of the program toward universal primary healthcare services.MethodsWe conducted a systematic review and synthesis of the literature on the HEP in Ethiopia between 2003 and 2018. Literature search was accomplished in PubMed, Embase and Google scholar databases. Literature search strategies were developed using medical subject headings (MeSH) and text words related to the aim of the review. We used a three-stage screening process to select the publications. Data extraction was conducted by three reviewers using pre-prepared data extraction form. We conducted an interpretive (not aggregative) synthesis of studies.FindingsThe HEP enabled Ethiopia to achieve significant improvements in maternal and child health, communicable diseases, hygiene and sanitation, knowledge and health care seeking. The HEP has been a learning organization that adapts itself to community demands. The program is also dynamic enough to shift tasks between health centers and community. The community has been a key player in the successful implementation of the HEP. In spite of these successes, the program is currently facing challenges that remain to be addressed. These challenges are related to productivity and efficiency of health extension workers (HEWs); working and living conditions of HEWs; capacity of health posts; and, social determinants of health. These require a systemic approach that involves the wider health system, community, and sectors responsible for social determinants of health. We developed a framework that may assist in the implementation of the HEP.ConclusionThe HEP has enabled Ethiopia to achieve significant improvements. However, several challenges remain to be addressed. The framework can be utilized to improve community health programs toward universal coverage for primary healthcare services.
Jet substructure provides one of the most exciting new approaches for searching for physics in and beyond the Standard Model at the Large Hadron Collider. Modern jet substructure searches are often performed with Neural Network (NN) taggers which study the jets' radiation distributions in great detail, far beyond what is theoretically described by parton shower generators. While this represents a great opportunity, as NNs look deeper into the structure of jets they become increasingly sensitive both to perturbative and non-perturbative theoretical uncertainties. It is therefore important to be able to control which aspects of both regimes the networks focus on, and to develop techniques for quantifying these uncertainties. In this paper we take two steps in this direction: First, we introduce EnFNs, a generalization of the Energy Flow Networks (EFNs) which directly probes higher point correlations in jets, as motivated by recent advances in the study of energy correlators. Second, we introduce a number of techniques to quantify and visualize their robustness to non-perturbative corrections. We highlight the importance of such considerations in a toy study incorporating systematics into a search, and maximizing for the network's discovery significance, as opposed to absolute tagging performance. We hope this study continues the interest in understanding the role QCD systematics play in Machine Learning applications and opens the door to a better interplay between theory and experiment in HEP.
Alexander Held, Elliott Kauffman, Oksana Shadura
et al.
Realistic environments for prototyping, studying and improving analysis workflows are a crucial element on the way towards user-friendly physics analysis at HL-LHC scale. The IRIS-HEP Analysis Grand Challenge (AGC) provides such an environment. It defines a scalable and modular analysis task that captures relevant workflow aspects, ranging from large-scale data processing and handling of systematic uncertainties to statistical inference and analysis preservation. By being based on publicly available Open Data, the AGC provides a point of contact for the broader community. Multiple different implementations of the analysis task that make use of various pipelines and software stacks already exist. This contribution presents an updated AGC analysis task. It features a machine learning component and expanded analysis complexity, including the handling of an extended and more realistic set of systematic uncertainties. These changes both align the AGC further with analysis needs at the HL-LHC and allow for probing an increased set of functionality. Another focus is the showcase of a reference AGC implementation, which is heavily based on the HEP Python ecosystem and uses modern analysis facilities. The integration of various data delivery strategies is described, resulting in multiple analysis pipelines that are compared to each other.
Searches for Beyond the Standard Model physics require probing the Standard Model with increased precision. One way this can be achieved is by improving the accuracy of the event selection classifiers. Recently, Gene Expression Programming (GEP) has been shown to provide complex yet easy to interpret classifiers in various fields. Previous attempts to apply GEP to high-energy physics (HEP), though limited by computational power available, achieved classifier accuracy of up to 95\%. In this paper, we demonstrate that a selection algorithm optimized by GEP and applied to the top-quark pair production process' semi-leptonic decay channel enables the increase of data purity for already highly pure data. Moreover, we explain how adding penalty cuts to the purity fitness function allows adjusting the optimized classifier to the needs of a specific measurement in terms of the size of the selected event sample and data purity.
Jim Pivarski, Eduardo Rodrigues, Kevin Pedro
et al.
This paper was prepared by the HEP Software Foundation (HSF) PyHEP Working Group as input to the second phase of the LHCC review of High-Luminosity LHC (HL-LHC) computing, which took place in November, 2021. It describes the adoption of Python and data science tools in HEP, discusses the likelihood of future scenarios, and recommendations for action by the HEP community.
Gabriele Benelli, Thomas Y. Chen, Javier Duarte
et al.
The growing role of data science (DS) and machine learning (ML) in high-energy physics (HEP) is well established and pertinent given the complex detectors, large data, sets and sophisticated analyses at the heart of HEP research. Moreover, exploiting symmetries inherent in physics data have inspired physics-informed ML as a vibrant sub-field of computer science research. HEP researchers benefit greatly from materials widely available materials for use in education, training and workforce development. They are also contributing to these materials and providing software to DS/ML-related fields. Increasingly, physics departments are offering courses at the intersection of DS, ML and physics, often using curricula developed by HEP researchers and involving open software and data used in HEP. In this white paper, we explore synergies between HEP research and DS/ML education, discuss opportunities and challenges at this intersection, and propose community activities that will be mutually beneficial.
SoLAr is a new concept for a liquid-argon neutrino detector technology to extend the sensitivities of these devices to the MeV energy range - expanding the physics reach of these next-generation detectors to include solar neutrinos. We propose this novel concept to significantly improve the precision on solar neutrino mixing parameters and to observe the "hep branch" of the proton-proton fusion chain. The SoLAr detector will achieve flavour-tagging of solar neutrinos in liquid argon. The SoLAr technology will be based on the concept of monolithic light-charge pixel-based readout which addresses the main requirements for such a detector: a low energy threshold with excellent energy resolution (approximately 7%) and background rejection through pulse-shape discrimination. The SoLAr concept is also timely as a possible technology choice for the DUNE "Module of Opportunity", which could serve as a next-generation multi-purpose observatory for neutrinos from the MeV to the GeV range. The goal of SoLAr is to observe solar neutrinos in a 10 ton-scale detector and to demonstrate that the required background suppression and energy resolution can be achieved. SoLAr will pave the way for a precise measurement of the 8-B flux, an improved precision on solar neutrino mixing parameters, and ultimately lead to the first observation of hep neutrinos in the DUNE Module of Opportunity.
In High Energy Physics (HEP), analysis metadata comes in many forms -- from theoretical cross-sections, to calibration corrections, to details about file processing. Correctly applying metadata is a crucial and often time-consuming step in an analysis, but designing analysis metadata systems has historically received little direct attention. Among other considerations, an ideal metadata tool should be easy to use by new analysers, should scale to large data volumes and diverse processing paradigms, and should enable future analysis reinterpretation. This document, which is the product of community discussions organised by the HEP Software Foundation, categorises types of metadata by scope and format and gives examples of current metadata solutions. Important design considerations for metadata systems, including sociological factors, analysis preservation efforts, and technical factors, are discussed. A list of best practices and technical requirements for future analysis metadata systems is presented. These best practices could guide the development of a future cross-experimental effort for analysis metadata tools.