Alyyah Malick, Benjamin Lebwohl, Peter H.R. Green et al.
Hasil untuk "hep-ex"
Menampilkan 20 dari ~757813 hasil · dari DOAJ, CrossRef, Semantic Scholar, arXiv
Sameera Shuaibi, Ian Tobal, James Gore et al.
Chun-Han Lo, Rahul Pannala, N. Jewel Samadder
Meredith Yellen, Rangesh Modi, Rebecca Yao et al.
Miguel E. Gomez, Harmeet Malhi, Douglas Simonetto
Graeme Andrew Stewart. Sanmay Ganguly, Sattwamo Ghosh, Philippe Gras et al.
Jet reconstruction remains a critical task in the analysis of data from HEP colliders. We describe in this paper a new, highly performant, Julia package for jet reconstruction, JetReconstruction.jl, which integrates into the growing ecosystem of Julia packages for HEP. With this package users can run sequential reconstruction algorithms for jets. In particular, for LHC events, the Anti-${k}_\text{T}$, Cambridge/Aachen and Inclusive-${k}_\text{T}$ algorithms can be used. For FCCee studies the use of alternative algorithms such as the Generalised ${k}_\text{T}$ for $e^+e^-$ and Durham are also supported. The performance of the core algorithms is better than Fastjet's C++ implementation, for typical LHC and FCCee events, thanks to the Julia compiler's exploitation of single-instruction-multiple-data (SIMD), as well as ergonomic compact data layouts. The full reconstruction history is made available, allowing inclusive and exclusive jets to be retrieved. The package also provides the means to visualise the reconstruction. Substructure algorithms have been added that allow advanced analysis techniques to be employed. The package can read event data from EDM4hep files and reconstruct jets from these directly, opening the door to FCCee and other future collider studies in Julia.
Alexandre Arbey, Jamie Boyd, Daniel Britzger et al.
Data preservation significantly increases the scientific output of high-energy physics experiments during and after data acquisition. For new and ongoing experiments, the careful consideration of long-term data preservation in the experimental design contributes to improving computational efficiency and strengthening the scientific activity in HEP through Open Science methodologies. This contribution is based on 15 years of experience of the DPHEP collaboration in the field of data preservation and focuses on aspects relevant for the strategic programming of particle physics in Europe: the preparation of future programs using data sets preserved from previous similar experiments (e.g. HERA for EIC), and the use of LHC data long after the end of the data taking. The lessons learned from past collider experiments and recent developments open the way to a number of recommendations for the full exploitation of the investments made in large HEP experiments.
Allen Caldwell, John Farmer, Nelson Lopes et al.
We discuss the main elements of a collider facility based on proton-driven plasma wakefield acceleration. We show that very competitive luminosities could be reached for high energy $e^+e^-$ colliders. A first set of parameters was developed for a Higgs Factory indicating that such a scheme is indeed potentially feasible. There are clearly many challenges to the development of this scheme, including novel RF acceleration modules and high precision and strong magnets for the proton driver. Challenges in the plasma acceleration stage include the ability to accelerate positrons while maintaining necessary emittance and the energy transfer efficiency from the driver to the witness. Since many exciting applications would become available from our approach, its development should be pursued.
Simon Waid, Philipp Gaggl, Andreas Gsponer et al.
Access to high-energy particle beams is key for testing high-energy physics (HEP) instruments. Accelerators for cancer treatment can serve as such a testing ground. However, HEP instrument tests typically require particle fluxes significantly lower than for cancer treatment. Thus, facilities need adaptations to fulfill both the requirements for cancer treatment and the requirements for HEP instrument testing. We report on the progress made in developing a beam monitor with a sufficient dynamic range to allow for the detection of single particles, while still being able to act as a monitor at the clinical particle rates of the MedAustron treatment facility. The beam monitor is designed for integration into existing accelerators.
Alexander Held, Sam Albin, Garhan Attebury et al.
The IRIS-HEP software institute, as a contributor to the broader HEP Python ecosystem, is developing scalable analysis infrastructure and software tools to address the upcoming HL-LHC computing challenges with new approaches and paradigms, driven by our vision of what HL-LHC analysis will require. The institute uses a "Grand Challenge" format, constructing a series of increasingly large, complex, and realistic exercises to show the vision of HL-LHC analysis. Recently, the focus has been demonstrating the IRIS-HEP analysis infrastructure at scale and evaluating technology readiness for production. As a part of the Analysis Grand Challenge activities, the institute executed a "200 Gbps Challenge", aiming to show sustained data rates into the event processing of multiple analysis pipelines. The challenge integrated teams internal and external to the institute, including operations and facilities, analysis software tools, innovative data delivery and management services, and scalable analysis infrastructure. The challenge showcases the prototypes - including software, services, and facilities - built to process around 200 TB of data in both the CMS NanoAOD and ATLAS PHYSLITE data formats with test pipelines. The teams were able to sustain the 200 Gbps target across multiple pipelines. The pipelines focusing on event rate were able to process at over 30 MHz. These target rates are demanding; the activity revealed considerations for future testing at this scale and changes necessary for physicists to work at this scale in the future. The 200 Gbps Challenge has established a baseline on today's facilities, setting the stage for the next exercise at twice the scale.
Jason Aebischer, Atakan Tugberk Akmete, Riccardo Aliberti et al.
The kaon physics programme, long heralded as a cutting-edge frontier by the European Strategy for Particle Physics, continues to stand at the intersection of discovery and innovation in high-energy physics (HEP). With its unparalleled capacity to explore new physics at the multi-TeV scale, kaon research is poised to unveil phenomena that could reshape our understanding of the Universe. This document highlights the compelling physics case, with emphasis on exciting new opportunities for advancing kaon physics not only in Europe but also on a global stage. As an important player in the future of HEP, the kaon programme promises to drive transformative breakthroughs, inviting exploration at the forefront of scientific discovery.
Yuheng Zhang, Kunyan Lu, Li-qiu Yao et al.
Gene transfection, which involves introducing nucleic acids into cells, is a pivotal technology in the life sciences and medical fields, particularly in gene therapy. Surface-mediated transfection, primarily targeting cells adhering to surfaces, shows promise for enhancing cell transfection by localizing and presenting surface-bound nucleic acids directly to the cells. However, optimizing endocytosis for efficient delivery remains a persistent challenge. Additionally, ensuring efficient and non-traumatic cell harvest capability is crucial for applications such as ex vivo cell-based therapy. To address these challenges, we developed a photothermal platform with enzymatic degradation capability for efficient gene transfection and cell harvest. This platform is based on carbon nanotubes (CNTs) doped with poly(dimethylsiloxane) and modified with polyelectrolyte multilayers (PEMs) containing hyaluronic acid and quaternized chitosan, allowing for substantial loading of poly(ethyleneimine)/plasmid DNA (pDNA) complexes through electrostatic interactions. Upon irradiation of near-infrared laser, the photothermal properties of CNTs enable high transfection efficiency by delivering pDNA into attached cells via a membrane disruption mechanism. The engineered cells can be harvested by treating with a non-toxic hyaluronidase solution to degrade PEMs, thus maintaining good viability for further applications. This platform has demonstrated remarkable efficacy across various cell lines (including Hep-G2 cells, Ramos cells and primary T cells), achieving a transfection efficiency exceeding 95 %, cell viability exceeding 90 %, and release efficiency surpassing 95 %, highlighting its potential for engineering living cells.
Yun-Fan Liaw
Samiksha Lamichhane, Kapil Adhikari
Paul Martin, Lawrence S. Friedman
S. Mandal, S. Chatterjee, A. Sen et al.
Gas Electron Multiplier (GEM) is one of the mostly used gaseous detectors in the High Energy Physics (HEP) experiments. GEMs are widely used as tracking devices due to their high-rate handling capability and good position resolution. An initiative is taken to study the stability in performance of the GEM chamber prototypes in the laboratory using external radiation for different Argon based gas mixtures. The effect of ambient parameters on the gain and energy resolution are studied. Very recently some behavioural changes in the performance of a SM GEM chamber is observed. The details of the experimental setup, methodology and results are reported here.
Ajana Löw, Martina Lotar Rihtarić, Ivana Vrselja
Abstract Background Conservation of resources theory (COR) establishes a link between resource loss and the stress response. The aim of this study was to assess the contribution of resource loss in the form of home damage and the choice of active or passive coping strategies to PTSD symptoms in survivors of the 2020 Petrinja (Croatia) earthquake. Methods A total of 374 adults (29.9% men) aged 18–64 years living in the counties surrounding the epicenter of the Petrinja (Croatia) earthquake participated in an online cross-sectional survey. The questionnaire included the PTSD Checklist for DSM-5 (PCL-5), the Coping Inventory, and the binary item assessing whether or not the participants' home was damaged. Results Hierarchical regression analysis showed that home damage was a significant predictor of PTSD symptoms. Participants whose homes were damaged by the earthquake were significantly more likely to use passive coping strategies, namely avoidance and emotional venting, and one active coping strategy, action, than those whose homes were spared. Finally, more frequent use of passive coping was associated with a higher risk of PTSD symptoms. Conclusions The study corroborates the COR theory link between resource loss and the stress response, as well as the general consensus that passive coping is a less adaptive strategy than active coping. In addition to passive coping, individuals who lacked resources may have been inclined to take some active steps because they either needed to repair or relocate their homes and because most buildings were only moderately to minimally damaged in the Petrinja earthquake.
Mirna Chehade, Jingwen Tan, Lauren T. Gehman
Navya Sadum, Jordan D. LeGout, Yan Bi
Bernhard Manfred Gruber, Guilherme Amadio, Stephan Hageböck
Particle transport simulations are a cornerstone of high-energy physics (HEP), constituting a substantial part of the computing workload performed in HEP. To boost the simulation throughput and energy efficiency, GPUs as accelerators have been explored in recent years, further driven by the increasing use of GPUs on HPCs. The Accelerated demonstrator of electromagnetic Particle Transport (AdePT) is an advanced prototype for offloading the simulation of electromagnetic showers in Geant4 to GPUs, and still undergoes continuous development and optimization. Improving memory layout and data access is vital to use modern, massively parallel GPU hardware efficiently, contributing to the challenge of migrating traditional CPU based data structures to GPUs in AdePT. The low-level abstraction of memory access (LLAMA) is a C++ library that provides a zero-runtime-overhead data structure abstraction layer, focusing on multidimensional arrays of nested, structured data. It provides a framework for defining and switching custom memory mappings at compile time to define data layouts and instrument data access, making LLAMA an ideal tool to tackle the memory-related optimization challenges in AdePT. Our contribution shares insights gained with LLAMA when instrumenting data access inside AdePT, complementing traditional GPU profiler outputs. We demonstrate traces of read/write counts to data structure elements as well as memory heatmaps. The acquired knowledge allowed for subsequent data layout optimizations.
Halaman 20 dari 37891