D. Fasshauer, R B Sutton, A. Brunger et al.
Hasil untuk "q-fin.PM"
Menampilkan 20 dari ~1530452 hasil · dari CrossRef, Semantic Scholar
Jon C. R. Bennett, Hui Zhang
B. Song, S. Noda, T. Asano et al.
M. Rodahl, F. Höök, A. Krozer et al.
A. Sirunyan, A. Tumasyan, W. Adam et al.
This work summarizes and puts in an overall perspective studies done within the compact muon solenoid (CMS) concerning the discovery potential for squarks and gluinos, sleptons, charginos and neutralinos, supersymmetric (SUSY) dark matter, lightest Higgs, sparticle mass determination methods and the detector design optimization in view of SUSY searches. It represents the status of our understanding of these subjects as of summer 1997. As a benchmark we used the minimal supergravity-inspired supersymmetric standard model (mSUGRA) with a stable lightest supersymmetric particle (LSP). Discovery of supersymmetry at the large hadron collider should be relatively straightforward. It may occur through the observation of large excesses of events in missing ET plus jets, or with one or more isolated leptons. An excess of trilepton events or isolated dileptons with missing ET, exhibiting a characteristic signature in the l+l− invariant mass distribution, could also be the first manifestation of SUSY production. Squarks and gluinos can be discovered for masses in excess of 2 TeV. Charginos and neutralinos can be discovered from an excess of events in dilepton or trilepton final states. Inclusive searches can give early indications from their copious production in squark and gluino cascade decays. Indirect evidence for sleptons can also be obtained from inclusive dilepton studies. Isolation requirements and a jet veto would allow detection of both the direct chargino/neutralino production and the directly produced sleptons. Squark and gluino production may also represent a copious source of Higgs bosons through cascade decays. The lightest SUSY Higgs h → b may be reconstructed with a signal/background ratio of order 1 thanks to hard cuts on ETmiss justified by escaping LSPs. The LSP of SUSY models with conserved R-parity represents a very good candidate for cosmological dark matter. The region of parameter space where this is true is well covered by our searches, at least for tanβ = 2. If supersymmetry exists at the electroweak scale, it could hardly escape detection in CMS and the study of supersymmetry will form a central part of our physics program.
Kristine H. Luce, J. Crowther
M. Khashei, M. Bijari
M. Morris, J. Teevan, Katrina Panovich
F. Yeh, V. Wedeen, W. Tseng
Lena Mamykina, Bella Manoim, Manas Mittal et al.
Timothy Erickson, Toni M. Whited
X. Yi, Qifan Yang, K. Yang et al.
Frequency combs are having a broad impact on science and technology because they provide a way to coherently link radio/microwave-rate electrical signals with optical-rate signals derived from lasers and atomic transitions. Integrating these systems on a photonic chip would revolutionize instrumentation, time keeping, spectroscopy, navigation, and potentially create new mass-market applications. A key element of such a system-on-a-chip will be a mode-locked comb that can be self-referenced. The recent demonstration of soliton mode locking in crystalline and silicon nitride microresonators has provided a way to both mode lock and generate femtosecond time-scale pulses. Here, soliton mode locking is demonstrated in high-Q silica resonators. The resonators produce low-phase-noise soliton pulse trains at readily detectable pulse rates—two essential properties for the operation of frequency combs. A method for the long-term stabilization of the solitons is also demonstrated, and is used to test the theoretical dependence of the comb power, efficiency, and soliton existence power on the pulse width. The influence of the Raman process on the soliton existence power and efficiency is also observed. The resonators are microfabricated on silicon chips and feature reproducible modal properties required for soliton formation. A low-noise and detectable pulse rate soliton frequency comb on a chip is a significant step towards a fully integrated frequency comb system.
Q. An, R. Asfandiyarov, P. Azzarello et al.
DAMPE satellite has directly measured the cosmic ray proton spectrum from 40 GeV to 100 TeV and revealed a new feature at about 13.6 TeV. The precise measurement of the spectrum of protons, the most abundant component of the cosmic radiation, is necessary to understand the source and acceleration of cosmic rays in the Milky Way. This work reports the measurement of the cosmic ray proton fluxes with kinetic energies from 40 GeV to 100 TeV, with 2 1/2 years of data recorded by the DArk Matter Particle Explorer (DAMPE). This is the first time that an experiment directly measures the cosmic ray protons up to ~100 TeV with high statistics. The measured spectrum confirms the spectral hardening at ~300 GeV found by previous experiments and reveals a softening at ~13.6 TeV, with the spectral index changing from ~2.60 to ~2.85. Our result suggests the existence of a new spectral feature of cosmic rays at energies lower than the so-called knee and sheds new light on the origin of Galactic cosmic rays.
A. Banerjee, A. Pradhan, Takol Tangphati et al.
Following the recent theory of f(Q) gravity, we continue to investigate the possible existence of wormhole geometries, where Q is the non-metricity scalar. Recently, the non-metricity scalar and the corresponding field equations have been studied for some spherically symmetric configurations in Mustafa (Phys Lett B 821:136612, 2021) and Lin and Zhai (Phys Rev D 103:124001, 2021). One can note that field equations are different in these two studies. Following Lin and Zhai (2021), we systematically study the field equations for wormhole solutions and found the violation of null energy conditions in the throat neighborhood. More specifically, considering specific choices for the f(Q) form and for constant redshift with different shape functions, we present a class of solutions for static and spherically symmetric wormholes. Our survey indicates that wormhole solutions could not exist for specific form function f(Q)=Q+αQ2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f(Q)= Q+ \alpha Q^2$$\end{document}. To summarize, exact wormhole models can be constructed with violation of the null energy condition throughout the spacetime while being ρ≥0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\rho \ge 0$$\end{document} and vice versa.
K. Churruca, Kristiana Ludlow, W. Wu et al.
Background Q-methodology is an approach to studying complex issues of human ‘subjectivity’. Although this approach was developed in the early twentieth century, the value of Q-methodology in healthcare was not recognised until relatively recently. The aim of this review was to scope the empirical healthcare literature to examine the extent to which Q-methodology has been utilised in healthcare over time, including how it has been used and for what purposes. Methods A search of three electronic databases (Scopus, EBSCO-CINAHL Complete, Medline) was conducted. No date restriction was applied. A title and abstract review, followed by a full-text review, was conducted by a team of five reviewers. Included articles were English-language, peer-reviewed journal articles that used Q-methodology (both Q-sorting and inverted factor analysis) in healthcare settings. The following data items were extracted into a purpose-designed Excel spreadsheet: study details (e.g., setting, country, year), reasons for using Q-methodology, healthcare topic area, participants (type and number), materials (e.g., ranking anchors and Q-set), methods (e.g., development of the Q-set, analysis), study results, and study implications. Data synthesis was descriptive in nature and involved frequency counting, open coding and the organisation by data items. Results Of the 2,302 articles identified by the search, 289 studies were included in this review. We found evidence of increased use of Q-methodology in healthcare, particularly over the last 5 years. However, this research remains diffuse, spread across a large number of journals and topic areas. In a number of studies, we identified limitations in the reporting of methods, such as insufficient information on how authors derived their Q-set, what types of analyses they performed, and the amount of variance explained. Conclusions Although Q-methodology is increasingly being adopted in healthcare research, it still appears to be relatively novel. This review highlight commonalities in how the method has been used, areas of application, and the potential value of the approach. To facilitate reporting of Q-methodological studies, we present a checklist of details that should be included for publication.
A. Sirunyan, A. Tumasyan, W. Adam et al.
M. Kemp, M. Franzi, A. Haase et al.
Very low frequency communication systems (3 kHz–30 kHz) enable applications not feasible at higher frequencies. However, the highest radiation efficiency antennas require size at the scale of the wavelength (here, >1 km), making portable transmitters extremely challenging. Facilitating transmitters at the 10 cm scale, we demonstrate an ultra-low loss lithium niobate piezoelectric electric dipole driven at acoustic resonance that radiates with greater than 300x higher efficiency compared to the previous state of the art at a comparable electrical size. A piezoelectric radiating element eliminates the need for large impedance matching networks as it self-resonates at the acoustic wavelength. Temporal modulation of this resonance demonstrates a device bandwidth greater than 83x beyond the conventional Bode-Fano limit, thus increasing the transmitter bitrate while still minimizing losses. These results will open new applications for portable, electrically small antennas. Designing high radiation efficiency antennas for portable transmitters in low frequency communication systems remains a challenge. Here, the authors report on using piezoelectricity to more efficiently radiate while achieving a bandwidth eighty three times higher than the passive Bode-Fano limit.
Florian Kittelmann, Pavel Sulimov, Kurt Stockinger
Classical and learned query optimizers (LQOs) use cardinality estimations as one of the critical inputs for query planning. Thus, accurately predicting the cardinality of arbitrary queries plays a vital role in query optimization. A recent boom in novel deep learning methods stimulated not only the rise of LQOs but also contributed to the appearance of learned cardinality estimators (LCEs). However, the majority of them are based on classical neural networks, ignoring that multivariate correlations between attributes across different tables could be naturally represented via entanglements in quantum circuits. In this paper, we introduce QardEst - Quantum Cardinality Estimator - a novel quantum neural network approach to estimate the cardinality of join queries. Our experiments conducted with a similar number of trainable parameters suggest that quantum neural networks executed on a quantum simulator outperform classical neural networks in terms of mean squared error as well as the q-error.
S. Hüttel, Sebastian Hess
The scientific production system is crucial in how global challenges are addressed. However, scholars have recently begun to voice concerns about structural inefficiencies within the system, as highlighted, for example, by the replication crisis, the p-value debate and various forms of publication bias. Most suggested remedies tend to address only partial aspects of the system's inefficiencies, but there is currently no unifying agenda in favour of an overall transformation of the system. Based on a critical review of the current scientific system and an exploratory pilot study about the state of student training, we argue that a unifying agenda is urgently needed, particularly given the emergence of artificial intelligence (AI) as a tool in scientific writing and the research discovery process. Without appropriate responses from academia, this trend may even compound current issues around credibility due to limited replicability and ritual-based statistical practice, while amplifying all forms of existing biases. Naïve openness in the science system alone is unlikely to lead to major improvements. We contribute to the debate and call for a system reform by identifying key elements in the definition of transformation pathways towards open, democratic and conscious learning, teaching, reviewing and publishing supported by openly maintained AI tools. Roles and incentives within the review process will have to adapt and be strengthened in relation to those that apply to authors. Scientists will have to write less, learn differently and review more in the future, but need to be trained better in and for AI even today.
C. Carlet, Sihem Mesnager, Chunming Tang et al.
Halaman 11 dari 76523