ABSTRACT High energy densities are vital to satisfy the increasing demand for battery storage systems for electric vehicles. One innovative battery type of the next generation is the solid‐state battery, which is characterized by the high expected energy density. The polymer‐based solid‐state battery is notable for its high machinability in production and, therefore, offers great potential for industrial scale. One component of the polymer‐based solid‐state battery is the composite cathode, which faces particular challenges in the individual production processes. The calendering process is essential, as it can increase the ionic conductivity through a reduction of the composite cathode porosity. For this reason, the calendering process for polymer‐based composite cathodes with different compositions of active material and solid electrolyte has been analyzed in depth in this work. This enabled extensive analysis of the calendering process with different material compositions of polymer‐based composite cathodes to provide a profound understanding of the causal‐effect relationships.
Muhammad Umair Danish, Umair Rehman, Katarina Grolinger
This paper introduces Monotone Delta, an order-theoretic measure designed to enhance the reliability assessment of survey-based instruments in human-machine interactions. Traditional reliability measures, such as Cronbach's Alpha and McDonald's Omega, often yield misleading estimates due to their sensitivity to redundancy, multidimensional constructs, and assumptions of normality and uncorrelated errors. These limitations can compromise decision-making in human-centric evaluations, where survey instruments inform adaptive interfaces, cognitive workload assessments, and human-AI trust models. Monotone Delta addresses these issues by quantifying internal consistency through the minimization of ordinal contradictions and alignment with a unidimensional latent order using weighted tournaments. Unlike traditional approaches, it operates without parametric or model-based assumptions. We conducted theoretical analyses and experimental evaluations on four challenging scenarios: tau-equivalence, redundancy, multidimensionality, and non-normal distributions, and proved that Monotone Delta provides more stable reliability assessments compared to existing methods. The Monotone Delta is a valuable alternative for evaluating questionnaire-based assessments in psychology, human factors, healthcare, and interactive system design, enabling organizations to optimize survey instruments, reduce costly redundancies, and enhance confidence in human-system interactions.
Cristina Martinez Montes, Daniela Grassi, Nicole Novielli
et al.
The study of well-being, stress and other human factors has traditionally relied on self-report instruments to assess key variables. However, concerns about potential biases in these instruments, even when thoroughly validated and standardised, have driven growing interest in alternatives in combining these measures with more objective methods, such as physiological measures. We aimed to (i) compare psychometric stress measures and biometric indicators and (ii) identify stress-related patterns in biometric data during software engineering tasks. We conducted an experiment where participants completed a pre-survey, then programmed two tasks wearing biometric sensors, answered brief post-surveys for each, and finally went through a short exit interview. Our results showed diverse outcomes; we found no stress in the psychometric instruments. Participants in the interviews reported a mix of feeling no stress and experiencing time pressure. Finally, the biometrics showed a significant difference only in EDA phasic peaks. We conclude that our chosen way of inducing stress by imposing a stricter time limit was insufficient. We offer methodological insights for future studies working with stress, biometrics, and psychometric instruments.
Rahmahtrisilvia Rahmahtrisilvia, Rudi Setiawan, Asep Ahmad Sopandi
et al.
Early Intervention Behavioral Therapy as a method has been shown to aid children diagnosed with Autism in adjusting behavior through Applied Behavior Analysis. While there are three levels of ABA, EIBI does not provide a concrete metric of what separates between the individual levels. The current study focuses on differentiating the electrical patterns found in EEG in children and plans to explore how EIBI can serve across the ABA spectrum. The electrodes F3, F4, C3, C4, P3, P4, O1, and O2 were used to capture the EEG signals and were utilized in estimating the power, spectral density using the Welch method. It was observed during the statistical examination that there existed differences in the results of power across the frequency band amongst the groups. The higher levels of Alpha lead us to believe that there was better emotional management. The chronic group was shown to have more prominent Delta power reflecting weakened control. Comparatively, beginning level’s theta power was found to be higher across all groups showcasing change in attention requiring tasks. Due to greater focus being placed on the lower range frequency activity there existed no noteworthy changes in the Beta and Gamma portions. These findings highlight the role of EIBI in neuromodulation in the Alpha and Delta bands, and its application in the enhancement of emotional and neurological stability. EEG is an effective measure as it quantifies EIBI outcomes. Further studies should examine the long-term effects and enhance curriculum concepts to increase the efficacy of the interventions.
Uma Narayanan, Pavan Prajith, Rijo Thomas Mathew
et al.
Researchers are concentrating on developing technologies to identify and caution drivers against driving while distracted because it is a major cause of traffic accidents. According to the National Highway Traffic Safety Administrator's report, distracted driving is to blame for roughly one in every five car accidents.Our goal is to create an accurate and dependable method for identifying distracted drivers and alerting them to their lack of focus. We take inspiration from the success of convolutional neural networks in computer vision to do this. Our strategy entails putting in place a CNN-based system that can recognize when a driver is distracted as well as pinpoint the precise cause of their preoccupation. Real-time detection, however, necessitates three apparently mutually exclusive requirements for an optimal network: a small number of parameters, high accuracy, and fast speed.
This paper proposes three novel test procedures that yield valid inference in an environment with many weak instrumental variables (MWIV). It is observed that the t statistic of the jackknife instrumental variable estimator (JIVE) has an asymptotic distribution that is identical to the two-stage-least squares (TSLS) t statistic in the just-identified environment. Consequently, test procedures that were valid for TSLS t are also valid for the JIVE t. Two such procedures, i.e., VtF and conditional Wald, are adapted directly. By exploiting a feature of MWIV environments, a third, more powerful, one-sided VtF-based test procedure can be obtained.
Ibrahim Khan, Thai Van Nguyen, Chollakorn Nimpattanavong
et al.
This paper presents our work to enhance the background music (BGM) in DareFightingICE by adding an adaptive BGM. The adaptive BGM consists of five different instruments playing a classical music piece called "Air on G-String." The BGM adapts by changing the volume of the instruments. Each instrument is connected to a different element of the game. We then run experiments to evaluate the adaptive BGM by using a deep reinforcement learning AI that only uses audio as input (Blind DL AI). The results show that the performance of the Blind DL AI improves while playing with the adaptive BGM as compared to playing without the adaptive BGM.
We formulate a general program for [...] analyzing continuous, differential weak, simultaneous measurements of noncommuting observables, which focuses on describing the measuring instrument autonomously, without states. The Kraus operators of such measuring processes are time-ordered products of fundamental differential positive transformations, which generate nonunitary transformation groups that we call instrumental Lie groups. The temporal evolution of the instrument is equivalent to the diffusion of a Kraus-operator distribution function defined relative to the invariant measure of the instrumental Lie group [...]. This way of considering instrument evolution we call the Instrument Manifold Program. We relate the Instrument Manifold Program to state-based stochastic master equations. We then explain how the Instrument Manifold Program can be used to describe instrument evolution in terms of a universal cover[,] the universal instrumental Lie group, which is independent [...] of Hilbert space. The universal instrument is generically infinite dimensional, in which situation the instrument's evolution is chaotic. Special simultaneous measurements have a finite-dimensional universal instrument, in which case the instrument is considered to be principal and can be analyzed within the [...] universal instrumental Lie group. Principal instruments belong at the foundation of quantum mechanics. We consider the three most fundamental examples: measurement of a single observable, of position and momentum, and of the three components of angular momentum. These measurements limit to strong simultaneous measurements. For a single observable, this gives the standard decay of coherence between inequivalent irreps; for the latter two, it gives a collapse within each irrep onto the canonical or spherical phase space, locating phase space at the boundary of these instrumental Lie groups.
Abstract The supply chain is a dynamic and uncertain system consisting of material, information, and fund flows between different organizations, from the acquisition of the raw materials to the delivery of the finished products to the end customers. Closed-loop supply chains do not end with the delivery of the finished products to the end customers, the process continues until economic value is obtained from the returned products or they are disposed properly in landfills. Incorporating reverse flows in supply chains increases the uncertainty and complexity, as well as complicating the management of supply chains that are already composed of different actors and have a dynamic structure. Since agent-based modeling and simulation is a more efficient method of handling the dynamic and complex nature of supply chains than the traditional analytical methods, in this study agent-based modeling methodology has been used to model a generic closed-loop supply chain network design problem with the aims of integrating customer behavior into the network, coping with the dynamism, and obtaining a more realistic structure by eliminating the required assumptions for solving the model with analytical methods. The actors in the CLSC network have been defined as agents with goals, properties and behaviors. In the proposed model dynamic customer arrivals, the changing aspects of customers' purchasing preferences for new and refurbished products and the time, quantity and quality uncertainties of returns have been handled via the proposed agent-based architecture. To observe the behavior of the supply chain in several conditions various scenarios have been developed according to different parameter settings for the supplier capacities, the rate of customers being affected by advertising, the market incentive threshold values, and the environmental awareness of customers. From the scenarios, it has been concluded that the system should be fed in the right amounts for the new and refurbished products to increase the effectiveness of factors such as advertising, incentives, and environmental awareness for achieving the desired sales amounts and cost targets.
Electronic computers. Computer science, Information technology
Cubical type theory provides a constructive justification of homotopy type
theory. A crucial ingredient of cubical type theory is a path lifting operation
which is explained computationally by induction on the type involving several
non-canonical choices. We present in this article two canonicity results, both
proved by a sconing argument: a homotopy canonicity result, every natural
number is path equal to a numeral, even if we take away the equations defining
the lifting operation on the type structure, and a canonicity result, which
uses these equations in a crucial way. Both proofs are done internally in a
presheaf model.
Sanjay Gosain, Jack Harvey, Valentin Martinez-Pillet
et al.
Designing compact instruments is the key for the scientific exploration by smaller spacecrafts such as cubesats or by deep space missions. Such missions require compact instrument designs to have minimal instrument mass. Here we present a proof of concept for miniaturization of the Global Oscillation Network Group GONG instrument. GONG instrument routinely obtains solar full disk Doppler and magnetic field maps of the solar photosphere using Ni 676 nm absorption line. A key concept for miniaturization of GONG optical design is to replace the bulky Lyot filter with a narrow-band interference filter and reduce the length of feed telescope. We present validation of the concept via numerical modeling as well as by proof of concept observations.
JavaScript is one of the main programming languages to develop highly rich responsive and interactive Web applications. In these kinds of applications, the use of asynchronous operations that execute callbacks is crucial. However, the dependency among nested callbacks, known as callback hell, can make it difficult to understand and maintain them, which will eventually mix concerns. Unfortunately, current solutions for JavaScript do not fully address the aforementioned issue. This paper presents Sync/cc, a JavaScript package that works on modern browsers. This package is a proof-of-concept that uses continuations and aspects that allow developers to write event handlers that need nested callbacks in a synchronous style, preventing callback hell. Unlike current solutions, Sync/cc is modular, succinct, and customizable because it does not require ad-hoc and scattered constructs, code refactoring, or adding ad-hoc implementations such as state machines. In practice, our proposal uses a) continuations to only suspend the current handler execution until the asynchronous operation is resolved, and b) aspects to apply continuations in a non-intrusive way. We test Sync/cc with a management information system that administers courses at a university in Chile.
Abstract Spin squeezing is a key resource in quantum metrology, allowing improvements of measurement signal-to-noise ratio. Its generation is a challenging task because the experimental realization of the required squeezing interaction remains difficult. Here, we propose a generic scheme to synthesize spin squeezing in non-squeezing systems. By using periodical rotation pulses, the original non-squeezing interaction can be transformed into squeezing interaction, with significantly enhanced interaction strength. The sign of the interaction coefficient is also flippable, facilitating time-reversal readout protocol for nonlinear interferometers. The generated spin squeezing is capable of achieving the Heisenberg limit with measurement precision ∝ 1/N for N particles and its robustness to noises of pulse areas and separations has been verified as well. This work offers a path to extending the scope of Heisenberg-limited quantum precision measurements in non-squeezing systems.
In this paper, through research and analysis of the communication network of the physical activity monitoring system, we combine wearable technology and identification technology and design a physical health monitoring bracelet that integrates multifaceted physical data collection and effective identity matching function. We match the identity through the chip and collect the physical fitness data generated in the process of exercise and centralized test by the sensor in real-time. Finally, the data transmission is realized through the WIFI communication function to achieve the purpose of monitoring physical exercise and improving physical quality. To ensure the continuity and stability of information transmission, the joint transmission method of direct transmission and indirect transmission is essential. Besides, considering the energy causality limitation of sensor nodes and relay nodes, a collaborative transmission model of wireless body area network based on wireless cognitive network is constructed. And, a power allocation algorithm based on maximum ratio merging and wireless cognitive network is proposed, which puts forward a new idea for the future research of wireless body area network resource allocation.
Aleksandr N. Grekov, Nikolay A. Grekov, Evgeniy Sychov
et al.
Based on the analysis of existing acoustic methods and instruments, a prototype of an automated instrument has been developed to perform joint measurements in situ of two parameters: sound speed and ultrasound attenuation. The device is based on existing sound velocity profilers. It was proposed to replace the TDC-GP22 converters used in the sound speed meter ISZ-1 with more advanced modern modified converters TDC-GP30, which can significantly improve the accuracy of measuring the amplitude of the reflected acoustic signal. The programs for processing signals from the primary acoustic transducer have been developed. The model of the device passed preliminary tests.
Mendelian randomization (MR) has become a popular approach to study causal effects by using genetic variants as instrumental variables. We propose a new MR method, GENIUS-MAWII, which simultaneously addresses the two salient phenomena that adversely affect MR analyses: many weak instruments and widespread horizontal pleiotropy. Similar to MR GENIUS (Tchetgen Tchetgen et al., 2021), we achieve identification of the treatment effect by leveraging heteroscedasticity of the exposure. We then derive the class of influence functions of the treatment effect, based on which, we construct a continuous updating estimator and establish its consistency and asymptotic normality under a many weak invalid instruments asymptotic regime by developing novel semiparametric theory. We also provide a measure of weak identification, an overidentification test, and a graphical diagnostic tool. We demonstrate in simulations that GENIUS-MAWII has clear advantages in the presence of directional or correlated horizontal pleiotropy compared to other methods. We apply our method to study the effect of body mass index on systolic blood pressure using UK Biobank.
The removal of impulse noise is a crucial pre-processing step in image processing systems. In recent years, numerous noise-removal methods have been proposed to improve denoizing performance and reconstruct noise-free images. However, removing high-density impulse noise remains a major challenge. In this paper, to address the image denoizing problem associated with high-density noise, we propose a new denoizing model, called LD-Net, which can be trained end-to-end and directly reconstructs noise-free images via a lightweight convolutional neural network. LD-Net is performed in two stages including a feature augmentation stage and a feature refinement stage. During the feature augmentation stage, the spatial size and dimension of the input image are increased by employing the deconvolutional layers for effective feature learning. During the feature refinement stage, the textural details of the image are enhanced for the reconstruction of the noise-free image by the utilization of a proposed sequence of three convolutional layers. Quantitative and qualitative evaluations performed on the SN-LABELME dataset indicate that the proposed LD-Net removes high-density impulse noise more effectively and at higher speed than other state-of-the-art denoizing methods.
Electronic computers. Computer science, Information technology