Cecilia Borca, Javier Jiménez Peña, David Marckx
et al.
A 2021 study by the ECFA Early-Career Researchers Panel revealed that 71% of 334 respondents used open-source software tools in their instrumentation work, yet 70% reported receiving no training for these tools. In response, the Software and Machine Learning for Instrumentation group was formed in the ECFA Early-Career Researchers Panel to assess the accessibility and quality of training programs in machine learning and software for early-career researchers in experimental and applied physics. This group launched a new survey, reaching 174 participants. This report summarises the survey results in detail, and is intended to serve as a guiding document to improve the training programs that are available to early-career researchers.
Robotic Process Automation is a technology that replicates human interactions with user interfaces across various applications. However, testing Robotic Process Automation implementations remains challenging due to the dynamic nature of workflows. This paper presents a novel testing framework that first integrates symbolic execution and concolic testing strategies to enhance Robotic Process Automation workflow validation. Building on insights from these methods, we introduce a hybrid approach that optimizes test coverage and efficiency in specific cases. Our open-source implementation demonstrates that automated testing in the Robotic Process Automation domain significantly improves coverage, reduces manual effort, and enhances reliability. Furthermore, the proposed solution supports multiple Robotic Process Automation platforms and aligns with industry best practices for user interface automation testing. Experimental evaluation, conducted in collaboration with industry, validates the effectiveness of our approach.
Abstract Machine learning (ML) offers considerable promise for the design of new molecules and materials. In real-world applications, the design problem is often domain-specific, and suffers from insufficient data, particularly labeled data, for ML training. In this study, we report a data-efficient, deep-learning framework for molecular discovery that integrates a coarse-grained functional-group representation with a self-attention mechanism to capture intricate chemical interactions. Our approach exploits group-contribution concepts to create a graph-based intermediate representation of molecules, serving as a low-dimensional embedding that substantially reduces the data demands typically required for training. Using a self-attention mechanism to learn the subtle but highly relevant chemical context of functional groups, the method proposed here consistently outperforms existing approaches for predictions of multiple thermophysical properties. In a case study focused on adhesive polymer monomers, we train on a limited dataset comprising only 6,000 unlabeled and 600 labeled monomers. The resulting chemistry prediction model achieves over 92% accuracy in forecasting properties directly from SMILES strings, exceeding the performance of current state-of-the-art techniques. Furthermore, the latent molecular embedding is invertible, enabling the design pipeline to automatically generate new monomers from the learned chemical subspace. We illustrate this functionality by targeting several properties, including high and low glass transition temperatures (Tg), and demonstrate that our model can identify new candidates with values that surpass those in the training set. The ease with which the proposed framework navigates both chemical diversity and data scarcity offers a promising route to accelerate and broaden the search for functional materials.
Materials of engineering and construction. Mechanics of materials, Computer software
Abdulrahman M. Abdulghani, Azizol Abdullah, A. R. Rahiman
et al.
Modern Software-Defined Wide Area Networks (SD-WANs) require adaptive controller placement addressing multi-objective optimization where latency minimization, load balancing, and fault tolerance must be simultaneously optimized. Traditional static approaches fail under dynamic network conditions with evolving traffic patterns and topology changes. This paper presents a novel hybrid framework integrating Gaussian Mixture Model (GMM) clustering with Multi-Agent Reinforcement Learning (MARL) for dynamic controller placement. The approach leverages probabilistic clustering for intelligent MARL initialization, reducing exploration requirements. Centralized Training with Decentralized Execution (CTDE) enables distributed optimization through cooperative agents. Experimental evaluation using real-world topologies demonstrates a noticeable reduction in the latency, improvement in network balance, and significant computational efficiency versus existing methods. Dynamic adaptation experiments confirm superior scalability during network changes. The hybrid architecture achieves linear scalability through problem decomposition while maintaining real-time responsiveness, establishing practical viability.
ABSTRACT This paper proposes a controlled signal technique for visible light non‐orthogonal multiple access (VL‐NOMA) communication in an interference‐controlled environment with intelligent reflecting surfaces (IRS) for beyond 5G (B5G) and 6G communication networks. The light‐emitting diode (LED) is used for carrier signal generation to transmit signals to the two users (photodiodes, PDs) due to its advantages, such as its programmable nature and flexibility. The potential challenge is how the signals could be controlled with an IRS approach, which prompted this research. We have used IRS, which is a cutting‐edge enabling technology that modifies the signal's reflection by utilizing numerous inexpensive passive reflecting elements to improve the signal's performance. Furthermore, deep reinforcement learning (DRL) is deployed to control the reflected signals, simulate, make decisions, and link LED‐IRS‐PDs, redirecting the signals. The entire system is successfully synchronized, and then the bit error rate (BER), line of sight (LOS), and non‐line of sight (NLOS) performances are investigated. Furthermore, we place a blocker at the center of the model as a NLOS to check how the transmitted signals will perform. We observed that the propagated signal improved the BER as per LOS, hence, the NLOS blocker reduced the signal's performance. Furthermore, we optimized the signals to investigate BER, LOS, and NLOS signal performance. We observed that LOS signals performed better than NLOS signals.
To ensure effective management of medical devices, it is imperative that medical devices must be safe and inoffensive, and their management must be based on evidence. Thus, to help enhance the safety of medical devices, a new mechanism for the periodic compliance assessment of medical devices has been developed. The mechanism involves the assessment of general safety, electrical safety and performance parameters in line with international best practice. At the same time, the effective management of medical devices requires data and information related to medical devices and their lifecycle events, which can be obtained through the medical device management information system. The establishment and implementation of efficient management of medical devices, involves strengthening the capacities of medical devices’ management, in order to be able to respond to the current requirements of the health system, in such a way as to ensure the functionality of medical devices and the safe and efficient use of medical devices. Accordingly, the implementation of efficient management of medical devices is fundamental for providing qualitative, safe and efficient medical devices, which contributes to increasing the quality of medical services.
Abstract With the rapid development of information technology, new educational models using virtual reality technology have received widespread attention from relevant researchers. In the field of vocational education, vocational colleges and training institutions can effectively mobilize students' learning initiative and improve their learning efficiency by using virtual reality technology. This study details the development process and system evaluation of a bespoke virtual reality system that offers a solution to the issues of uncertainty regarding hazards, high teaching expenses, and spatial constraints inherent in the practical training of elevator maintenance. By establishing a virtual environment that is highly reproducible and designing abundant interaction methods, this system facilitates students in attaining mastery over the structural make‐up of elevators, the principles of their operation, and the techniques involved in calibrating elevator governors. The system underwent testing by multiple users, and the satisfaction level of the system was ascertained through a questionnaire study, while the effectiveness of the system was evaluated using independent samples t test for data statistics concerning students' performance. The results of the study indicate that the system gained widespread praise among users, and it notably enhanced the students' learning drive, practical abilities, and on‐site adaptability.
Educational robots offer a platform for training aspiring engineers and building trust in technology that is envisioned to shape how we work and live. In education, accessibility and modularity are significant in the choice of such a technological platform. In order to foster continuous development of the robots as well as to improve student engagement in the design and fabrication process, safe production methods with low accessibility barriers should be chosen. In this paper, we present Robotont 3, an open-source mobile robot that leverages Fused Deposition Modeling (FDM) 3D-printing for manufacturing the chassis and a single dedicated system board that can be ordered from online printed circuit board (PCB) assembly services. To promote accessibility, the project follows open hardware practices, such as design transparency, permissive licensing, accessibility in manufacturing methods, and comprehensive documentation. Semantic Versioning was incorporated to improve maintainability in development. Compared to the earlier versions, Robotont 3 maintains all the technical capabilities, while featuring an improved hardware setup to enhance the ease of fabrication and assembly, and modularity. The improvements increase the accessibility, scalability and flexibility of the platform in an educational setting.
Mechanical engineering and machinery, Electronic computers. Computer science
Automated verification has become an essential part in the security evaluation of cryptographic protocols. In this context privacy-type properties are often modelled by indistinguishability statements, expressed as behavioural equivalences in a process calculus. In this paper we contribute both to the theory and practice of this verification problem. We establish new complexity results for static equivalence, trace equivalence and labelled bisimilarity and provide a decision procedure for these equivalences in the case of a bounded number of protocol sessions. Our procedure is the first to decide trace equivalence and labelled bisimilarity exactly for a large variety of cryptographic primitives -- those that can be represented by a subterm convergent destructor rewrite system. We also implemented the procedure in a new tool, DeepSec. We showed through extensive experiments that it is significantly more efficient than other similar tools, while at the same time raises the scope of the protocols that can be analysed.
A Ruiz-Gonzalez, JR Heredia-Larrubia, FM Perez-Hidalgo
et al.
The objective of this research is to mitigate vibrations in induction motors. To achieve this goal, a discontinuous pulse width modulation (PWM) control strategy based on carrier wave modulation is proposed for multilevel inverters. This study provides justification for the reduction of machine vibrations compared to existing control techniques documented in the technical literature. Additionally, the proposed technique offers the advantage of attenuating the Total Harmonic Distortion of the multilevel inverter's output voltage while simultaneously achieving a higher RMS value for the same DC level. By modifying a parameter of the carrier wave, the control strategy allows for variations in the electrical spectrum while avoiding natural mechanical resonance frequencies, thereby reducing motor vibrations. Laboratory results demonstrating the application of different modulation strategies in a multilevel inverter for an induction motor and a comparison with the presented strategy are provided
Nicolas Apfel, Julia Hatamyar, Martin Huber
et al.
This study introduces a data-driven, machine learning-based method to detect suitable control variables and instruments for assessing the causal effect of a treatment on an outcome in observational data, if they exist. Our approach tests the joint existence of instruments, which are associated with the treatment but not directly with the outcome (at least conditional on observables), and suitable control variables, conditional on which the treatment is exogenous, and learns the partition of instruments and control variables from the observed data. The detection of sets of instruments and control variables relies on the condition that proper instruments are conditionally independent of the outcome given the treatment and suitable control variables. We establish the consistency of our method for detecting control variables and instruments under certain regularity conditions, investigate the finite sample performance through a simulation study, and provide an empirical application to labor market data from the Job Corps study.
Missagh Mehdipour, Laura W. Brenneman, Jon M. Miller
et al.
Black hole accretion in active galactic nuclei (AGN) is coupled to the evolution of their host galaxies. Outflowing winds in AGN can play an important role in this evolution through the resulting feedback mechanism. Multi-wavelength spectroscopy is key for probing the intertwined physics of inflows and outflows in AGN. However, with the current spectrometers, crucial properties of the ionized outflows are poorly understood, such as their coupling to the accretion rate, their launching mechanism, and their kinetic power. In this paper we discuss the need for simultaneous X-ray and UV high-resolution spectroscopy for tackling outstanding questions on these outflows in AGN. The instrumental requirements for achieving the scientific objectives are addressed. We demonstrate that these requirements would be facilitated by the proposed Arcus Probe mission concept. The multi-wavelength spectroscopy and timing by Arcus would enable us to establish the kinematics and ionization structure of the entire ionized outflow, extending from the vicinity of the accretion disk to the outskirts of the host galaxy. Arcus would provide key diagnostics on the origin, driving mechanism, and the energetics of the outflows, which are useful benchmarks for testing various theoretical models of outflows and understanding their impact in AGN.
Segmentation of surgical instruments is crucial for enhancing surgeon performance and ensuring patient safety. Conventional techniques such as binary, semantic, and instance segmentation share a common drawback: they do not accommodate the parts of instruments obscured by tissues or other instruments. Precisely predicting the full extent of these occluded instruments can significantly improve laparoscopic surgeries by providing critical guidance during operations and assisting in the analysis of potential surgical errors, as well as serving educational purposes. In this paper, we introduce Amodal Segmentation to the realm of surgical instruments in the medical field. This technique identifies both the visible and occluded parts of an object. To achieve this, we introduce a new Amoal Instruments Segmentation (AIS) dataset, which was developed by reannotating each instrument with its complete mask, utilizing the 2017 MICCAI EndoVis Robotic Instrument Segmentation Challenge dataset. Additionally, we evaluate several leading amodal segmentation methods to establish a benchmark for this new dataset.
Sandra Benítez-Peña, Rafael Blanquero, Emilio Carrizosa
et al.
Feature Selection is a crucial procedure in Data Science tasks such as Classification, since it identifies the relevant variables, making thus the classification procedures more interpretable, cheaper in terms of measurement and more effective by reducing noise and data overfit. The relevance of features in a classification procedure is linked to the fact that misclassifications costs are frequently asymmetric, since false positive and false negative cases may have very different consequences. However, off-the-shelf Feature Selection procedures seldom take into account such cost-sensitivity of errors. In this paper we propose a mathematical-optimization-based Feature Selection procedure embedded in one of the most popular classification procedures, namely, Support Vector Machines, accommodating asymmetric misclassification costs. The key idea is to replace the traditional margin maximization by minimizing the number of features selected, but imposing upper bounds on the false positive and negative rates. The problem is written as an integer linear problem plus a quadratic convex problem for Support Vector Machines with both linear and radial kernels. The reported numerical experience demonstrates the usefulness of the proposed Feature Selection procedure. Indeed, our results on benchmark data sets show that a substantial decrease of the number of features is obtained, whilst the desired trade-off between false positive and false negative rates is achieved.
Lev Raskin , Larysa Sukhomlyn, Dmytro Sokolov
et al.
Object of research is technical state of deteriorating systems whose operating conditions depend on a large number of interacting factors. The caused inhomogeneity of the sample of initial data on the technical state leads to impossibility of correct use of traditional methods of assessing the state of a system (meaning methods using mathematical tools of regression analysis). Subject of research is developing a method for constructing a regression polynomial based on the results of processing a set of controlled system parameters. Non-linearity of the polynomial describing the evolution of the technical state of real systems leads to an increase in the number of regression polynomial coefficients subject to estimation. The problem is further complicated by the growing number of factors affecting the technical state of the system. In these circumstances, the so-called <small sample effect> occurs. Goal the research consists in developing a method for constructing an approximation polynomial that describes evolution of the system state in a situation where the volume of the initial data sample is insufficient for correct estimating coefficients of this polynomial. The results obtained. The paper proposes a method for solving the given problem, based on implementation of a two-stage procedure. At the first stage a functional description of the approximation polynomial coefficients is performed; and this radically reduces the number of regression polynomial parameters to be estimated. This polynomial is used for preliminary estimation of its coefficients with the aim of filtering out insignificant factors and their interactions. At the second stage, parameters of the truncated polynomial are estimated by means of using standard technologies of mathematical statistics. Two approaches to constructing a modified polynomial have been studied: the additive one and the multiplicative one. It has been shown that the additive approach is, on average, an order of magnitude more effective than the multiplicative one.
The Partitioned Global Address Space (PGAS) library DASH provides C++ container classes for distributed N-dimensional structured grids. This article presents enhancements on top of the DASH library to support stencil operations and halo areas to conveniently and efficiently parallelize structured grids. The improvements include definitions of multiple stencil operators, automatic derivation of halo sizes, efficient halo data exchanges, as well as communication hiding optimizations. The main contributions of this article are two-fold. First, the halo abstraction concept and the halo wrapper software components are explained. Second, the code complexity and the runtime of an example code implemented in DASH and pure Message Passing Interface (MPI) are compared.
Navaneethakrishna Makaram, Sarvagya Gupta, Matthew Pesce
et al.
In drug-resistant epilepsy, a visual inspection of intracranial electroencephalography (iEEG) signals is often needed to localize the epileptogenic zone (EZ) and guide neurosurgery. The visual assessment of iEEG time-frequency (TF) images is an alternative to signal inspection, but subtle variations may escape the human eye. Here, we propose a deep learning-based metric of visual complexity to interpret TF images extracted from iEEG data and aim to assess its ability to identify the EZ in the brain. We analyzed interictal iEEG data from 1928 contacts recorded from 20 children with drug-resistant epilepsy who became seizure-free after neurosurgery. We localized each iEEG contact in the MRI, created TF images (1–70 Hz) for each contact, and used a pre-trained VGG16 network to measure their visual complexity by extracting unsupervised activation energy (UAE) from 13 convolutional layers. We identified points of interest in the brain using the UAE values via patient- and layer-specific thresholds (based on extreme value distribution) and using a support vector machine classifier. Results show that contacts inside the seizure onset zone exhibit lower UAE than outside, with larger differences in deep layers (L10, L12, and L13: <i>p</i> < 0.001). Furthermore, the points of interest identified using the support vector machine, localized the EZ with 7 mm accuracy. In conclusion, we presented a pre-surgical computerized tool that facilitates the EZ localization in the patient’s MRI without requiring long-term iEEG inspection.
We introduce the concepts of dual instruments and sub-observables. We show that although a dual instruments measures a unique observable, it determines many sub-observables. We define a unique minimal extension of a sub-observable to an observable and consider sequential products and conditioning of sub-observables. Sub-observable effect algebras are characterized and studied. Moreover, the convexity of these effect algebras is considered. The sequential product of instruments is discussed. These concepts are illustrated with many examples of instruments. In particular, we discuss Lüders, Holero and constant state instruments. Various conjectures for future research are presented.
Kenta Takatsu, Alexander W. Levis, Edward Kennedy
et al.
Comparative effectiveness research frequently employs the instrumental variable design since randomized trials can be infeasible for many reasons. In this study, we investigate and compare treatments for emergency cholecystitis -- inflammation of the gallbladder. A standard treatment for cholecystitis is surgical removal of the gallbladder, while alternative non-surgical treatments include managed care and pharmaceutical options. As randomized trials are judged to violate the principle of equipoise, we consider an instrument for operative care: the surgeon's tendency to operate. Standard instrumental variable estimation methods, however, often rely on parametric models that are prone to bias from model misspecification. We outline instrumental variable estimation methods based on the doubly robust machine learning framework. These methods enable us to employ various machine learning techniques for nuisance parameter estimation and deliver consistent estimates and fast rates of convergence for valid inference. We use these methods to estimate the primary target causal estimand in an IV design. Additionally, we expand these methods to develop estimators for heterogeneous causal effects, profiling principal strata, and a sensitivity analyses for a key instrumental variable assumption. We conduct a simulation study to demonstrate scenarios where more flexible estimation methods outperform standard methods. Our findings indicate that operative care is generally more effective for cholecystitis patients, although the benefits of surgery can be less pronounced for key patient subgroups.
When multi-dimensional instruments are used to identify and estimate causal effects, the monotonicity condition may not hold due to heterogeneity in the population. Under a partial monotonicity condition, which only requires the monotonicity to hold for each instrument separately holding all the other instruments fixed, the 2SLS estimand can still be a positively weighted average of LATEs. In this paper, we provide a simple nonparametric test for partial instrument monotonicity. We demonstrate the good finite sample properties of the test through Monte Carlo simulations. We then apply the test to monetary incentives and distance from results centers as instruments for the knowledge of HIV status.