Hasil untuk "Instruments and machines"

Menampilkan 20 dari ~592200 hasil · dari arXiv, DOAJ, CrossRef

JSON API
arXiv Open Access 2026
FlueBricks: A Construction Kit of Flute-like Instruments for Acoustic Reasoning

Bo-Yu Chen, Chiao-Wei Huang, Lung-Pan Cheng

We present FlueBricks, a construction kit for acoustic reasoning via building and customizing flute-like instruments. By assembling generator, resonator, and connector modules that embody various aeroacoustic properties, users gain deeper understanding of how blowhole, tube length, and tone-hole placement alter onset, pitch, and timbre through hands-on experimentation. This forms a designer-player loop of configuring and playing to form, test, and refine acoustic behaviors-acoustic reasoning-shifting acoustic instruments from static artifacts to dynamic systems. To understand how users engage with this system, we conducted an exploratory study with 12 participants ranging from novices to professional musicians. During their explorations, we observed participants fluently switching between designer and player roles, scaffolding designs from familiar instruments, forming and refining their acoustic understanding of length, tone holes, and generator geometry, reinterpreting modules beyond their intended functions, and using their creations for performative acts such as pedagogical showing and musical expression. These collectively demonstrated FlueBricks's potential as a pedagogical tool for embodied acoustic reasoning.

en cs.HC, cs.SD
arXiv Open Access 2026
Empirical Challenges with Peers-of-Peers Instruments in the Linear-In-Means Model

Nathan Canen, Shantanu Chadha

In the linear-in-means model, endogeneity arises naturally due to the reflection problem. A common solution is to use Instrumental Variables (IVs) based on higher-order network links, such as using friends-of-friends' characteristics. We first show that such instruments are unlikely to work well in many applied settings: in very sparse or very dense networks, friends-of-friends may be similar to the original links. This implies that the IVs may be weak or their first stage estimand may be undefined. For a class of random graphs, we use random graph theory and characterize regimes where such instruments perform well, and when they would not. We prove how weak-IV robust inference can be adapted to this environment, and how scaling the network can help. We provide extensive Monte Carlo simulations and revisit empirical applications, showing the prevalence of such issues in empirical practice, and how our results restore valid inference.

en econ.EM
arXiv Open Access 2025
An Instrumental Variables Approach to Testing Firm Conduct under a Bertrand-Nash Framework

Youngjin Hong, In Kyung Kim, Kyoo il Kim

Understanding firm conduct is crucial for industrial organization and antitrust policy. In this article, we develop a testing procedure based on the Rivers and Vuong non-nested model selection framework. Unlike existing methods that require estimating the demand and supply system, our approach compares the model fit of two first-stage price regressions. Through an extensive Monte Carlo study, we demonstrate that our test performs comparably to, or outperforms, existing methods in detecting collusion across various collusive scenarios. The results are robust to model misspecification, alternative functional forms for instruments, and data limitations. By simplifying the diagnosis of firm behavior, our method offers researchers and regulators an efficient tool for assessing industry conduct under a Bertrand oligopoly framework. Additionally, our approach offers a practical guideline for enhancing the strength of BLP-style instruments in demand estimation: once collusion is detected, researchers are advised to incorporate the product characteristics of colluding partners into own-firm instruments while excluding them from other-firm instruments.

en econ.GN
arXiv Open Access 2025
Lead Instrument Detection from Multitrack Music

Longshen Ou, Yu Takahashi, Ye Wang

Prior approaches to lead instrument detection primarily analyze mixture audio, limited to coarse classifications and lacking generalization ability. This paper presents a novel approach to lead instrument detection in multitrack music audio by crafting expertly annotated datasets and designing a novel framework that integrates a self-supervised learning model with a track-wise, frame-level attention-based classifier. This attention mechanism dynamically extracts and aggregates track-specific features based on their auditory importance, enabling precise detection across varied instrument types and combinations. Enhanced by track classification and permutation augmentation, our model substantially outperforms existing SVM and CRNN models, showing robustness on unseen instruments and out-of-domain testing. We believe our exploration provides valuable insights for future research on audio content analysis in multitrack music settings.

en cs.SD, eess.AS
DOAJ Open Access 2025
Students’ Innovation Ability Evaluation in Vocational Colleges Based on the STEAM Education Concept: A Neutrosophic SuperHyper Pentapartitioned Process Model

Xiaodan Kong

Innovation ability is increasingly regarded as a core competence in vocational education, particularly under the STEAM (Science, Technology, Engineering, Arts, Mathematics) paradigm that emphasizes integrative and creative problem-solving. However, evaluating innovation ability is challenging due to its multidimensionality, vagueness, and subjectivity. This paper introduces a novel Neutrosophic SuperHyper Pentapartitioned Process Model (NSPPM) designed to rigorously capture uncertainty, contradiction, and incompleteness in students’ innovation assessment. Building on the formalism of Single-Valued Pentapartitioned Neutrosophic Sets (SVPNS), the model decomposes student performance into five dimensions: truth (T), contradiction (C), ignorance (G), uncertainty (U), and falsehood (F). These are then aggregated across multi level tasks using a superhyperstructure composition that ensures closure, monotonicity, and idempotence. Decision-making is achieved via a dominance relation that compares students across dimensions, preserving the neutrosophic nature of the evaluation without collapsing it into scalar indices. To illustrate, synthetic case studies simulate student performance under STEAM tasks in engineering and arts, demonstrating how NSPPM identifies non-dominated students and reveals nuanced patterns of innovation capacity. The results highlight both the theoretical novelty of pentapartitioned neutrosophic superhyper modeling and its practical utility in educational evaluation.

Mathematics, Electronic computers. Computer science
DOAJ Open Access 2025
Evaluation system of college english teaching quality based on fuzzy information of artificial intelligence

Ma Lina

Abstract The current evaluation of English teaching quality in colleges and universities faces the problems of information uncertainty and fuzziness. Traditional evaluation methods cannot accurately reflect the complex teaching effects, mainly due to the diversity of data and the fuzziness of evaluation dimensions. To address this issue, this paper proposes a college English teaching quality evaluation system that combines Fuzzy C-Means (FCM) and Takagi-Sugeno Fuzzy Inference System (TS-FIS). First, the FCM algorithm is utilized to fuzzify various teaching data and convert the evaluation dimensions into fuzzy membership degrees. Then, TS-FIS is used to infer this fuzzy information and generate comprehensive scores. Finally, a deep neural network (DNN) is employed to train historical data, dynamically adjusting the evaluation results. The findings demonstrate that the system achieves an evaluation accuracy of more than 91% when dealing with uncertainties in complex teaching environments, and the score fluctuation range is controlled within 5% during the dynamic adjustment process, which proves the effectiveness of the system in improving evaluation accuracy and adaptability. The method proposed in this paper provides an effective solution to the problem of evaluating English teaching quality in colleges and universities using fuzzy information.

Computational linguistics. Natural language processing, Electronic computers. Computer science
DOAJ Open Access 2025
FVM: A Formal Verification Methodology for VHDL Designs

Hipolito Guzman-Miranda, Marcos Lopez Garcia, Alberto Urbon Aguado

With the increasing complexity of digital designs, functional verification is becoming unmanageable. Bugs that survive verification cause a number of issues with functional, performance, security, safety and economic impact, and are unfortunately prevalent in current FPGA and ASIC designs, manifesting in later stages of development or even after the design has been deployed or manufactured. In this context, Formal Verification poses itself as a powerful complement to verification by simulation, which is currently the most extended verification method. By mathematically proving properties of the designs, Formal Verification allows to verify them with high confidence, but also requires designers to have deep expertise of the methods, techniques and tools. Thus, adoption of formal methods for verification is not as extended as their usefulness may suggest, and even less in the case of VHDL teams. To lower the adoption barriers for formal verification of digital designs, the present article proposes a Formal Verification Methodology, which is complemented by a build and test framework and a repository of examples. Results of applying the Formal Verification Methodology to the repository of examples show compelling results both in manageable design complexity and verification productivity.

Electronic computers. Computer science, Information technology
DOAJ Open Access 2025
Empirical research on the evolution trend of heat and sentiment for emergencies

Shihong Wu, Wei Yu, Yanxia Zhao et al.

Emergencies inflict heavy casualties, economic losses, ecological damage, and significant social harm to society. By segmenting information topics and analysing emotional shifts, we can identify corresponding real-world events and their impacts, thereby providing guidance for timely responses to emergencies. In the past, public opinion monitoring of emergencies was based mainly on single-topic detection or emotion analysis, which cannot comprehensively evaluate the evolution of public opinion. In this work, word segmentation is applied to video comments related to various emergency situations. By utilizing the co-word network and Louvain algorithm for theme division, along with sentiment analysis constructed through time series analysis of sentiment value changes for various emergencies employing the naive Bayes method, the evolution of public opinion is comprehensively assessed. As a result, the pivotal nodes in the evolution of public opinion are identified and the evolution process is divided into stages. Using this method, relevant management departments can effectively address the majority of public opinions for various types of emergencies, addressing them from the perspectives of prevention, adjustment, and recovery. This approach not only enhances rescue efficiency and strengthens safety management but also actively guides the evolution of public opinion, ultimately providing society with solid and reliable security safeguards.

Electronic computers. Computer science, Science
arXiv Open Access 2024
Testing the Exogeneity of Instrumental Variables and Regressors in Linear Regression Models Using Copulas

Seyed Morteza Emadi

We provide a Copula-based approach to test the exogeneity of instrumental variables in linear regression models. We show that the exogeneity of instrumental variables is equivalent to the exogeneity of their standard normal transformations with the same CDF value. Then, we establish a Wald test for the exogeneity of the instrumental variables. We demonstrate the performance of our test using simulation studies. Our simulations show that if the instruments are actually endogenous, our test rejects the exogeneity hypothesis approximately 93% of the time at the 5% significance level. Conversely, when instruments are truly exogenous, it dismisses the exogeneity assumption less than 30% of the time on average for data with 200 observations and less than 2% of the time for data with 1,000 observations. Our results demonstrate our test's effectiveness, offering significant value to applied econometricians.

en stat.ME, econ.EM
DOAJ Open Access 2024
Semantic similarity on multimodal data: A comprehensive survey with applications

Baha Ihnaini, Belal Abuhaija, Ebenezer Atta Mills et al.

Recently, the revival of the semantic similarity concept has been featured by the rapidly growing artificial intelligence research fueled by advanced deep learning architectures enabling machine intelligence using multimodal data. Thus, semantic similarity in multimodal data has gained substantial attention among researchers. However, the existing surveys on semantic similarity measures are restricted to a single modality, mainly text, which significantly limits the capability to understand the intelligence of real-world application scenarios. This study critically reviews semantic similarity approaches by shortlisting 223 vital articles from the leading databases and digital libraries to offer a comprehensive and systematic literature survey. The notable contribution is to illuminate the evolving landscape of semantic similarity and its crucial role in understanding, interpreting, and extracting meaningful information from multimodal data. Primarily, it highlights the challenges and opportunities inherent in different modalities, emphasizing the significance of advancements in cross-modal and multimodal semantic similarity approaches with potential application scenarios. Finally, the survey concludes by summarizing valuable future research directions. The insights provided in this survey improve the understanding and pave the way for further innovation by guiding researchers in leveraging the strength of semantic similarity for an extensive range of real-world applications.

Electronic computers. Computer science
DOAJ Open Access 2024
How large language model-powered conversational agents influence decision making in domestic medical triage contexts

Catalina Gomez, Junjie Yin, Chien-Ming Huang et al.

IntroductionEffective delivery of healthcare depends on timely and accurate triage decisions, directing patients to appropriate care pathways and reducing unnecessary visits. Artificial Intelligence (AI) solutions, particularly those based on Large Language Models (LLMs), may enable non-experts to make better triage decisions at home, thus easing the healthcare system's load. We investigate how LLM-powered conversational agents influence non-experts in making triage decisions, further studying different persona profiles embedded via prompting.MethodsWe designed a randomized experiment where participants first assessed patient symptom vignettes independently, then consulted one of the two agent profiles—rational or empathic—for advice, and finally revised their triage ratings. We used linear models to quantify the effect of the agent profile and confidence on the weight of advice. We examined changes in confidence and accuracy of triage decisions, along with participants' perceptions of the agents.ResultsIn a study with 49 layperson participants, we found that persona profiles can be differentiated in LLM-powered conversational agents. However, these profiles did not significantly affect the weight of advice. Notably, less confident participants were more influenced by LLM advice, leading to larger adjustments to initial decisions. AI guidance improved alignment with correct triage levels and boosted confidence in participants' decisions.DiscussionWhile LLM advice improves triage recommendations accuracy, confidence plays an important role in its adoption. Our findings raise design considerations for human-AI interfaces, highlighting two key aspects: encouraging appropriate alignment with LLMs' advice and ensuring that people are not easily swayed in situations of uncertainty.

Electronic computers. Computer science
arXiv Open Access 2023
Investigating a domain adaptation approach for integrating different measurement instruments in a longitudinal clinical registry

Maren Hackenberg, Michelle Pfaffenlehner, Max Behrens et al.

In a longitudinal clinical registry, different measurement instruments might have been used for assessing individuals at different time points. To combine them, we investigate deep learning techniques for obtaining a joint latent representation, to which the items of different measurement instruments are mapped. This corresponds to domain adaptation, an established concept in computer science for image data. Using the proposed approach as an example, we evaluate the potential of domain adaptation in a longitudinal cohort setting with a rather small number of time points, motivated by an application with different motor function measurement instruments in a registry of spinal muscular atrophy (SMA) patients. There, we model trajectories in the latent representation by ordinary differential equations (ODEs), where person-specific ODE parameters are inferred from baseline characteristics. The goodness of fit and complexity of the ODE solutions then allows to judge the measurement instrument mappings. We subsequently explore how alignment can be improved by incorporating corresponding penalty terms into model fitting. To systematically investigate the effect of differences between measurement instruments, we consider several scenarios based on modified SMA data, including scenarios where a mapping should be feasible in principle and scenarios where no perfect mapping is available. While misalignment increases in more complex scenarios, some structure is still recovered, even if the availability of measurement instruments depends on patient state. A reasonable mapping is feasible also in the more complex real SMA dataset. These results indicate that domain adaptation might be more generally useful in statistical modeling for longitudinal registry data.

en cs.LG, stat.ME
DOAJ Open Access 2023
Applying a unified process kinetic equation to advanced materials process analysis: Characterization of the kinetics of isothermal microwave‐assisted chemical syntheses

Boon Wong

Abstract Rate‐enhancement of any isothermal, isobaric chemical synthesis conducted under resonant microwave (RM) irradiation versus the same process activated by conventional field‐free heating has been attributed to a reduction in activation enthalpy of the process. This report applies a unified process kinetic equation (UPKE) to demonstrate and characterize non‐thermal microwave effects (NTME) on kinetics‐enhancements observed in isothermal microwave‐assisted chemical syntheses (IMACS). The UPKE, derived from a mesoscopic irreversible thermodynamic model, pinpoints that the rate of any high‐affinity chemical reaction is effectively independent of the affinity of the process as described by the mass‐action rate law. Energetically, activation enthalpy reduction observed in IMACS is considered the major NTME, which causes dominant process‐rate enhancements. This NTME results from RM‐induced enthalpy variation during the reaction: RM energy‐input first promotes the molar enthalpy of the irradiated reactant(s) at temperature, which consequently motivates an activation enthalpy reduction for rate‐enhancement. Conversely, frequency coefficient lowering is another common NTME occurring in IMACS, causing an adverse yet compensable setback to process‐kinetics as predicted by the UPKE. Applicability of the UPKE‐proposed rationale and methodology for IMACS kinetic characterization is fully confirmed by relevant data in the literature.

Engineering (General). Civil engineering (General), Electronic computers. Computer science
DOAJ Open Access 2023
An effective stacked autoencoder based depth separable convolutional neural network model for face mask detection

Sundaravadivazhagan Balasubaramanian, Robin Cyriac, Sahana Roshan et al.

The COVID-19 pandemic has been infecting the entire world over the past years. To prevent the spread of COVID-19, people have acclimatised to the new normal, which includes working from home, communicating online, and maintaining personal cleanliness. There are numerous tools required to prepare to compact transmissions in the future. One of these elements for protecting individuals from fatal virus transmission is the mask. Studies have indicated that wearing a mask may help to reduce the risk of viral transmission of all kinds. It causes many public places to take efforts to ensure that its guests wear adequate face masks and keep a safe distance from one another. Screening systems need to be installed at the doors of businesses, schools, government buildings, private offices, and/or other important areas. A variety of face detection models have been designed using various algorithms and techniques. Most of the articles in the previously published research have not worked on dimensionality reduction in conjunction with depth-wise separable neural networks. The necessity of determining the identities of people who do not cover their faces when they are in public is the driving factor for the development of this methodology. This research work proposes a deep learning technique to determine if a person is wearing mask or not and identifies whether it is properly worn or not. Stacked Auto Encoder (SAE) technique is implemented by stacking the following components: Principal Component Analysis (PCA) and Depth-wise Separable Convolutional Neural Network (DWSC-NN). PCA is used to reduce the irrelevant features in the images and resulted high true positive rate in the detection of mask. We achieved an accuracy score of 94.16% and an F1 score of 96.009% by the application of the method described in this research.

Computer engineering. Computer hardware, Electronic computers. Computer science
CrossRef Open Access 2022
Introducing Robotized Stator Cable Winding to Rotating Electric Machines

Erik Hultman

Following environmental concerns and the rapid digitalization of our society, we are currently experiencing an extensive electrification and industrial revolution. High numbers of electric machines thus need to be assembled for varying applications, including vehicle propulsion and renewable energy conversion. Cable winding is an alternative stator winding technology for electric machines that has been utilized for such applications, so far in smaller series or in prototype machines. The presented work introduces the first concept for automated stator cable winding of rotating electric machines. This concept could enable higher production volumes of cable wound machines and a unique flexibility in handling different machines, in line with Industry 4.0. Robotized stator cable winding is evaluated here for five very different rotating machine designs, through simulations and analytical extrapolation of previous experimental winding results. Potential cycle time and assembly cost savings are indicated compared to manual and lower volume conventional automation, while it is not possible to compete in the present form with existing very high-volume conventional winding automation for smaller machines. Future experimental work is pointed out on handling larger winding cables and special machine designs, and on increased robustness and optimization.

DOAJ Open Access 2022
Stabilizing deep tomographic reconstruction: Part B. Convergence analysis and adversarial attacks

Weiwen Wu, Dianlin Hu, Wenxiang Cong et al.

Summary: Due to lack of the kernel awareness, some popular deep image reconstruction networks are unstable. To address this problem, here we introduce the bounded relative error norm (BREN) property, which is a special case of the Lipschitz continuity. Then, we perform a convergence study consisting of two parts: (1) a heuristic analysis on the convergence of the analytic compressed iterative deep (ACID) scheme (with the simplification that the CS module achieves a perfect sparsification), and (2) a mathematically denser analysis (with the two approximations: [1] AT is viewed as an inverse A-1 in the perspective of an iterative reconstruction procedure and [2] a pseudo-inverse is used for a total variation operator H). Also, we present adversarial attack algorithms to perturb the selected reconstruction networks respectively and, more importantly, to attack the ACID workflow as a whole. Finally, we show the numerical convergence of the ACID iteration in terms of the Lipschitz constant and the local stability against noise. The bigger picture: For deep tomographic reconstruction to realize its full potential in practice, it is critically important to address the instabilities of deep reconstruction networks, which were identified in a recent PNAS paper. Our analytic compressed iterative deep (ACID) framework has provided an effective solution to address this challenge by synergizing deep learning and compressed sensing through iterative refinement. Here, we provide an initial convergence analysis, describe an algorithm to attack the entire ACID workflow, and establish not only its capability of stabilizing an unstable deep reconstruction network but also its stability against adversarial attacks dedicated to ACID as a whole. Although our theoretical results are under approximations, they shed light on the converging mechanism of ACID, serving as a basis for further investigation.

Computer software
DOAJ Open Access 2022
LPMP: A Bio-Inspired Model for Visual Localization in Challenging Environments

Sylvain Colomer, Sylvain Colomer, Nicolas Cuperlier et al.

Autonomous vehicles require precise and reliable self-localization to cope with dynamic environments. The field of visual place recognition (VPR) aims to solve this challenge by relying on the visual modality to recognize a place despite changes in the appearance of the perceived visual scene. In this paper, we propose to tackle the VPR problem following a neuro-cybernetic approach. To this end, the Log-Polar Max-Pi (LPMP) model is introduced. This bio-inspired neural network allows building a neural representation of the environment via an unsupervised one-shot learning. Inspired by the spatial cognition of mammals, visual information in the LPMP model are processed through two distinct pathways: a “what” pathway that extracts and learns the local visual signatures (landmarks) of a visual scene and a “where” pathway that computes their azimuth. These two pieces of information are then merged to build a visuospatial code that is characteristic of the place where the visual scene was perceived. Three main contributions are presented in this article: 1) the LPMP model is studied and compared with NetVLAD and CoHog, two state-of-the-art VPR models; 2) a test benchmark for the evaluation of VPR models according to the type of environment traveled is proposed based on the Oxford car dataset; and 3) the impact of the use of a novel detector leading to an uneven paving of an environment is evaluated in terms of the localization performance and compared to a regular paving. Our experiments show that the LPMP model can achieve comparable or better localization performance than NetVLAD and CoHog.

Mechanical engineering and machinery, Electronic computers. Computer science
DOAJ Open Access 2022
Switchable half-metallicity in A-type antiferromagnetic NiI2 bilayer coupled with ferroelectric In2Se3

Yaping Wang, Xinguang Xu, Xian Zhao et al.

Abstract Electrically controlled half-metallicity in antiferromagnets is of great significance for both fundamental research and practical application. Here, by constructing van der Waals heterostructures composed of two-dimensional (2D) A-type antiferromagnetic NiI2 bilayer (bi-NiI2) and ferroelectric In2Se3 with different thickness, we propose that the half-metallicity is realizable and switchable in the bi-NiI2 proximate to In2Se3 bilayer (bi-In2Se3). The polarization flipping of the bi-In2Se3 successfully drives transition between half-metal and semiconductor for the bi-NiI2. This intriguing phenomenon is attributed to the joint effect of polarization field-induced energy band shift and interfacial charge transfer. Besides, the easy magnetization axis of the bi-NiI2 is also dependent on the polarization direction of the bi-In2Se3. The half-metallicity and magnetic anisotropy energy of the bi-NiI2 in heterostructure can be effectively manipulated by strain. These findings provide not only a feasible strategy to achieve and control half-metallicity in 2D antiferromagnets, but also a promising candidate to design advanced nanodevices.

Materials of engineering and construction. Mechanics of materials, Computer software
arXiv Open Access 2021
VFSIE -- Development and Testing Framework for Federated Science Instruments

Anees Al-Najjar, Nageswara S. V. Rao, Neena Imam et al.

Recent developments in softwarization of networked infrastructures combined with containerization of computing workflows promise unprecedented compute anywhere and everywhere capabilities for federations of edge and remote computing systems and science instruments. The development and testing of software stacks that implement these capabilities over physical production federations, however, is not very practical nor cost-effective. In response, we develop a digital twin of the physical infrastructure, called the Virtual Federated Science Instrument Environment (VFSIE). This framework emulates the federation using containers and hosts connected over an emulated network, and supports the development and testing of federation stacks and workflows. We illustrate its use in a case study involving Jupiter Notebook computations and instrument control.

en cs.NI

Halaman 23 dari 29610