Hasil untuk "Computer engineering. Computer hardware"

Menampilkan 20 dari ~8510079 hasil · dari CrossRef, DOAJ, arXiv, Semantic Scholar

JSON API
arXiv Open Access 2026
Dishonesty Tendencies in Testing Scenarios Among Students with Virtual Reality and Computer-Mediated Technology

Tanja Kojić, Alina Dovhalevska, Maurizio Vergari et al.

Virtual reality (VR) systems have the potential to be an innovation in the field of e-learning. Starting with fully functional e-classes, VR technologies can be used to build entire e-campuses. The power of VR is that it allows for stronger contact with students than computer-mediated technology. Deceptive behaviour, both verbal and nonverbal, refers to intentional activities designed to deceive others. Students often engage in dishonest practices to make progress. Whether it is cheating on an exam, copying another student's essay, or inflating their GPA, the motivation for cheating is rarely simply a lack of preparation. Even though some may see academic dishonesty as an asset, the reality is that it can have major consequences. This poster demonstrates the findings from a study of students' deceitful behaviour during a test in VR and in real-life situations. For this user study, 22 volunteers were invited to participate, with each experiment involving exactly two participants and the examiner present in the room. Students were invited to take two tests: one in VR and one on a laptop. Their goal was to score as many points as possible by simulating a real-world online exam. Participants were requested to complete questionnaires during and after each experiment, which assisted in collecting additional data for this study. The results indicate that the amount of cheating that happened in VR and on a laptop was exactly the same.

arXiv Open Access 2026
Maintaining the Heterogeneity in the Organization of Software Engineering Research

Yang Yue, Zheng Jiang, Yi Wang

The heterogeneity in the organization of software engineering (SE) research historically exists, i.e., funded research model and hands-on model, which makes software engineering become a thriving interdisciplinary field in the last 50 years. However, the funded research model is becoming dominant in SE research recently, indicating such heterogeneity has been seriously and systematically threatened. In this essay, we first explain why the heterogeneity is needed in the organization of SE research, then present the current trend of SE research nowadays, as well as the consequences and potential futures. The choice is at our hands, and we urge our community to seriously consider maintaining the heterogeneity in the organization of software engineering research.

en cs.SE
DOAJ Open Access 2025
Fuzzy clustering with robust learning models for soccer player profiling and resilience analysis

Antonio Pacifico

Abstract This study proposes an integrated analytical framework to enhance football performance analytics by combining feature engineering, fuzzy clustering, interpretable machine learning, and topological network analysis. The framework is designed to extract latent offensive profiles and predict high-efficiency scoring profiles across domestic and international competitions. The approach begins by constructing three composite indicators – Index of Offensive Efficiency, Competitive Resilience Index, and Versatility Score – designed to capture multidimensional aspects of a player’s offensive productivity, adaptability across competitions, and contribution breadth. These engineered metrics inform a fuzzy clustering algorithm that reveals two core performance profiles: “Seasoned Finishing Specialists” and “Emerging Versatile Contributors”. Building on this segmentation, a supervised learning model based on XGBoost is employed to predict the likelihood of surpassing a goals-per-shot efficiency threshold. Model interpretability is ensured via SHAP plot, which highlight the pivotal role of salary, finishing metrics, and competition-specific resilience. Partial dependence plots further expose nonlinear and interactive effects between key predictors. A network-based analysis complements the model by mapping performance similarities and identifying both archetypal and transitional performers via centrality measures. Robustness checks, including alternative winsorization, fuzziness levels, and subgroup-specific clustering, confirm the stability of the results. Overall, the proposed framework bridges segmentation and prediction with transparency and domain-relevance, offering a comprehensive toolkit for decision-makers in sports analytics, recruiters, and talent management.

Computer engineering. Computer hardware, Information technology
arXiv Open Access 2025
HaLoRA: Hardware-aware Low-Rank Adaptation for Large Language Models Based on Hybrid Compute-in-Memory Architecture

Taiqiang Wu, Chenchen Ding, Wenyong Zhou et al.

Low-rank adaptation (LoRA) is a predominant parameter-efficient finetuning method for adapting large language models (LLMs) to downstream tasks. Meanwhile, Compute-in-Memory (CIM) architectures demonstrate superior energy efficiency due to their array-level parallel in-memory computing designs. In this paper, we propose deploying the LoRA-finetuned LLMs on the hybrid CIM architecture (i.e., pretrained weights onto energy-efficient Resistive Random-Access Memory (RRAM) and LoRA branches onto noise-free Static Random-Access Memory (SRAM)), reducing the energy cost to about 3\% compared to the Nvidia A100 GPU. However, the inherent noise of RRAM on the saved weights leads to performance degradation, simultaneously. To address this issue, we design a novel Hardware-aware Low-rank Adaptation (HaLoRA) method. The key insight is to train a LoRA branch that is robust toward such noise and then deploy it on noise-free SRAM, while the extra cost is negligible since the parameters of LoRAs are much fewer than pretrained weights (e.g., 0.15\% for LLaMA-3.2 1B model). To improve the robustness towards the noise, we theoretically analyze the gap between the optimization trajectories of the LoRA branch under both ideal and noisy conditions and further design an extra loss to minimize the upper bound of this gap. Therefore, we can enjoy both energy efficiency and accuracy during inference. Experiments finetuning the Qwen and LLaMA series demonstrate the effectiveness of HaLoRA across multiple reasoning tasks, achieving up to \textbf{22.7} improvement in average score while maintaining robustness at various noise types and noise levels.

en cs.CL, cs.AR
DOAJ Open Access 2024
Spontaneously formed cellulose-based random micro-textured film for light extraction in organic light-emitting diodes

Sora Han, Baeksang Sung, Hyunjun Jang et al.

This study utilized cellulose-based external light extraction films to enhance the outcoupling efficiency of organic light-emitting diodes (OLEDs). These films were created through a straightforward fabrication employing the biopolymers hydroxyethyl cellulose (HEC) and tannic acid (TA). We observed that the hydrogen bonding between HEC and TA could spontaneously generate a random microtextured surface on the HEC: TA films. Furthermore, the surface morphology of the HECTA film can be adjusted by varying the TA content. When applying the HECTA films as an external light-extraction layer for OLEDs, we noticed a remarkable improvement in the external quantum efficiency and current efficiency of the OLEDs up to 54% compared to those without the HECTA film. Additionally, the HECTA film displayed an outstanding ability to absorb ultraviolet light. Therefore, it is anticipated that the HECTA film can significantly contribute to prolonging the stability of OLEDs.

Computer engineering. Computer hardware
DOAJ Open Access 2024
Hybrid Filtering Method for Multisource Point Cloud Data of Maglev Tracks

ZHANG Yuxin, ZHANG Lei, OU Dongxiu

In the simulation data processing of maglev tracks, the filtering and extraction of maglev track point cloud data is an important link. Thus, practical applications should adopt an efficient filtering method according to the characteristics of the maglev data to be extracted. The point cloud data objects of the maglev track primarily include the image data of the maglev track, which is obtained by Unmanned Aerial Vehicle (UAV) oblique photography and formed into dense point cloud data after 3D reconstruction, and the laser point cloud data, which is obtained by handheld lidar scanning of the maglev track. Based on the data characteristics of these point clouds and considering the complex scenes around the maglev track, the two types of point clouds are mixed and filtered. First, the octree downsampling method is used for laser point cloud data, which effectively reduces the order of magnitude of the point cloud data and saves running time. The Cloth Simulation Filtering (CSF) method is then used on the laser point cloud and dense point cloud data to filter the ground plane point cloud and retain the non-ground point cloud data, respectively. A Statistical Outlier Removal (SOR) filtering method is used to screen a large number of outliers. Based on the characteristics of the maglev track, point clouds outside the coordinate range are filtered through straight-through filtering. On the premise of not changing the structure of the maglev track, the experimental results show that the filtering rates of the proposed method are 86.15% and 64.76% for the octree-downsampled laser point cloud data and the dense point cloud data without octree downsampling, respectively. These two point cloud datasets have similar structural ranges after hybrid filtering and a number of point clouds of the same order of magnitude, which can be effective for methods such as feature extraction of point clouds in maglev orbits.

Computer engineering. Computer hardware, Computer software
DOAJ Open Access 2024
AI-empowered mobile edge computing: inducing balanced federated learning strategy over edge for balanced data and optimized computation cost

Momina Shaheen, Muhammad S. Farooq, Tariq Umer

Abstract In Mobile Edge Computing, the framework of federated learning can enable collaborative learning models across edge nodes, without necessitating the direct exchange of data from edge nodes. It addresses significant challenges encompassing access rights, privacy, security, and the utilization of heterogeneous data sources over mobile edge computing. Edge devices generate and gather data, across the network, in non-IID (independent and identically distributed) manner leading to potential variations in the number of data samples among these edge networks. A method is proposed to work in federated learning under edge computing setting, which involves AI techniques such as data augmentation and class estimation and balancing during training process with minimized computational overhead. This is accomplished through the implementation of data augmentation techniques to refine data distribution. Additionally, we leveraged class estimation and employed linear regression for client-side model training. This strategic approach yields a reduction in computational costs. To validate the effectiveness of the proposed approach, it is applied to two distinct datasets. One dataset pertains to image data (FashionMNIST), while the other comprises numerical and textual data concerning stocks for predictive analysis of stock values. This approach demonstrates commendable performance across both dataset types and approaching more than 92% of accuracy in the paradigm of federated learning.

Computer engineering. Computer hardware, Electronic computers. Computer science
DOAJ Open Access 2022
Analog Circuit Fault Classification and Data Reduction Using PCA-ANFIS Technique Aided by K-means Clustering Approach

LAIDANI, I., BOUROUBA, N.

The paper work aims to extract effectively the fault feature information of analog integrated circuits and to improve the performance of a fault classification process. Thus, a fault classification method based on principal component analysis (PCA) and adaptive neuro fuzzy inference system classifier (ANFIS) preprocessed by K-means clustering (KMC) is proposed. To effectively extract and select fault features the traditional signal processing based on sampling technique conducts to different signature parameters. A stimulus pulse signal applied to the circuit under test (CUT) allowed us to get a reference output response. Respecting both specific sampling interval and step, the fault free and the faulty output responses are sampled to create amplitude sample features that will serve the fault classification process. The PCA employed for data reduction has lessened the computational complexity and obtaining the optimal features. Thus more than 75% of data volume decreased without loss of original information. The principal components extracted by this reduction data method have been input into ANFIS aided by KMC to obtain the best fault diagnosis results. The experimental results show a score of 100% diagnostic accuracies for the CUTs. Therefore, our approach has achieved best fault classification precision comparing to other research works.

Electrical engineering. Electronics. Nuclear engineering, Computer engineering. Computer hardware
DOAJ Open Access 2022
Application of Fast P2P Traffic Recognition Technology Based on Decision Tree in the Detection of Network Traffic Data

Lin Zheng, Junjiao Li

With the rapid development of large-scale enterprise informatization construction, the network scale has become huge and complex, and the data traffic carried by the network is increasing. Accurate network traffic identification is the basis of network management and is of great significance to enterprise informatization construction and operation and maintenance. In response to the network operation and maintenance requirements of large enterprises, this paper analyzes the network architecture and network traffic distribution of large enterprise groups from the perspective of enterprise network operators and introduces the current operation and maintenance process of enterprise network performance failures. Maintenance process optimization and reengineering are carried out to plan and find out the shortcomings of the current process links and put forward corresponding solutions. Based on the research of traffic identification in recent years, a fast P2P network traffic anomaly identification algorithm based on decision tree model is proposed, which improves the efficiency and accuracy of network traffic application identification and network traffic anomaly data identification.

Computer engineering. Computer hardware
DOAJ Open Access 2022
Experimental Study of Oxygen Separation in Oxygen- Pressure Swing Adsorption Unit

Radek Šulc, Miroslav Kos

The pressure swing adsorption (PSA) units are widely used as oxygen sources where oxygen is produced in a gaseous form. The start-up time of minutes is an undeniable advantage of PSA technology compared to cryogenic air separation having start-up time taking hours or days. The purity of oxygen produced by PSA using nitrogen selective zeolites (type A and type X zeolites) is limited to 95 % oxygen. The pilot-plant adsorption unit utilizes two-bed pressure swing adsorption technology. The nominal capacity of this unit is 1.4 kg h-1 of gaseous oxygen with a purity of 95 % oxygen. The paper deals to analyze process characteristics of oxygen separation from the air in the pilot-plant adsorption unit for the adsorption pressure of 5.5 bar and the defined adsorption cycle. The effect of the number of cycles needed to obtain relevant results was also investigated.

Chemical engineering, Computer engineering. Computer hardware
DOAJ Open Access 2022
Side Channel Attack On Stream Ciphers: A Three-Step Approach To State/Key Recovery

Satyam Kumar, Vishnu Asutosh Dasu, Anubhab Baksi et al.

Side Channel Attack (SCA) exploits the physical information leakage (such as electromagnetic emanation) from a device that performs some cryptographic operation and poses a serious threat in the present IoT era. In the last couple of decades, there have been a large body of research works dedicated to streamlining/improving the attacks or suggesting novel countermeasures to thwart those attacks. However, a closer inspection reveals that a vast majority of published works in the context of symmetric key cryptography is dedicated to block ciphers (or similar designs). This leaves the problem for the stream ciphers wide open. There are few works here and there, but a generic and systematic framework appears to be missing from the literature. Motivating by this observation, we explore the problem of SCA on stream ciphers with extensive details. Loosely speaking, our work picks up from the recent TCHES’21 paper by Sim, Bhasin and Jap. We present a framework by extending the efficiency of their analysis, bringing it into more practical terms. In a nutshell, we develop an automated framework that works as a generic tool to perform SCA on any stream cipher or a similar structure. It combines multiple automated tools (such as, machine learning, mixed integer linear programming, satisfiability modulo theory) under one umbrella, and acts as an end-to-end solution (taking side channel traces and returning the secret key). Our framework efficiently handles noisy data and works even after the cipher reaches its pseudo-random state. We demonstrate its efficacy by taking electromagnetic traces from a 32-bit software platform and performing SCA on a high-profile stream cipher, TRIVIUM, which is also an ISO standard. We show pragmatic key recovery on TRIVIUM during its initialization and also after the cipher reaches its pseudo-random state (i.e., producing key-stream).

Computer engineering. Computer hardware, Information technology
arXiv Open Access 2022
Combining Photogrammetric Computer Vision and Semantic Segmentation for Fine-grained Understanding of Coral Reef Growth under Climate Change

Jiageng Zhong, Ming Li, Hanqi Zhang et al.

Corals are the primary habitat-building life-form on reefs that support a quarter of the species in the ocean. A coral reef ecosystem usually consists of reefs, each of which is like a tall building in any city. These reef-building corals secrete hard calcareous exoskeletons that give them structural rigidity, and are also a prerequisite for our accurate 3D modeling and semantic mapping using advanced photogrammetric computer vision and machine learning. Underwater videography as a modern underwater remote sensing tool is a high-resolution coral habitat survey and mapping technique. In this paper, detailed 3D mesh models, digital surface models and orthophotos of the coral habitat are generated from the collected coral images and underwater control points. Meanwhile, a novel pixel-wise semantic segmentation approach of orthophotos is performed by advanced deep learning. Finally, the semantic map is mapped into 3D space. For the first time, 3D fine-grained semantic modeling and rugosity evaluation of coral reefs have been completed at millimeter (mm) accuracy. This provides a new and powerful method for understanding the processes and characteristics of coral reef change at high spatial and temporal resolution under climate change.

en cs.CV
arXiv Open Access 2022
Revisiting Embodiment for Brain-Computer Interfaces

Barış Serim, Michiel Spapé, Giulio Jacucci

Researchers increasingly explore deploying brain-computer interfaces (BCIs) for able-bodied users, with the motivation of accessing mental states more directly than allowed by existing body-mediated interaction. This motivation seems to contradict the long-standing HCI emphasis on embodiment, namely the general claim that the body is crucial for cognition. This paper addresses this apparent contradiction through a review of insights from embodied cognition and interaction. We first critically examine the recent interest in BCIs and identify the extent cognition in the brain is integrated with the wider body as a central concern for research. We then define the implications of an integrated view of cognition for interface design and evaluation. A counterintuitive conclusion we draw is that embodiment per se should not imply a preference for body movements over brain signals. Yet it can nevertheless guide research by 1) providing body-grounded explanations for BCI performance, 2) proposing evaluation considerations that are neglected in modular views of cognition, and 3) through the direct transfer of its design insights to BCIs. We finally reflect on HCI's understanding of embodiment and identify the neural dimension of embodiment as hitherto overlooked.

arXiv Open Access 2021
Automated Symbolic and Numerical Testing of DLMF Formulae using Computer Algebra Systems

Howard S. Cohl, André Greiner-Petter, Moritz Schubotz

We have developed an automated procedure for symbolic and numerical testing of formulae extracted from the NIST Digital Library of Mathematical Functions (DLMF). For the NIST Digital Repository of Mathematical Formulae, we have developed conversion tools from semantic LaTeX to the Computer Algebra System (CAS) Maple which relies on Youssef's part-of-math tagger. We convert a test data subset of 4,078 semantic LaTeX DLMF formulae %extracted from the DLMF to the native CAS representation and then apply an automated scheme for symbolic and numerical testing and verification. Our framework is implemented using Java and Maple. We describe in detail the conversion process which is required so that the CAS can correctly interpret the mathematical representation of the formulae. We describe the improvement of the effectiveness of our automated scheme through incremental enhancement (making more precise) of the mathematical semantic markup for the formulae.

arXiv Open Access 2021
Weakly nonlocal Poisson brackets: tools, examples, computations

Matteo Casati, Paolo Lorenzoni, Daniele Valeri et al.

We implement an algorithm for the computation of Schouten bracket of weakly nonlocal Hamiltonian operators in three different computer algebra systems: Maple, Reduce and Mathematica. This class of Hamiltonian operators encompass almost all the examples coming from the theory of (1+1)-integrable evolutionary PDEs

en math-ph, cs.SC
DOAJ Open Access 2020
IMPLEMENTATION OF AHP METHOD IN THE DECISION SUPPORT OF SELECTION OF STUDENT ACHIEVEMENT CASE STUDY: SENIOR HIGH SCHOOL

Ari Puspita, Yuyun Yuningsih, Hilda Amalia et al.

The research to look for excellence that each student has in order to find out students who excel. Through academic and non-academic achievements can be a benchmark for finding the best students. The research method used in this study is the survey research method, in which the writer will distribute the questionnaire to one of the High Schools with the target of the research being students and observing through the principal and teachers at the High School. This study uses the AHP (Analytical Hierarchy Process) method which is assisted by Expert Choice Software in making decisions regarding high achieving students in high schools in the Tangerang area. AHP process by comparing one student with other student candidates based on predetermined criteria, also comparing criteria, to find which criteria are more favored. The results of the processing of the Expert Choice Software in the form of graphic images that show superior students so can be decided by decision-makers.

Electronic computers. Computer science, Computer engineering. Computer hardware
DOAJ Open Access 2020
Weak Tie Overlapping Community Detection Based on Time Interaction Bias Influence Propagation Model

XU Xiaoyuan, HUANG Li, LI Haibo

In order to improve the detection and recognition performance of weak tie overlapping communities,this paper proposes a community detection method based on time interaction bias influence propagation model.The target function for the model segmentation of community detection graph is designed and the load balance of the processor is optimized by applying community structure,so as to improve the solution efficiency of the model.Based on the neighborhood edge density,the approximate active edge is redefined and an influence propagation model is established,which can confirm that the users have high interaction frequency and have strong recognition performance for weak tie users.On this basis,a time interaction bias community detection method based on overlapping community detection is proposed.Experimental results show that the proposed method has high recognition accuracy and efficiency when conducting detection on overlapping communities.

Computer engineering. Computer hardware, Computer software
arXiv Open Access 2020
Optimal Bayesian experimental design for subsurface flow problems

Alexander Tarakanov, Ahmed H. Elsheikh

Optimal Bayesian design techniques provide an estimate for the best parameters of an experiment in order to maximize the value of measurements prior to the actual collection of data. In other words, these techniques explore the space of possible observations and determine an experimental setup that produces maximum information about the system parameters on average. Generally, optimal Bayesian design formulations result in multiple high-dimensional integrals that are difficult to evaluate without incurring significant computational costs as each integration point corresponds to solving a coupled system of partial differential equations. In the present work, we propose a novel approach for development of polynomial chaos expansion (PCE) surrogate model for the design utility function. In particular, we demonstrate how the orthogonality of PCE basis polynomials can be utilized in order to replace the expensive integration over the space of possible observations by direct construction of PCE approximation for the expected information gain. This novel technique enables the derivation of a reasonable quality response surface for the targeted objective function with a computational budget comparable to several single-point evaluations. Therefore, the proposed technique reduces dramatically the overall cost of optimal Bayesian experimental design. We evaluate this alternative formulation utilizing PCE on few numerical test cases with various levels of complexity to illustrate the computational advantages of the proposed approach.

en physics.comp-ph, cs.LG
arXiv Open Access 2020
Three Patterns to Support Empathy in Computer-Mediated Human Interaction

Michael J. Lyons, Daniel Kluender

We present three patterns for computer-mediated interaction which we discovered during the design and development of a platform for remote teaching and learning of kanji, the Chinese characters used in written Japanese. Our aim in developing this system was to provide a basis for embodiment in remote interaction, and in particular to support the experience of empathy by both teacher and student. From this study, the essential elements are abstracted and suggested as design patterns for other computer-mediated interaction systems.

en cs.HC, cs.MM

Halaman 35 dari 425504