“I Wonder if my Years of Training and Expertise Will be Devalued by Machines”: Concerns About the Replacement of Medical Professionals by Artificial Intelligence
M. K. K. Rony, Mst. Rina Parvin, Md. Wahiduzzaman
et al.
Background The rapid integration of artificial intelligence (AI) into healthcare has raised concerns among healthcare professionals about the potential displacement of human medical professionals by AI technologies. However, the apprehensions and perspectives of healthcare workers regarding the potential substitution of them with AI are unknown. Objective This qualitative research aimed to investigate healthcare workers’ concerns about artificial intelligence replacing medical professionals. Methods A descriptive and exploratory research design was employed, drawing upon the Technology Acceptance Model (TAM), Technology Threat Avoidance Theory, and Sociotechnical Systems Theory as theoretical frameworks. Participants were purposively sampled from various healthcare settings, representing a diverse range of roles and backgrounds. Data were collected through individual interviews and focus group discussions, followed by thematic analysis. Results The analysis revealed seven key themes reflecting healthcare workers’ concerns, including job security and economic concerns; trust and acceptance of AI; ethical and moral dilemmas; quality of patient care; workforce role redefinition and training; patient–provider relationships; healthcare policy and regulation. Conclusions This research underscores the multifaceted concerns of healthcare workers regarding the increasing role of AI in healthcare. Addressing job security, fostering trust, addressing ethical dilemmas, and redefining workforce roles are crucial factors to consider in the successful integration of AI into healthcare. Healthcare policy and regulation must be developed to guide this transformation while maintaining the quality of patient care and preserving patient–provider relationships. The study findings offer insights for policymakers and healthcare institutions to navigate the evolving landscape of AI in healthcare while addressing the concerns of healthcare professionals.
Impact of artificial intelligence adoption on students' academic performance in open and distance learning: A systematic literature review
M. D. Adewale, A. Azeta, A. Abayomi-alli
et al.
The role of artificial intelligence (AI) in education has been extensively studied, focusing on its ability to enhance learning and teaching processes. However, the precise impact of AI adoption on academic performance in open and distance learning (ODL) remains largely unexplored. This systematic literature review critically evaluates AI's impact on academic performance within ODL environments. Drawing from a curated selection of 64 papers from an initial pool of 700, spanning from 2017 to 2023 and sourced from Scopus, Google Scholar, and Web of Science, this study delves into the multifaceted role of AI in enhancing learning outcomes. The meta-analysis reveals a diverse methodological landscape: machine learning methods, employed in 29.69 % of the studies, stand out for their ability to predict academic achievement, which is matched in prevalence by classical statistical methods. Although less common at 3.13 %, hybrid methods are a burgeoning area of research, while a significant 40.63 % of works prioritise nonempirical methods, focusing on theoretical analysis and literature reviews. This investigation highlights the critical factors driving AI adoption in education and its tangible benefits for student performance. It identifies a crucial literature gap: the absence of a process-based framework designed to forecast AI's educational impacts with greater precision, especially across gender and regional lines. By proposing this framework, this study contributes to the academic discourse on AI in education. It underscores the urgent need for structured methodologies to navigate the challenges and opportunities of AI integration. This framework, aligned with UNESCO's 2030 educational objectives, promises to bridge educational divides, ensuring equitable access to quality education across diverse demographics. The findings advocate for future research to design, refine, and test such a framework, paving the way for more inclusive and effective educational technologies in ODL settings.
Landslide Disaster Vulnerability Assessment and Prediction Based on a Multi-Scale and Multi-Model Framework: Empirical Evidence from Yunnan Province, China
Li Xu, Shucheng Tan, Runyang Li
Against the backdrop of intensifying global climate change and expanding human encroachment into mountainous regions, landslides have increased markedly in both frequency and destructiveness, emerging as a key risk to socio-ecological security and development in mountain areas. Rigorous assessment and forward-looking prediction of landslide disaster vulnerability (LDV) are essential for targeted disaster risk reduction and regional sustainability. However, existing studies largely center on landslide susceptibility or risk, often overlooking the dynamic evolution of adaptive capacity within affected systems and its nonlinear responses across temporal and spatial scales, thereby obscuring the complex mechanisms underpinning LDV. To address this gap, we examine Yunnan Province, a landslide-prone region of China where intensified extreme rainfall and the expansion of human activities in recent years have exacerbated landslide risk. Drawing on the vulnerability scoping diagram (VSD), we construct an exposure–sensitivity–adaptive capacity assessment framework to characterize the spatiotemporal distribution of LDV during 2000–2020. We further develop a multi-model, multi-scale integrated prediction framework, benchmarking the predictive performance of four machine learning algorithms—backpropagation neural network (BPNN), support vector machine (SVM), random forest (RF), and XGBoost—across sample sizes ranging from 2500 to 360,000 to identify the optimal model–scale combination. From 2000 to 2020, LDV in Yunnan declined overall, exhibiting a spatial pattern of “higher in the northwest and lower in the southeast.” High-LDV areas decreased markedly, and sustained enhancement of adaptive capacity was the primary driver of the decline. At approximately the 90,000-cell grid scale, XGBoost performed best, robustly reproducing the observed spatiotemporal evolution and projecting continued declines in LDV during 2030–2050, albeit with decelerating improvement; low-LDV zones show phased fluctuations of “expansion followed by contraction”, whereas high-LDV zones continue to contract northwestward. The proposed multi-model, multi-scale fusion framework enhances the accuracy and robustness of LDV prediction, provides a scientific basis for precise disaster risk reduction strategies and resource optimization in Yunnan, and offers a quantitative reference for resilience building and policy design in analogous regions worldwide.
Wear Mechanisms and Service Life of PVD-Coated Collar Dies in Coin Minting
Srirod Sunchai, Kongphan Phanphong
Circulating coins remain indispensable in Thailand despite rapid digitalization, with the 5-baht denomination playing a critical role. Yet, collar dies used in its minting suffer frequent wear and premature failure, increasing production costs. This study systematically evaluated two cold-work tool steels – DIN 2379 (conventional) and Böhler K490 (powder metallurgy) – uncoated and coated with titanium nitride (TiN) or chromium nitride (CrN) via arc-PVD, under industrial high-speed minting. Testing included hardness and scratch adhesion, optical/SEM–EDS wear analysis, regression-based tool life prediction, quantitative coin quality assessment, and economic evaluation. Results demonstrated that substrate–coating synergy was decisive. TiN on DIN 2379 achieved the best overall performance, sustaining ~1.5 million coins with negligible defects, while uncoated DIN 2379 failed at ~0.67 million. CrN provided intermediate life (~0.95 million) but with lower adhesion stability. In contrast, K490 systems terminated early (~0.8–1.0 million coins) due to Cu–Ni debris accumulation, despite groove wear depths remaining below 4 μm. Regression models predicted >6 million coins for K490, but these values were invalid, highlighting the inadequacy of wear-only criteria. The novelty of this work lies in experimentally confirming, under industrial Thai minting conditions, that debris – not wear depth – governs die termination and product quality. TiN-coated DIN 2379 emerged as the most reliable and cost-effective option (~175,000 Baht per million coins), providing a framework for die management that integrates wear progression, debris effects, and economic efficiency.
Machine design and drawing, Engineering machinery, tools, and implements
Regeneration Efficiency Assessment and Predictive Comparison of Government-Led and Market-Driven Models in Historic Districts Via DID and XGBoost
Hong Ni, Jinliu Chen, Pengcheng Li
An Ensemble Method for Data Classification Using State-of-the-art Methodologies
Sujata Ray, Debasmita Pradhan, N. Ray
Over the past few decades, classification has consistently posed a significant computational challenge. This study presents an innovative ensemble classification model designed for data classification, drawing inspiration from Radial Basis Function, Extreme Learning Machine, Functional Linked Artificial Neural Network, and Artificial Neural Network. The study involved experimenting with various combinations of ensemble methods to construct an ensemble classifier. Remarkably, implementing the ensemble model using Radial Basis Function Network (RBFN), Artificial Neural Network (ANN), Extreme Learning Machine (ELM), and Functional Linked Artificial Neural Network (FLANN) yielded superior results when tested on benchmark datasets. The accuracy range of the ensemble method varies from 80% to 98% which is a good performance considering the diverge data sets used.
Sociotechnical Transformation: A Systematic Review on the Impact of Artificial Intelligence on Society and Organizations
Marc Selgas-Cors
This article presents a systematic literature review (SLR), conducted in accordance with PRISMA 2020 guidelines, to explore how artificial intelligence (AI) is reshaping the architecture of sociotechnical systems. Drawing from 64 peer-reviewed Q1 publications published between 2023 and early 2025, the review distils four interwoven thematic domains: labor and organizational transformation, social inequality, surveillance and data governance, and the evolving dynamics of human–machine interaction and identity. These themes illuminate a crucial insight: AI is not merely optimizing processes or enhancing efficiency; it is recalibrating social hierarchies, reshaping epistemic authority, and redefining institutional accountability. The studies reviewed the span of a range of sectors, from credit scoring algorithms and automated hiring systems to predictive policing and AI-mediated educational platforms. What emerges is a consistent finding: these systems are far from neutral. They are entangled with cultural assumptions, political agendas, and economic imperatives that shape both their design and their deployment. To support transparency, the article includes a comprehensive metadata table that categorizes the 64 studies by topic, method, and publication source. Beyond synthesis, the review raises an urgent call for human-centered AI development, participatory design processes, and equitable governance frameworks that address the regulatory asymmetries between the Global North and South. In a world increasingly governed by algorithmic logic, these measures are not optional, they are foundational.
Human Capital for Industrial Innovation: Competence Development, Job Satisfaction and Entrepreneurship at a Mining Institute
Malec Małgorzata, Stańczak Lilianna
The authors take advantage of their scientific and professional experience in the scope of project management and management policy at a research organization, presenting the results of KOMAG employees’ job satisfaction survey and of two projects conducted in 2022 and 2023 within the framework of the European Fund for Just Transformation. The survey results concerning the KOMAG employees’ job satisfaction revealed an urgent need of improving competences and qualifications. It was particularly interesting to investigate the employees’ level of engagement in trainings and courses and their impact on improving competences and developing professional careers. It lacks an analysis of the role of employees’ job satisfaction, competences and entrepreneurship in the management processes at research organizations. The results of two projects, carried out as components of continuing education and job crafting, can be treated as a sort of guidelines and recommendations for not only representatives of research organizations but also for industrial companies as they are of interdisciplinary, general character. The article’s objective is oriented onto getting knowledge about specific requirements and expectations of researchers employed at institutes. So far this aspect has not been investigated and presented in available literature. The results of two projects, described in the article, are in line with job crafting policy at the KOMAG Institute and they show explicitly that a development of entrepreneurship, competences and job satisfaction is an important component of managerial strategy.
Machine design and drawing, Engineering machinery, tools, and implements
Machine Learning
Javier M. Duarte, Uros Seljak, Kazu Terao
This chapter gives an overview of the core concepts of machine learning (ML) -- the use of algorithms that learn from data, identify patterns, and make predictions or decisions without being explicitly programmed -- that are relevant to particle physics with some examples of applications to the energy, intensity, cosmic, and accelerator frontiers.
en
physics.data-an, hep-ex
Performance Estimation in Binary Classification Using Calibrated Confidence
Juhani Kivimäki, Jakub Białek, Wojtek Kuberski
et al.
Model monitoring is a critical component of the machine learning lifecycle, safeguarding against undetected drops in the model's performance after deployment. Traditionally, performance monitoring has required access to ground truth labels, which are not always readily available. This can result in unacceptable latency or render performance monitoring altogether impossible. Recently, methods designed to estimate the accuracy of classifier models without access to labels have shown promising results. However, there are various other metrics that might be more suitable for assessing model performance in many cases. Until now, none of these important metrics has received similar interest from the scientific community. In this work, we address this gap by presenting CBPE, a novel method that can estimate any binary classification metric defined using the confusion matrix. In particular, we choose four metrics from this large family: accuracy, precision, recall, and F$_1$, to demonstrate our method. CBPE treats the elements of the confusion matrix as random variables and leverages calibrated confidence scores of the model to estimate their distributions. The desired metric is then also treated as a random variable, whose full probability distribution can be derived from the estimated confusion matrix. CBPE is shown to produce estimates that come with strong theoretical guarantees and valid confidence intervals.
Feeding Two Birds or Favoring One? Adequacy-Fluency Tradeoffs in Evaluation and Meta-Evaluation of Machine Translation
Behzad Shayegh, Jan-Thorsten Peter, David Vilar
et al.
We investigate the tradeoff between adequacy and fluency in machine translation. We show the severity of this tradeoff at the evaluation level and analyze where popular metrics fall within it. Essentially, current metrics generally lean toward adequacy, meaning that their scores correlate more strongly with the adequacy of translations than with fluency. More importantly, we find that this tradeoff also persists at the meta-evaluation level, and that the standard WMT meta-evaluation favors adequacy-oriented metrics over fluency-oriented ones. We show that this bias is partially attributed to the composition of the systems included in the meta-evaluation datasets. To control this bias, we propose a method that synthesizes translation systems in meta-evaluation. Our findings highlight the importance of understanding this tradeoff in meta-evaluation and its impact on metric rankings.
Augmented Physics: Creating Interactive and Embedded Physics Simulations from Static Textbook Diagrams
Aditya Gunturu, Yi Wen, Nandi Zhang
et al.
We introduce Augmented Physics, a machine learning-integrated authoring tool designed for creating embedded interactive physics simulations from static textbook diagrams. Leveraging recent advancements in computer vision, such as Segment Anything and Multi-modal LLMs, our web-based system enables users to semi-automatically extract diagrams from physics textbooks and generate interactive simulations based on the extracted content. These interactive diagrams are seamlessly integrated into scanned textbook pages, facilitating interactive and personalized learning experiences across various physics concepts, such as optics, circuits, and kinematics. Drawing from an elicitation study with seven physics instructors, we explore four key augmentation strategies: 1) augmented experiments, 2) animated diagrams, 3) bi-directional binding, and 4) parameter visualization. We evaluate our system through technical evaluation, a usability study (N=12), and expert interviews (N=12). Study findings suggest that our system can facilitate more engaging and personalized learning experiences in physics education.
32 sitasi
en
Computer Science
Seventh-Degree Polynomial-Based Single Lane Change Trajectory Planning and Four-Wheel Steering Model Predictive Tracking Control for Intelligent Vehicles
Fei Lai, Chaoqun Huang
Single lane changing is one of the typical scenarios in vehicle driving. Planning a suitable single lane changing trajectory and tracking that trajectory accurately is very important for intelligent vehicles. The contribution of this study is twofold: (i) to plan lane change trajectories that cater to different driving styles (including aspects such as safety, efficiency, comfort, and balanced performance) by a 7th-degree polynomial; and (ii) to track the predefined trajectory by model predictive control (MPC) through four-wheel steering. The growing complexity of autonomous driving systems requires precise and comfortable trajectory planning and tracking. While 5th-degree polynomials are commonly used for single-lane change maneuvers, they may fail to adequately address lateral jerk, resulting in less comfortable trajectories. The main challenges are: (i) trajectory planning and (ii) trajectory tracking. Front-wheel steering MPC, although widely used, struggles to accurately track trajectories from point mass models, especially when considering vehicle dynamics, leading to excessive lateral jerk. To address these issues, we propose a novel approach combining: (i) 7th-degree polynomial trajectory planning, which provides better control over lateral jerk for smoother and more comfortable maneuvers, and (ii) four-wheel steering MPC, which offers superior maneuverability and control compared to front-wheel steering, allowing for more precise trajectory tracking. Extensive MATLAB/Simulink simulations demonstrate the effectiveness of our approach, showing improved comfort and tracking performance. Key findings include: (i) improved trajectory tracking: Four-wheel steering MPC outperforms front-wheel steering in accurately following desired trajectories, especially when considering vehicle dynamics. (ii) better ride comfort: 7th-degree polynomial trajectories, with improved control over lateral jerk, result in a smoother driving experience. Combining these two techniques enables safer, more efficient, and more comfortable autonomous driving.
Mechanical engineering and machinery, Machine design and drawing
Tactile sensor-less fingertip contact detection and force estimation for stable grasping with an under-actuated hand
Ha Thang Long Doan, Hikaru Arita, Kenji Tahara
Abstract Detecting contact when fingers are approaching an object and estimating the magnitude of the force the fingers are exerting on the object after contact are important tasks for a multi-fingered robotic hand to stably grasp objects. However, for a linkage-based under-actuated robotic hand with a self-locking mechanism to realize stable grasping without using external sensors, such tasks are difficult to perform when only analyzing the robot model or only applying data-driven methods. Therefore, in this paper, a hybrid of previous approaches is used to find a solution for realizing stable grasping with an under-actuated hand. First, data from the internal sensors of a robotic hand are collected during its operation. Subsequently, using the robot model to analyze the collected data, the differences between the model and real data are explained. From the analysis, novel data-driven-based algorithms, which can overcome noted challenges to detect contact between a fingertip and the object and estimate the fingertip forces in real-time, are introduced. The proposed methods are finally used in a stable grasp controller to control a triple-fingered under-actuated robotic hand to perform stable grasping. The results of the experiments are analyzed to show that the proposed algorithms work well for this task and can be further developed to be used for other future dexterous manipulation tasks.
Technology, Mechanical engineering and machinery
Workplace hazards and safety practices in the small-scale industries
ElSharkawy Mahmoud Fathy
The small-scale industries are considered a major sector of economic investment in the world. Small-scale industries typically suffer from problems such as poor management systems, poor safety training, difficulties in complying with legislation, and absence of safety performance. This study aimed to measure the levels of heat stress and noise and assess the safety performance in small-scale industries. Twenty industrial workshops were selected representing four different types of small-scale industries (foundries, automotive repair, metal processing, and aluminium processing) in Alexandria, Egypt. Inside each selected workshop, both levels of heat stress and noise were measured by calibrated instruments. A pre-designed checklist evaluated the adequacy of the safety performance. Noise levels ranged between 86.4 ± 2.0 and 89.7 ± 2.7 dB exceeding the recommended value (85 dB). In the most studied workshops, the levels of heat stress were relatively high, especially in the foundries. Besides, the safety practices at all these workshops were poor or very poor. The most obvious safety problems included poor housekeeping, lack of PPEs, inadequate illumination, absence of emergency exits, and insufficient fire extinguishers. The results emphasize the responsibility of the local authorities to give more attention and interest to this type of industry.
Machine design and drawing, Engineering machinery, tools, and implements
Emerging Transportation Safety and Operations: Practical Perspectives
Deogratias Eustace
Improving transportation traffic safety and operations is a global priority, with efforts focusing on both technological advancements and strategic planning [...]
Mechanical engineering and machinery, Machine design and drawing
Thermal Management of Lithium-Ion Battery Pack Using Equivalent Circuit Model
Muthukrishnan Kaliaperumal, Ramesh Kumar Chidambaram
The design of an efficient thermal management system for a lithium-ion battery pack hinges on a deep understanding of the cells’ thermal behavior. This understanding can be gained through theoretical or experimental methods. While the theoretical study of the cells using electrochemical and numerical methods requires expensive computing facilities and time, the Equivalent Circuit Model (ECM) offers a more direct approach. However, upfront experimental cell characterization is needed to determine the ECM parameters. In this study, the behavior of a cell is characterized experimentally, and the results are used to build a second-order equivalent electrical circuit model of the cell. This model is then integrated with the cooling system of the battery pack for effective thermal management. The Equivalent Circuit Model estimates the internal heat generation inside the cell using instantaneous load current, terminal voltage, and temperature data. By extrapolating the heat generation data of a single cell, we can determine the heat generation of the cells in the pack. With the implementation of the ECM in the cooling system, the coolant flow rate can be adjusted to ensure the attainment of a safe operating cell temperature. Our study confirms that 14% of pumping power can be reduced when compared to the conventional constant flow rate cooling system, while still maintaining the temperature of the cells within safe limits.
Mechanical engineering and machinery, Machine design and drawing
Distributed and Secure Kernel-Based Quantum Machine Learning
Arjhun Swaminathan, Mete Akgün
Quantum computing promises to revolutionize machine learning, offering significant efficiency gains in tasks such as clustering and distance estimation. Additionally, it provides enhanced security through fundamental principles like the measurement postulate and the no-cloning theorem, enabling secure protocols such as quantum teleportation and quantum key distribution. While advancements in secure quantum machine learning are notable, the development of secure and distributed quantum analogues of kernel-based machine learning techniques remains underexplored. In this work, we present a novel approach for securely computing common kernels, including polynomial, radial basis function (RBF), and Laplacian kernels, when data is distributed, using quantum feature maps. Our methodology introduces a robust framework that leverages quantum teleportation to ensure secure and distributed kernel learning. The proposed architecture is validated using IBM's Qiskit Aer Simulator on various public datasets.
Explicit and data-Efficient Encoding via Gradient Flow
Kyriakos Flouris, Anna Volokitin, Gustav Bredell
et al.
The autoencoder model typically uses an encoder to map data to a lower dimensional latent space and a decoder to reconstruct it. However, relying on an encoder for inversion can lead to suboptimal representations, particularly limiting in physical sciences where precision is key. We introduce a decoder-only method using gradient flow to directly encode data into the latent space, defined by ordinary differential equations (ODEs). This approach eliminates the need for approximate encoder inversion. We train the decoder via the adjoint method and show that costly integrals can be avoided with minimal accuracy loss. Additionally, we propose a $2^{nd}$ order ODE variant, approximating Nesterov's accelerated gradient descent for faster convergence. To handle stiff ODEs, we use an adaptive solver that prioritizes loss minimization, improving robustness. Compared to traditional autoencoders, our method demonstrates explicit encoding and superior data efficiency, which is crucial for data-scarce scenarios in the physical sciences. Furthermore, this work paves the way for integrating machine learning into scientific workflows, where precise and efficient encoding is critical. \footnote{The code for this work is available at \url{https://github.com/k-flouris/gfe}.}
Rethinking the Public Space Design Process Using Extended Reality as a Game Changer for 3D Co-Design
Mario Matthys, Laure De Cock, L. Mertens
et al.
Public space design processes are complex. Numerous preconditions and the involvement of stakeholders impede rapid decision making. Two-dimensional drawings remain the norm, although these are difficult for citizen stakeholders to understand. Public space designers rarely use 3D city models, infrastructure building information modeling, digital twins, or extended reality. Usually, 3D images (without animation) are only rendered after decision making for communication purposes. This study consists of an online questionnaire of 102 Flemish region (Belgium) stakeholders to show the appeal of and resistance to the use of 3D and extended reality in public space design processes. In a follow-up experiment, 37 participants evaluated various graphic techniques by their designs and observations. The questionnaire showed that all stakeholders lack experience with the use of virtual reality in design processes. We found that non-designer stakeholders and designers indicated that using virtual reality and interactive online 3D tools using game engines provided a better insight into communication and design. Reusing 3D designs in cycling simulators during the design process results in cost-effective quality optimization, and integration into digital twins or animated spatial time machines paves the way for hybrid, 4D cities. Extended reality supports 3D co-design that has simplicity and clarity from the outset of the design process, a trait that makes it a game changer.