A Course on the Introduction to Quantum Software Engineering: Experience Report
Andriy Miranskyy
Quantum computing is increasingly practiced through programming, yet most educational offerings emphasize algorithmic or framework-level use rather than software engineering concerns such as testing, abstraction, tooling, and lifecycle management. This paper reports on the design and first offering of a cross-listed undergraduate--graduate course that frames quantum computing through a software engineering lens, focusing on early-stage competence relevant to software engineering practice. The course integrates foundational quantum concepts with software engineering perspectives, emphasizing executable artifacts, empirical reasoning, and trade-offs arising from probabilistic behaviour, noise, and evolving toolchains. Evidence is drawn from instructor observations, student feedback, surveys, and analysis of student work. Despite minimal prior exposure to quantum computing, students were able to engage productively with quantum software engineering topics once a foundational understanding of quantum information and quantum algorithms, expressed through executable artifacts, was established. This experience report contributes a modular course design, a scalable assessment model for mixed academic levels, and transferable lessons for software engineering educators developing quantum computing curricula.
Methods for Processing Signal Conversion in Velocity and Acceleration Measurement Considering Transducer Characteristics
Sergii Filonenko, Anzhelika Stakhova
This study presents an innovative approach to processing vibration signals in bridge structures, with a focus on enhancing the accuracy of dynamic response measurements and structural health assessments. It addresses key challenges in signal processing, particularly the uncertainties in selecting filtering parameters for isolating dynamic components from static displacements. A novel method for adaptive filter parameter selection is proposed, which considers variations in resonant frequencies and the non-linearity of quasi-static displacements caused by moving loads. This approach significantly reduces errors in determining forced and natural vibration parameters, leading to more accurate assessments of the bridge’s mechanical characteristics. The study introduces an optimized algorithm for processing acceleration and velocity signals, improving the resolution of natural frequency identification. This method combines traditional Fast Fourier Transform (FFT) techniques with an innovative spectral analysis approach, enabling precise identification of resonant frequencies and damping coefficients. A comprehensive evaluation framework is developed, integrating vibration amplitude, frequency, and damping ratio analyses. This framework enhances structural health assessments, improving the detection and characterization of potential defects and changes in load-bearing capacity. The practical significance of this research lies in its real-world application to bridge diagnostics. The study provides guidelines for sensor selection and configuration, adapted for various bridge types and sizes. The proposed methods demonstrate notable improvements in dynamic coefficient determination and overall structural assessments, offering the potential to reduce maintenance costs and enhance bridge safety.
Engineering machinery, tools, and implements
TACLA: An LLM-Based Multi-Agent Tool for Transactional Analysis Training in Education
Monika Zamojska, Jarosław A. Chudziak
Simulating nuanced human social dynamics with Large Language Models (LLMs) remains a significant challenge, particularly in achieving psychological depth and consistent persona behavior crucial for high-fidelity training tools. This paper introduces TACLA (Transactional Analysis Contextual LLM-based Agents), a novel Multi-Agent architecture designed to overcome these limitations. TACLA integrates core principles of Transactional Analysis (TA) by modeling agents as an orchestrated system of distinct Parent, Adult, and Child ego states, each with its own pattern memory. An Orchestrator Agent prioritizes ego state activation based on contextual triggers and an agent's life script, ensuring psychologically authentic responses. Validated in an educational scenario, TACLA demonstrates realistic ego state shifts in Student Agents, effectively modeling conflict de-escalation and escalation based on different teacher intervention strategies. Evaluation shows high conversational credibility and confirms TACLA's capacity to create dynamic, psychologically-grounded social simulations, advancing the development of effective AI tools for education and beyond.
Design of a Microprocessors and Microcontrollers Laboratory Course Addressing Complex Engineering Problems and Activities
Fahim Hafiz, Md Jahidul Hoq Emon, Md Abid Hossain
et al.
This paper proposes a novel curriculum for the microprocessors and microcontrollers laboratory course. The proposed curriculum blends structured laboratory experiments with an open-ended project phase, addressing complex engineering problems and activities. Microprocessors and microcontrollers are ubiquitous in modern technology, driving applications across diverse fields. To prepare future engineers for Industry 4.0, effective educational approaches are crucial. The proposed lab enables students to perform hands-on experiments using advanced microprocessors and microcontrollers while leveraging their acquired knowledge by working in teams to tackle self-defined complex engineering problems that utilize these devices and sensors, often used in the industry. Furthermore, this curriculum fosters multidisciplinary learning and equips students with problem-solving skills that can be applied in real-world scenarios. With recent technological advancements, traditional microprocessors and microcontrollers curricula often fail to capture the complexity of real-world applications. This curriculum addresses this critical gap by incorporating insights from experts in both industry and academia. It trains students with the necessary skills and knowledge to thrive in this rapidly evolving technological landscape, preparing them for success upon graduation. The curriculum integrates project-based learning, where students define complex engineering problems for themselves. This approach actively engages students, fostering a deeper understanding and enhancing their learning capabilities. Statistical analysis shows that the proposed curriculum significantly improves student learning outcomes, particularly in their ability to formulate and solve complex engineering problems, as well as engage in complex engineering activities.
Software Engineering as a Domain to Formalize
Bertrand Meyer
Software engineering concepts and processes are worthy of formal study; and yet we seldom formalize them. This "research ideas" article explores what a theory of software engineering could and should look like. Software engineering research has developed formal techniques of specification and verification as an application of mathematics to specify and verify systems addressing needs of various application domains. These domains usually do not include the domain of software engineering itself. It is, however, a rich domain with many processes and properties that cry for formalization and potential verification. This article outlines the structure of a possible theory of software engineering in the form of an object-oriented model, isolating abstractions corresponding to fundamental software concepts of project, milestone, code module, test and other staples of our field, and their mutual relationships. While the presentation is only a sketch of the full theory, it provides a set of guidelines for how a comprehensive and practical Theory of Software Engineering should (through an open-source community effort) be developed.
Vision-Proprioception Fusion with Mamba2 in End-to-End Reinforcement Learning for Motion Control
Xiaowen Tao, Yinuo Wang, Jinzhao Zhou
End-to-end reinforcement learning (RL) for motion control trains policies directly from sensor inputs to motor commands, enabling unified controllers for different robots and tasks. However, most existing methods are either blind (proprioception-only) or rely on fusion backbones with unfavorable compute-memory trade-offs. Recurrent controllers struggle with long-horizon credit assignment, and Transformer-based fusion incurs quadratic cost in token length, limiting temporal and spatial context. We present a vision-driven cross-modal RL framework built on SSD-Mamba2, a selective state-space backbone that applies state-space duality (SSD) to enable both recurrent and convolutional scanning with hardware-aware streaming and near-linear scaling. Proprioceptive states and exteroceptive observations (e.g., depth tokens) are encoded into compact tokens and fused by stacked SSD-Mamba2 layers. The selective state-space updates retain long-range dependencies with markedly lower latency and memory use than quadratic self-attention, enabling longer look-ahead, higher token resolution, and stable training under limited compute. Policies are trained end-to-end under curricula that randomize terrain and appearance and progressively increase scene complexity. A compact, state-centric reward balances task progress, energy efficiency, and safety. Across diverse motion-control scenarios, our approach consistently surpasses strong state-of-the-art baselines in return, safety (collisions and falls), and sample efficiency, while converging faster at the same compute budget. These results suggest that SSD-Mamba2 provides a practical fusion backbone for resource-constrained robotic and autonomous systems in engineering informatics applications.
Development of a Low-Cost Automated Injection Molding Device for Sustainable Plastic Recycling and Circular Economy Applications
Ananta Sinchai, Kunthorn Boonyang, Thanakorn Simmala
In response to the critical demand for innovative solutions to tackle plastic pollution, this research presents a low-cost, fully automated plastic injection molding system designed to convert waste into sustainable products. Constructed entirely from repurposed materials, the apparatus focuses on processing high-density polyethylene (HDPE) efficiently without hydraulic components, thereby enhancing eco-friendliness and accessibility. Performance evaluations identified an optimal molding temperature of 200 °C, yielding consistent products with a minimal weight deviation of 4.17%. The key operational parameters included a motor speed of 525 RPM, a gear ratio of 1:30, and an inverter frequency of 105 Hz. Further tests showed that processing temperatures of 210 °C and 220 °C, with injection times of 15 to 35 s, yielded optimal surface finish and complete filling. The surface finish, assessed through image intensity variation, had a low coefficient of variation (≤5%), while computer vision evaluation confirmed the full filling of all specimens in this range. A laser-based overflow detection system has minimized material waste, proving effective in small-scale, community recycling. This study underscores the potential of low-cost automated systems to advance the practices of circular economies and enhance localized plastic waste management. Future research will focus on automation, temperature precision, material adaptability, and emissions management.
Engineering machinery, tools, and implements, Technological innovations. Automation
A Flushing Duration Model for a Campaign against Contamination in Water Distribution Systems
Hao Cao, Pu Li
Contamination poses a significant risk to public health by degrading water quality in water distribution systems (WDSs). As one of the key tasks of a response strategy to contamination incidents in a WDS, pipe system flushing has been widely implemented in practice. However, due to the complexity of the network structure and chemical reaction within the pipe system, determining the flushing duration is still one of the significant challenges for a given network. To address this problem, a model for determining the flushing duration is developed. This model is based on calculating the traveling trajectory of the contaminant inside the network. This is carried out by discretizing the one-dimension advection equation and calculating the variation of the contaminant concentration from one segment to another over time. As a preliminary study, we focus on simplified scenarios where contaminants exhibit no chemical reaction within the WDS. The proposed model is applied and analyzed through a simulation study and a laboratory testbed. The results demonstrate the efficacy of the model for determining flushing duration, which can offer valuable insights for real-world applications and serve as a crucial reference for water utility companies.
Engineering machinery, tools, and implements
Stochastic Insights into Leakage Dynamics across Diverse Pipe Materials in Water Distribution Systems
Soheila Beygi, Jakobus E. van Zyl, Brendon Harkness
This study developed more realistic stochastic models of pipe failures that incorporate leak types and dimensions based on different pipe materials. The distributions of pipe failure types and properties were identified by analysing photographic records of failed pipes in Auckland, New Zealand. A stochastic model generated leaks in a typical DMA consisting of different pipe materials to different Infrastructure Leakage Index (ILI) levels. After analysing 100 networks for each scenario, the study observed that different pipe materials had distinct leakage exponent distributions. This study provides a tool for better understanding leakage behaviour in different pipe materials and evaluating methods for better water loss management.
Engineering machinery, tools, and implements
Evaluation of Axial Flow Impeller Fabrication Process by Wire Arc Additive Manufacturing and Machining
Shinichiro Ejiri
An evaluation was conducted on the fabrication of an axial flow impeller by a hybrid system of wire arc additive manufacturing and machining. First, a four-bladed stainless steel axial flow impeller was fabricated to measure the number of chips and fabrication time. Next, axial flow impellers with different numbers of blades were designed and compared with those fabricated only by machining from a round bar. In both cases, the number of chips was reduced by approximately 80% by using this system. On the other hand, the increase in the number of blades reduced the difference in fabrication time, which was almost the same with six blades. In conclusion, the use of this system is an option from the viewpoint of reducing environmental impact; however, it is not necessarily advantageous in terms of fabrication time.
Engineering machinery, tools, and implements
Aligning Models with Their Realization through Model-based Systems Engineering
Lovis Justin Immanuel Zenz, Erik Heiland, Peter Hillmann
et al.
In this paper, we propose a method for aligning models with their realization through the application of model-based systems engineering. Our approach is divided into three steps. (1) Firstly, we leverage domain expertise and the Unified Architecture Framework to establish a reference model that fundamentally describes some domain. (2) Subsequently, we instantiate the reference model as specific models tailored to different scenarios within the domain. (3) Finally, we incorporate corresponding run logic directly into both the reference model and the specific models. In total, we thus provide a practical means to ensure that every implementation result is justified by business demand. We demonstrate our approach using the example of maritime object detection as a specific application (specific model / implementation element) of automatic target recognition as a service reoccurring in various forms (reference model element). Our approach facilitates a more seamless integration of models and implementation, fostering enhanced Business-IT alignment.
Abstraction Engineering
Nelly Bencomo, Jordi Cabot, Marsha Chechik
et al.
Modern software-based systems operate under rapidly changing conditions and face ever-increasing uncertainty. In response, systems are increasingly adaptive and reliant on artificial-intelligence methods. In addition to the ubiquity of software with respect to users and application areas (e.g., transportation, smart grids, medicine, etc.), these high-impact software systems necessarily draw from many disciplines for foundational principles, domain expertise, and workflows. Recent progress with lowering the barrier to entry for coding has led to a broader community of developers, who are not necessarily software engineers. As such, the field of software engineering needs to adapt accordingly and offer new methods to systematically develop high-quality software systems by a broad range of experts and non-experts. This paper looks at these new challenges and proposes to address them through the lens of Abstraction. Abstraction is already used across many disciplines involved in software development -- from the time-honored classical deductive reasoning and formal modeling to the inductive reasoning employed by modern data science. The software engineering of the future requires Abstraction Engineering -- a systematic approach to abstraction across the inductive and deductive spaces. We discuss the foundations of Abstraction Engineering, identify key challenges, highlight the research questions that help address these challenges, and create a roadmap for future research.
Towards an Engineering Discipline for Resilient Cyber-Physical Systems
Ricardo D. Caldas
Resilient cyber-physical systems comprise computing systems able to continuously interact with the physical environment in which they operate, despite runtime errors. The term resilience refers to the ability to cope with unexpected inputs while delivering correct service. Examples of resilient computing systems are Google's PageRank and the Bubblesort algorithm. Engineering for resilient cyber-physical systems requires a paradigm shift, prioritizing adaptability to dynamic environments. Software as a tool for self-management is a key instrument for dealing with uncertainty and embedding resilience in these systems. Yet, software engineers encounter the ongoing challenge of ensuring resilience despite environmental dynamic change. My thesis aims to pioneer an engineering discipline for resilient cyber-physical systems. Over four years, we conducted studies, built methods and tools, delivered software packages, and a website offering guidance to practitioners. This paper provides a condensed overview of the problems tackled, our methodology, key contributions, and results highlights. Seeking feedback from the community, this paper serves both as preparation for the thesis defense and as insight into future research prospects.
Chaos Engineering: A Multi-Vocal Literature Review
Joshua Owotogbe, Indika Kumara, Willem-Jan Van Den Heuvel
et al.
Organizations, particularly medium and large enterprises, typically rely heavily on complex, distributed systems to deliver critical services and products. However, the growing complexity of these systems poses challenges in ensuring service availability, performance, and reliability. Traditional resilience testing methods often fail to capture the intricate interactions and failure modes of modern systems. Chaos Engineering addresses these challenges by proactively testing how systems in production behave under turbulent conditions, allowing developers to uncover and resolve potential issues before they escalate into outages. Though chaos engineering has received growing attention from researchers and practitioners alike, we observed a lack of reviews that synthesize insights from both academic and grey literature. Hence, we conducted a Multivocal Literature Review (MLR) on chaos engineering to address this research gap by systematically analyzing 96 academic and grey literature sources published between January 2016 and April 2024. We first used the chosen sources to derive a unified definition of chaos engineering and to identify key functionalities, components, and adoption drivers. We also developed a taxonomy for chaos engineering platforms and compared the relevant tools using it. Finally, we analyzed the current state of chaos engineering research and identified several open research issues.
Real-Time Field Quality Management System for Asphalt Pavement Using Cloud
Kyu-Dong Jeong, Dong-Hyuk Kim, Jae-Won Kim
et al.
If the production and construction information of asphalt mixture are tightly coupled and quality control is performed in real time, it is possible to minimize quality degradation and solve problems early. For these objectives, a cloud-based IoT (Internet of Things) PQMS (Pavement Quality Management System) was developed in this study. As a result, drivers and managers can monitor construction information and identify problems using monitors and apps. In 2023, it will be applied to national road construction sites to verify the effectiveness of the proposed cloud-based IoT PQMS and address potential problems.
Engineering machinery, tools, and implements
Prison Disaster Factors: A Case Study of Taipei Prison
Chi-Jan Huang, Ting-Yi Chiang, Wun-Wu Chen
Prisons have always been considered self-sufficient, and government disaster response plans at all levels rarely mention prisons. Prisons may face emergencies such as earthquakes, floods, fires, prison escapes, or riots. Prisons are located in various disaster potential areas. If prepared, the safety of prison inmates can be secured. If it is not handled properly, society can be threatened. Through a literature review, TELES and SESS earthquake loss estimation system, and other methods, we sorted out three disaster risk factors, personnel risk, equipment risk, and management risk. In the safety part of facilities, old buildings, old prison walls, insufficient monitoring facilities, and insufficient prison space need to be included. The management aspect includes the potential of adjacent disasters. Insufficient regional and disaster prevention materials, medical materials, and connections with surrounding resources need to be solved.
Engineering machinery, tools, and implements
Artificial Intelligence and Optimization Computing to Lead Energy Retrofit Programs in Complex Real Estate Investments
Aurora Greta Ruggeri, Laura Gabrielli, Massimiliano Scarpa
In order to plan and manage low-carbon investments in wide real estate assets, in this research, a strategic approach is developed to act on building stocks as a whole, with the aim of overcoming the single-building perspective and identifying the energy retrofit level leading to the maximum possible benefit. It is shown how artificial intelligence (AI) and optimization computing are essential to the creation of the decision-making process. In fact, energy improvement consists of an optimization problem in which conflicting objectives and constraints are balanced, and several techniques are integrated to achieve a unified result, including machine learning, economics, building energy simulation, computer programming, optimization, and risk analysis. This target is achieved by means of Artificial Neural Networks (ANNs) for energy consumption assessment, an Analytic Hierarchy Process for energy retrofit compatibility assessment, and an evolutionary optimization algorithm for the achievement of the optimal configuration of intervention on the stock, maximizing the energy and economic performance of the investment. The proposed procedure is validated on the case study of a building asset located in Northern Italy. Since the developed model relies on AI-based algorithms, it has a consequent limitation: the developed ANNs can work only for the building types, occupation profiles and climatic areas that were used in the training phase. In further development of this research, the aim will be to expand the generalization properties of the forecasting tool.
Engineering machinery, tools, and implements
A Novel Approach to Fabricating a Screen-Printed Electrode Based on a Gold Nanorod–Graphene Oxide Composite for the Detection of Uric Acid
Wulan Tri Wahyuni, Hana Safitri, Eti Rohaeti
et al.
In this study, we report the development of a technique to fabricate a screen-printed electrode (SPE) and apply it in uric acid sensing. The SPE was fabricated by printing it on a photo paper substrate using a printing technique on an office printer. In particular, the conductive ink used to print the working electrode (WE) and counter electrode (CE) consisted of graphene oxide (GO) and a gold nanorod (AuNR) material. While the reference electrode (RE) was made by applying a conductive silver paste to the fabricated SPE, the electrochemical measurement of uric acid solution using fabricated SPE GO/AuNR provided a higher signal than commercially available SPE. The electroanalytical performance of the fabricated SPE based on GO/AuNR, which was used to measure the uric acid solution, exhibited a linear range of 0.8−200 μM, a detection limit of 0.5 μM, a quantitation limit of 1.0 μM, an outstanding repeatability (% relative standard deviation) of 4.885%, and good selectivity with ascorbic acid, dopamine, glucose, urea, and sodium as interference. Furthermore, an SPE, fabricated based on GO/AuNR, was successfully employed for the determination of uric acid concentration in human urine samples using the standard addition approach.
Engineering machinery, tools, and implements
Comparative Analysis of Esterification Reaction in Continuous Stirred Tank and Plug-Flow Reactors
Abdulaziz Bakhtiyorov, Abbos Elmanov, Olimjon Maksudov
et al.
In this study, a comparative study was conducted on the two reactor types (the plug-flow and continuous stirred tank reactor) employed for the traditional esterification process to investigate their potential applications to the esterification reaction with the ethanol-rich feed. Aspen Plus software was used to conduct a sensitivity analysis on the temperature profiles in the axial and radial directions, focusing in particular on the reactor and feed stream temperatures, operating parameters, and ethyl acetate yields for the reactors. The energy analysis for esterification processes with the different reactor types has also been evaluated. Compared with the continuous stirred tank reactor, the plug-flow reactor process with the ethanol-rich feed exhibited reduced hotspot temperatures. The simulation results show that the hotspot temperatures in the continuous stirred tank reactor can be within the operating temperature range of 90–100 °C. Regarding the comparison of these reactor types for the esterification process, the plug-flow reactor shows advantages in terms of efficient hotspot temperature with the operating temperature range of 70–75 °C. On the other hand, the yield of ethyl acetate product from the continuous stirred tank reactor is slightly higher than from the alternative esterification process with excess ethanol feed.
Engineering machinery, tools, and implements
Taxing Collaborative Software Engineering
Michael Dorner, Maximilian Capraro, Oliver Treidler
et al.
The engineering of complex software systems is often the result of a highly collaborative effort. However, collaboration within a multinational enterprise has an overlooked legal implication when developers collaborate across national borders: It is taxable. In this article, we discuss the unsolved problem of taxing collaborative software engineering across borders. We (1) introduce the reader to the basic principle of international taxation, (2) identify three main challenges for taxing collaborative software engineering making it a software engineering problem, and (3) estimate the industrial significance of cross-border collaboration in modern software engineering by measuring cross-border code reviews at a multinational software company.