Ultra-fine grooving technology for 4H-SiC substrate using poly-crystalline diamond (PCD) blade
Haruto KONISHI, Takashi FUJITA, Yasuo IZUMI
et al.
A unique poly-crystalline diamond (PCD) blade with three-dimensional high density cutting edges was developed to machine ultra-fine grooves on a silicon carbide (SiC) substrate with a mirror-like finish. Focusing on the working surface of the high-density three-dimensional cutting edge of the PCD blade, we aligned the tips of the individual micro-cutting edges on the same plane to enable individual and independent machining in the transverse direction of the working surface of the blade. As a result of machining a groove on the SiC substrate using the PCD blade, an extremely narrow groove of 15.29 µm in average width and 89.04 µm in average depth could be formed. The cross-sectional shape of the groove bottom was also sharply machined with an extremely small corner R on the order of 1 µm. The cross-sectional shape of the blade was accurately transferred to a groove shape with sharp edges. In addition, the surface roughness of the groove bottom in the blade traveling direction had a mirror-like state with less that Ra1 nm. Although only one linear crack remained, it is possible to eliminate the crack generation by removing the protruding cutting edges.
Engineering machinery, tools, and implements, Mechanical engineering and machinery
Exploring LLMs for User Story Extraction from Mockups
Diego Firmenich, Leandro Antonelli, Bruno Pazos
et al.
User stories are one of the most widely used artifacts in the software industry to define functional requirements. In parallel, the use of high-fidelity mockups facilitates end-user participation in defining their needs. In this work, we explore how combining these techniques with large language models (LLMs) enables agile and automated generation of user stories from mockups. To this end, we present a case study that analyzes the ability of LLMs to extract user stories from high-fidelity mockups, both with and without the inclusion of a glossary of the Language Extended Lexicon (LEL) in the prompts. Our results demonstrate that incorporating the LEL significantly enhances the accuracy and suitability of the generated user stories. This approach represents a step forward in the integration of AI into requirements engineering, with the potential to improve communication between users and developers.
Evaluating the FER of IEEE 802.15.4 Frames Between UAVs and Wireless Sensor Nodes
Christian Tipantuña, Carlos Egas Acosta, Luis Criollo
et al.
Evaluating the noise on the transmitted bits or frames and its effect on connectivity is essential in Wireless Sensor Networks when designing and deploying IoT networks. When the sensor nodes are positioned on the ground, the number of correctly received frames can fluctuate due to obstructions, interference, and the processing capacity of the nodes. Therefore, implementing strategies and systems to monitor and evaluate the rate of lost frames is required. This paper presents the implementation of a system designed to assess the rate of erroneous IEEE 802.15.4 frames within a communication setup that includes a ground control station, an unmanned aerial vehicle, and surface-based sensor nodes.
Engineering machinery, tools, and implements
Innovative Drug Delivery Systems: The Comprehensive Role of Natural Polymers in Fast-Dissolving Tablets
Meet V. Naliyadhara, Riya B. Chovatiya, Shyam R. Vekariya
et al.
Fast-dissolving tablets (FDTs) have arisen as a novel way to tackle issues encountered by patients with dysphagia, including youngsters, older people, and those with neurodegenerative or developmental disabilities. This review emphasises the crucial function of natural polymers as super disintegrants in improving fast-disintegrating tablet formulation. Natural polymers, such as chitosan, guar gum, xanthan gum, and fenugreek seed mucilage, are biocompatible, biodegradable, and offer better affordability than synthetics. Natural polymers can quickly break down and disintegrate oral tablets. They also help accelerate drug release bioavailability and patient compliance. This article discusses the benefits of natural polymers, such as environmentally sustainable processing, cost effectiveness, and patient engagement, as well as challenges and limitations. The comprehensive comparison between natural polymers and synthetic polymers emphasises the benefits of natural substances to overcome challenges in the production and promotion of sustainable pharmaceutical practices. Spray drying, freezing, and nanotechnology are advancements in FDT production technology. Apart from its ownership, like Zydis and Durasolv, the combination of these techniques aids in the creation of a medicine system that may be adjusted. They prioritize patients and are also effective. Prospective studies should focus on the expansion of natural polymer procurement and distillation processes to improve the use of FDT.
Engineering machinery, tools, and implements
Enhancing Orthogonal GPS L1C Signal Acquisition
Ali Albu-Rghaif , Hussein A. Abdulkadhim, Latifah Munirah Kamarudin
The Global Positioning System (GPS) represents a significant leap in global navigation satellite systems. This is achieved through continuous localization, reliable navigation, and precise timing for various uses, such as civilian, commercial, and military. Among the multiple signals sent by GPS satellites, the L1C signal greatly enhances structure and performance, boosting user reliability (by employing binary offset carrier modulation to reduce multipath effects) and accuracy (by lengthening the ranging code). It features data and pilot components, enhancing resilience against multipath interference and strengthening the signal under challenging conditions. In this work, an orthogonal single-channel acquisition algorithm for the GPS L1C signal is proposed, and it is utilized to reduce the complexity of a conventional side-by-side/dual-channel configuration. The proposed scheme mathematically combines the data and pilot portions into one orthogonal channel, which approach has been shown to achieve a 3dB gain in signal-to-noise ratio (SNR) with 34% gain in computation complexity over the conventional implementation. The MATLAB-Simulink environment was used to simulate the GPS L1C parameters with a sampling frequency of 16.368 MHz and a dwell time of 10 ms. The simulations were carried out across various SNR levels to evaluate detection probability, and processing time. The results show that the proposed solution preserves detection probability and dramatically increases resource utilization. This work provides the first single-channel orthogonal design for GPS L1C acquisition and is an efficient step towards low-power, high-performance GNSS receivers.
Engineering machinery, tools, and implements, Mechanics of engineering. Applied mechanics
Modelling and Optimization of the Precision Hot Forging/Extrusion Process of an Asymmetric C45E/1.1191 Carbon Steel Bearing Element
Antonio Nikolov, Anton Mihaylov, Dimiter Yankov
Precision extrusion forging is an innovative manufacturing process for trouble-free production of high-quality components with an accurate shape. The process provides a reduced technological chain and high production efficiency, as only certain surfaces need additional processing. This study used QForm software as an environment for simulating precision extrusion forging. The main goal of this research was to present a brief overview of the latest research on the simulation of precision extrusion forging, with an emphasis on the production cycle rather than on mathematical description. This article examines the processes of simulation modeling of precision extrusion forging with newly designed tooling for the manufacture of a newly introduced asymmetric load-bearing facade element patented by Braykov. With the help of simulation modeling, appropriate modes for specific production were established, and were later implemented. The production process itself is briefly presented at the end of this article.
Engineering machinery, tools, and implements
A Model-Based Analysis of Direct Methanol Production from CO<sub>2</sub> and Renewable Hydrogen
Azizbek Kamolov, Zafar Turakulov, Botir Shukurillaevich Usmonov
et al.
Methanol synthesis from CO<sub>2</sub> is a key strategy for carbon capture and utilization, offering a viable solution to mitigate climate change. The direct synthesis of methanol not only reduces greenhouse gases but also produces valuable chemicals for industrial applications. The aim of this study is to model and optimize the methanol synthesis process from CO<sub>2</sub>, focusing on maximizing methanol yield while minimizing CO<sub>2</sub> content in the product stream. In this work, a detailed methanol synthesis process simulation was developed using the Soave–Redlich–Kwong equation of state in the Aspen Plus V11 commercial software environment. Pure CO<sub>2</sub> streams, which are produced from the post-combustion carbon capture process, and renewable hydrogen streams were used. The results are compared with open literature sources. In addition, a sensitivity analysis was employed to evaluate the effects of the pressure, temperature, and recirculation fraction on process efficiency. The results showed that the highest methanol yield of 76,838 kg/h was obtained at 80 bar, 276 °C, and a recirculation fraction of 0.9. The lowest CO<sub>2</sub> content in the final product (73 kg/h) occurred at 80 bar, 220 °C, and a recirculation fraction of 0.6. These findings demonstrate the trade-off between maximizing methanol output and reducing unreacted CO<sub>2</sub>. In conclusion, optimal operating conditions for both the high yield and low CO<sub>2</sub> content were identified, providing a foundation for further process refinement. Future work will involve developing a more complex multi-reactor model and conducting economic assessments for large-scale industrial implementation.
Engineering machinery, tools, and implements
Understanding Computational Science and Engineering (CSE) and Domain Science Skills Development in National Laboratory Postgraduate Internships
Morgan M. Fong, Hilary Egan, Marc Day
et al.
Background: Harnessing advanced computing for scientific discovery and technological innovation demands scientists and engineers well-versed in both domain science and computational science and engineering (CSE). However, few universities provide access to both integrated domain science/CSE cross-training and Top-500 High-Performance Computing (HPC) facilities. National laboratories offer internship opportunities capable of developing these skills. Purpose: This student presents an evaluation of federally-funded postgraduate internship outcomes at a national laboratory. This study seeks to answer three questions: 1) What computational skills, research skills, and professional skills do students improve through internships at the selected national laboratory. 2) Do students gain knowledge in domain science topics through their internships. 3) Do students' career interests change after these internships? Design/Method: We developed a survey and collected responses from past participants of five federally-funded internship programs and compare participant ratings of their prior experience to their internship experience. Findings: Our results indicate that participants improve CSE skills and domain science knowledge, and are more interested in working at national labs. Participants go on to degree programs and positions in relevant domain science topics after their internships. Conclusions: We show that national laboratory internships are an opportunity for students to build CSE skills that may not be available at all institutions. We also show a growth in domain science skills during their internships through direct exposure to research topics. The survey instrument and approach used may be adapted to other studies to measure the impact of postgraduate internships in multiple disciplines and internship settings.
Analysis of a Newly Developed Afterburner System Employing Hydrogen–Methane Blends
Florin Gabriel Florean, Andreea Mangra, Marius Enache
et al.
A considerable number of Combined Heat and Power (CHP) systems continue to depend on fossil fuels like oil and natural gas, contributing to significant environmental pollution and the release of greenhouse gases. Two V-gutter flame holder prototypes (P1 and P2) with the same expansion angle, fueled with pure hydrogen (100% H<sub>2</sub>) or hydrogen–methane mixtures (60% H<sub>2</sub> + 40% CH<sub>4</sub>, 80% H<sub>2</sub> + 20% CH<sub>4</sub>), intended for use in cogeneration applications, have been designed, manufactured, and tested. Throughout the tests, the concentrations of CO<sub>2</sub>, CO, and NO in the flue gas were monitored, and particle image velocimetry (PIV) measurements were performed. The CO, CO<sub>2</sub>, respectively, and NO emissions gradually decreased as the percentage of H<sub>2</sub> in the fuel mixture increased. The NO emissions were significantly lower in the case of prototype P2 in comparison with prototype P1 in all measurement points for all used fuel mixtures. The shortest recirculation zone was observed for P1, where the axial velocity reaches a negative peak of approximately 12 m/s at roughly 50 mm downstream of the edge of the flame holder, and the recirculation region spans about 90 mm. In comparison, the P2 prototype has a length of the recirculation region span of about 100 mm with a negative peak of approximately 14 m/s. The data reveal high gradients in flow velocity near the flow separation point, which gradually smooth out with increasing downstream distance. Despite their similar design, P2 consistently performs better across all measured velocity components. This improvement can be attributed to the larger fuel injection holes, which enhance fuel–air mixing and combustion stability. Additionally, the presence of side walls directing the flow around the flame stabilizer further aids in maintaining a stable combustion process.
Engineering machinery, tools, and implements, Technological innovations. Automation
Design and architecture of the IBM Quantum Engine Compiler
Michael B. Healy, Reza Jokar, Soolu Thomas
et al.
In this work, we describe the design and architecture of the open-source Quantum Engine Compiler (qe-compiler) currently used in production for IBM Quantum systems. The qe-compiler is built using LLVM's Multi-Level Intermediate Representation (MLIR) framework and includes definitions for several dialects to represent parameterized quantum computation at multiple levels of abstraction. The compiler also provides Python bindings and a diagnostic system. An open-source LALR lexer and parser built using Bison and Flex generates an Abstract Syntax Tree that is translated to a high-level MLIR dialect. An extensible hierarchical target system for modeling the heterogeneous nature of control systems at compilation time is included. Target-based and generic compilation passes are added using a pipeline interface to translate the input down to low-level intermediate representations (including LLVM IR) and can take advantage of LLVM backends and tooling to generate machine executable binaries. The qe-compiler is built to be extensible, maintainable, performant, and scalable to support the future of quantum computing.
Higher education assessment practice in the era of generative AI tools
Bayode Ogunleye, Kudirat Ibilola Zakariyyah, Oluwaseun Ajao
et al.
The higher education (HE) sector benefits every nation's economy and society at large. However, their contributions are challenged by advanced technologies like generative artificial intelligence (GenAI) tools. In this paper, we provide a comprehensive assessment of GenAI tools towards assessment and pedagogic practice and, subsequently, discuss the potential impacts. This study experimented using three assessment instruments from data science, data analytics, and construction management disciplines. Our findings are two-fold: first, the findings revealed that GenAI tools exhibit subject knowledge, problem-solving, analytical, critical thinking, and presentation skills and thus can limit learning when used unethically. Secondly, the design of the assessment of certain disciplines revealed the limitations of the GenAI tools. Based on our findings, we made recommendations on how AI tools can be utilised for teaching and learning in HE.
Observer Backstepping Design for Flight Control
Ben Messaoud Safinaz, Belkheiri Mohammed, Belkheiri Ahmed
This paper presents observer backstepping as a new nonlinear flight control design framework. Flight control laws for general-purpose maneuvering in the presence of nonlinear lift and side forces are designed. The controlled variables are the angle of attack, the sideslip angle, and the roll rate. The stability has been proved using Lyapunov stability criteria. Control laws were evaluated using realistic aircraft simulation models, with highly encouraging results.
Engineering machinery, tools, and implements
Applying Model Studies to Support the Monitoring of Methane Hazard during the Process of Underground Coal Mining
Tutak Magdalena, Brodny Jarosław, Małkowski Piotr
et al.
The process of underground mining is one of the most complex and hazardous activities. In order to maintain the continuity and efficiency of this process, it is necessary to take measures to reduce this hazard. The paper addresses this issue by presenting a developed methodology for using model studies and numerical simulations to support the process of monitoring methane hazards. Its basis is the developed model of the region of underground mining exploitation along with the ventilation phenomena occurring in it. To develop it, the ANSYS Fluent program was used, based on the finite volume method classified as computational fluid mechanics. The model reflects both the geometries and physical and chemical phenomena occurring in the studied area, as well as the auxiliary ventilation equipment used during operation. The research was conducted for two variants of methane emissions from goaf zones, the first of which concerned the actual state of the mining area, and the second of which concerned increased methane emissions from these goaf zones. The purpose of the study was to determine the distribution of methane concentrations in the most dangerous part of the studied area, which is the intersection of the longwall and the tailgate, as well as the distribution of ventilation air flow velocities affecting them. The studies for both variants made it possible to determine places particularly exposed to the occurrence of dangerous concentrations of methane in this region. The methodology developed represent a new approach to studying the impact of methane emissions from goaf zones into mine workings.
Machine design and drawing, Engineering machinery, tools, and implements
Application of Simulation Technique for Improving Plant Layout in Ceramic Factory
Kasemset Chompoonoot, Opassuwan Takron, Tangsittikhun Thanakit
et al.
This study aims to design and improve the plant layout of a ceramic factory by adopting Systematic Layout Planning (SLP) and the simulation technique. A ceramic company in northern Thailand is selected as a case study. Three ceramic products including roof tiles, wall tiles and dishware are studied due to their highest production volume. Through the SLP approach, information regarding the number of departments and machines, the area of the plant, the frequency of movement and the distance between each department is collected for the analysis of the relationship between departments. Two plant layout designs are then proposed; the first one is derived from the Computerized Relationship Layout Planning algorithm (CORELAP), and the second one is the process layout. For selecting the most appropriate layout design, five criteria are considered including total distance, the average total process time of each unit produced, ease of movement, material flow and safety. To determine the distance and the average total process time per unit, Distance-Based Scoring and simulation techniques are conducted while the ease of movement, material flow and safety are rated based on whether the company satisfies each criterion. Employing the weight scoring technique, the results report that the CORELAP layout is the most suitable for further implementation due to its highest weighted score equal to 2.536 while the process layout receives 2.386. Implementing the CORELAP layout can reduce the total distance by 16.76% while the average total process time per unit of the CORELAP layout is not significantly different at the significance level of 0.05 as compared to the existing layout.
Machine design and drawing, Engineering machinery, tools, and implements
AIBugHunter: A Practical Tool for Predicting, Classifying and Repairing Software Vulnerabilities
Michael Fu, Chakkrit Tantithamthavorn, Trung Le
et al.
Many ML-based approaches have been proposed to automatically detect, localize, and repair software vulnerabilities. While ML-based methods are more effective than program analysis-based vulnerability analysis tools, few have been integrated into modern IDEs, hindering practical adoption. To bridge this critical gap, we propose AIBugHunter, a novel ML-based software vulnerability analysis tool for C/C++ languages that is integrated into Visual Studio Code. AIBugHunter helps software developers to achieve real-time vulnerability detection, explanation, and repairs during programming. In particular, AIBugHunter scans through developers' source code to (1) locate vulnerabilities, (2) identify vulnerability types, (3) estimate vulnerability severity, and (4) suggest vulnerability repairs. In this article, we propose a novel multi-objective optimization (MOO)-based vulnerability classification approach and a transformer-based estimation approach to help AIBugHunter accurately identify vulnerability types and estimate severity. Our empirical experiments on a large dataset consisting of 188K+ C/C++ functions confirm that our proposed approaches are more accurate than other state-of-the-art baseline methods for vulnerability classification and estimation. Furthermore, we conduct qualitative evaluations including a survey study and a user study to obtain software practitioners' perceptions of our AIBugHunter tool and assess the impact that AIBugHunter may have on developers' productivity in security aspects. Our survey study shows that our AIBugHunter is perceived as useful where 90% of the participants consider adopting our AIBugHunter. Last but not least, our user study shows that our AIBugHunter could possibly enhance developers' productivity in combating cybersecurity issues during software development.
Multi-agricultural Machinery Collaborative Task Assignment Based on Improved Genetic Hybrid Optimization Algorithm
Haohao Du
To address the challenges of delayed scheduling information, heavy reliance on manual labour, and low operational efficiency in traditional large-scale agricultural machinery operations, this study proposes a method for multi-agricultural machinery collaborative task assignment based on an improved genetic hybrid optimisation algorithm. The proposed method establishes a multi-agricultural machinery task allocation model by combining the path pre-planning of a simulated annealing algorithm and the static task allocation of a genetic algorithm. By sequentially fusing these two algorithms, their respective shortcomings can be overcome, and their advantages in global and local search can be utilised. Consequently, the search capability of the population is enhanced, leading to the discovery of more optimal solutions. Then, an adaptive crossover operator is constructed according to the task assignment model, considering the capacity, path cost, and time of agricultural machinery; two-segment coding and multi-population adaptive mutation are used to assign tasks to improve the diversity of the population and enhance the exploration ability of the population; and to improve the global optimisation ability of the hybrid algorithm, a 2-Opt local optimisation operator and an Circle modification algorithm are introduced. Finally, simulation experiments were conducted in MATLAB to evaluate the performance of the multi-agricultural machinery collaborative task assignment based on the improved genetic hybrid algorithm. The algorithm's capabilities were assessed through comparative analysis in the simulation trials. The results demonstrate that the developed hybrid algorithm can effectively reduce path costs, and the efficiency of the assignment outcomes surpasses that of the classical genetic algorithm. This approach proves particularly suitable for addressing large-scale task allocation problems.
Generative Artificial Intelligence for Software Engineering -- A Research Agenda
Anh Nguyen-Duc, Beatriz Cabrero-Daniel, Adam Przybylek
et al.
Generative Artificial Intelligence (GenAI) tools have become increasingly prevalent in software development, offering assistance to various managerial and technical project activities. Notable examples of these tools include OpenAIs ChatGPT, GitHub Copilot, and Amazon CodeWhisperer. Although many recent publications have explored and evaluated the application of GenAI, a comprehensive understanding of the current development, applications, limitations, and open challenges remains unclear to many. Particularly, we do not have an overall picture of the current state of GenAI technology in practical software engineering usage scenarios. We conducted a literature review and focus groups for a duration of five months to develop a research agenda on GenAI for Software Engineering. We identified 78 open Research Questions (RQs) in 11 areas of Software Engineering. Our results show that it is possible to explore the adoption of GenAI in partial automation and support decision-making in all software development activities. While the current literature is skewed toward software implementation, quality assurance and software maintenance, other areas, such as requirements engineering, software design, and software engineering education, would need further research attention. Common considerations when implementing GenAI include industry-level assessment, dependability and accuracy, data accessibility, transparency, and sustainability aspects associated with the technology. GenAI is bringing significant changes to the field of software engineering. Nevertheless, the state of research on the topic still remains immature. We believe that this research agenda holds significance and practical value for informing both researchers and practitioners about current applications and guiding future research.
Recreating Lunar Environments by Fusion of Multimodal Data Using Machine Learning Models
Ana C. Castillo, Jesus A. Marroquin-Escobedo, Santiago Gonzalez-Irigoyen
et al.
The latest satellite infrastructure for data processing, transmission and reception can certainly be improved by upgrading tools used to deal with very large amounts of data from every different sensor incorporated within the space missions. In order to develop a better technique to process data, in this paper we will take an insight into multimodal data fusion using machine learning algorithms. This paper discusses how machine learning models are used to recreate environments from heterogeneous, multi-modal data sets. In particular, for those models based on neural networks, the most important difficulty is the vast number of training objects of the connected neural network based on Convolutional Neural Networks (CNN) to avoid overfitting and underfitting of the models.
Engineering machinery, tools, and implements
T4PdM: a Deep Neural Network based on the Transformer Architecture for Fault Diagnosis of Rotating Machinery
Erick Giovani Sperandio Nascimento, Julian Santana Liang, Ilan Sousa Figueiredo
et al.
Deep learning and big data algorithms have become widely used in industrial applications to optimize several tasks in many complex systems. Particularly, deep learning model for diagnosing and prognosing machinery health has leveraged predictive maintenance (PdM) to be more accurate and reliable in decision making, in this way avoiding unnecessary interventions, machinery accidents, and environment catastrophes. Recently, Transformer Neural Networks have gained notoriety and have been increasingly the favorite choice for Natural Language Processing (NLP) tasks. Thus, given their recent major achievements in NLP, this paper proposes the development of an automatic fault classifier model for predictive maintenance based on a modified version of the Transformer architecture, namely T4PdM, to identify multiple types of faults in rotating machinery. Experimental results are developed and presented for the MaFaulDa and CWRU databases. T4PdM was able to achieve an overall accuracy of 99.98% and 98% for both datasets, respectively. In addition, the performance of the proposed model is compared to other previously published works. It has demonstrated the superiority of the model in detecting and classifying faults in rotating industrial machinery. Therefore, the proposed Transformer-based model can improve the performance of machinery fault analysis and diagnostic processes and leverage companies to a new era of the Industry 4.0. In addition, this methodology can be adapted to any other task of time series classification.
Personalized Federated Learning for Multi-task Fault Diagnosis of Rotating Machinery
Sheng Guo, Zengxiang Li, Hui Liu
et al.
Intelligent fault diagnosis is essential to safe operation of machinery. However, due to scarce fault samples and data heterogeneity in field machinery, deep learning based diagnosis methods are prone to over-fitting with poor generalization ability. To solve the problem, this paper proposes a personalized federated learning framework, enabling multi-task fault diagnosis method across multiple factories in a privacypreserving manner. Firstly, rotating machines from different factories with similar vibration feature data are categorized into machine groups using a federated clustering method. Then, a multi-task deep learning model based on convolutional neural network is constructed to diagnose the multiple faults of machinery with heterogeneous information fusion. Finally, a personalized federated learning framework is proposed to solve data heterogeneity across different machines using adaptive hierarchical aggregation strategy. The case study on collected data from real machines verifies the effectiveness of the proposed framework. The result shows that the diagnosis accuracy could be improved significantly using the proposed personalized federated learning, especially for those machines with scarce fault samples.