Leandro A. Rodriguez-Ortiz, Estefanía A. Tapia Suárez, Santiago Muíños-Landín
et al.
Research on sustainable business models (BMs) rooted in the circular economy is expanding in industry and academia, encouraging companies to enhance their impact on profits, people, and the planet. However, developing sustainable BMs is complex due to the conflicting objectives of sustainability dimensions and competition with traditional models. This work addresses this challenge by proposing a Sustainable Business Guide (SBG) supported by artificial intelligence (AI) to assist in decision-making. The SBG supports the design and assessment of business models by integrating inspection; business opportunity exploration; technical, economic, and environmental analyses; an AI-based decision support system; and Porter’s Five Forces, focusing on the aeronautical and wind energy sectors.
At the stage of the final processing of surfaces, the quality indicators of the surfaces of parts—size, shape in cross-section and longitudinal section, mutual arrangement of surfaces, and roughness—are obtained. This includes technical and organizational measures and activities that are laid down or taken into account in the applied technology. This publication constructs cause-and-effect diagrams of the factors influencing the achievement of each of these accuracy indicators. Ways to reduce the negative impact of some factors are indicated. Errors related to the components of the technological system are analyzed and grouped. Tasks related to accurate process design are defined. Guidelines related to structural accuracy design are given. The technological conditions for ensuring the accuracy of finishing operations when processing parts by turning are formulated.
Michael Welsh, Julian Lopez-Rippe, Dana Alkhulaifat
et al.
Large language models (LLMs) show promise in enhancing medical research through domain-specific question answering. However, their clinical application is limited by hallucination risk, limited domain specialization, and privacy concerns. Public LLMs like GPT-4-Consensus pose challenges for use with institutional data, due to the inability to ensure patient data protection. In this work, we present a secure, custom-designed retrieval-augmented generation (RAG) LLM system deployed entirely within our institution and tailored for radiology research. Radiology researchers at our institution evaluated the system against GPT-4-Consensus through a blinded survey assessing factual accuracy (FA), citation relevance (CR), and perceived performance (PP) using 5-point Likert scales. Our system achieved mean ± SD scores of 4.15 ± 0.99 for FA, 3.70 ± 1.17 for CR, and 3.55 ± 1.39 for PP. In comparison, GPT-4-Consensus obtained 4.25 ± 0.72, 3.85 ± 1.23, and 3.90 ± 1.12 for the same metrics, respectively. No statistically significant differences were observed (<i>p</i> = 0.97, 0.65, 0.42), and 50% of participants preferred our system’s output. These results validate that secure, local RAG-based LLMs can match state-of-the-art performance while preserving privacy and adaptability, offering a scalable tool for medical research environments.
Engineering machinery, tools, and implements, Technological innovations. Automation
Kubernetes clusters are deployed across data centers for geo-redundancy and low-latency access, resulting in new challenges in scheduling workloads optimally. This paper presents a practical evaluation of network-aware scheduling in a distributed Kubernetes cluster that spans multiple network zones. A custom scheduling plugin is implemented within the scheduling framework to incorporate real-time network telemetry (inter-node ping latency) into pod placement decisions. The assessment methodology combines a custom scheduler plugin, realistic network latency measurements, and representative distributed benchmarks to assess the impact of scheduling on traffic patterns. The results provide strong empirical confirmation of the findings previously established through simulation, offering a validated path forward to integrate not only network metrics, but also other performance-critical metrics such as energy efficiency, hardware utilization, and fault tolerance.
The increasing complexity of software has led to the steady growth of vulnerabilities. Vulnerability repair investigates how to fix software vulnerabilities. Manual vulnerability repair is labor-intensive and time-consuming because it relies on human experts, highlighting the importance of Automated Vulnerability Repair (AVR). In this SoK, we present the systematization of AVR methods through the three steps of AVR workflow: vulnerability analysis, patch generation, and patch validation. We assess AVR tools for C/C++ and Java programs as they have been widely studied by the community. Since existing AVR tools for C/C++ programs are evaluated with different datasets, which often consist of a few vulnerabilities, we construct the first C/C++ vulnerability repair benchmark dataset, dubbed Vul4C, which contains 144 vulnerabilities as well as their exploits and patches. We use Vul4C to evaluate seven AVR tools for C/C++ programs and use the third-party Vul4J dataset to evaluate two AVR tools for Java programs. We also discuss future research directions.
Chaos engineering reveals resilience risks but is expensive and operationally risky to run broadly and often. Model-based analyses can estimate dependability, yet in practice they are tricky to build and keep current because models are typically handcrafted. We claim that a simple connectivity-only topological model - just the service-dependency graph plus replica counts - can provide fast, low-risk availability estimates under fail-stop faults. To make this claim practical without hand-built models, we introduce model discovery: an automated step that can run in CI/CD or as an observability-platform capability, synthesizing an explicit, analyzable model from artifacts teams already have (e.g., distributed traces, service-mesh telemetry, configs/manifests) - providing an accessible gateway for teams to begin resilience testing. As a proof by instance on the DeathStarBench Social Network, we extract the dependency graph from Jaeger and estimate availability across two deployment modes and five failure rates. The discovered model closely tracks live fault-injection results; with replication, median error at mid-range failure rates is near zero, while no-replication shows signed biases consistent with excluded mechanisms. These results create two opportunities: first, to triage and reduce the scope of expensive chaos experiments in advance, and second, to generate real-time signals on the system's resilience posture as its topology evolves, preserving live validation for the most critical or ambiguous scenarios.
Recent research suggests that one-third of the global supply of metals is used to produce machinery and industrial equipment (ME). ME production causes 8% of global greenhouse gas emissions. Yet, our understanding of how much different types of ME contribute is limited. While the energy use needed to operate machines usually enters life cycle assessments, the production of the machines is often neglected, mostly because data is lacking. Here we explore the use of detailed economic input-output data for the United States (USEEIO) to produce cradle-to-gate life cycle inventories for machinery for material handling and metalworking, machine tools, dies, fixtures, and industrial molds. The cradle-to-gate GHG emissions of the investigated machinery were 38 million tonnes CO2e (0.5% of US emissions), compared to 330 Mt for all ME. Materials contributed 46-63% to the carbon footprint of the ME in question, the production of electricity and fuels used in production processes other than materials production contributed 13-28%. Important uses of ME as capital products were in the manufacturing of vehicles, refining, and metal industries. Important uses as intermediate inputs were oil and gas production, mining, as well as manufacturing and commercial structures. This manuscript demonstrates the feasibility of using detailed input-output tables for life cycle inventory modelling of the production and use of ME.
Gabriel Trujillo-Hernández, Wendy Flores-Fuentes, Luis Roberto Ramírez-Hernández
et al.
Individuals’ lifestyles are affected by valgus and varus deformities in the rearfoot, causing pain in the joints and plantar surface due to the misalignment between the tibial and calcaneus. In orthopedics, medical professionals measure this misalignment by using X-ray systems and goniometers. The X-ray emits ionizing radiation that can cause damage through cumulative exposure over a lifetime, whereas the goniometer will produce measurement errors. This patent review conducted a technological search of systems and methods across various databases using inclusion and exclusion criteria. These thirty-five obtained patents provide valuable information about mechanical, electronic, and mechatronic technologies and non-ionizing radiation to evaluate valgus and varus deformities. The patents are classified into stationary mechanisms, stationary electronic devices, dynamic mechanisms, dynamic electronic devices, stationary mechatronic devices, and dynamic mechatronic devices. They are further categorized based on their measurement methods as either visual or automatic. Additionally, the patents are grouped by usage mode into sitting, standing, and walking. This patent review aims to provide medical professionals with little-known techniques for measuring and evaluating the rearfoot alignment.
Engineering machinery, tools, and implements, Technological innovations. Automation
Julian Frattini, Michael Unterkalmsteiner, Davide Fucci
et al.
Tools constitute an essential contribution to natural language processing for requirements engineering (NLP4RE) research. They are executable instruments that make research usable and applicable in practice. In this chapter, we first introduce a systematic classification of NLP4RE tools to improve the understanding of their types and properties. Then, we extend an existing overview with a systematic summary of 126 NLP4RE tools published between April 2019 and June 2023 to ease reuse and evolution of existing tools. Finally, we provide instructions on how to create, maintain, and disseminate NLP4RE tools to support a more rigorous management and dissemination.
In the last two decades, several researchers provided snapshots of the "current" state and evolution of empirical research in requirements engineering (RE) through literature reviews. However, these literature reviews were not sustainable, as none built on or updated previous works due to the unavailability of the extracted and analyzed data. KG-EmpiRE is a Knowledge Graph (KG) of empirical research in RE based on scientific data extracted from currently 680 papers published in the IEEE International Requirements Engineering Conference (1994-2022). KG-EmpiRE is maintained in the Open Research Knowledge Graph (ORKG), making all data openly and long-term available according to the FAIR data principles. Our long-term goal is to constantly maintain KG-EmpiRE with the research community to synthesize a comprehensive, up-to-date, and long-term available overview of the state and evolution of empirical research in RE. Besides KG-EmpiRE, we provide its analysis with all supplementary materials in a repository. This repository contains all files with instructions for replicating and (re-)using the analysis locally or via executable environments and for repeating the research approach. Since its first release based on 199 papers (2014-2022), KG-EmpiRE and its analysis have been updated twice, currently covering over 650 papers. KG-EmpiRE and its analysis demonstrate how innovative infrastructures, such as the ORKG, can be leveraged to make data from literature reviews FAIR, openly available, and maintainable for the research community in the long term. In this way, we can enable replicable, (re-)usable, and thus sustainable literature reviews to ensure the quality, reliability, and timeliness of their research results.
Natural Language Processing (NLP) is now a cornerstone of requirements automation. One compelling factor behind the growing adoption of NLP in Requirements Engineering (RE) is the prevalent use of natural language (NL) for specifying requirements in industry. NLP techniques are commonly used for automatically classifying requirements, extracting important information, e.g., domain models and glossary terms, and performing quality assurance tasks, such as ambiguity handling and completeness checking. With so many different NLP solution strategies available and the possibility of applying machine learning alongside, it can be challenging to choose the right strategy for a specific RE task and to evaluate the resulting solution in an empirically rigorous manner. In this chapter, we present guidelines for the selection of NLP techniques as well as for their evaluation in the context of RE. In particular, we discuss how to choose among different strategies such as traditional NLP, feature-based machine learning, and language-model-based methods. Our ultimate hope for this chapter is to serve as a stepping stone, assisting newcomers to NLP4RE in quickly initiating themselves into the NLP technologies most pertinent to the RE field.
Ben Arie Tanay, Lexy Arinze, Siddhant S. Joshi
et al.
Background: Large Language Models (LLMs) such as ChatGPT and CoPilot are influencing software engineering practice. Software engineering educators must teach future software engineers how to use such tools well. As of yet, there have been few studies that report on the use of LLMs in the classroom. It is, therefore, important to evaluate students' perception of LLMs and possible ways of adapting the computing curriculum to these shifting paradigms. Purpose: The purpose of this study is to explore computing students' experiences and approaches to using LLMs during a semester-long software engineering project. Design/Method: We collected data from a senior-level software engineering course at Purdue University. This course uses a project-based learning (PBL) design. The students used LLMs such as ChatGPT and Copilot in their projects. A sample of these student teams were interviewed to understand (1) how they used LLMs in their projects; and (2) whether and how their perspectives on LLMs changed over the course of the semester. We analyzed the data to identify themes related to students' usage patterns and learning outcomes. Results/Discussion: When computing students utilize LLMs within a project, their use cases cover both technical and professional applications. In addition, these students perceive LLMs to be efficient tools in obtaining information and completion of tasks. However, there were concerns about the responsible use of LLMs without being detrimental to their own learning outcomes. Based on our findings, we recommend future research to investigate the usage of LLM's in lower-level computer engineering courses to understand whether and how LLMs can be integrated as a learning aid without hurting the learning outcomes.
This study investigates the thermal insulation and moisture management of three types of mountaneering boots and simulated hiking activities under controlled environmental conditions with two elite athletes. Temperature and humidity were determined with six wireless probes placed on the most exposed parts of the foot (hallux, middle toe, little toe, dorsum, ankle and sole). Thermal images were taken to record the thermal insulation of each sample. Methodologically, the study aims to simulate every movement and activity of alpinism in order to realistically evaluate the conditions of use of this kind of footwear (also taking into account the lacing pressure exerted on the foot). Based on the results obtained, in a further step it will be possible to define the best solution in terms of combination of materials by creating a comfort scale for hiking boots.
Textile bleaching, dyeing, printing, etc., Engineering machinery, tools, and implements
In this paper, we propose a simulation method to evaluate the packed state of catalyst pellets in a pipe reactor for gas phase reaction. This simulation method is based on dynamic explicit FEM (finite element method) and consists of three steps. Specifically, first, the catalyst pellets freely fall under gravity and flow into a packing hose with a hopper-like inlet. Then, after the catalyst pellets reach the bottom of the pipe reactor, the packing hose is pulled up to create a random packing of the catalyst pellets in the pipe reactor. Finally, the bulk density and stress distribution of the catalyst pellets randomly packed in the pipe reactor are calculated, and the packed state of the catalyst pellets is evaluated. Using this simulation method, we simulated the packing of cylindrical catalyst pellets into the pipe reactor and evaluated the packed state of the catalyst pellets in the pipe reactor. As a result, it is clarified that the maximum equivalent stress generated at the bottom of the catalyst pellets packed in the pipe reactor approaches a constant value as the ratio of the packed height of the catalyst pellets to the diameter of the pipe reactor increases. In addition, the effects of the shape of the catalyst pellets on the bulk density and average equivalent stress of the catalyst pellets in the pipe reactor are elucidated, and the optimal shape of the catalyst pellets that can increase the bulk density and reduces the stress is found. The optimally shaped catalyst pellets are expected to improve their packed state in the pipe reactor, extend the durability of the catalyst pellets and increase the efficiency of production. In this paper, it is assumed that the packed state and durability of the catalyst pellets are evaluated based on the bulk density and stress distribution of the catalyst pellets in the pipe reactor.
Mechanical engineering and machinery, Engineering machinery, tools, and implements
Martin Valica, Tomáš Lempochner, Linda Machalová
et al.
The aim of this work was to evaluate the possibility of applying the dried biomass of <i>E. gracilis</i> var. <i>bacillaris</i> as a biosorbent for the removal of Cd. Experiments were carried out under conditions of batch systems involving aqueous solutions labelled with <sup>109</sup>CdCl<sub>2</sub>. From the kinetics of Cd biosorption, it can be assumed that the Cd removal was a rapid process that achieved the concentration equilibrium of [Cd]<sub>biomass</sub>:[Cd]<sub>solution</sub> in the first minutes of the interaction. In individual experiments, the effect of solution pH, initial biosorbent, or Cd concentration was evaluated. According to MINEQL+ speciation modelling, it was found that the biosorption of Cd decreased linearly corresponding to a decrease in the proportion of Cd<sup>2+</sup> in the solution. The biosorption data were well fitted to the Langmuir model of adsorption isotherm in comparison with the Freundlich model. The maximum biosorption capacity of the dried biomass of <i>E. gracilis</i> var. <i>bacillaris</i> for the removal of Cd was predicted and reached the value <i>Q<sub>max</sub></i> = 0.13 mmol/g or 14.1 mg/g (d.w.), respectively.
Fluorescent molecules are versatile nanoscale emitters that enable detailed observations of biophysical processes with nanoscale resolution. Because they are well-approximated as electric dipoles, imaging systems can be designed to visualize their 3D positions and 3D orientations, so-called dipole-spread function (DSF) engineering, for 6D super-resolution single-molecule orientation-localization microscopy (SMOLM). We review fundamental image-formation theory for fluorescent di-poles, as well as how phase and polarization modulation can be used to change the image of a dipole emitter produced by a microscope, called its DSF. We describe several methods for designing these modulations for optimum performance, as well as compare recently developed techniques, including the double-helix, tetrapod, crescent, and DeepSTORM3D learned point-spread functions (PSFs), in addition to the tri-spot, vortex, pixOL, raPol, CHIDO, and MVR DSFs. We also cover common imaging system designs and techniques for implementing engineered DSFs. Finally, we discuss recent biological applications of 6D SMOLM and future challenges for pushing the capabilities and utility of the technology.
Aluminum alloys have been extensively used in aerospace, machinery, transportation, and other industries due to their superior strength, relatively low weight, and outstanding thermal conductivity. High-speed cutting (HSC) and minimum quantity lubrication (MQL) techniques have been widely applied for the machining of aluminum alloys. This paper investigates numerically the influences of various geometrical tools on the cutting behaviors, in particular, the cutting forces/temperature of aluminum alloys. The effects of the rake angle and the friction coefficient are examined by simulations based on the orthogonal cutting mode. The findings of the numerical analysis may guide the design of cutting tools in high-speed MQL cutting for aluminum alloys.
This article reviews the advances in additive manufacturing of magnetic ceramics and alloys without rare-earth elements. Near-net-shaped permanent magnets with varying shapes and dimensions overcome traditional limitations of the cast, sintered, and bonded magnets. The published articles are categorized based on material types and 3D printing techniques. Selective laser melting and electron beam melting were predominantly used to produce alnico magnets. In addition to the electron beam melting, manganese aluminium-based alloys were successfully printed by fuse filament fabrication. By incorporating magnetic powders in polymers and then printing via extrusion, the fuse filament fabrication was also used to produce strontium ferrite magnets. Moreover, hard ferrites were printed by stereolithography and extrusion free-forming, without drawing composites into filaments. Magnetic properties in some cases are comparable to those of conventional magnets with the same compositions. Currently, available software packages can simulate magnetic fields for designing magnets and optimize the integration in electrical machines. These developments open up opportunities for next-generation permanent magnet applications.
Engineering machinery, tools, and implements, Technological innovations. Automation
The natural environment is of the utmost significance not only for a particular location but also for the entire world. This is because the natural environment provides essential environmental services to the human population. However, the environment is being negatively impacted by human activity as well as population growth. The most significant impact is felt in the national capital region. Using the Google Earth Engine (GEE) cloud platform and the QGIS desktop, the purpose of this research was to analyze the changes in land use and land cover (LULC) transformations that have taken place in the Sonipat district of India over the past ten years (2011–2021). Change detection (CD) of an LULC map is a method that examines shifts in LULC throughout time. Landsat 7 and the Sentinel 2 satellite image collections were utilized in this study. The study area was divided into four LULC categories using the most likely classified approach to quantify the changes over the aforementioned period. The results indicated that between 2011 and 2021, cropland in the study area decreased by about 11%. Built-up and urban areas increased by 3%. With the help of this study, decision-makers will be able to make choices that are appropriate in the given situation. The findings emphasize the value of satellite monitoring in reducing the rate of environmental degradation in the Sonipat district.
Finding meaningful concepts in engineering application datasets which allow for a sensible grouping of designs is very helpful in many contexts. It allows for determining different groups of designs with similar properties and provides useful knowledge in the engineering decision making process. Also, it opens the route for further refinements of specific design candidates which exhibit certain characteristic features. In this work, an approach to define meaningful and consistent concepts in an existing engineering dataset is presented. The designs in the dataset are characterized by a multitude of features such as design parameters, geometrical properties or performance values of the design for various boundary conditions. In the proposed approach the complete feature set is partitioned into several subsets called description spaces. The definition of the concepts respects this partitioning which leads to several desired properties of the identified concepts. This cannot be achieved with state-of-the-art clustering or concept identification approaches. A novel concept quality measure is proposed, which provides an objective value for a given definition of concepts in a dataset. The usefulness of the measure is demonstrated by considering a realistic engineering dataset consisting of about 2500 airfoil profiles, for which the performance values (lift and drag) for three different operating conditions were obtained by a computational fluid dynamics simulation. A numerical optimization procedure is employed, which maximizes the concept quality measure and finds meaningful concepts for different setups of the description spaces, while also incorporating user preference. It is demonstrated how these concepts can be used to select archetypal representatives of the dataset which exhibit characteristic features of each concept.