Yang Liu
Hasil untuk "Electrical engineering. Electronics. Nuclear engineering"
Menampilkan 20 dari ~8455581 hasil · dari CrossRef, arXiv
Alejandro Pradas-Gomez, Arindam Brahma, Ola Isaksson
Engineering analysis automation in product development relies on rigid interfaces between tools, data formats and documented processes. When these interfaces change, as they routinely do as the product evolves in the engineering ecosystem, the automation support breaks. This paper presents a DUCTILE (Delegated, User-supervised Coordination of Tool- and document-Integrated LLM-Enabled) agentic orchestration, an approach for developing, executing and evaluating LLM-based agentic automation support of engineering analysis tasks. The approach separates adaptive orchestration, performed by the LLM agent, from deterministic execution, performed by verified engineering tools. The agent interprets documented design practices, inspects input data and adapts the processing path, while the engineer supervises and exercises final judgment. DUCTILE is demonstrated on an industrial structural analysis task at an aerospace manufacturer, where the agent handled input deviations in format, units, naming conventions and methodology that would break traditional scripted pipelines. Evaluation against expert-defined acceptance criteria and deployment with practicing engineers confirm that the approach produces correct, methodologically compliant results across 10 repeated independent runs. The paper discusses the paradigm shift and the practical consequences of adopting agentic automation, including unintended effects on the nature of engineering work when removing mundane tasks and creating an exhausting supervisory role.
Weixing Zhang, Mario Herb, Martin Armbruster et al.
Despite Domain-Driven Design's proven value in managing complex business logic, a fundamental semantic expressiveness gap persists between generic modeling languages and tactical DDD patterns, causing continuous divergence between design intent and implementation. We envision a constraint-based tactical modeling environment that transforms abstract architectural principles into explicit, tool-enforced engineering constraints. At its core is a DDD-native metamodel where tactical patterns are first-class modeling primitives, coupled with a real-time constraint verification engine that prevents architectural violations during modeling, and bidirectional synchronization mechanisms that maintain model-code consistency through round-trip engineering. This approach aims to democratize tactical DDD by embedding expert-level architectural knowledge directly into modeling constraints, enabling small teams and junior developers to build complex business systems without sacrificing long-term maintainability. By lowering the technical barriers to DDD adoption, we envision transforming tactical DDD from an elite practice requiring continuous expert oversight into an accessible engineering discipline with tool-supported verification.
Breno Bernard Nicolau de França, Dietmar Pfahl, Valdemar Vicente Graciano Neto et al.
The chapter supports educators and postgraduate students in understanding the role of simulation in software engineering research based on the authors' experience. This way, it includes a background positioning simulation-based studies in software engineering research, the proposition of learning objectives for teaching simulation as a research method, and presents our experience when teaching simulation concepts and practice. For educators, it further provides learning objectives when teaching simulation, considering the current state of the art in software engineering research and the necessary guidance and recommended learning activities to achieve these objectives. For students, it drives the learning path for those interested in learning this method but had no opportunity to engage in an entire course on simulation in the context of empirical research.
Pragya Verma, Marcos Vinicius Cruz, Grischa Liebel
Neurodiversity describes variation in brain function among people, including common conditions such as Autism spectrum disorder (ASD), Attention deficit hyperactivity disorder (ADHD), and dyslexia. While Software Engineering (SE) literature has started to explore the experiences of neurodivergent software engineers, there is a lack of research that compares their challenges to those of neurotypical software engineers. To address this gap, we analyze existing data from the 2022 Stack Overflow Developer survey that collected data on neurodiversity. We quantitatively compare the answers of professional engineers with ASD (n=374), ADHD (n=1305), and dyslexia (n=363) with neurotypical engineers. Our findings indicate that neurodivergent engineers face more difficulties than neurotypical engineers. Specifically, engineers with ADHD report that they face more interruptions caused by waiting for answers, and that they less frequently interact with individuals outside their team. This study provides a baseline for future research comparing neurodivergent engineers with neurotypical ones. Several factors in the Stack Overflow survey and in our analysis are likely to lead to conservative estimates of the actual effects between neurodivergent and neurotypical engineers, e.g., the effects of the COVID-19 pandemic and our focus on employed professionals.
Andreas Vogelsang
Generative LLMs, such as GPT, have the potential to revolutionize Requirements Engineering (RE) by automating tasks in new ways. This column explores the novelties and introduces the importance of precise prompts for effective interactions. Human evaluation and prompt engineering are essential in leveraging LLM capabilities.
Jefferson Seide Molleri, Kai Petersen
In the dynamic field of Software Engineering (SE), where practice is constantly evolving and adapting to new technologies, conducting research is a daunting quest. This poses a challenge for researchers: how to stay relevant and effective in their studies? Empirical Software Engineering (ESE) has emerged as a contending force aiming to critically evaluate and provide knowledge that informs practice in adopting new technologies. Empirical research requires a rigorous process of collecting and analyzing data to obtain evidence-based findings. Challenges to this process are numerous, and many researchers, novice and experienced, found difficulties due to many complexities involved in designing their research. The core of this chapter is to teach foundational skills in research design, essential for educating software engineers and researchers in ESE. It focuses on developing a well-structured research design, which includes defining a clear area of investigation, formulating relevant research questions, and choosing appropriate methodologies. While the primary focus is on research design, this chapter also covers aspects of research scoping and selecting research methods. This approach prepares students to handle the complexities of the ever-changing technological landscape in SE, making it a critical component of their educational curriculum.
Cynthia C. S. Liem, Andrew M. Demetriou
So far, the relationship between open science and software engineering expertise has largely focused on the open release of software engineering research insights and reproducible artifacts, in the form of open-access papers, open data, and open-source tools and libraries. In this position paper, we draw attention to another perspective: scientific insight itself is a complex and collaborative artifact under continuous development and in need of continuous quality assurance, and as such, has many parallels to software artifacts. Considering current calls for more open, collaborative and reproducible science; increasing demands for public accountability on matters of scientific integrity and credibility; methodological challenges coming with transdisciplinary science; political and communication tensions when scientific insight on societally relevant topics is to be translated to policy; and struggles to incentivize and reward academics who truly want to move into these directions beyond traditional publishing habits and cultures, we make the parallels between the emerging open science requirements and concepts already well-known in (open-source) software engineering research more explicit. We argue that the societal impact of software engineering expertise can reach far beyond the software engineering research community, and call upon the community members to pro-actively help driving the necessary systems and cultural changes towards more open and accountable research.
Ana-Maria Comeagă, Iuliana Marin
The rise of the Internet has brought about significant changes in our lives, and the rapid expansion of the Internet of Things (IoT) is poised to have an even more substantial impact by connecting a wide range of devices across various application domains. IoT devices, especially low-end ones, are constrained by limited memory and processing capabilities, necessitating efficient memory management within IoT operating systems. This paper delves into the importance of memory management in IoT systems, with a primary focus on the design and configuration of such systems, as well as the scalability and performance of scene management. Effective memory management is critical for optimizing resource usage, responsiveness, and adaptability as the IoT ecosystem continues to grow. The study offers insights into memory allocation, scene execution, memory reduction, and system scalability within the context of an IoT system, ultimately highlighting the vital role that memory management plays in facilitating a seamless and efficient IoT experience.
Danilo Monteiro Ribeiro, Alberto Souza, Victor Santiago et al.
In several areas of knowledge, self-efficacy is related to the perfomance of individuals, including in Software Engineering. However,it is not clear how self-efficacy can be modified in training conducted by the industry. Furthermore, we still do not understand how self-efficacy can impact an individual's team and career in the industry. This lack of understanding can negatively impact how companies and individuals perceive the importance of self-efficacy in the field. Therefore, We present a research proposal that aims to understand the relationship between self-efficacy and training in Software Engineering. Moreover, we look to understand the role of self-efficacy at Software Development industry. We propose a longitudinal case study with software engineers at Zup Innovation that participating of our bootcamp training. We expect to collect data to support our assumptions that self-efficacy can be related to training in Software Engineering. The other assumption is that self-efficacy at the beginning of training is higher than the middle, and that self-efficacy at the end of training is higher than the middle. We expect that the study proposed in this article will motivate a discussion about self-efficacy and the importance of training employers in the industry of software development.
Sabah Al-Fedaghi
According to some algorithmicists, algorithmics traditionally uses algorithm theory, which stems from mathematics. The growing need for innovative algorithms has caused increasing gaps between theory and practice. Originally, this motivated the development of algorithm engineering, which is viewed as experimental techniques related to software engineering. Currently, algorithm engineering is a methodology for algorithmic research that combines theory with implementation and experimentation in order to produce better algorithms with high practical impact. Still, researchers have questioned whether the notion of algorithms can be defined in a fully generable way and discussed what kinds of entities algorithms actually are. They have also struggled to maintain a view that formulates algorithms mathematically (e.g., Turing machines and finite-state machines [FSMs]) while adapting a more applied view. Answering the question of what algorithms have practical applications in software specifications in particular, this paper proposes a diagrammatical definition of an algorithm based on a new modeling machine called a thinging machine (TM). The machine has five actions (e.g., create, process, release, transfer, and receive) that can form a network of machines. The paper explores the application of the definition in Turing machines and FSMs. The results point to the fact that the proposed definition can serve as a middle-ground representation of algorithms, a definition which is between formal specification and the commonly used informal definition (e.g., set of instructions).
Thierry Lecomte
Industrial applications involving formal methods are still exceptions to the general rule. Lack of understanding, employees without proper education, difficulty to integrate existing development cycles, no explicit requirement from the market, etc. are explanations often heard for not being more formal. This article reports some experience about a game changer that is going to seamlessly integrate formal methods into safety critical systems engineering.
Bao-Xi Sun
The collective excitation of nuclear matter is analyzed in a bosonized Landau Fermi liquid model. When the nonlinear self-interacting terms of scalar mesons are included in Walecka model, the collective excitation energy of nuclear matter can be obtained self-consistently, and the calculation results are consistent with the corresponding experimental data of the nucleus ${}^{208}Pb$ when the quantum number of the orientation of the total spin $m$ is zero. The cases with the nonzero $m$ values are also studied, and it is found that the collective excitation energy of nuclear matter decreases with the absolute value of $m$ increasing when the total spin is conserved. Moreover, four kinds of collective excitation modes of nuclear matter are discussed when the isospin and spin of nucleons are taken into account. The direct interaction between two nucleons near Fermi surface only changes the effective nucleon mass and Fermi velocity, while the exchange interaction plays an critical role in the collective excitation of nuclear matter.
Amit Kumar Mishra
Innovation and entrepreneurship have a very special role to play in creating sustainable development in the world. Engineering design plays a major role in innovation. These are not new facts. However this added to the fact that in current time knowledge seem to increase at an exponential rate, growing twice every few months. This creates a need to have newer methods to innovate with very little scope to fall short of the expectations from customers. In terms of reliable designing, system design tools and methodologies have been very helpful and have been in use in most engineering industries for decades now. But traditional system design is rigorous and rigid. As we can see, we need an innovation system that should be rigorous and flexible at the same time. We take our inspiration from biosphere, where some of the most rugged yet flexible plants are creepers which grow to create mesh. In this thematic paper we shall explain our approach to system engineering which we call the MeMo (Mesh Model) that fuses the rigor of system engineering with the flexibility of agile methods to create a scheme that can give rise to reliable innovation in the high risk market of today.
H. Togashi, K. Nakazato, Y. Takehara et al.
A new table of the nuclear equation of state (EOS) based on realistic nuclear potentials is constructed for core-collapse supernova numerical simulations. Adopting the EOS of uniform nuclear matter constructed by two of the present authors with the cluster variational method starting from the Argonne v18 and Urbana IX nuclear potentials, the Thomas-Fermi calculation is performed to obtain the minimized free energy of a Wigner-Seitz cell in non-uniform nuclear matter. As a preparation for the Thomas-Fermi calculation, the EOS of uniform nuclear matter is modified so as to remove the effects of deuteron cluster formation in uniform matter at low densities. Mixing of alpha particles is also taken into account following the procedure used by Shen et al. (1998, 2011). The critical densities with respect to the phase transition from non-uniform to uniform phase with the present EOS are slightly higher than those with the Shen EOS at small proton fractions. The critical temperature with respect to the liquid-gas phase transition decreases with the proton fraction in a more gradual manner than in the Shen EOS. Furthermore, the mass and proton numbers of nuclides appearing in non-uniform nuclear matter with small proton fractions are larger than those of the Shen EOS. These results are consequences of the fact that the density derivative coefficient of the symmetry energy of our EOS is smaller than that of the Shen EOS.
Markus Kortelainen
Parameters of the nuclear density functional theory (DFT) models are usually adjusted to experimental data. As a result they carry certain theoretical error, which, as a consequence, carries out to the predicted quantities. In this work we address the propagation of theoretical error, within the nuclear DFT models, from the model parameters to the predicted observables. In particularly, the focus is set on the Skyrme energy density functional models.
Andrew M. Connor, Jim Buchan, Krassie Petrova
In this paper, we introduce the concept of the research practice gap as it is perceived in the field of software requirements engineering. An analysis of this gap has shown that two key causes for the research-practice gap are lack of effective communication and the relatively light coverage of requirements engineering material in University programmes. We discuss the design and delivery of a Masters course in Software Requirements Engineering (SRE) that is designed to overcome some of the issues that have caused the research-practice gap. By encouraging students to share their experiences in a peer learning environment, we aim to improve shared understanding between students (many of whom are current industry practitioners) and researchers (including academic staff members) to improve the potential for effective collaborations, whilst simultaneously developing the requirements engineering skill sets of the enrolled students. Feedback from students in the course is discussed and directions for the future development of the curriculum and learning strategies are given.
Thilo Breitsprecher, Mihai Codescu, Constantin Jucovschi et al.
The engineering design process follows a series of standardized stages of development, which have many aspects in common with software engineering. Among these stages, the principle solution can be regarded as an analogue of the design specification, fixing as it does the way the final product works. It is usually constructed as an abstract sketch (hand-drawn or constructed with a CAD system) where the functional parts of the product are identified, and geometric and topological constraints are formulated. Here, we outline a semantic approach where the principle solution is annotated with ontological assertions, thus making the intended requirements explicit and available for further machine processing; this includes the automated detection of design errors in the final CAD model, making additional use of a background ontology of engineering knowledge. We embed this approach into a document-oriented design workflow, in which the background ontology and semantic annotations in the documents are exploited to trace parts and requirements through the design process and across different applications.
B. Diertens
In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. We also described this software development process more formally by presenting the tools we use in this process in a CASE setting, leading to the PSF-ToolBus software engineering environment. In this article we summarize that work and describe a similar software development process for implementation of software systems using a client / server model and present this in a CASE setting as well.
P. Danielewicz
Extraction of bulk nuclear properties by comparing reaction observables to results from semiclassical transport-model simulations is discussed. Specific properties include the nuclear viscosity, incompressibility and constraints on the nuclear pressure at supranormal densities.
Halaman 57 dari 422780