ConspectusCovalent organic frameworks (COFs) are an emerging class of crystalline porous polymers and have received tremendous attention and research interest. COFs can be classified into two-dimensional (2D) and three-dimensional (3D) analogues. Resembling the architectures of porous graphene, 2D conjugated COFs have exhibited promising prospects in many fields, such as gas storage and separation, heterogeneous catalysis, sensing, photocatalysis, environmental remediation, drug delivery, energy storage and conversion, and so forth. However, efficient structural design for high-throughput production of crystalline 2D COFs remains challenging.In this Account, we summarize our recent contributions to the design, synthesis, and application exploration of 2D conjugated COFs. First, we raised an efficient "two-in-one" strategy for the facile synthesis of 2D imine COFs with good reproducibility and solvent adaptability. Thanks to this elaborate molecular design strategy, we could easily modulate the topology of COFs and fabricate COF films. In addition, we developed two approaches to stabilize the 2D conjugated COFs by using planar building blocks and donor-acceptor structures. We also proposed a skeleton engineering strategy to design COFs as electrode materials, through which redox-active orthoquinone moieties were stepwise-incorporated in the skeletons of isostructural 2D imine-linked COFs. This strategy enabled systematic investigations on a series of 2D conjugated COFs with analogous structures but different numbers of active sites for energy storage, which provides a good platform to unveil the underlying structure-property relationships. In addition, we recently developed a new kind of arylamine-linked 2D conjugated COFs. The electroactive diphenylamine linkages endowed these 2D conjugated COFs with extended conjugation and improved stability, which also conferred these COFs with excellent pseudocapacitive energy storage performance. Moreover, tailor-made sulfur-rich COFs were introduced that were synthesized by selective introduction of polysulfide or sulfonyl groups on the COF skeletons and were used for Li storage and proton conduction. At the end, the key challenges of 2D conjugated COFs toward practical applications and their future prospects are suggested. We hope that this Account will evoke new inspirations and innovative work in the field of 2D conjugated COFs in the near future, especially in some burgeoning and interdisciplinary research areas.
The 2009 earthquake in Padang, Indonesia, caused significant damage to heritage assets, posing challenges to preserving their authenticity during recovery. This study examines the implementation of the Nara Protocol's six authenticity attributes traditions and techniques, form and design, use and function, materials and substance, spirit and feeling, and location and setting in participatory community-led heritage recovery efforts. The objective is to assess how the recovery process aligns with the Nara Protocol's principles of contextual authenticity while addressing local cultural values. A mixed-method approach was employed, including field surveys of damaged heritage structures, stakeholder interviews, and document analysis of pre- and post-disaster records. The results indicate various implementations of the six authenticity attributes, leading to alterations in authenticity. Religious and community buildings show more changes in materials and techniques due to the availability of present-day materials and the built-back-better principle, aiming to provide safer buildings and environments for future earthquakes. The lack of sufficient documentation challenges restoration efforts post-disaster, resulting in changes in shape and design. Advocacy from experts, universities, ancestors, and donors supports these conservation efforts. This study also highlights that community involvement is crucial to ensure the authenticity of spirit and feeling post-recovery.
Architecture, Architectural engineering. Structural engineering of buildings
This study explores the potential of using information flow as a spatial generator in urban architecture, through an algorithmic approach that emphasises relational thinking and adaptability in shaping the built environment. Bridging computational design and ecological paradigms, it reframes architecture as a medium for information exchange in the second machine age. The proposed method employs algorithmic processes Cellular Automata and Swarm Intelligence to generate dynamic spatial systems that address the logistical complexities of urban contexts. Applied to real urban sites, the CA-SI workflow reveals aggregation fields, directional gradients, and distributed trajectories that converge into five recurrent network topologies: annular, co-linear, co-vertex, diffused plane, and intersectional plane. These swarm networks prioritise relationships and interactions over isolated forms and foster a self-organising and metabolistic nature of urban networks. The findings indicate that information-driven spatial tendencies interact with and reorganise existing material conditions, which result in a co-constituted physical–digital field understood as multispace. This interplay highlights both the generative potential and interpretive limits of computational abstraction. By integrating modular and scalable components that dynamically respond to spatial and informational demands, the study offers a morphogenetic approach to urban architecture and positions information as a driver of adaptive and interconnected urban ecosystems.
Architecture, Architectural engineering. Structural engineering of buildings
With the rapid rise of AI coding agents, the fundamental premise of what it means to be a software engineer is in question. In this vision paper, we re-examine what it means for an AI agent to be considered a software engineer and then critically think about what makes such an agent trustworthy. \textit{Grounded} in established definitions of software engineering (SE) and informed by recent research on agentic AI systems, we conceptualise AI software engineers as participants in human-AI SE teams composed of human software engineers and AI models and tools, and we distinguish trustworthiness as a key property of these systems and actors rather than a subjective human attitude. Based on historical perspectives and emerging visions, we identify key dimensions that contribute to the trustworthiness of AI software engineers, spanning technical quality, transparency and accountability, epistemic humility, and societal and ethical alignment. We further discuss how trustworthiness can be evaluated and demonstrated, highlighting a fundamental trust measurement gap: not everything that matters for trust can be easily measured. Finally, we outline implications for the design, evaluation, and governance of AI SE systems, advocating for an ethics-by-design approach to enable appropriate trust in future human-AI SE teams.
Juan M. Murillo, Ignacio García Rodríguez de Guzmán, Enrique Moguel
et al.
The first edition of the QuantumX track, held within the XXIX Jornadas de Ingeniería del Software y Bases de Datos (JISBD 2025), brought together leading Spanish research groups working at the intersection of Quantum Computing and Software Engineering. The event served as a pioneering forum to explore how principles of software quality, governance, testing, orchestration, and abstraction can be adapted to the quantum paradigm. The presented works spanned diverse areas (from quantum service engineering and hybrid architectures to quality models, circuit optimization, and quantum machine learning), reflecting the interdisciplinary nature and growing maturity of Quantum Computing and Quantum Software Engineering. The track also fostered community building and collaboration through the presentation of national and Ibero-American research networks such as RIPAISC and QSpain, and through dedicated networking sessions that encouraged joint initiatives. Beyond reporting on the event, this article provides a structured synthesis of the contributions presented at QuantumX, identifies common research themes and engineering concerns, and outlines a set of open challenges and future directions for the advancement of Quantum Software Engineering. This first QuantumX track established the foundation for a sustained research community and positioned Spain as an emerging contributor to the European and global quantum software ecosystem.
Richard O’Hegarty, Aislinn McCarthy, Jack O’Hagan
et al.
This study assesses the embodied carbon credentials of modern methods of construction (MMC) by conducting a critical literature review and synthesis of the findings. While several studies have reviewed the broader impacts of MMC, no other study to date has comprehensively reviewed the embodied carbon credentials of this construction typology. Since MMC is not an internationally recognised term, the assessment is inclusive of other terminology used in different parts of the world – e.g. prefabrication, off-site construction and industrialised construction. The study captures 250 separate studies and distils these to a final sample set of 41 studies and a total of 82 case study comparisons. Although a general perception exists that the adoption of MMC results in embodied carbon savings, the evidence to support this claim is not robust. The results from individual case studies range significantly in both direction and magnitude, and, in the absence of a critical review, considerably different conclusions can be drawn. Upon critique and synthesis of the published studies, it is found that the adoption of MMC has no significant positive, or negative, impact on the embodied carbon of a building. Policy relevance MMC have been widely cited as the answer to housing shortages and productivity issues in the construction industry more broadly. They have subsequently attracted political attention and implementation in many regions. Embodied carbon is another topic of continued debate in built environment policy. There is a somewhat hopeful assumption that the adoption of MMC will reduce embodied carbon. But, to date, the evidence to arrive at that assumption has been inconsistent. The literature that compares MMC with traditional construction varies considerably. It is found that there is no broad link between MMC and reduced embodied carbon. Reducing the embodied carbon of buildings requires assessment on a case-by-case basis.
Architectural engineering. Structural engineering of buildings
The reconstruction, management, and optimization of gas pipelines is of significant importance for solving modern engineering problems. This paper presents innovative methodologies aimed at the effective reconstruction of gas pipelines under unstable conditions. The research encompasses the application of machine learning and optimization algorithms, targeting the enhancement of system reliability and the optimization of interventions during emergencies. The findings of the study present engineering solutions aimed at addressing the challenges in real-world applications by comparing the performance of various algorithms. Consequently, this work contributes to the advancement of cutting-edge approaches in the field of engineering and opens new perspectives for future research. A highly reliable and efficient technological Figure has been proposed for managing emergency processes in gas transportation based on the principles of the reconstruction phase. For complex gas pipeline systems, new approaches have been investigated for the modernization of existing control process monitoring systems. These approaches are based on modern achievements in control theory and information technology, aiming to select emergency and technological modes. One of the pressing issues is to develop a method to minimize the transmission time of measured and controlled data on non-stationary flow parameters of gas networks to dispatcher control centers. Therefore, the reporting Figures obtained for creating a reliable information base for dispatcher centers using modern methods to efficiently manage the gas dynamic processes of non-stationary modes are of particular importance.
Gopi Krishnan Rajbahadur, Keheliya Gallaba, Elyas Rashno
et al.
Modern software engineering increasingly relies on open, community-driven standards, yet how such standards are created in fast-evolving domains like AI-powered systems remains underexplored. This paper presents a detailed experience report on the development of the AI Bill of Materials AIBOM specification, an extension of the ISO/IEC 5962:2021 Software Package Data Exchange (SPDX) software bill of materials (SBOM) standard, which captures AI components such as datasets and iterative training artifacts. Framed through the lens of Action Research (AR), we document a global, multi-stakeholder effort involving over 90 contributors and structured AR cycles. The resulting specification was validated through four complementary approaches: alignment with major regulations and ethical standards (e.g., EU AI Act and IEEE 7000 standards), systematic mapping to six industry use cases, semi-structured practitioner interviews, and an industrial case study. Beyond delivering a validated artefact, our paper documents the process of building the AIBOM specification in the wild, and reflects on how it aligns with the AR cycle, and distills lessons that can inform future standardization efforts in the software engineering community.
Jorge Pérez, Jessica Díaz, Ángel González-Prieto
et al.
Context: This work is part of a research project whose ultimate goal is to systematize theory building in qualitative research in the field of software engineering. The proposed methodology involves four phases: conceptualization, operationalization, testing, and application. In previous work, we performed the conceptualization of a theory that investigates the structure of IT departments and teams when software-intensive organizations adopt a culture called DevOps. Objective: This paper presents a set of procedures to systematize the operationalization phase in theory building and their application in the context of DevOps team structures. Method: We operationalize the concepts and propositions that make up our theory to generate constructs and empirically testable hypotheses. Instead of using causal relations to operationalize the propositions, we adopt logical implication, which avoids the problems associated with causal reasoning. Strategies are proposed to ensure that the resulting theory aligns with the criterion of parsimony. Results: The operationalization phase is described from three perspectives: specification, implementation, and practical application. First, the operationalization process is formally defined. Second, a set of procedures for operating both concepts and propositions is described. Finally, the usefulness of the proposed procedures is demonstrated in a case study. Conclusions: This paper is a pioneering contribution in offering comprehensive guidelines for theory operationalization using logical implication. By following established procedures and using concrete examples, researchers can better ensure the success of their theory-building efforts through careful operationalization.
Lovis Justin Immanuel Zenz, Erik Heiland, Peter Hillmann
et al.
In this paper, we propose a method for aligning models with their realization through the application of model-based systems engineering. Our approach is divided into three steps. (1) Firstly, we leverage domain expertise and the Unified Architecture Framework to establish a reference model that fundamentally describes some domain. (2) Subsequently, we instantiate the reference model as specific models tailored to different scenarios within the domain. (3) Finally, we incorporate corresponding run logic directly into both the reference model and the specific models. In total, we thus provide a practical means to ensure that every implementation result is justified by business demand. We demonstrate our approach using the example of maritime object detection as a specific application (specific model / implementation element) of automatic target recognition as a service reoccurring in various forms (reference model element). Our approach facilitates a more seamless integration of models and implementation, fostering enhanced Business-IT alignment.
6G will revolutionize the software world allowing faster cellular communications and a massive number of connected devices. 6G will enable a shift towards a continuous edge-to-cloud architecture. Current cloud solutions, where all the data is transferred and computed in the cloud, are not sustainable in such a large network of devices. Current technologies, including development methods, software architectures, and orchestration and offloading systems, still need to be prepared to cope with such requirements. In this paper, we conduct a Systematic Mapping Study to investigate the current research status of 6G Software Engineering. Results show that 18 research papers have been proposed in software process, software architecture, orchestration and offloading methods. Of these, software architecture and software-defined networks are respectively areas and topics that have received the most attention in 6G Software Engineering. In addition, the main types of results of these papers are methods, architectures, platforms, frameworks and algorithms. For the five tools/frameworks proposed, they are new and not currently studied by other researchers. The authors of these findings are mainly from China, India and Saudi Arabia. The results will enable researchers and practitioners to further research and extend for 6G Software Engineering.
[Context and Motivation]: The quality of requirements specifications impacts subsequent, dependent software engineering activities. Requirements quality defects like ambiguous statements can result in incomplete or wrong features and even lead to budget overrun or project failure. [Problem]: Attempts at measuring the impact of requirements quality have been held back by the vast amount of interacting factors. Requirements quality research lacks an understanding of which factors are relevant in practice. [Principal Ideas and Results]: We conduct a case study considering data from both interview transcripts and issue reports to identify relevant factors of requirements quality. The results include 17 factors and 11 interaction effects relevant to the case company. [Contribution]: The results contribute empirical evidence that (1) strengthens existing requirements engineering theories and (2) advances industry-relevant requirements quality research.
The research was carried out for the selection of optimal options for the heat supply of multi-apartment buildings, taking as an example several new buildings in Yerevan. The purpose of the study is to confirm the choice of the best method of providing heat in apartment complexes. It was used to calculate and analyze the energy-economic and operational-technical indicators of individual heating boilers, small centralized systems, and hybrid systems to solve this problem. The calculations considered both natural gas tariffs and fluctuations in the value of the Armenian dram against the US dollar. The value of 1 kWh of thermal energy or specific heat capacity has been determined as an important criterion for choosing the most efficient method of heat supply, considering the careful analysis of almost all variable factors.
Au niveau de la répartition spatiale des masses architecturales, le processus d'évolution des mosquées marocaines a donné lieu à deux courants d’architecture :
Le premier courant : la salle de prière à nefs parallèles au mur de la qibla. Cette tendance était forte présente dans les premières mosquées monumentales au Maroc. Citons comme exemple la mosquée Al Quaraouiyine et Al-Andalous à Fès. Cette composition de volumes est utilisée plus tard dans la construction de plusieurs mosquées médiévales et postmédiévale au Maroc.
Le deuxième courant : la salle de prière à nefs longitudinales ; la salle de prière est appelée aussi hypostyle à nef perpendiculaire au mur de la qibla. Elle s'inspire de la mosquée d’Al-Aqsa à Jérusalem. Ce courant est dominant dans les mosquées Almohades et rarement utilisé dans les mosquées Mérinides et Saadiennes. L’adoption du premier ou du deuxième courant a des répercussions sur l’agencement intérieur de la mosquée. Dans le premier cas, les arcades sont parallèles au mur de la qibla et dans le deuxième cas les arcades sont perpendiculaires au mur de la qibla.
Architectural engineering. Structural engineering of buildings
Shape optimization, as one of the types of structural optimization problems, is an important process in the design of shells, since it contributes to the creation of a structure with fine performance characteristics, expansion of design variations and knowledge base to obtain high-quality results. To solve the problems associated with determining the shape and creating more advanced structures, software packages include a special optimization module, which can be based on one or more mathematical methods, the purpose of which is to provide the best solution in the shortest possible time. The research is focused on the process of shape optimization in three well-known universal software packages: Ansys Mechanical, COMSOL Multiphysics and Simulia Abaqus, as well as in Rhinoceros modeling software with a special visual Grasshopper plugin. The purpose of the study is to analyze the technology of shape optimization in four software packages and to compare them with each other in terms of the problem-solving process, user interface, the fullness of libraries, accessibility for educational purposes and system requirements for a computer. The authors specify and describe the characteristic features of each software package. It was found that all the software packages under consideration are equipped with great opportunities for shape optimization of structures and have a variety of functionality for solving this type of tasks. The development of optimization technology in calculation and modeling software packages will allow obtaining the most effective solutions in the process of designing shells of complex shapes.
Architectural engineering. Structural engineering of buildings
Tanja Marzi, Clara Bertolini-Cestari, Olivia Pignatelli
The paper presents a significant case study: the Church of San Giovanni Battista in Salbertrand dates back to the 16th century and constitutes one of the most interesting examples of religious architecture in the Susa Valley of the western Italian Alps.
Its historic timber roof structure was once at risk of demolition, but in 2000 finally became the object of necessary preservation and reinforcement works. Here, the interdisciplinary studies carried out for the diagnosis and assessment of the state of conservation are presented, starting with the identification of the wood species used, the geometrical survey, the visual and NDT diagnosis of the timber elements, and the structural evaluation. A special section is dedicated to the dendrochronological analysis, with a comparison of different case studies regarding larch roof structures of other historic architectures located in the northwest of Italy. The tree-ring sequences obtained from the buildings presented have also been used to define a larch chronology of the Susa Valley in Piedmont.
Following the first assessment phase, a second phase involved defining the restoration and reinforcement interventions. The reinterpretation of historic craftsmanship rules and traditions, which already contemplated the use of steel devices, attempted to offer alternative design solutions. This reinterpretation constituted the basis of the reinforcement interventions carried out in Salbertrand in the early 2000s. This paper highlights the importance of learning from historical treatises, showing how, even in modern reinforcement interventions, the application of traditional carpentry rules can achieve the aims of preservation and structural efficiency with overall cost-effectiveness and durability, resulting in a favorable balance between tradition and innovation.
Architectural engineering. Structural engineering of buildings
Abstract Rayleigh damping is commonly used in response history analysis (RHA) to provide an energy dissipation source, but the adaptation may be difficult if frequency independence damping is desired over a wide frequency range. And, while modal damping produces constant frequency‐independent damping, the damping matrix becomes dense, increasing the calculation weight is large. This study examines the advantage of using extended Rayleigh damping in the RHA. First, the multi spring‐mass system is used to assess the basic damping performance of the extended Rayleigh damping in the assumed frequency range. A 12‐story 3D elastic moment frame model is then used to compare the damping performance. As a result, extended Rayleigh damping is demonstrated to be frequency independent over a wider frequency range than Rayleigh damping and less computationally demanding than modal damping.
Architecture, Architectural engineering. Structural engineering of buildings
Ruochun Zhang, Bonaventura Tagliafierro, Colin Vanden Heuvel
et al.
This paper introduces DEM-Engine, a new submodule of Project Chrono, that is designed to carry out Discrete Element Method (DEM) simulations. Based on spherical primitive shapes, DEM-Engine can simulate polydisperse granular materials and handle complex shapes generated as assemblies of primitives, referred to as clumps. DEM-Engine has a multi-tier parallelized structure that is optimized to operate simultaneously on two GPUs. The code uses custom-defined data types to reduce memory footprint and increase bandwidth. A novel "delayed contact detection" algorithm allows the decoupling of the contact detection and force computation, thus splitting the workload into two asynchronous GPU streams. DEM-Engine uses just-in-time compilation to support user-defined contact force models. This paper discusses its C++ and Python interfaces and presents a variety of numerical tests, in which impact forces, complex-shaped particle flows, and a custom force model are validated considering well-known benchmark cases. Additionally, the full potential of the simulator is demonstrated for the investigation of extraterrestrial rover mobility on granular terrain. The chosen case study demonstrates that large-scale co-simulations (comprising 11 million elements) spanning 15 seconds, in conjunction with an external multi-body dynamics system, can be efficiently executed within a day. Lastly, a performance test suggests that DEM-Engine displays linear scaling up to 150 million elements on two NVIDIA A100 GPUs.
Gunnar Kudrjavets, Aditya Kumar, Jeff Thomas
et al.
Engineers build large software systems for multiple architectures, operating systems, and configurations. A set of inconsistent or missing compiler flags generates code that catastrophically impacts the system's behavior. In the authors' industry experience, defects caused by an undesired combination of compiler flags are common in nontrivial software projects. We are unaware of any build and CI/CD systems that track how the compiler produces a specific binary in a structured manner. We postulate that a queryable database of how the compiler compiled and linked the software system will help to detect defects earlier and reduce the debugging time.
Cancer remains one of the leading health burdens worldwide. One of the challenges hindering cancer therapy development is the substantial discrepancies between the existing cancer models and the tumor microenvironment (TME) of human patients. Constructing tumor organoids represents an emerging approach to recapitulate the pathophysiological features of the TME in vitro. Over the past decade, various approaches have been demonstrated to engineer tumor organoids as in vitro cancer models, such as incorporating multiple cellular populations, reconstructing biophysical and chemical traits, and even recapitulating structural features. In this review, we focus on engineering approaches for building tumor organoids, including biomaterial-based, microfabrication-assisted, and synthetic biology-facilitated strategies. Furthermore, we summarize the applications of engineered tumor organoids in basic cancer research, cancer drug discovery, and personalized medicine. We also discuss the challenges and future opportunities in using tumor organoids for broader applications.