For a class of stochastic dynamical models of exchange economies that we call ``fully connected Cobb-Douglas'', the paper proves convergence of the probability distribution to an equilibrium, in total variation metric as time goes to infinity. The convergence is exponential and the equilibrium is determined uniquely by the number of agents, their ``exponents'', and the initial amounts of money and goods in the economy.
The demand of finite raw materials will keep increasing as they fuel modern society. Simultaneously, solutions for stopping carbon emissions in the short term are not available, thus making the net zero target extremely challenging to achieve at scale. The circular economy (CE) paradigm is gaining attention as a solution to address climate change and the uncertainties of supplies of critical materials. Hence, in this paper, we introduce CiRL, a deep reinforcement learning (DRL) library of environments focused on the circularity control of both solid and fluid materials. The integration of DRL into the design of material circularity is possible thanks to the formalism of thermodynamical material networks, which is underpinned by compartmental dynamical thermodynamics. Along with the focus on circularity, this library has three more features: the new CE-oriented environments are in the state-space form, which is typically used in dynamical systems analysis and control design; it is based on a state-of-the-art Python library of DRL algorithms, namely, Stable-Baselines3; and it is developed in Google Colaboratory to be accessible to researchers from different disciplines and backgrounds as is often the case for circular economy researchers and engineers. CiRL is intended to be a tool to generate AI-driven actions for optimizing the circularity of supply-recovery chains and to be combined with human-driven decisions derived from material flow analysis (MFA) studies. CiRL is publicly available.
Agentic AI is poised to usher in a seismic paradigm shift in Software Engineering (SE). As technologists rush head-along to make agentic AI a reality, SE researchers are driven to establish agentic SE as a research area. While early visions of agentic SE are primarily focused on code-related activities, early empirical evidence calls for a consideration of a wider range of socio-technical activities and concerns to make it work in practice. This paper contributes to the emerging visions by: (a) recommending an expansion of its scope beyond code, toward a 'whole of process' vision, grounding it in SE foundations and evolution and emerging agentic SE frameworks, (b) proposing a preliminary set of values and principles to guide community efforts, and (c) sharing guidance on designing and using well-defined vocabulary for agentic SE. It is hoped that these ideas will encourage collaborations and steer the SE community toward laying strong foundations of agentic SE so it is not limited to enabling coding acceleration but becomes the next process-level paradigm shift.
Hydro-Science and Engineering (Hydro-SE) is a critical and irreplaceable domain that secures human water supply, generates clean hydropower energy, and mitigates flood and drought disasters. Featuring multiple engineering objectives, Hydro-SE is an inherently interdisciplinary domain that integrates scientific knowledge with engineering expertise. This integration necessitates extensive expert collaboration in decision-making, which poses difficulties for intelligence. With the rapid advancement of large language models (LLMs), their potential application in the Hydro-SE domain is being increasingly explored. However, the knowledge and application abilities of LLMs in Hydro-SE have not been sufficiently evaluated. To address this issue, we propose the Hydro-SE LLM evaluation benchmark (Hydro-SE Bench), which contains 4,000 multiple-choice questions. Hydro-SE Bench covers nine subfields and enables evaluation of LLMs in aspects of basic conceptual knowledge, engineering application ability, and reasoning and calculation ability. The evaluation results on Hydro-SE Bench show that the accuracy values vary among 0.74 to 0.80 for commercial LLMs, and among 0.41 to 0.68 for small-parameter LLMs. While LLMs perform well in subfields closely related to natural and physical sciences, they struggle with domain-specific knowledge such as industry standards and hydraulic structures. Model scaling mainly improves reasoning and calculation abilities, but there is still great potential for LLMs to better handle problems in practical engineering application. This study highlights the strengths and weaknesses of LLMs for Hydro-SE tasks, providing model developers with clear training targets and Hydro-SE researchers with practical guidance for applying LLMs.
Banghua Zhu, Sai Praneeth Karimireddy, Jiantao Jiao
et al.
The creator economy has revolutionized the way individuals can profit through online platforms. In this paper, we initiate the study of online learning in the creator economy by modeling the creator economy as a three-party game between the users, platform, and content creators, with the platform interacting with the content creator under a principal-agent model through contracts to encourage better content. Additionally, the platform interacts with the users to recommend new content, receive an evaluation, and ultimately profit from the content, which can be modeled as a recommender system. Our study aims to explore how the platform can jointly optimize the contract and recommender system to maximize the utility in an online learning fashion. We primarily analyze and compare two families of contracts: return-based contracts and feature-based contracts. Return-based contracts pay the content creator a fraction of the reward the platform gains. In contrast, feature-based contracts pay the content creator based on the quality or features of the content, regardless of the reward the platform receives. We show that under smoothness assumptions, the joint optimization of return-based contracts and recommendation policy provides a regret $Θ(T^{2/3})$. For the feature-based contract, we introduce a definition of intrinsic dimension $d$ to characterize the hardness of learning the contract and provide an upper bound on the regret $\mathcal{O}(T^{(d+1)/(d+2)})$. The upper bound is tight for the linear family.
Noise: an enemy to be dealt with and a major factor limiting communication system performance. However, what if there is gold in that garbage? In conventional engineering, our focus is primarily on eliminating, suppressing, combating, or even ignoring noise and its detrimental impacts. Conversely, could we exploit it similarly to biology, which utilizes noise-alike carrier signals to convey information? In this context, the utilization of noise, or noise-alike signals in general, has been put forward as a means to realize unconditionally secure communication systems in the future. In this tutorial article, we begin by tracing the origins of thermal noise-based communication and highlighting one of its significant applications for ensuring unconditionally secure networks: the Kirchhoff-law-Johnson-noise (KLJN) secure key exchange scheme. We then delve into the inherent challenges tied to secure communication and discuss the imperative need for physics-based key distribution schemes in pursuit of unconditional security. Concurrently, we provide a concise overview of quantum key distribution (QKD) schemes and draw comparisons with their KLJN-based counterparts. Finally, extending beyond wired communication loops, we explore the transmission of noise signals over-the-air and evaluate their potential for stealth and secure wireless communication systems.
The work space is the sancto sanctorum of the world’s economy. It has become increasingly clear that a functional, efficient work space is conducive to optimum output. Therefore the engineering concepts of comfort and ease of working for long durations have become the mainstay of industries the world over. These concepts are complemented by the inputs of safety, hygiene and worker friendliness especially in the global garment manufacturing industry. They need to be evaluated scientifically. One of these means is the evaluation by using OCRA. The basic concept of OCRA, therefore, is to determine and measure a defined technical action. It examines the series of complex movements necessary to complete a work task, involving the upper limb joints at the shoulder, elbow, wrist and finger level. Various scholars have documented the root causes of musculoskeletal discomfort among the seated workers and have delineated them as the constrained work posture and repetitive tasks. Currently, the researcher is working on three work aid prototypes by incorporating workstation design principles for modifications to improve working posture.
Abstract Safety management of hydrogen infrastructure is vital for sustainable progress in the hydrogen economy. Accordingly, this paper presents a dynamic and holistic risk model to address some significant shortcomings of the current hydrogen risk analysis models. The hydrogen release scenarios are modeled using the Bow-tie technique integrated with improved D Numbers Theory and Best-Worst Method. This helps to analyze epistemic uncertainty in the prior probabilities of the causation factors and barriers. Subsequently, a Dynamic Bayesian Network (DBN) model is developed to analyze dynamic risk and deal with aleatory uncertainty. The application of the proposed model is demonstrated on a water electrolysis process. The results of the case study provide a better understanding of the causal modeling of accident scenarios, associated evolving risks with uncertainty. The proposed model will serve as a useful tool for the operational safety management of the hydrogen infrastructure or other complex engineering systems.
Lorenzo Paolo Ingrassia, Xiaohu Lu, G. Ferrotti
et al.
Abstract Nowadays, sustainability and circular economy are two principles to be pursued in all fields. In road pavement engineering, they can be put into practice through the partial substitution of bitumen with industrial residues and by-products deriving from renewable materials. Within this framework, this paper presents an extensive investigation of the chemical, morphological and rheological properties of bio-binders obtained by mixing a conventional 50/70 bitumen with different percentages by weight (0, 5%, 10% and 15%) of a renewable bio-oil, generated as a residue in the processing of wood into pulp and paper. Results show that overall the bio-oil provides a softening effect, which, in terms of performance, leads to an improvement of the low-temperature behaviour and fatigue resistance with respect to the control bitumen, in spite of an increased tendency to permanent deformation. Although no chemical reaction appears to occur after blending, the peculiarities of the bio-oil affect the chemistry of the resulting bio-binders, whereas no phase separation is observed from the microscopic analysis. In addition, a Newtonian behaviour, an unchanged temperature susceptibility and a good fitting of 1S2P1D model to the rheological data are found, regardless of the bio-oil percentage considered. These promising outcomes suggest that such bio-binders can be favourably employed for several applications in road pavements.
The oil and gas industry is facing many corrosion problems. They have been faced with contaminants such as H2S and CO2 which deteriorate pipe lines and machine components. Over time, corrosion can occur on these machines’ inner surfaces. The pipelines must transport large amounts of crude oil which must be able to withstand large amount of pressure. The storage containers for the oil and gas are made of aluminium and steel which must be protected because of their susceptibility to corrosion which impacts directly or indirectly on the economy. Steel and aluminium are important metals used from manufacture to distribution of final products in almost every part of the oil and gas industry. This paper reviews the effect of corrosion on metal and some of the approaches towards corrosion control in engineering sectors.
Abstract Circular Economy has emerged as a popular research topic that is shaping public policy in Europe, China, America and elsewhere. It complements the conceptual basis of the Industrial Ecology framework, but places more emphasis on business models to create closed-loop material systems. However, the Circular Economy concept currently lacks robust engineering design methods, leading to many researchers question its effectiveness. In contrast, Process Integration is lesser known publicly, despite being widely applied in industry to achieve substantial reductions in industrial energy, water, and utility use. These three areas have developed in parallel with minimal cross-pollination, but are in fact potentially complementary. This contribution proposes the new unified concept of Circular Integration for greater transdisciplinary research cohesiveness in the sustainable development of processes, industries, and economies. Circular Integration combines elements from Process Integration, Industrial Ecology, and Circular Economy into a multi-dimensional, multi-scale approach to the minimisation of resource and energy consumption. Circular Integration thus provides an engineering toolbox for planning a Sustainable and Circular Economy. To demonstrate its potential, the paper presents a case study on sustainable biofuel production. The possible solution leverages existing industrial resources to potentially produce enough fuel to fulfil 37% of current demand for global air and marine transport. The Circular Integration framework can also be generalised to systems other than transport and energy, aiming to catalyse greater transdisciplinary research for the analysis, design, and implementation of sustainable and circular systems.
Background: Software development results in the production of various types of artifacts: source code, version control system metadata, bug reports, mailing list conversations, test data, etc. Empirical software engineering (ESE) has thrived mining those artifacts to uncover the inner workings of software development and improve its practices. But which artifacts are studied in the field is a moving target, which we study empirically in this paper.Aims: We quantitatively characterize the most frequently mined and co-mined software artifacts in ESE research and the research purposes they support.Method: We conduct a meta-analysis of artifact mining studies published in 11 top conferences in ESE, for a total of 9621 papers. We use natural language processing (NLP) techniques to characterize the types of software artifacts that are most often mined and their evolution over a 16-year period (2004-2020). We analyze the combinations of artifact types that are most often mined together, as well as the relationship between study purposes and mined artifacts.Results: We find that: (1) mining happens in the vast majority of analyzed papers, (2) source code and test data are the most mined artifacts, (3) there is an increasing interest in mining novel artifacts, together with source code, (4) researchers are most interested in the evaluation of software systems and use all possible empirical signals to support that goal.
B. Woolston, Jason R King, Michael A. Reiter
et al.
Due to volatile sugar prices, the food vs fuel debate, and recent increases in the supply of natural gas, methanol has emerged as a promising feedstock for the bio-based economy. However, attempts to engineer Escherichia coli to metabolize methanol have achieved limited success. Here, we provide a rigorous systematic analysis of several potential pathway bottlenecks. We show that regeneration of ribulose 5-phosphate in E. coli is insufficient to sustain methanol assimilation, and overcome this by activating the sedoheptulose bisphosphatase variant of the ribulose monophosphate pathway. By leveraging the kinetic isotope effect associated with deuterated methanol as a chemical probe, we further demonstrate that under these conditions overall pathway flux is kinetically limited by methanol dehydrogenase. Finally, we identify NADH as a potent kinetic inhibitor of this enzyme. These results provide direction for future engineering strategies to improve methanol utilization, and underscore the value of chemical biology methodologies in metabolic engineering.Engineering E. coli for metabolization of methanol to produce fuels and chemicals has not been fully achieved. Here, the authors combine metabolic engineering and chemical inhibition to improve methanol assimilation and distinguish the role of kinetics and thermodynamics under various culture conditions.
Ghislain H. Demeze-Jouatsa, Roland Pongou, Jean-Baptiste Tondji
Frequent violations of fair principles in real-life settings raise the fundamental question of whether such principles can guarantee the existence of a self-enforcing equilibrium in a free economy. We show that elementary principles of distributive justice guarantee that a pure-strategy Nash equilibrium exists in a finite economy where agents freely (and non-cooperatively) choose their inputs and derive utility from their pay. Chief among these principles is that: 1) your pay should not depend on your name, and 2) a more productive agent should not earn less. When these principles are violated, an equilibrium may not exist. Moreover, we uncover an intuitive condition -- technological monotonicity -- that guarantees equilibrium uniqueness and efficiency. We generalize our findings to economies with social justice and inclusion, implemented in the form of progressive taxation and redistribution, and guaranteeing a basic income to unproductive agents. Our analysis uncovers a new class of strategic form games by incorporating normative principles into non-cooperative game theory. Our results rely on no particular assumptions, and our setup is entirely non-parametric. Illustrations of the theory include applications to exchange economies, surplus distribution in a firm, contagion and self-enforcing lockdown in a networked economy, and bias in the academic peer-review system. Keywords: Market justice; Social justice; Inclusion; Ethics; Discrimination; Self-enforcing contracts; Fairness in non-cooperative games; Pure strategy Nash equilibrium; Efficiency. JEL Codes: C72, D30, D63, J71, J38
Quantum technology is exploding. Computing, communication, and sensing are just a few areas likely to see breakthroughs in the next few years. Worldwide, national governments, industries, and universities are moving to create a new class of workforce - the Quantum Engineers. Demand for such engineers is predicted to be in the tens of thousands within a five-year timescale. However, how best to train this next generation of engineers is far from obvious. Quantum mechanics - long a pillar of traditional physics undergraduate degrees - must now be merged with traditional engineering offerings. This paper discusses the history, development, and first year of operation of the world's first undergraduate degree in quantum engineering. The main purpose of the paper is to inform the wider debate, now being held by many institutions worldwide, on how best to formally educate the Quantum Engineer.
[Context and motivation:] For realistic self-adaptive systems, multiple quality attributes need to be considered and traded off against each other. These quality attributes are commonly encoded in a utility function, for instance, a weighted sum of relevant objectives. [Question/problem:] The research agenda for requirements engineering for self-adaptive systems has raised the need for decision-making techniques that consider the trade-offs and priorities of multiple objectives. Human stakeholders need to be engaged in the decision-making process so that the relative importance of each objective can be correctly elicited. [Principal ideas/results:] This research preview paper presents a method that supports multiple stakeholders in prioritizing relevant quality attributes, negotiating priorities to reach an agreement, and giving input to define utility functions for self-adaptive systems. [Contribution:] The proposed method constitutes a lightweight solution for utility function definition. It can be applied by practitioners and researchers who aim to develop self-adaptive systems that meet stakeholders' requirements. We present details of our plan to study the application of our method using a case study.
The paper aims to present a new apparatus for managing of the information security of the digital economy with using of social networks. A general problem for optimization of the information security management for participants in the digital economy is formulated. The results obtained make it possible to develop a new class of programs for analysis and decision support in the field of information security and economic management. The development of this line of research will be especially important in the financial field.