J. Tirole, J. Laffont
Hasil untuk "Engineering economy"
Menampilkan 20 dari ~10905795 hasil · dari CrossRef, DOAJ, arXiv, Semantic Scholar
Ian A. Cosden, Elizabeth Holtz, Joel U. Bretheim
Research Software Engineers (RSEs) have become indispensable to computational research and scholarship. The fast rise of RSEs in higher education and the trend of universities to be slow creating or adopting models for new technology roles means a lack of structured career pathways that recognize technical mastery, scholarly impact, and leadership growth. In response to an immense demand for RSEs at Princeton University, and dedicated funding to grow the RSE group at least two-fold, Princeton was forced to strategize how to cohesively define job descriptions to match the rapid hiring of RSE positions but with enough flexibility to recognize the unique nature of each individual position. This case study describes our design and implementation of a comprehensive RSE career ladder spanning Associate through Principal levels, with parallel team-lead and managerial tracks. We outline the guiding principles, competency framework, Human Resources (HR) alignment, and implementation process, including engagement with external consultants and mapping to a standard job leveling framework utilizing market benchmarks. We share early lessons learned and outcomes including improved hiring efficiency, clearer promotion pathways, and positive reception among staff.
T. Graedel
Material flow analysis (MFA), a central methodology of industrial ecology, quantifies the ways in which the materials that enable modern society are used, reused, and lost. Sankey diagrams, termed the "visible language of industrial ecology", are often employed to present MFA results. This Perspective assesses the history and current status of MFA, reviews the development of the methodology, presents current examples of metal, polymer, and fiber MFAs, and demonstrates that MFAs have been responsible for creating related industrial ecology specialties and stimulating connections between industrial ecology and a variety of engineering and social science fields. MFA approaches are now being linked with environmental input-output assessment, scenario development, and life cycle assessment, and these increasingly comprehensive assessments promise to be central tools for sustainable development and circular economy studies in the future. Current shortcomings and promising innovations are also presented, as are the implications of MFA results for corporate and national policy.
C. Breyer, M. Fasihi, C. Bajamundi et al.
Christian Breyer is Professor for Solar Economy at LUT University, Finland. His major expertise is research of technological and economic characteristics of renewable energy systems specializing for highly renewable energy systems, on a local but also global scale. Research includes integrated sector analyses with power, heat, transport, desalination, industry, NETs, CCU, and Power-to-X. He worked previously for Reiner Lemoine Institut, Berlin, and Q-Cells (now Hanwha Q Cells). He is a member of ETIP PV, IEA-PVPS, scientific committee of the EU PVSEC and IRES, chairman at the Energy Watch Group, and reviewer for the IPCC. Mahdi Fasihi, M.Sc., is Research Assistant at LUT University, Finland. His focus area is CO2 direct air capture and techno-economic assessment of renewable electricity-based Power-to-X fuels and chemicals production and global trading. Highly resolved energy system modeling is a key method for his potential assessments. He received his M.Sc. degree in Energy Technology at LUT University and B.Sc. in Mechanical Engineering at Guilan University, Iran. Cyril Bajamundi, PhD, is Chief Technology Officer of Soletair Power Oy, a Finnish start-up company focused on CO2 direct air capture and Power-to-X fuel conversion. He had been a Senior Scientist with VTT Technical Research Center of Finland, working in direct air capture of CO2 to support power-to-gas and power-to-liquid technologies for energy storage. Previously, he worked as Assistant Professor in the Department of Chemical Engineering at the University of the Philippines, where he received his M.Sc. in Chemical Engineering. Felix Creutzig leads a working group at the Mercator Research Institute on Global Commons and Climate Change, Berlin, and is Chair of Sustainability Economics of Human Settlements at Technical University Berlin. Educated as a physicist, he holds a PhD in Computational Neuroscience. He coordinates the chapter on “demand, services, and social aspects of mitigation” in the 6th assessment report of the IPCC. Research interests include data science and machine learning approaches for designing low-carbon cities, and demand-side solutions for climate change mitigation.
G. Ischia, L. Fiori
Hydrothermal carbonization (HTC) is an emerging path to give a new life to organic waste and residual biomass. Fulfilling the principles of the circular economy, through HTC “unpleasant” organics can be transformed into useful materials and possibly energy carriers. The potential applications of HTC are tremendous and the recent literature is full of investigations. In this context, models capable to predict, simulate and optimize the HTC process, reactors, and plants are engineering tools that can significantly shift HTC research towards innovation by boosting the development of novel enterprises based on HTC technology. This review paper addresses such key-issue: where do we stand regarding the development of these tools? The literature presents many and simplified models to describe the reaction kinetics, some dealing with the process simulation, while few focused on the heart of an HTC system, the reactor. Statistical investigations and some life cycle assessment analyses also appear in the current state of the art. This work examines and analyzes these predicting tools, highlighting their potentialities and limits. Overall, the current models suffer from many aspects, from the lack of data to the intrinsic complexity of HTC reactions and HTC systems. Therefore, the emphasis is given to what is still necessary to make the HTC process duly simulated and therefore implementable on an industrial scale with sufficient predictive margins.
Yican Wu, Jing Song, Huaqing Zheng et al.
Monte Carlo (MC) method has distinct advantages to simulate complicated nuclear systems and is envisioned as routine method for nuclear design and analysis in the future. High fidelity simulation with MC method coupled with multi-physical phenomenon simulation has significant impact on safety, economy and sustainability of nuclear systems. However, great challenges to current MC methods and codes prevent its application in real engineering project. SuperMC is a CAD-based Monte Carlo program for integrated simulation of nuclear system developed by FDS Team, China, making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC were presented in this paper. SuperMC2.1, the latest version for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model. SuperMC is still in its evolution process toward a general and routine tool for nuclear system.
Bin Yu, Yong Chen, Dawei Luo et al.
Logistics operations demand real-time visibility and rapid response, yet minute-level traffic speed forecasting remains challenging due to heterogeneous data sources and frequent distribution shifts. This paper proposes a Deep Operator Network (DeepONet)-based framework that treats traffic prediction as learning a mapping from historical states and boundary conditions to future speed states, enabling robust forecasting under changing scenarios. We project logistics demand onto a road network to generate diverse congestion scenarios and employ a branch–trunk architecture to decouple historical dynamics from exogenous contexts. Experiments on both a controlled simulation dataset and the real-world Metropolitan Los Angeles (METR-LA) benchmark demonstrate that the proposed method outperforms classical regression and deep learning baselines in cross-scenario generalization. Specifically, the operator learning approach effectively adapts to unseen boundary conditions without retraining, establishing a promising direction for resilient and adaptive logistics forecasting.
Abdelhamid Khelifi, Messaouda Boumaaza, Ahmed Belaadi et al.
In today's construction projects, considering the environment and economy is essential. Alternatives to cement or natural aggregates lead to natural resource conservation and reduced carbon dioxide emissions. In this context, dune sand (DS) and Washingtonia waste (WW), which are abundant in the Algerian desert, were investigated as potential eco-friendly substitutes for cement in mortar manufacture. The initiative aims to create innovative, ecological, and cleaner building materials and lightweight, eco-friendly cement mortars. To this end, the response surface methodology (RSM) was applied for predicting and optimizing the physical characteristics of mortars composed of DS and WW, varying from 1 % to 3 % and treated for 4–24 h with sodium hydroxide solution (NaOH) at a 1–5 % concentration. The purpose of this study was to evaluate the suitability of the material for civil engineering, focusing on properties such as slump (S), specific gravity (ρ), water absorption (WA), bending displacement (Yb), dynamic elastic modulus (Edyn), and other properties. Furthermore, the mortar's chemical composition and high-temperature behavior were investigated. To maximize mechanical qualities and minimize physical properties, an analysis of variance using the Box-Behnken Design was performed to optimize and forecast the factors and outcomes. The results revealed that WW content, NaOH concentration, and immersion time significantly influenced the physico-mechanical properties of the mortar. The optimal formulation obtained was 1.3 % WW, 5 % NaOH concentration, and 14.8 h of immersion time, leading to values of 1972.77 kg/m³ for ρ, 16.62 cm for S, 2.63 % for WA, 0.26 mm for Yb, and 20.46 GPa for Edyn. The strong correlation between the RSM model and experimental data confirms the model's reliability. These findings demonstrate the potential of this sustainable mortar for eco-construction applications, such as bending elements and repairing damaged structures in buildings, highways, and bridges.
Liyuan Chen, Shuoling Liu, Jiangpeng Yan et al.
The advent of foundation models (FMs), large-scale pre-trained models with strong generalization capabilities, has opened new frontiers for financial engineering. While general-purpose FMs such as GPT-4 and Gemini have demonstrated promising performance in tasks ranging from financial report summarization to sentiment-aware forecasting, many financial applications remain constrained by unique domain requirements such as multimodal reasoning, regulatory compliance, and data privacy. These challenges have spurred the emergence of financial foundation models (FFMs): a new class of models explicitly designed for finance. This survey presents a comprehensive overview of FFMs, with a taxonomy spanning three key modalities: financial language foundation models (FinLFMs), financial time-series foundation models (FinTSFMs), and financial visual-language foundation models (FinVLFMs). We review their architectures, training methodologies, datasets, and real-world applications. Furthermore, we identify critical challenges in data availability, algorithmic scalability, and infrastructure constraints and offer insights into future research opportunities. We hope this survey can serve as both a comprehensive reference for understanding FFMs and a practical roadmap for future innovation.
Davide Venturelli, Erik Gustafson, Doga Kurkcuoglu et al.
We review the prospects to build quantum processors based on superconducting transmons and radiofrequency cavities for testing applications in the NISQ era. We identify engineering opportunities and challenges for implementation of algorithms in simulation, combinatorial optimization, and quantum machine learning in qudit-based quantum computers.
Nazanin Ahmadi, Qianying Cao, Jay D. Humphrey et al.
Physics-informed machine learning (PIML) is emerging as a potentially transformative paradigm for modeling complex biomedical systems by integrating parameterized physical laws with data-driven methods. Here, we review three main classes of PIML frameworks: physics-informed neural networks (PINNs), neural ordinary differential equations (NODEs), and neural operators (NOs), highlighting their growing role in biomedical science and engineering. We begin with PINNs, which embed governing equations into deep learning models and have been successfully applied to biosolid and biofluid mechanics, mechanobiology, and medical imaging among other areas. We then review NODEs, which offer continuous-time modeling, especially suited to dynamic physiological systems, pharmacokinetics, and cell signaling. Finally, we discuss deep NOs as powerful tools for learning mappings between function spaces, enabling efficient simulations across multiscale and spatially heterogeneous biological domains. Throughout, we emphasize applications where physical interpretability, data scarcity, or system complexity make conventional black-box learning insufficient. We conclude by identifying open challenges and future directions for advancing PIML in biomedical science and engineering, including issues of uncertainty quantification, generalization, and integration of PIML and large language models.
Fabian C. Peña
Large Language Models (LLMs) are revolutionizing software engineering (SE), with special emphasis on code generation and analysis. However, their applications to broader SE practices including conceptualization, design, and other non-code tasks, remain partially underexplored. This research aims to augment the generality and performance of LLMs for SE by (1) advancing the understanding of how LLMs with different characteristics perform on various non-code tasks, (2) evaluating them as sources of foundational knowledge in SE, and (3) effectively detecting hallucinations on SE statements. The expected contributions include a variety of LLMs trained and evaluated on domain-specific datasets, new benchmarks on foundational knowledge in SE, and methods for detecting hallucinations. Initial results in terms of performance improvements on various non-code tasks are promising.
Fernando Ayach, Vitor Lameirão, Raul Leão et al.
Proto-personas are commonly used during early-stage Product Discovery, such as Lean Inception, to guide product definition and stakeholder alignment. However, the manual creation of proto-personas is often time-consuming, cognitively demanding, and prone to bias. In this paper, we propose and empirically investigate a prompt engineering-based approach to generate proto-personas with the support of Generative AI (GenAI). Our goal is to evaluate the approach in terms of efficiency, effectiveness, user acceptance, and the empathy elicited by the generated personas. We conducted a case study with 19 participants embedded in a real Lean Inception, employing a qualitative and quantitative methods design. The results reveal the approach's efficiency by reducing time and effort and improving the quality and reusability of personas in later discovery phases, such as Minimum Viable Product (MVP) scoping and feature refinement. While acceptance was generally high, especially regarding perceived usefulness and ease of use, participants noted limitations related to generalization and domain specificity. Furthermore, although cognitive empathy was strongly supported, affective and behavioral empathy varied significantly across participants. These results contribute novel empirical evidence on how GenAI can be effectively integrated into software Product Discovery practices, while also identifying key challenges to be addressed in future iterations of such hybrid design processes.
Zeng Meng, Gang Li, Xuan Wang et al.
Le Zhang, Zicheng Jiang, To-Hung Tsui et al.
In the context of a circular economy, bioplastic production using biodegradable materials such as poly(3-hydroxybutyrate) (PHB) has been proposed as a promising solution to fundamentally solve the disposal issue of plastic waste. PHB production techniques through fermentation of PHB-accumulating microbes such as Cupriavidus necator have been revolutionized over the past several years with the development of new strategies such as metabolic engineering. This review comprehensively summarizes the latest PHB production technologies via Cupriavidus necator fermentation. The mechanism of the biosynthesis pathway for PHB production was first assessed. PHB production efficiencies of common carbon sources, including food waste, lignocellulosic materials, glycerol, and carbon dioxide, were then summarized and critically analyzed. The key findings in enhancing strategies for PHB production in recent years, including pre-treatment methods, nutrient limitations, feeding optimization strategies, and metabolism engineering strategies, were summarized. Furthermore, technical challenges and future prospects of strategies for enhanced production efficiencies of PHB were also highlighted. Based on the overview of the current enhancing technologies, more pilot-scale and larger-scale tests are essential for future implementation of enhancing strategies in full-scale biogas plants. Critical analyses of various enhancing strategies would facilitate the establishment of more sustainable microbial fermentation systems for better waste management and greater efficiency of PHB production.
K. Mondal, Luis Nuñez, C. Downey et al.
Today’s competitive world economy is creating an indispensable demand for increased efficiency of engineering components that operate in harsh environments (i.e., very high-temperature, corrosive, ...
R. Buyya
Computational Grids, emerging as an infrastructure for next generation computing, enable the sharing, selection, and aggregation of geographically distributed resources for solving large-scale problems in science, engineering, and commerce. As the resources in the Grid are heterogeneous and geographically distributed with varying availability and a variety of usage and cost policies for diverse users at different times and, priorities as well as goals that vary with time. The management of resources and application scheduling in such a large and distributed environment is a complex task. This thesis proposes a distributed computational economy as an effective metaphor for the management of resources and application scheduling. It proposes an architectural framework that supports resource trading and quality of services based scheduling. It enables the regulation of supply and demand for resources and provides an incentive for resource owners for participating in the Grid and motives the users to trade-off between the deadline, budget, and the required level of quality of service. The thesis demonstrates the capability of economic-based systems for peer-to-peer distributed computing by developing users' quality-of-service requirements driven scheduling strategies and algorithms. It demonstrates their effectiveness by performing scheduling experiments on the World-Wide Grid for solving parameter sweep applications.
H. Kuo, Y. Tseng, Y. Yang
Abstract In recent years, STEM (Science, Technology, Engineering, and Mathematics) has been extensively advocated and implemented in education, as it is suggested to be very impactful on student’s interdisciplinary learning, which can be seen as a significant driving force for a country’s advancement in scientific and technical knowledge, innovation, economy, and international competitiveness. Developing a human-computer interaction (HCI) system to solve real-world problems requires the inventors to have interdisciplinary STEM knowledge and skills. Thus a STEM Interdisciplinary Project-based Learning (IPBL) approach was applied to teach a total number of 45 college students registered in the departments of engineering and design. Inspired by Design Thinking, the 18-week STEM IPBL course was delivered through four phases, including discover, define, develop, and deliver. All the finished HCI projects applied the interdisciplinary knowledge and skills from the domains of STEM. Evidence drawn from the 6-point Likert ‘Motivated Strategies for Learning Questionnaire (MSLQ)’ indicated that the STEM IPBL course was very impactful on student’s learning, which improved the participants’ (a) overall learning motivation (Pre M = 4.4, Post M = 4.64; p = .012), (b) self-efficacy of learning (Pre M = 4.03, Post M = 4.43; p = .003), (c) enjoyableness of learning STEM (Pre M = 4.68, Post M = 4.75; p = .556), and (d) recognizing the significance of learning STEM on future career development (Pre M = 4.73, Post M = 4.94; p = .077). It is also found that compared with design majored students, the course had a better effect on the engineering majored students. Evidence collected from ‘Abbreviated Torrance Test for Adults (ATTA)’ indicated that the student’s overall creativity was significantly improved (Pre M = 63.36, Post M = 68.44; p = .000). More specifically, among the four facets of creativity, the improvements were as follows: fluency (Pre M = 14.89, Post M = 16.2; p = .001), elaboration (Pre M = 16.69, Post M = 18.62; p = .000), flexibility (Pre M = 14.82, Post M = 16.04; p = .009), and originality (Pre M = 16.96, Post M = 17.58; p = .136). It is found that the STEM IPBL course had a different impact on the student's originality, while the originality of engineering majored students significantly improved (p = .006), the originality of design majored students did not change. Some educational implications were also provided in the article.
Dongmin Kong, Bohui Zhang, Jian Zhang
This paper investigates the impact of higher education on corporate innovation. To establish causality, we exploit a policy-induced exogenous shock in the supply of Chinese college-educated labor starting in 2003. Using a difference-in-differences approach, we find that Chinese firms in skilled industries generate better innovation outcomes as measured by patents and citations than those in unskilled industries. This effect is more pronounced among firms headquartered in a province with more science and engineering college graduates, young firms that are more likely to hire young graduates, and firms located near universities. Moreover, higher education expansion increases a firm’s innovative human capital in terms of the number of educated employees and inventors. Finally, we show that technological innovation is a mechanism through which higher education affects productivity growth and, thus, the economy.
M. Asyraf, A. Syamsir, N. Zahari et al.
New product development review article aims to consolidate the principles and current literature on design for sustainability to seek the field’s future direction. In this point of view, the design for sustainability methods can be established under the idea of sustainability in dimensions of ecology, economy and social pillars. Design for sustainability concept is implemented in concurrent engineering, including concept, embodiment and detail design processes. Integrating sustainability in engineering designs is crucial to producing greener products, system innovation, and services aligned with current market demand. Currently, many concurrent engineering studies related to natural fibre-reinforced polymer composites associated with sustainability enhance the application of design for sustainability techniques by professional designers. However, the current literature is scarce in bridging the design for sustainability concept with concurrent engineering during the design development stage, and these areas should be further developed. Several other future research directions, such as the need for aligning with principles and applications, along with exploring the relationships between the design for sustainability techniques and views of sustainability, are presented in this review paper.
Halaman 19 dari 545290