Integrating sustainability into entrepreneurship education is becoming increasingly prevalent. Nonetheless, there has been limited empirical research on how higher education institutions (HEIs) operationalize this approach in the context of industry collaboration. Further, existing frameworks overlook the dynamics of tension, as demonstrated by prior research. This study explores how HEIs and industry partners jointly develop and sustain entrepreneurship programs focused on sustainability. The study outlines a process model consisting of four stages: Sustainability Foundation, Synergy Hub and Pedagogy, Strategic Partnerships, and Enduring Ecosystems. According to the model, HEI-industry collaboration requires specific elements arranged in an appropriate sequence, with active tension management at each stage. In this sense, this study provides theoretical understandings that offer a dynamic, processual framework that integrates sustainability principles into entrepreneurship education. This study provides practical guidance for HEIs and industry partners on developing a sustainable entrepreneurship program through a staged, adaptive collaboration.
Lorenz Brehme, Benedikt Dornauer, Thomas Ströhle
et al.
Retrieval-Augmented Generation (RAG) is a well-established and rapidly evolving field within AI that enhances the outputs of large language models by integrating relevant information retrieved from external knowledge sources. While industry adoption of RAG is now beginning, there is a significant lack of research on its practical application in industrial contexts. To address this gap, we conducted a semistructured interview study with 13 industry practitioners to explore the current state of RAG adoption in real-world settings. Our study investigates how companies apply RAG in practice, providing (1) an overview of industry use cases, (2) a consolidated list of system requirements, (3) key challenges and lessons learned from practical experiences, and (4) an analysis of current industry evaluation methods. Our main findings show that current RAG applications are mostly limited to domain-specific QA tasks, with systems still in prototype stages; industry requirements focus primarily on data protection, security, and quality, while issues such as ethics, bias, and scalability receive less attention; data preprocessing remains a key challenge, and system evaluation is predominantly conducted by humans rather than automated methods.
Covid has made online teaching and learning acceptable and students, faculty, and industry professionals are all comfortable with this mode. This comfort can be leveraged to offer an online multi-institutional research-level course in an area where individual institutions may not have the requisite faculty to teach and/or research students to enroll. If the subject is of interest to industry, online offering also allows industry experts to contribute and participate with ease. Advanced topics in Software Engineering are ideally suited for experimenting with this approach as industry, which is often looking to incorporate advances in software engineering in their practices, is likely to agree to contribute and participate. In this paper we describe an experiment in teaching a course titled "AI in Software Engineering" jointly between two institutions with active industry participation, and share our and student's experience. We believe this collaborative teaching approach can be used for offering research level courses in any applied area of computer science by institutions who are small and find it difficult to offer research level courses on their own.
Gauthier Roussilhe, Thibault Pirson, David Bol
et al.
Growing attention is given to the environmental impacts of the digital sector, exacerbated by the increase of digital products and services in our globalized societies. The materiality of the digital sector is often presented through the environmental impacts of mining activities to point out that digitization does not mean dematerialization. Despite its importance, such a narrative is often restricted to a few minerals (e.g., cobalt, lithium) that have become the symbols of extractive industries. In this paper, we further explore the materiality of the digital sector with an approach based on the diversity of elements and their purity requirements in the semiconductor industry. Semiconductors are responsible for manufacturing the key building blocks of the digital sector, i.e., microchips. Given that the need for ultra-high purity materials is very specific to the semiconductor industry, a few companies around the world have been studied, revealing new critical actors in complex supply chains. This highlights strong dependencies towards other industrial sectors with mass production and the need for a deeper investigation of interactions with the chemical industry, complementary to the mining industry.
The foundation model industry exhibits unprecedented concentration in critical inputs: semiconductors, energy infrastructure, elite talent, capital, and training data. Despite extensive sectoral analyses, no comprehensive framework exists for assessing overall industrial vulnerability. We develop the Artificial Intelligence Industrial Vulnerability Index (AIIVI) grounded in O-Ring production theory, recognizing that foundation model production requires simultaneous availability of non-substitutable inputs. Given extreme data opacity and rapid technological evolution, we implement a validated human-in-the-loop methodology using large language models to systematically extract indicators from dispersed grey literature, with complete human verification of all outputs. Applied to six state-of-the-art foundation model developers, AIIVI equals 0.82, indicating extreme vulnerability driven by compute infrastructure (0.85) and energy systems (0.90). While industrial policy currently emphasizes semiconductor capacity, energy infrastructure represents the emerging binding constraint. This methodology proves applicable to other fast-evolving, opaque industries where traditional data sources are inadequate.
This paper introduces IGGA, a dataset of 160 industry guidelines and policy statements for the use of Generative AIs (GAIs) and Large Language Models (LLMs) in industry and workplace settings, collected from official company websites, and trustworthy news sources. The dataset contains 104,565 words and serves as a valuable resource for natural language processing tasks commonly applied in requirements engineering, such as model synthesis, abstraction identification, and document structure assessment. Additionally, IGGA can be further annotated to function as a benchmark for various tasks, including ambiguity detection, requirements categorization, and the identification of equivalent requirements. Our methodologically rigorous approach ensured a thorough examination, with a selection of reputable and influential companies that represent a diverse range of global institutions across six continents. The dataset captures perspectives from fourteen industry sectors, including technology, finance, and both public and private institutions, offering a broad spectrum of insights into the integration of GAIs and LLMs in industry.
In this paper, in order to reduce the energy leakage caused by the discretized representation in sparse channel estimation for Orthogonal Frequency Division Multiplexing (OFDM) systems, we systematically have analyzed the optimal locations of atoms with discrete delays for each path reconstruction from the perspective of linear fitting theory. Then, we have investigated the adverse effects of the non-ideal inner product function on the iteration in one of the most widely used channel estimation method, Orthogonal Matching Pursuit (OMP). The study shows that the distance between the selected atoms for each path in OMP can be larger than the sampling interval, which prevents OMP-based methods from achieving better performance. To overcome this drawback, the image deblurring-based channel estimation method, in which the channel estimation problem is analogized to one-dimensional image deblurring, was proposed to improve the large compensation distance of traditional OMP. The advantage of the proposed method was validated by the results of numerical simulation and sea trial data decoding.
Sorena Vahedipour-Dahraie, Younes Zahedi, Mir Daryoush Shakouri
SUMMARY: Because of the side effects of growth stimulant antibiotics employed for poultry nutrition the poultry industry attempts to substitute them with a safer one like as phytogenic or organic acids. Thus, the objective of this study was to determine the influence of single and double supplementation of broiler chickens diet with eugenol (0, 500, and 1,000 ppm) and butyric acid glycerides (BAG) (0 and 0.2% w/w) on the chemical, technological and sensory traits of the chicken breast during 60 d of storage. The biological trial was carried out on a total of 300 mixed-sex one-day-old Ross 308 chicks, which were randomly distributed into 6 dietary treatments, with 5 replicates and 10 birds each. The results revealed that L* and b* color values of the fillet samples changed significantly (p < 0.05). The pH values reduced significantly from 5.79 to 5.69 as an effect of eugenol supplementation in the diet (p < 0.05). Water binding ability of the fillet samples evaluated by drip loss, cooking loss and water holding capacity assays was not influenced by addition of BAG and eugenol to the diet of broilers. Sensorial traits of the meat samples were not negatively affected by the dietary supplementation process. Thiobarbituric acid reactive substances (TBARS) and total volatile basic nitrogen (TVBN) values of the fillets were not influenced meaningfully by treatments. Overall, dietary supplementation of broiler chickens by eugenol and BAG did not result in important modifications of the physicochemical characteristics of chicken's breast.
Quentin Raillard-Cazanove, Thibaut Knibiehly, Robin Girard
The decarbonisation of the energy system is crucial for achieving climate goals and is inherently linked to the decarbonisation of industry. Despite this, few studies explore the simultaneous impacts of decarbonising both sectors. This paper aims to examine how industrial decarbonisation in Europe affects the energy system and vice versa. To address this, an industry model incorporating key heavy industry sectors across six European countries is combined with an energy system model for electricity and hydrogen covering fifteen European regions, refered to as the EU-15, divided into eleven zones. The study evaluates various policy scenarios under different conditions.The results demonstrate that industrial decarbonisation leads to a significant increase in electricity and hydrogen demand. This additional demand for electricity is largely met through renewable energy sources, while hydrogen supply is predominantly addressed by blue hydrogen production when fossil fuels are authorized and the system lacks renewable energy. This increased demand results in higher prices with considerable regional disparities. Furthermore, the findings reveal that, regardless of the scenario, the electricity mix in the EU-15 remains predominantly renewable, exceeding 85%.A reduction in carbon taxes lowers the prices of electricity and hydrogen, but does not increase consumption, as the lower carbon tax makes the continued use of fossil fuels more attractive to industry. In scenarios that enforce a phase-out of fossil fuels, electricity prices rise, leading to a greater reliance on imports of low-carbon hydrogen and methanol. Results also suggest that domestic hydrogen production benefits from synergies between electrolytic hydrogen and blue hydrogen, helping to maintain competitive prices.
Recently, the Industry 5.0 is gaining attention as a novel paradigm, defining the next concrete steps toward more and more intelligent, green-aware and user-centric digital systems. In an era in which smart devices typically adopted in the industry domain are more and more sophisticated and autonomous, the Internet of Things and its evolution, known as the Internet of Everything (IoE, for short), involving also people, robots, processes and data in the network, represent the main driver to allow industries to put the experiences and needs of human beings at the center of their ecosystems. However, due to the extreme heterogeneity of the involved entities, their intrinsic need and capability to cooperate, and the aim to adapt to a dynamic user-centric context, special attention is required for the integration and processing of the data produced by such an IoE. This is the objective of the present paper, in which we propose a novel semantic model that formalizes the fundamental actors, elements and information of an IoE, along with their relationships. In our design, we focus on state-of-the-art design principles, in particular reuse, and abstraction, to build ``SemIoE'', a lightweight ontology inheriting and extending concepts from well-known and consolidated reference ontologies. The defined semantic layer represents a core data model that can be extended to embrace any modern industrial scenario. It represents the base of an IoE Knowledge Graph, on top of which, as an additional contribution, we analyze and define some essential services for an IoE-based industry.
Valentina Zaccaria, Chiara Masiero, David Dandolo
et al.
While Machine Learning has become crucial for Industry 4.0, its opaque nature hinders trust and impedes the transformation of valuable insights into actionable decision, a challenge exacerbated in the evolving Industry 5.0 with its human-centric focus. This paper addresses this need by testing the applicability of AcME-AD in industrial settings. This recently developed framework facilitates fast and user-friendly explanations for anomaly detection. AcME-AD is model-agnostic, offering flexibility, and prioritizes real-time efficiency. Thus, it seems suitable for seamless integration with industrial Decision Support Systems. We present the first industrial application of AcME-AD, showcasing its effectiveness through experiments. These tests demonstrate AcME-AD's potential as a valuable tool for explainable AD and feature-based root cause analysis within industrial environments, paving the way for trustworthy and actionable insights in the age of Industry 5.0.
Abstract Given the prevalence of surveillance cameras in our daily lives, human action recognition from videos holds significant practical applications. A persistent challenge in this field is to develop more efficient models capable of real-time recognition with high accuracy for widespread implementation. In this research paper, we introduce a novel human action recognition model named Context-Aware Memory Attention Network (CAMA-Net), which eliminates the need for optical flow extraction and 3D convolution which are computationally intensive. By removing these components, CAMA-Net achieves superior efficiency compared to many existing approaches in terms of computation efficiency. A pivotal component of CAMA-Net is the Context-Aware Memory Attention Module, an attention module that computes the relevance score between key-value pairs obtained from the 2D ResNet backbone. This process establishes correspondences between video frames. To validate our method, we conduct experiments on four well-known action recognition datasets: ActivityNet, Diving48, HMDB51 and UCF101. The experimental results convincingly demonstrate the effectiveness of our proposed model, surpassing the performance of existing 2D-CNN based baseline models. Article Highlights Recent human action recognition models are not yet ready for practical applications due to high computation needs. We propose a 2D CNN-based human action recognition method to reduce the computation load. The proposed method achieves competitive performance compared to most SOTA 2D CNN-based methods on public datasets.
While system identification methods have developed rapidly, modeling the process of batch polymerization reactors still poses challenges. Therefore, designing an intelligent modeling approach for these reactors is important. This paper focuses on identifying actual models for batch polymerization reactors, proposing a novel recursive approach based on the expectation-maximization algorithm. The proposed method pays special attention to unknown inputs (UIs), which may represent modeling errors or process faults. To estimate the UIs of the model, the recursive expectation-maximization (EM) technique is used. The proposed algorithm consists of two steps: the E-step and the M-step. In the E-step, a Q-function is recursively computed based on the maximum likelihood framework, using the UI estimates from the previous time step. The Kalman filter is utilized to calculate the estimates of the states using the measurements from sensor data. In the M-step, analytical solutions for the UIs are found through local optimization of the recursive Q-function. To demonstrate the effectiveness of the proposed algorithm, a practical application of modeling batch polymerization reactors is presented. The performance of the proposed recursive EM algorithm is compared to that of the augmented state Kalman filter (ASKF) using root mean squared errors (RMSEs). The RMSEs obtained from the proposed method are at least <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>6.52</mn><mo>%</mo></mrow></semantics></math></inline-formula> lower than those from the ASKF method, indicating superior performance.
John P. Hansen, J V Hansen, Jonathan Hansen
et al.
Many oil/gas fields in Kazakhstan contain high levels of highly corrosive H2S and CO2, sometimes at very high pressures. The management of corrosion is essential in maintaining plant safety and integrity of the processing facility. This paper describes the development of a non-destructive testing (NDT) method that improves the reliability of air-cooled heat exchangers by reducing down-time related to corrosive and erosive failure of fin-fan tubes. The project goal was to maximize the output of oil and gas plants and refineries while reducing the plant operating cost. The work first identified those NDT requirements for air-cooled heat exchangers damage assessment that would provide the greatest economic benefit for Kazakhstan industry. The main objective was to develop a state-of-an-art NDT method for air-cooled heat exchanger tubes, capable to: a) detect any damage mechanism while testing from tube internal diameter, b) accurately determine the damage in terms of wall loss, c) perform inspection quickly and expediently, d) requires minimum tube cleaning. Consequently, the method specially adapted for Kazakhstan conditions was developed based on a combination of Magnetic Flux Leakage (MFL) technique for flaw detection and with Hall effect measurement of wall thickness and gradual corrosion in tubes. The method has been tested in both laboratory and field conditions and the results were compared with accurate but slow ultrasonic IRIS method. High correlation was obtained, which proved that the developed technology is capable to deliver similar results at the speed almost 10 times faster and less than half the cost.
In the modern economy, digitalization has become one of the key components of the Russian Federation regions socio-economic development. Enterprises of various industries are faced with the need to process large amounts of data, which greatly complicates data management, and therefore the relevance of the analysis of artificial intelligence technologies increases. Training employees for industrial processes is a major challenge in any industry. Effective human resource management requires an accurate assessment and presentation of available competencies, as well as an effective mapping of the required competencies for specific positions. Competences enable the company to achieve high production and economic results. The aim of the study is to develop a structural model of a predictive expert system for managing data on the competencies of a modern manager by combining artificial and human intelligence, which can serve as a decision support tool for managers in real conditions to improve the efficiency of a particular enterprise. The study of the demand for managers and requirements for candidates in the Russian Federation and the Republic of Tatarstan was conducted on the data of the largest Russian Internet recruitment company HeadHunter. To develop a structural model of the proposed expert system, information from specialized scientific publications published in the Russian and foreign scientific literature of the Web of science and Scopus databases was used. The expert system will allow the manager to find the best options for using employees, predict the development of the enterprise as a whole and its individual divisions, which will significantly increase the key performance indicators of any company.
Purpose
This study aims to investigate the relationship between market power and firms’ performance in the Indonesian manufacturing industry.
Design/methodology/approach
Using the Statistik Industri Besar dan Sedang from BPS we extract the data about the market share and productivity of each firm that will represent market power and the firms’ performance respectively. The dataset also allows us to apply dynamic panel data that might address the endogeneity and reverse causality problem which could occur in the estimation.
Findings
The results suggest that market power has an inverted U-shaped relationship with firms’ productivity. Further analysis shows similar conditions also occur in all selected industries except automotive.
Research limitations/implications
This study could help policymakers if they want to influence firms’ performance based on their market share.
Originality/value
This paper applies the DPD method to address the endogeneity problem that might occur in the previous studies.
The goal of our research was to investigate the effects of collaboration between academia and industry on Natural Language Processing (NLP). To do this, we created a pipeline to extract affiliations and citations from NLP papers and divided them into three categories: academia, industry, and hybrid (collaborations between academia and industry). Our empirical analysis found that there is a trend towards an increase in industry and academia-industry collaboration publications and that these types of publications tend to have a higher impact compared to those produced solely within academia.
Andrés Felipe Posada-Moreno, Kai Müller, Florian Brillowski
et al.
The industry 4.0 is leveraging digital technologies and machine learning techniques to connect and optimize manufacturing processes. Central to this idea is the ability to transform raw data into human understandable knowledge for reliable data-driven decision-making. Convolutional Neural Networks (CNNs) have been instrumental in processing image data, yet, their ``black box'' nature complicates the understanding of their prediction process. In this context, recent advances in the field of eXplainable Artificial Intelligence (XAI) have proposed the extraction and localization of concepts, or which visual cues intervene on the prediction process of CNNs. This paper tackles the application of concept extraction (CE) methods to industry 4.0 scenarios. To this end, we modify a recently developed technique, ``Extracting Concepts with Local Aggregated Descriptors'' (ECLAD), improving its scalability. Specifically, we propose a novel procedure for calculating concept importance, utilizing a wrapper function designed for CNNs. This process is aimed at decreasing the number of times each image needs to be evaluated. Subsequently, we demonstrate the potential of CE methods, by applying them in three industrial use cases. We selected three representative use cases in the context of quality control for material design (tailored textiles), manufacturing (carbon fiber reinforcement), and maintenance (photovoltaic module inspection). In these examples, CE was able to successfully extract and locate concepts directly related to each task. This is, the visual cues related to each concept, coincided with what human experts would use to perform the task themselves, even when the visual cues were entangled between multiple classes. Through empirical results, we show that CE can be applied for understanding CNNs in an industrial context, giving useful insights that can relate to domain knowledge.
Recommender system (RS) is an established technology with successful applications in social media, e-commerce, entertainment, and more. RSs are indeed key to the success of many popular APPs, such as YouTube, Tik Tok, Xiaohongshu, Bilibili, and others. This paper explores the methodology for improving modern industrial RSs. It is written for experienced RS engineers who are diligently working to improve their key performance indicators, such as retention and duration. The experiences shared in this paper have been tested in some real industrial RSs and are likely to be generalized to other RSs as well. Most contents in this paper are industry experience without publicly available references.