Abstract Industry 4.0 leads to the digitalization era. Everything is digital; business models, environments, production systems, machines, operators, products and services. It’s all interconnected inside the digital scene with the corresponding virtual representation. The physical flows will be mapped on digital platforms in a continuous manner. On a higher level of automation, many systems and software are enabling factory communications with the latest trends of information and communication technologies leading to the state-of-the-art factory, not only inside but also outside factory, achieving all elements of the value chain on a real-time engagement. Everything is smart. This disruptive impact on manufacturing companies will allow the smart manufacturing ecosystem paradigm. Industry 4.0 is the turning point to the end of the conventional centralized applications. The Industry 4.0 environment is scanned on this paper, describing the so-called enabling technologies and systems over the manufacturing environment.
Technological revolutions mark profound transformations in socio-economic systems. They are associated with the development and diffusion of general-purpose technologies (GPTs) that display very high degrees of pervasiveness, dynamism and complementarity. This article provides an in-depth examination of the technologies underpinning the “factory of the future” as profiled by the Industry 4.0 paradigm. It contains an exploratory comparative analysis of the technological bases and the emergent patterns of development of Internet of Things, big data, cloud, robotics, artificial intelligence, and additive manufacturing. We qualify the “enabling” nature of these technologies. We then test whether, taken together and individually, they display the characteristics of generality, originality, and longevity associated with GPTs. Finally, we discuss key themes for future research on this topic from an industrial structural change perspective.
Md. Jahidul Islam Razin, Md. Abdul Karim, M. F. Mridha
et al.
Business sentiment analysis (BSA) is one of the significant and popular topics of natural language processing. It is one kind of sentiment analysis techniques for business purposes. Different categories of sentiment analysis techniques like lexicon-based techniques and different types of machine learning algorithms are applied for sentiment analysis on different languages like English, Hindi, Spanish, etc. In this paper, long short-term memory (LSTM) is applied for business sentiment analysis, where a recurrent neural network is used. An LSTM model is used in a modified approach to prevent the vanishing gradient problem rather than applying the conventional recurrent neural network (RNN). To apply the modified RNN model, product review dataset is used. In this experiment, 70\% of the data is trained for the LSTM and the rest 30\% of the data is used for testing. The result of this modified RNN model is compared with other conventional RNN models, and a comparison is made among the results. It is noted that the proposed model performs better than the other conventional RNN models. Here, the proposed model, i.e., the modified RNN model approach has achieved around 91.33\% of accuracy. By applying this model, any business company or e-commerce business site can identify the feedback from their customers about different types of products that customers like or dislike. Based on the customer reviews, a business company or e-commerce platform can evaluate its marketing strategy.
Next activity prediction represents a fundamental challenge for optimizing business processes in service-oriented architectures such as microservices environments, distributed enterprise systems, and cloud-native platforms, which enables proactive resource allocation and dynamic service composition. Despite the prevalence of sequence-based methods, these approaches fail to capture non-sequential relationships that arise from parallel executions and conditional dependencies. Even though graph-based approaches address structural preservation, they suffer from homogeneous representations and static structures that apply uniform modeling strategies regardless of individual process complexity characteristics. To address these limitations, we introduce RLHGNN, a novel framework that transforms event logs into heterogeneous process graphs with three distinct edge types grounded in established process mining theory. Our approach creates four flexible graph structures by selectively combining these edges to accommodate different process complexities, and employs reinforcement learning formulated as a Markov Decision Process to automatically determine the optimal graph structure for each specific process instance. RLHGNN then applies heterogeneous graph convolution with relation-specific aggregation strategies to effectively predict the next activity. This adaptive methodology enables precise modeling of both sequential and non-sequential relationships in service interactions. Comprehensive evaluation on six real-world datasets demonstrates that RLHGNN consistently outperforms state-of-the-art approaches. Furthermore, it maintains an inference latency of approximately 1 ms per prediction, representing a highly practical solution suitable for real-time business process monitoring applications. The source code is available at https://github.com/Joker3993/RLHGNN.
This research aims to examine the effect of digitalization on the sustainability of accounting practices in the financial industry. Using a qualitative approach, this research conducted case studies on several leading companies in the financial sector. We collected data on the implementation of digital technologies in accounting practices through in-depth interviews, participatory observation, and document analysis. The research findings show that digitalization brings significant changes in accounting practices, especially with regard to improving efficiency, transparency, and data accuracy. Research has proven that the integration of technologies like cloud-based accounting information systems, big data, and artificial intelligence enhances the quality of financial decision-making. In addition, digitalization also has a positive impact on environmental sustainability through reduced paper usage and energy efficiency. However, the research also reveals challenges, such as the need to upskill accountants, data security issues, and adapt to regulatory changes. The findings provide important insights for practitioners and policymakers in responding to the dynamics brought about by digitalization in accounting. This research contributes to the literature relating to the interaction between technology and accounting practices by providing an in-depth understanding of how financial firms adapt and utilize technological innovations for their business sustainability. The practical implications of this research include recommendations for the development of policies that support the integration of technology in accounting as well as the development of appropriate human resources.
Scientific and objective evaluation of education quality is an important demand of the current education industry. Artificial intelligence empowering all walks of life has become an inevitable trend of social development in the future. This study introduces a scheme of artificial intelligence for education quality assessment. The education quality evaluation system combines big data and artificial intelligence technology, uses artificial intelligence model construction, analyzes the collected teaching quality evaluation index data, and generates objective results for education quality evaluation. The teaching quality evaluation of higher education teaching quality is a more complex system engineering. Its remarkable characteristics are complex levels, strong structure, large amount of information processed, and need to consume a lot of manpower, material and financial resources. The traditional manual processing method can not adapt to the requirements of accurate evaluation, fast evaluation and good evaluation. Therefore, this study can help to realize the artificial intelligence of teaching evaluation, combined with computer network technology, so that the evaluation is more suitable for the requirements of higher education teaching quality evaluation.
Recent research advances in Artificial Intelligence (AI) have yielded promising results for automated software vulnerability management. AI-based models are reported to greatly outperform traditional static analysis tools, indicating a substantial workload relief for security engineers. However, the industry remains very cautious and selective about integrating AI-based techniques into their security vulnerability management workflow. To understand the reasons, we conducted a discussion-based study, anchored in the authors' extensive industrial experience and keen observations, to uncover the gap between research and practice in this field. We empirically identified three main barriers preventing the industry from adopting academic models, namely, complicated requirements of scalability and prioritization, limited customization flexibility, and unclear financial implications. Meanwhile, research works are significantly impacted by the lack of extensive real-world security data and expertise. We proposed a set of future directions to help better understand industry expectations, improve the practical usability of AI-based security vulnerability research, and drive a synergistic relationship between industry and academia.
With the rapid growth and increasing complexity of industrial big data, traditional data processing methods are facing many challenges. This article takes an in-depth look at the application of cloud computing technology in industrial big data processing and explores its potential impact on improving data processing efficiency, security, and cost-effectiveness. The article first reviews the basic principles and key characteristics of cloud computing technology, and then analyzes the characteristics and processing requirements of industrial big data. In particular, this study focuses on the application of cloud computing in real-time data processing, predictive maintenance, and optimization, and demonstrates its practical effects through case studies. At the same time, this article also discusses the main challenges encountered during the implementation process, such as data security, privacy protection, performance and scalability issues, and proposes corresponding solution strategies. Finally, this article looks forward to the future trends of the integration of cloud computing and industrial big data, as well as the application prospects of emerging technologies such as artificial intelligence and machine learning in this field. The results of this study not only provide practical guidance for cloud computing applications in the industry, but also provide a basis for further research in academia.
Industry 4.0 (I4.0), is revolutionizing the manufacturing and maintenance processes, leading to various implications for workers in these sectors. This work examines the effects of I4.0 on production and maintenance, particularly in automation, connectivity, data analyses, and artificial intelligence, thereby resulting in the integration of more automated systems in accuracy and production speed. It also minimizes workplace accidents while providing the potential for round-the-clock production. I4.0 also introduces a higher level of connectivity through the creation of intelligence factories, where workers, machines, and products interact and communicate with each other in real-time. It leads to better tracking of products, real-time inventory management, and predictive maintenance. This aids in improving manufacturing processes and increases efficiency. The use of AI in partnership with big data can also provide predictive maintenance by identifying faults early, leading to efficient repairs and machine maintenance. The implementation of AI in maintenance provides a predictive future, allowing maintenance professionals to track and identify potential equipment issues early enough to carry out repair and maintenance before they become critical. Despite the various benefits of I4.0, maintenance personnel will require re-skilling and re-adjustment to suit the demands of the new industry, the introduction of I4.0 has led to significant progress in the manufacturing industry, revolutionizing production and maintenance processes with increased efficiency, production, and predictive maintenance.
Abstract Delivering on digitalization for large multinational business, in the contemporary context of global operations and real time delivery, is a significant opportunity. Operations of localised facilities independent of global operations can result in compromised global synergies. Centralised functions such as research and development, optimisations of assets, corporate planning (strategy, investment planning, financial), and supply chain together with any other function deliver significant business value. Integration of these functions via industry 4.0 delivers significant business value, delivering strategic and operational benefits. This research proposes a global system approach, as defined by industry 4.0 (vertical, horizontal and total business integration), to this challenge, from ERP through manufacturing systems down to instrumentation. The proposed work resolves the inter-site challenges together with global standardization and inter-functional integration. This proposed architecture is reinforced by a simulation illustrating the benefits of the integrated business.
Industry 4.0 factories are complex and data-driven. Data is yielded from many sources, including sensors, PLCs, and other devices, but also from IT, like ERP or CRM systems. We ask how to collect and process this data in a way, such that it includes metadata and can be used for industrial analytics or to derive intelligent support systems. This paper describes a new, query model based approach, which uses a big data architecture to capture data from various sources using OPC UA as a foundation. It buffers and preprocesses the information for the purpose of harmonizing and providing a holistic state space of a factory, as well as mappings to the current state of a production site. That information can be made available to multiple processing sinks, decoupled from the data sources, which enables them to work with the information without interfering with devices of the production, disturbing the network devices they are working in, or influencing the production process negatively. Metadata and connected semantic information is kept throughout the process, allowing to feed algorithms with meaningful data, so that it can be accessed in its entirety to perform time series analysis, machine learning or similar evaluations as well as replaying the data from the buffer for repeatable simulations.
The implementation of disruptive technologies of Industry 4.0 is carried out in all segments of society, but we still do not fully understand the breadth and speed of its application. We are currently witnessing major changes in all industries, so that new business methods are emerging, as well as transformation of production systems, new form of consumption, delivery and transport. All this is happening due to the implementation of disruptive technological discoveries that include: the Internet of Things (IoT), advanced robotics, smart sensors, Big Data, analytics, cloud computing, 3D printing, machine learning, virtual and augmented reality (AR), artificial intelligence, and productive maintenance. Advanced robotics is one of the most important technologies in Industry 4.0. The robotic application in the automation of production processes, with the support of information technology, leads us to ‘’smart automation’’, i.e., ‘’smart factory’’. The changes are so profound that, from the perspective of human history, there has never been a time of greater promise or potential danger. New generation robots have many advantages compared to the firstgeneration industrial robots such as: they work alongside with workers, workers perform their tasks in a safe environment, robots take up less space, robots do not need to be separated by fences, robots are easy to manipulate and cheaper to implement. The paper analyzes the trend of implementation of collaborative and service robots for logistics, which make the automation of production processes more flexible. Robotic technology is the basic technology of Industry 4.0, because without its application, the implementation of Industry 4.0 would not be possible. The trend of application of new generation robots will have an increasing character in the future, because the goals of the fourth industrial revolution cannot be achieved without collaborative robots. In other words, the objective is to achieve a ‘’smart production process’’ or ‘’smart factory’’.
In the current era, many disciplines are seen devoted towards ontology development for their domains with the intention of creating, disseminating and managing resource descriptions of their domain knowledge into machine understandable and processable manner. Ontology construction is a difficult group activity that involves many people with the different expertise. Generally, domain experts are not familiar with the ontology implementation environments and implementation experts do not have all the domain knowledge. We have designed Collaborative Business Intelligence Ontology (CBIOnt) for BI4People project. In this paper, we present CBIOnt that is OWL 2 DL ontology for the description of collaborative session between different collaborators working together on the business intelligent platform. As the collaborative session between various collaborators belongs to some collaborative form, phase and research aspect, therefore CBIOnt captures this knowledge along with the collaborative session content (comments, questions, answers, etc.) so that one can inference various types of information stored on ontologies when required. In addition, it stores the location and temporal-spatial information about the collaboration held between collaborators. We believe CBIOnt serves as a formal framework for dealing with the collaborative session taken place among collaborators on the semantic Web.
With the rapid development of big data technology applications, the application of big data technology has an increasing impact on the core competitiveness of automotive enterprises. This paper briefly introduces the development history and status quo of big data, introduces the role that big data technology is playing in the automotive industry, analyzes the problems in the automotive sales link and improvement programs that can be made by using big data, puts forward the current automotive aftermarket link optimization strategy, and also puts forward the limitations of current big data technology and the difficulties and pain points of research tasks, while looking forward to the future, affirming the development prospects of big data. In the future, big data technology will become an indispensable part of the automotive industry chain and play an important role in promoting the management optimization of the automotive industry. These results shed light on guiding further exploration of implementation of big data analysis in supply chain management.
taimor naseri, Mohammad Bagher Arayesh, marjan vahedi
Despite the rich literature on entrepreneurial marketing in various texts, the nature of research in the field of cooperative businesses with different cultural and economic burden has undergone major changes and the existing patterns and models of entrepreneurial marketing do not explain well the entrepreneurial marketing methods of these businesses. Therefore, the main purpose of this study is to identify the implementation model of strategic entrepreneurial marketing in production cooperatives. This study, in terms of purpose, is considered as an applied research and it is a qualitative research in terms of data collection method. To identify the pattern of strategic entrepreneurial marketing implementation in production cooperatives, 124 articles were reviewed, of which 26 articles were selected for the final analysis using the Critical Assessment Skills Program. In this study, first 298 indicators of strategic entrepreneurial marketing were identified and classified into 46 concepts and 12 categories. Shannon entropy method was used to determine the weight of the indices. On the basis of the research findings, the main dimensions of entrepreneurial marketing implementation include strategic marketing thinking, intra-organizational factors, strategic planning, target market selection, network of communication-organizational capabilities, entrepreneur-centered marketing strategies, strategic requirements, entrepreneurial requirements, Systematic support for cooperatives management, market entry methods, market research and consumer behavior analysis, and market control and evaluation. Finally, it can be concluded that managers of production cooperatives can use the results of this study to identify new customers and their diverse needs, increase market share and create a competitive advantage.
There is a growing interest in product aesthetics analytics and design. However, the lack of available large-scale data that covers various variables and information is one of the biggest challenges faced by analysts and researchers. In this paper, we present our multidisciplinary initiative of developing a comprehensive automotive dataset from different online sources and formats. Specifically, the created dataset contains 1.4 million images from 899 car models and their corresponding model specifications and sales information over more than ten years in the UK market. Our work makes significant contributions to: (i) research and applications in the automotive industry; (ii) big data creation and sharing; (iii) database design; and (iv) data fusion. Apart from our motivation, technical details and data structure, we further present three simple examples to demonstrate how our data can be used in business research and applications.