The rise of motivational information systems: A review of gamification research
Jonna Koivisto, Juho Hamari
Abstract Today, our reality and lives are increasingly game-like, not only because games have become a pervasive part of our lives, but also because activities, systems and services are increasingly gamified. Gamification refers to designing information systems to afford similar experiences and motivations as games do, and consequently, attempting to affect user behavior. In recent years, popularity of gamification has skyrocketed and manifested in growing numbers of gamified applications, as well as a rapidly increasing amount of research. However, this vein of research has mainly advanced without an agenda, theoretical guidance or a clear picture of the field. To make the picture more coherent, we provide a comprehensive review of the gamification research (N = 819 studies) and analyze the research models and results in empirical studies on gamification. While the results in general lean towards positive findings about the effectiveness of gamification, the amount of mixed results is remarkable. Furthermore, education, health and crowdsourcing as well as points, badges and leaderboards persist as the most common contexts and ways of implementing gamification. Concurrently, gamification research still lacks coherence in research models, and a consistency in the variables and theoretical foundations. As a final contribution of the review, we provide a comprehensive discussion, consisting of 15 future research trajectories, on future agenda for the growing vein of literature on gamification and gameful systems within the information system science field.
1479 sitasi
en
Computer Science
A Design Science Research Methodology for Information Systems Research
K. Peffers, T. Tuunanen, M. Rothenberger
et al.
7039 sitasi
en
Computer Science
Management information systems
Christine Urquhart, Mohan Ravindranathan
Abstract Technology is the science that studies processes, methods and operations run or applied onto raw materials, matters or data, in order to obtain a certain product. Information is the material signal able to launch a material reaction of a dynamic auto-tuning system for which the system is conditioned and finalized. Information Technology is the technology needed for handling (procuring, processing, storing converting and transmitting) information, in particular, with the use of computers [Longley, D. & Shain, M. (1985), p. 164]. The importance of IT in the economic growth and development is widely known, taking into account the impact that technology can have on the success and survival, or the failure of the economic activity of enterprises/organizations, IT offering various management information systems (MIS), executive and feedback segments, which all have important and beneficial implications in management and control.
1722 sitasi
en
Computer Science
Survey Research Methodology in Management Information Systems: An Assessment
A. Pinsonneault, K. Kraemer
1465 sitasi
en
Computer Science, Sociology
A Survey on Blockchain for Information Systems Management and Security
David Berdik, Safa Otoum, Niko Schmidt
et al.
Abstract Blockchain technologies have grown in prominence in recent years, with many experts citing the potential applications of the technology in regard to different aspects of any industry, market, agency, or governmental organizations. In the brief history of blockchain, an incredible number of achievements have been made regarding how blockchain can be utilized and the impacts it might have on several industries. The sheer number and complexity of these aspects can make it difficult to address blockchain potentials and complexities, especially when trying to address its purpose and fitness for a specific task. In this survey, we provide a comprehensive review of applying blockchain as a service for applications within today’s information systems. The survey gives the reader a deeper perspective on how blockchain helps to secure and manage today information systems. The survey contains a comprehensive reporting on different instances of blockchain studies and applications proposed by the research community and their respective impacts on blockchain and its use across other applications or scenarios. Some of the most important findings this survey highlights include the fact that blockchain’s structure and modern cloud- and edge-computing paradigms are crucial in enabling a widespread adaption and development of blockchain technologies for new players in today unprecedented vibrant global market. Ensuring that blockchain is widely available through public and open-source code libraries and tools will help to ensure that the full potential of the technology is reached and that further developments can be made concerning the long-term goals of blockchain enthusiasts.
477 sitasi
en
Computer Science
A Framework for Management Information Systems
G. Gorry, M. Morton
547 sitasi
en
Computer Science
Prediction of bank transaction fraud using TabNet—an adaptive deep learning architecture
B.S. Prashanth, Manoj Kumar, Ariful Hoque
et al.
The development of online banking has brought about an increase in fraudulent operations, which is a major problem for banks. This study delves into the urgent requirement for interpretable, scalable, and top-notch fraud detection systems by using TabNet, an adaptable deep learning framework, on a Kaggle dataset consisting of actual bank transactions in India. Maximizing operational risk management by improving the accuracy of transaction anomaly detection and ensuring regulatory compliance through transparent models is the goal.We utilize a supervised learning pipeline that incorporates the Synthetic Minority Over-sampling Technique (SMOTE) to ensure that classes are balanced. Subsequently, we conduct thorough exploratory data analysis (EDA) to identify patterns of fraud, both during specific times and across behaviors. On this dataset, five different deep learning architectures are tested: DNN, GRU, LSTM, CNN1D, and TabNet. Assessment of predictive performance was carried out using a 3-fold cross-validation framework. With a ROC-AUC of 0.9739 and an accuracy of 97.39 %, TabNet considerably outperformed the competition. The method of sparse feature selection used improved interpretability, generalized better on tabular data, and produced fewer false positives and negatives.Critical insights for operational fraud detection systems and a contribution to the broader literature on explainable AI (XAI) in financial decision-making are offered by the findings. Goals 8 and 16 of the Sustainable Development Agenda are supported by this study, which promotes inclusive economic growth and institutional transparency. Supporting strong, policy-compliant, and interpretable decision-support systems, it also offers practical use for real-time implementation in banking infrastructure.
Finance, Economics as a science
An Information-Theoretic Method for Dynamic System Identification With Output-Only Damping Estimation
Marios Impraimakis, Feiyu Zhou, Andrew Plummer
The system identification capabilities of a novel information-theoretic method are examined here. Specifically, this work uses information-theoretic metrics and vibration-based measurements to enhance damping estimation accuracy in mechanical systems. The method refers to a key limitation in system identification, signal processing, monitoring, and alert systems. These systems integrate various components, including sensors, data acquisition devices, and alert mechanisms. They are designed to operate in an environment to calculate key parameters such as peak accelerations and duration of high acceleration values. The current operational modal identification methods, though, suffer from limitations related to obtaining poor damping estimates due to their empirical nature. This has a significant impact on alert warning systems. This occurs when their duration is misestimated; specifically, when using the vibration amplitudes as an indicator of danger alerts for monitoring systems in damage or anomaly detection scenarios. To this end, approaches based on the Shannon entropy and the Kullback-Leibler divergence concept are proposed. The primary objective is to monitor the vibration levels in near real-time and provide immediate alerts when predefined thresholds are exceeded. In considering the proposed approach, both new real-world data from the multi-axis simulation table at the University of Bath, as well as the benchmark International Association for Structural Control-American Society of Civil Engineers (IASC-ASCE) structural health monitoring problem are considered. Importantly, the approach is shown to select the optimal model, which accurately captures the correct alert duration, providing a powerful tool for system identification and monitoring.
Agentic Business Process Management Systems
Marlon Dumas, Fredrik Milani, David Chapela-Campa
Since the early 90s, the evolution of the Business Process Management (BPM) discipline has been punctuated by successive waves of automation technologies. Some of these technologies enable the automation of individual tasks, while others focus on orchestrating the execution of end-to-end processes. The rise of Generative and Agentic Artificial Intelligence (AI) is opening the way for another such wave. However, this wave is poised to be different because it shifts the focus from automation to autonomy and from design-driven management of business processes to data-driven management, leveraging process mining techniques. This position paper, based on a keynote talk at the 2025 Workshop on AI for BPM, outlines how process mining has laid the foundations on top of which agents can sense process states, reason about improvement opportunities, and act to maintain and optimize performance. The paper proposes an architectural vision for Agentic Business Process Management Systems (A-BPMS): a new class of platforms that integrate autonomy, reasoning, and learning into process management and execution. The paper contends that such systems must support a continuum of processes, spanning from human-driven to fully autonomous, thus redefining the boundaries of process automation and governance.
Leveraging the Potential of Novel Data in Power Line Communication of Electricity Grids
Christoph Balada, Max Bondorf, Sheraz Ahmed
et al.
Electricity grids have become an essential part of daily life, even if they are often not noticed in everyday life. We usually only become particularly aware of this dependence by the time the electricity grid is no longer available. However, significant changes, such as the transition to renewable energy (photovoltaic, wind turbines, etc.) and an increasing number of energy consumers with complex load profiles (electric vehicles, home battery systems, etc.), pose new challenges for the electricity grid. At the same time, these challenges are usually too complex to be solved with traditional approaches. In this gap, where traditional approaches are reaching their limits, Machine Learning has become a popular tool to bridge this shortcoming through data-driven approaches. To enable novel ML implementations is we propose FiN-2 dataset, the first large-scale real-world broadband powerline communications (PLC) dataset. FiN-2 was collected during real practical use in a part of the German low-voltage grid that supplies energy to over 4.4 million people and shows well over two billion data points collected by more than 5100 sensors. In addition, we present different use cases in asset management, grid state visualization, forecasting, predictive maintenance, and novelty detection to highlight the benefits of these types of data. For these applications, we particularly highlight the use of novel machine learning architectures to extract rich information from real-world data that cannot be captured using traditional approaches. By publishing the first large-scale real-world dataset, we also aim to shed light on the previously largely unrecognized potential of PLC data and emphasize machine-learning-based research in low-voltage distribution networks by presenting a variety of different use cases.
Electrical engineering. Electronics. Nuclear engineering
Integrating BIM, Machine Learning, and PMBOK for Green Project Management in Saudi Arabia: A Framework for Energy Efficiency and Environmental Impact Reduction
Maher Abuhussain, Ali Hussain Alhamami, Khaled Almazam
et al.
This study introduces a comprehensive framework combining building information modeling (BIM), project management body of knowledge (PMBOK), and machine learning (ML) to optimize energy efficiency and reduce environmental impacts in Riyadh’s construction sector. The suggested methodology utilizes BIM for dynamic energy simulations and design visualization, PMBOK for integrating sustainability into project-management processes, and ML for predictive modeling and real-time energy optimization. Implementing an integrated model that incorporates building-management strategies and machine learning for both commercial and residential structures can offer stakeholders a thorough solution for forecasting energy performance and environmental impact. This is particularly essential in arid climates owing to specific conditions and environmental limitations. Using a simulation-based methodology, the framework was evaluated based on two representative case studies: (i) a commercial complex and (ii) a residential building. The neural network (NN), reinforcement learning (RL), and decision tree (DT) were implemented to assess performance in energy prediction and optimization. Results demonstrated notable seasonal energy savings, particularly in spring (15% reduction for commercial buildings) and fall (13% reduction for residential buildings), driven by optimized heating, ventilation, and air conditioning (HVAC) systems, insulation strategies, and window configurations. ML models successfully predicted energy consumption and greenhouse gas (GHG) emissions, enabling targeted mitigation strategies. GHG emissions were reduced by up to 25% in commercial and 20% in residential settings. Among the models, NN achieved the highest predictive accuracy (R<sup>2</sup> = 0.95), while RL proved effective in adaptive operational control. This study highlights the synergistic potential of BIM, PMBOK, and ML in advancing green project management and sustainable construction.
Dynamic Sign Language Recognition in Bahasa using MediaPipe, Long Short-Term Memory, and Convolutional Neural Network
Ivana Valentina Lemmuela, Mewati Ayub, Oscar Karnalim
Background: Communication is important for everyone, including individuals with hearing and speech impairments. For this demographic, sign language is widely used as the primary medium of communication with others who share similar conditions or with hearing individuals who understand sign language. However, communication difficulties arise when individuals with these impairments attempt to interact with those who do not understand sign language.
Objective: This research aims to develop models capable of recognizing sign language movements in Bahasa and converting the detected gesture into corresponding words, with a focus on vocabularies related to religious activities. Specifically, the research examined dynamic sign language in Bahasa, which comprised gestures requiring motion for proper demonstration.
Methods: In accordance with the research objective, sign language recognition model was developed using MediaPipe-assisted extraction process. Recognition of dynamic sign language in Bahasa was achieved through the application of Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM) methods.
Results: Sign language recognition model developed using bidirectional LSTM showed the best result with a testing accuracy of 100%. However, the best result for the CNN alone was 86.67 %. The integration of CNN and LSTM was observed to improve performance than CNN alone, with the best CNN-LSTM model achieving an accuracy of 95.24%.
Conclusion: The bidirectional LSTM model outperformed the unidirectional LSTM by capturing richer temporal information, with a specific consideration of both past and future time steps. Based on the observations made, CNN alone could not match the effectiveness of the Bidirectional LSTM, but a combination of CNN with LSTM produced better results. It is also important to state that normalized landmark data was found to significantly improve accuracy. Accuracy within this context was also influenced by shot type variability and specific landmark coordinates. Furthermore, the dataset containing straight-shot videos with x and y coordinates provided more accurate results, dissimilar to those comprised of videos with shot variation, which typically require x, y, and z coordinates for optimal accuracy.
Keywords: Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), MediaPipe, Sign Language
Management information systems
Requirements for Vitiligo Registry Design in Iran: A Qualitative Content Analysis Study
Zahra Arabkermani, Roxana Sharifian, Peivand Bastani
et al.
Background: Vitiligo is a prevalent skin disorder that has significant biological and social consequences for the affected individuals. Therefore, appropriate measures should be taken to diagnose this disease and treat patients, and powerful information and monitoring systems, such as a registry, are required. This study aimed to identify the design requirements for vitiligo registry in Iran.Methods: This qualitative study was conducted using a content analysis approach in 2020. In total, 9 dermatologists and health information management and medical informatics specialists working in Tehran, Shiraz, and Mashhad universities of medical sciences were interviewed. The participants were selected by a non-random purposive sampling method. The data were analyzed manually using thematic analysis approach.Results: In this study, 7 major themes and 14 sub-themes were obtained regarding vitiligo registry design requirements. The major themes included registry objectives, structure, data sources, inclusion criteria, classification system, data quality control, and data reporting.Conclusion: In total, 7 major themes and 14 sub-themes were identified to design the vitiligo registry. Developing a vitiligo registry based on these requirements could provide a better understanding of this disease, deliver high-quality services to patients across the country, and facilitate research on this disease.
Public aspects of medicine
Enhancing Forecast Verification in National Meteorological and Hydrological Services
Thomas C. Pagano, Elizabeth E. Ebert, Mohammadreza Khanarmuei
ABSTRACT Forecast verification is an essential function of National Meteorological and Hydrological Services (NMHSs), underpinning their ability to deliver accurate, reliable, and actionable weather, climate, and water‐related information. As NMHSs face increasing demands for transparency, accountability, and continuous improvement, they require robust systems to assess and enhance the quality of their forecasts. This article presents a holistic forecast verification capability development framework, built from over a decade of focused effort at the Australian Bureau of Meteorology. The framework integrates best practices in governance, data management, verification metrics, and communication. It acknowledges the importance of user‐centered approaches and highlights areas where verification practices can align with user needs. To support NMHSs in adopting this framework, the article introduces two practical tools: a Verification Planning Template for establishing new verification activities and systems and a Gap Analysis and Maturity Assessment (GAMA) tool for benchmarking and advancing existing practices. These tools provide structured guidance for planning, evaluating, and improving verification within a NMHS, with the ultimate goal of delivering higher quality forecasts that meet diverse stakeholder needs. The Bureau's progress in implementing this framework demonstrates significant benefits, including improved forecast quality, enhanced coordination across verification efforts, and greater trust among users. However, challenges such as data availability, system integration, and resourcing remain pervasive, both within the Bureau and globally. The tools and insights shared in this article offer a pathway for NMHSs to overcome these obstacles, enabling them to better respond to evolving user expectations and operational demands. This work highlights the value of fostering a strong verification culture, supported by collaboration and knowledge sharing across the international meteorological community. By applying the principles and tools presented here, and customizing them to their circumstances, NMHSs can advance toward resilient, evidence‐based verification practices and capabilities that enhance forecast reliability and stakeholder confidence worldwide.
From product to system network challenges in system of systems lifecycle management
Vahid Salehi, Josef Vilsmeier, Shirui Wang
Today, products are no longer isolated artifacts, but nodes in networked systems. This means that traditional, linearly conceived life cycle models are reaching their limits: Interoperability across disciplines, variant and configuration management, traceability, and governance across organizational boundaries are becoming key factors. This collective contribution classifies the state of the art and proposes a practical frame of reference for SoS lifecycle management, model-based systems engineering (MBSE) as the semantic backbone, product lifecycle management (PLM) as the governance and configuration level, CAD-CAE as model-derived domains, and digital thread and digital twin as continuous feedback. Based on current literature and industry experience, mobility, healthcare, and the public sector, we identify four principles: (1) referenced architecture and data models, (2) end-to-end configuration sovereignty instead of tool silos, (3) curated models with clear review gates, and (4) measurable value contributions along time, quality, cost, and sustainability. A three-step roadmap shows the transition from product- to network- centric development: piloting with reference architecture, scaling across variant and supply chain spaces, organizational anchoring (roles, training, compliance). The results are increased change robustness, shorter throughput times, improved reuse, and informed sustainability decisions. This article is aimed at decision-makers and practitioners who want to make complexity manageable and design SoS value streams to be scalable.
Robust blue-green urban flood risk management optimised with a genetic algorithm for multiple rainstorm return periods
Asid Ur Rehman, Vassilis Glenis, Elizabeth Lewis
et al.
Flood risk managers seek to optimise Blue-Green Infrastructure (BGI) designs to maximise return on investment. Current systems often use optimisation algorithms and detailed flood models to maximise benefit-cost ratios for single rainstorm return periods. However, these schemes may lack robustness in mitigating flood risks across different storm magnitudes. For example, a BGI scheme optimised for a 100-year return period may differ from one optimised for a 10-year return period. This study introduces a novel methodology incorporating five return periods (T = 10, 20, 30, 50, and 100 years) into a multi-objective BGI optimisation framework. The framework combines a Non-dominated Sorting Genetic Algorithm II (NSGA-II) with a fully distributed hydrodynamic model to optimise the spatial placement and combined size of BGI features. For the first time, direct damage cost (DDC) and expected annual damage (EAD), calculated for various building types, are used as risk objective functions, transforming a many-objective problem into a multi-objective one. Performance metrics such as Median Risk Difference (MedRD), Maximum Risk Difference (MaxRD), and Area Under Pareto Front (AUPF) reveal that a 100-year optimised BGI design performs poorly when evaluated for other return periods, particularly shorter ones. In contrast, a BGI design optimised using composite return periods enhances performance metrics across all return periods, with the greatest improvements observed in MedRD (22%) and AUPF (73%) for the 20-year return period, and MaxRD (23%) for the 50-year return period. Furthermore, climate uplift stress testing confirms the robustness of the proposed design to future rainfall extremes. This study advocates a paradigm shift in flood risk management, moving from single maximum to multiple rainstorm return period-based designs to enhance resilience and adaptability to future climate extremes.
Farm management information systems: Current situation and future perspectives
S. Fountas, Giacomo Carli, C. Sørensen
et al.
Farm management information systems centered around the farm manager in open-field crop production.Prevailing differences between academic and commercial farm management information systems.Grouping of farm management information systems based on cluster analysis. Farm Management Information Systems (FMIS) in agriculture have evolved from simple farm recordkeeping into sophisticated and complex systems to support production management. The purpose of current FMIS is to meet the increased demands to reduce production costs, comply with agricultural standards, and maintain high product quality and safety. This paper presents current advancements in the functionality of academic and commercial FMIS. The study focuses on open-field crop production and centeres on farm managers as the primary users and decision makers. Core system architectures and application domains, adoption and profitability, and FMIS solutions for precision agriculture as the most information-intensive application area were analyzed. Our review of commercial solutions involved the analysis of 141 international software packages, categorized into 11 functions. Cluster analysis was used to group current commercial FMIS as well as examine possible avenues for further development. Academic FMIS involved more sophisticated systems covering compliance to standards applications, automated data capture as well as interoperability between different software packages. Conversely, commercial FMIS applications targeted everyday farm office tasks related to budgeting and finance, such as recordkeeping, machinery management, and documentation, with emerging trends showing new functions related to traceability, quality assurance and sales.
331 sitasi
en
Engineering, Computer Science
Management accountants and strategic management accounting: The role of organizational culture and information systems
Wael Hadid, M. Al-Sayed
Abstract This study aims to contribute to the scant contingency theory literature on the determinants of strategic management accounting (SMA) practices and the role management accountants play. We develop and test a more complex theoretical model than in prior studies, to simultaneously examine the role of three variables: management accountant networking, information systems (IS) quality and organizational culture. These have not been examined in a single model before in the SMA literature. Using data from 149 UK manufacturing business units and the partial least square structural equation modeling, our findings document a positive relationship between management accountant networking and the implementation of SMA practices. However, this relationship is positively moderated by IS quality, which further enables management accountants to implement SMA practices. Unlike IS quality, we do not find empirical support for similar moderating effects by the outcome-oriented culture and innovation-oriented culture. Instead, the innovation-oriented culture has a significant indirect positive effect on SMA implementation through management accountant networking but not a direct one. In contrast, we find a direct positive impact of outcome-oriented culture on SMA implementation but not an indirect one through management accountant networking. These results suggest that in outcome-driven business units, the implementation of SMA practices may not be limited to the accounting function. Managers in other functions may be motivated to implement SMA practices even when management accountants are not part of the process.
The impact of knowledge management processes on information systems: A systematic review
M. Al-Emran, V. Mezhuyev, Adzhar Kamaludin
et al.
Abstract Knowledge Management (KM) processes play a significant role in the implementation of various Information Systems (IS). Several review studies were carried out to afford a better understanding of the current research trend of KM processes. However, this issue still needs to be examined from other perspectives. It is observed that previous research neglects the examination of KM processes studies with regard to ISs. The current study systematically reviews and sheds the light on KM processes studies related to ISs aiming to provide a comprehensive analysis of 41 research articles published in peer-reviewed journals from 2001 to 2018. The main findings of this study indicate that knowledge sharing is the most frequent KM process studied, followed by knowledge acquisition and knowledge application. Besides, questionnaire surveys were found to be the primarily relied research methods for data collection in the context of KM processes. In addition, 78% of the analyzed studies registered positive research outcomes. In terms of IS type, most of the analyzed studies focused on investigating the impact of KM processes on E-business systems, knowledge management systems, and IS outsourcing, respectively. Additionally, in terms of data collection, the majority of the analyzed studies were primarily focused on the participants who are IS executives/managers. Furthermore, most of the analyzed studies that achieved positive outcomes were carried out in China. To that end, this review study attempts to demonstrate and detail the recent increase in the interest and the advancement made in KM processes research considering ISs studies, which form an essential reference for scholars in KM field.
209 sitasi
en
Computer Science
A Scalable Real-Time SDN-Based MQTT Framework for Industrial Applications
E. Shahri, P. Pedreiras, L. Almeida
The increasing prominence of concepts such as Smart Production and Industrial Internet of Things (IIoT) within the context of Industry 4.0 has introduced a new set of requirements for the engineering of industrial systems, including support for dynamic environments, timeliness guarantees, support for heterogeneity, interoperability and reliability. These requirements are further exacerbated at the network level by the notable rise in the number and variety of devices involved. To stay competitive in this ever-changing industrial landscape while boosting productivity, it is vital to meet those requirements, combining established protocols with emerging technologies. Software-Defined Networking (SDN) is the forefront traffic management paradigm that offers flexibility for complex industrial networks, enabling efficient resource allocation and dynamic reconfiguration. Message Queuing Telemetry Transport (MQTT) is a low-overhead protocol of the application layer that is gaining popularity in the scope of the IoT and IIoT. However, its Quality-of-Service (QoS) policies do not support timeliness requirements. This article presents a framework that seamlessly integrates SDN and MQTT, enhancing network management flexibility while satisfying real-time requirements found in industrial environments. It leverages the User Properties of MQTTv5 to allow specifying real-time requirements. MQTT traffic is intercepted by a Network Manager that extracts real-time information and instructs an SDN controller to deploy corresponding network reservations. MQTT traffic across multiple edge networks is propagated by selected brokers using multicasting. Extensive experiments validate the proposed approach, demonstrating its superiority over MQTT and Direct Multicast-MQTT (DM-MQTT) DM-MQTT in latency reduction. A response time analysis, validated experimentally, emphasizes robust performance across metrics.
Electronics, Industrial engineering. Management engineering