Hasil untuk "iot"

Menampilkan 20 dari ~487500 hasil · dari CrossRef, DOAJ, Semantic Scholar

JSON API
S2 Open Access 2015
A review on Internet of Things (IoT), Internet of Everything (IoE) and Internet of Nano Things (IoNT)

M. Miraz, Maaruf Ali, P. Excell et al.

The current prominence and future promises of the Internet of Things (IoT), Internet of Everything (IoE) and Internet of Nano Things (IoNT) are extensively reviewed and a summary survey report is presented. The analysis clearly distinguishes between IoT and IoE which are wrongly considered to be the same by many people. Upon examining the current advancement in the fields of IoT, IoE and IoNT, the paper presents scenarios for the possible future expansion of their applications.

421 sitasi en Computer Science
CrossRef Open Access 2026
Intelligent Railway Wagon Health Assessment Using IoT Sensors and Predictive Analytics for Safety-Critical Applications

Shiva Kumar Mysore Gangadhara, Krishna Alabhujanahalli Neelegowda, Anitha Arekattedoddi Chikkalingaiah et al.

The safety and reliability of railway wagon operations largely depend on the timely detection of degradation in safety-critical components such as axle bearings, wheelsets, and braking systems. Conventional maintenance strategies based on fixed inspection intervals are often inadequate for capturing the actual operating conditions of wagon components, leading to delayed fault detection or unnecessary maintenance actions. To address these limitations, this paper proposes a sensor-based health assessment framework for the continuous monitoring of railway wagons under operational conditions. The proposed framework integrates multi-sensor data acquisition, systematic signal preprocessing, feature-based health indicator construction, and temporal degradation analysis to evaluate component health in real time. A safety-oriented decision logic is employed to classify operating conditions and generate reliable alerts while minimizing false detections caused by transient disturbances. The effectiveness of the proposed approach is validated using a publicly available run-to-failure bearing dataset that exhibits degradation characteristics similar to those observed in railway wagon axle bearings. Experimental results demonstrate that the proposed framework achieves improved classification accuracy, higher detection reliability, reduced false alarm rates, and lower detection latency compared to representative existing condition monitoring approaches. In addition, the computational efficiency of the proposed model confirms its suitability for real-time deployment. The results indicate that the proposed health assessment framework provides a practical and reliable solution for safety-critical railway wagon monitoring and forms a strong foundation for future extensions toward predictive maintenance and remaining useful life estimation.

CrossRef Open Access 2025
Pemanfaatan Solar Panel Sebagai Studi Efisiensi Penyiraman Tanaman Cabai Berbasis IoT

Fahrul Apriansyah, Fery Antony, Hastha Sunardi

Penelitian ini bertujuan untuk memudahkan petani dalam melakukan penyiraman tanaman secara otomatis dan terjadwal, dengan adanya pemanfaatkan panel surya sebagai sumber energi utama. Sistem ini akan dikembangkan untuk tanaman cabai, yang merupakan salah satu komoditas penting dalam kebutuhan pangan masyarakat. Panel surya berfungsi menyerap energi matahari yang kemudian dikonversi menjadi energi listrik dan disimpan ke dalam baterai melalui Solar Charge Controller (SCC), guna menghidupkan seluruh sistem penyiraman. Sistem ini dikendalikan oleh mikrokontroler ESP32 yang terintegrasi dengan sensor kelembaban tanah (soil moisture) dan sensor suhu serta kelembaban udara (DHT11). Mekanisme penyiraman dilakukan secara otomatis berdasarkan pembacaan kelembaban tanah serta jadwal penyiraman yang telah ditentukan. Tanaman cabai idealnya tumbuh pada kelembaban tanah antara 60% hingga 80%, sehingga pompa akan aktif menyiram saat kelembaban berada di bawah 60%, dan berhenti saat mencapai ambang optimal. Hasil pengujian menunjukkan bahwa sistem bekerja dengan baik, mampu melakukan penyiraman secara akurat dan efektif sesuai kondisi aktual di lapangan. Dengan adanya sistem ini, diharapkan petani dapat menghemat waktu, tenaga, dan mendukung praktik pertanian yang lebih ramah lingkungan melalui pemanfaatan energi terbarukan.

DOAJ Open Access 2025
IoT-Based Off-Grid Solar Power Supply: Design, Implementation, and Case Study of Energy Consumption Control Using Forecasted Solar Irradiation

Marijan Španer, Mitja Truntič, Darko Hercog

This article presents the development and implementation of an IoT-enabled, off-grid solar power supply prototype designed to power a range of electrical devices. The developed system comprises a Photovoltaic panel, a Maximum Power Point Tracking (MPPT) charger, a 2.5 kWh/24 V high-performance LiFePO4 battery bank with a Battery Management System, an embedded controller with IoT connectivity, and DC/DC and DC/AC converters. The PV panel serves as the primary energy source, with the MPPT controller optimizing battery charging, while the DC/DC and DC/AC converters supply power to the connected electrical devices. The article includes a case study of a developed platform for powering an information and advertising system. The system features a predictive energy management algorithm, which optimizes the appliance operation based on daily solar irradiance forecasts and real-time battery State-of-Charge monitoring. The IoT-enabled controller obtains solar irradiance forecasts from an online meteorological service via API calls and uses these data to estimate energy availability for the next day. Using this prediction, the system schedules and prioritizes the operations of connected electrical devices dynamically to optimize the performance and prevent critical battery discharge. The IoT-based controller is equipped with both Wi-Fi and an LTE modem, enabling communication with online services via wireless or cellular networks.

Technology, Engineering (General). Civil engineering (General)
DOAJ Open Access 2025
TUOMO SIPOLA, JANNE ALATALO, MONIKA WOLFMAYR, AND TERO KOKKONEN, EDS., ARTIFICIAL INTELLIGENCE FOR SECURITY: ENHANCING PROTECTION IN A CHANGING WORLD, CHAM: SPRINGER NATURE, 2024, 307 P., ISBN 978- 3031574511

Lokesh SWAMI

This review critically examines Artificial Intelligence for Security: Enhancing Protection in a Changing World edited by Tuomo Sipola, Janne Alatalo, Monika Wolfmayr, and Tero Kokkonen, published in Springer Nature, in 2024. Structured in three parts: “Methodological Fundamentals”, “Critical Infrastructure Protection”, and “AI for Anomaly Detection”, the volume brings together theoretical insights and applied studies to explore how AI enhances security systems. Part I engages with foundational concepts such as differential privacy, explainable AI, and adversarial robustness. Part II presents sector-specific applications ranging from logistics and smart grids to healthcare systems. Part III demonstrates AI’s utility in real-time anomaly detection, providing empirical results on web-attack detection, log analysis, and Internet of Things (IoT) intrusion modeling. The book’s strengths lie in its interdisciplinary approach, ethical framing, and strong emphasis on real-world applications. Case studies such as the use of fuzzy logic in smart grids and hybrid models for IoT defense underscore the book’s practical relevance. However, limitations include a narrow domain scope (with less attention to areas like financial or defense security), minimal engagement with geopolitical dynamics, and overly technical vocabulary that may limit accessibility for non-specialist readers. Despite these constraints, the volume makes a substantial contribution to the field, integrating technical precision analysis with policy-aligned ethics, offering a valuable resource for cybersecurity researchers, practitioners, and policy architects concerned with building reliable AI-enabled systems.

Political science
DOAJ Open Access 2025
Energy Consumption Modeling for Wi-Fi HaLow Networks

Zhiqiang Xu, Luke Kane, Vicky Liu et al.

Wi-Fi HaLow (IEEE 802.11ah) has emerged as a promising solution which can support Internet of Things (IoT) applications where energy efficiency and extended coverage are important. A key feature of Wi-Fi HaLow is the Target Wake Time (TWT) mechanism, which allows devices to schedule wake-up times, significantly reducing IDLE listening and energy consumption. However, there is currently no energy consumption model, leaving a gap in calculating how much energy a device actually consumes in a real network. This study aims to bridge this gap by developing a forecast model to accurately predict the energy consumption of devices with TWT enabled. The proposed model is then validated through experimental measurements using real Wi-Fi HaLow-compatible devices, ensuring an accurate representation of practical energy consumption. This research provides empirical insights and recommendations for optimizing network configurations in battery-constrained environments. In particular, the proposed energy consumption model can assist businesses in accurately estimating and managing energy usage, which is essential for cost-effective planning and improving operational efficiency in real-world IoT deployments.

Telecommunication, Transportation and communications
CrossRef Open Access 2024
GTBTL-IoT: An Approach of Curtailing Task Offloading Time for Improved Responsiveness in IoT-MEC Model

Eram Fatima Siddiqui, Tasneem Ahmed

INTRODUCTION: The Internet of Things (IoT) has transformed daily life by interconnecting digital devices via integrated sensors, software, and connectivity. Although IoT devices excel at real-time data collection and decision-making, their performance on complex tasks is hindered by limited power, resources, and time. To address this, IoT is often combined with cloud computing (CC) to meet time-sensitive demands. However, the distance between IoT devices and cloud servers can result in latency issues. OBJECTIVES: To mitigate latency challenges, Mobile Edge Computing (MEC) is integrated with IoT. MEC offers cloud-like services through servers located near network edges and IoT devices, enhancing device responsiveness by reducing transmission and processing latency. This study aims to develop a solution to optimize task offloading in IoT-MEC environments, addressing challenges like latency, uneven workloads, and network congestion. METHODS: This research introduces the Game Theory-Based Task Latency (GTBTL-IoT) algorithm, a two-way task offloading approach employing Game Matching Theory and Data Partitioning Theory. Initially, the algorithm matches IoT devices with the nearest MEC server using game-matching theory. Subsequently, it splits the entire task into two halves and allocates them to both local and MEC servers for parallel computation, optimizing resource usage and workload balance. RESULTS: GTBTL-IoT outperforms existing algorithms, such as the Delay-Aware Online Workload Allocation (DAOWA) Algorithm, Fuzzy Algorithm (FA), and Dynamic Task Scheduling (DTS), by an average of 143.75 ms with a 5.5 s system deadline. Additionally, it significantly reduces task transmission, computation latency, and overall job offloading time by 59%. Evaluated in an ENIGMA-based simulation environment, GTBTL-IoT demonstrates its ability to compute requests in real-time with optimal resource usage, ensuring efficient and balanced task execution in the IoT-MEC paradigm. CONCLUSION: The Game Theory-Based Task Latency (GTBTL-IoT) algorithm presents a novel approach to optimize task offloading in IoT-MEC environments. By leveraging Game Matching Theory and Data Partitioning Theory, GTBTL-IoT effectively reduces latency, balances workloads, and optimizes resource usage. The algorithm's superior performance compared to existing methods underscores its potential to enhance the responsiveness and efficiency of IoT devices in real-world applications, ensuring seamless task execution in IoT-MEC systems.

3 sitasi en
DOAJ Open Access 2024
Edge-Enhanced TempoFuseNet: A Two-Stream Framework for Intelligent Multiclass Video Anomaly Recognition in 5G and IoT Environments

Gulshan Saleem, Usama Ijaz Bajwa, Rana Hammad Raza et al.

Surveillance video analytics encounters unprecedented challenges in 5G and IoT environments, including complex intra-class variations, short-term and long-term temporal dynamics, and variable video quality. This study introduces Edge-Enhanced TempoFuseNet, a cutting-edge framework that strategically reduces spatial resolution to allow the processing of low-resolution images. A dual upscaling methodology based on bicubic interpolation and an encoder–bank–decoder configuration is used for anomaly classification. The two-stream architecture combines the power of a pre-trained Convolutional Neural Network (CNN) for spatial feature extraction from RGB imagery in the spatial stream, while the temporal stream focuses on learning short-term temporal characteristics, reducing the computational burden of optical flow. To analyze long-term temporal patterns, the extracted features from both streams are combined and routed through a Gated Recurrent Unit (GRU) layer. The proposed framework (TempoFuseNet) outperforms the encoder–bank–decoder model in terms of performance metrics, achieving a multiclass macro average accuracy of 92.28%, an F1-score of 69.29%, and a false positive rate of 4.41%. This study presents a significant advancement in the field of video anomaly recognition and provides a comprehensive solution to the complex challenges posed by real-world surveillance scenarios in the context of 5G and IoT.

Information technology
DOAJ Open Access 2024
A Survey on Green Enablers: A Study on the Energy Efficiency of AI-Based 5G Networks

Zeinab Ezzeddine, Ayman Khalil, Besma Zeddini et al.

In today’s world, the significance of reducing energy consumption globally is increasing, making it imperative to prioritize energy efficiency in 5th-generation (5G) networks. However, it is crucial to ensure that these energy-saving measures do not compromise the Key Performance Indicators (KPIs), such as user experience, quality of service (QoS), or other important aspects of the network. Advanced wireless technologies have been integrated into 5G network designs at multiple network layers to address this difficulty. The integration of emerging technology trends, such as machine learning (ML), which is a subset of artificial intelligence (AI), and AI’s rapid improvements have made the integration of these trends into 5G networks a significant topic of research. The primary objective of this survey is to analyze AI’s integration into 5G networks for enhanced energy efficiency. By exploring this intersection between AI and 5G, we aim to identify potential strategies and techniques for optimizing energy consumption while maintaining the desired network performance and user experience.

Chemical technology
DOAJ Open Access 2024
Web-based platform to collect, share and manage technical data of historical systemic architectures: the Telegraphic Towers along the Madrid-Valencia path

Margherita Lasorella, Pasquale de-Dato, Elena Cantatore

Considering the variety of architectural Cultural Heritage typologies, systemic architectures require specific attention in the recovery process. The dimensions of "extension" and "recurrence" at geographic and technological levels affect the complexity of their knowledge process; they require systematic ways for their categorisation and comprehension to guarantee correct diagnosis and suitable rehabilitation. Recent applications involving Internet of Things (IoT) for the built Cultural Heritage have demonstrated the potentialities of three-dimensional (3D) geographic information system (GIS) models and structured databases in supporting complex degrees of knowledge for technicians, as well as management for administrators. Starting from such experiences, the work presents the setting up of a web-based platform to support the knowledge and management of systemic architectures, considering the geographical distribution of fabrics, natural and anthropic boundary conditions, and technical and administrative details. The platform takes advantage of digital models, machine and deep learning procedures and relational databases, in a GIS-based environment, for the recognition and categorisation of prevalent physical and qualitative features of systemic architectures, the recognition and qualification of dominant and recurrent decays and the management of recovery activities in a semi-automatic way. Specifically, the main digital objects used for testing the applied techniques and setting up the platform are based on Red-Green-Blue (RGB) and mapped point clouds of the historical Telegraphic Towers located along the Madrid-Valencia path, resulting from the on-site investigations. Their choice is motivated by the high level of knowledge about the cases reached in the last years by the authors, allowing them to test rules within the decision support systems and innovative techniques for their decay mapping. As the experience has demonstrated, the systematisation of technical details and operative pipeline of methods and tools allow the normalisation and standardisation of the intervention selection process; this offers policymakers an innovative tool based on traditional procedures for conservation plans, coherent with a priority-based practice.

Museums. Collectors and collecting, Archaeology
DOAJ Open Access 2024
Research on multi-layer network topology optimization strategy for railway internet of things based on game theory benefits

Fang Wang, Kaixuan Su, Bo Liang et al.

In the railway system environment, the interconnection of a vast array of intelligent sensing devices has brought about revolutionary changes in the management and monitoring of railway transportation. However, this also poses challenges to the communication service quality within the railway Internet of Things (IoT). Through collective intelligence and collaboration, the nodes within the railway IoT can not only share data and information but also work synergistically to enhance the overall intelligence level and improve decision-making quality of the network. Therefore, this paper proposes a reconnection mechanism based on the computation of node game-theoretic benefits and optimizes this process with the concept of swarm intelligence collaboration. Initially, the game-theoretic benefit values of the nodes in the railway IoT network are calculated. Subsequently, based on the weight priority of the edges, the two edges with the larger weights are selected, and connections are established between nodes with similar game-theoretic benefit values to enhance the network’s robustness. This approach enables rapid networking and efficient communication transmission within the railway IoT, providing robust assurance for the safe and stable operation of the railway.

DOAJ Open Access 2023
P4-HLDMC: A Novel Framework for DDoS and ARP Attack Detection and Mitigation in SD-IoT Networks Using Machine Learning, Stateful P4, and Distributed Multi-Controller Architecture

Walid I. Khedr, Ameer E. Gouda, Ehab R. Mohamed

Distributed Denial of Service (DDoS) and Address Resolution Protocol (ARP) attacks pose significant threats to the security of Software-Defined Internet of Things (SD-IoT) networks. The standard Software-Defined Networking (SDN) architecture faces challenges in effectively detecting, preventing, and mitigating these attacks due to its centralized control and limited intelligence. In this paper, we present P4-HLDMC, a novel collaborative secure framework that combines machine learning (ML), stateful P4, and a hierarchical logically distributed multi-controller architecture. P4-HLDMC overcomes the limitations of the standard SDN architecture, ensuring scalability, performance, and an efficient response to attacks. It comprises four modules: the multi-controller dedicated interface (MCDI) for real-time attack detection through a distributed alert channel (DAC), the MSMPF, a P4-enabled stateful multi-state matching pipeline function for analyzing IoT network traffic using nine state tables, the modified ensemble voting (MEV) algorithm with six classifiers for enhanced detection of anomalies in P4-extracted traffic patterns, and an attack mitigation process distributed among multiple controllers to effectively handle larger-scale attacks. We validate our framework using diverse test cases and real-world IoT network traffic datasets, demonstrating high detection rates, low false-alarm rates, low latency, and short detection times compared to existing methods. Our work introduces the first integrated framework combining ML, stateful P4, and SDN-based multi-controller architecture for DDoS and ARP detection in IoT networks.

DOAJ Open Access 2023
Telemedicine - Application in cardiology

Krishnam P Raju, Prasad G Sistla

Background: As countries around the globe enforce social distancing and self-isolation to fight the COVID-19 pandemic, telemedicine is emerging as a critical tool to connect physicians and other healthcare professionals with patients dealing with chronic cardiovascular conditions. Technology assisted healthcare delivery is virtually imperative especially in India with a large part belonging to rural and remote regions. The Information and Communication technology (ICT) which is the fundamental part of this technology is the ability to locally connect to a global network. The current pandemic caused by corona has highlighted the importance of this technology even more with patients showing apprehension to go to hospitals for routine check-ups. The emergence of Internet of Things (IoT) has further ensured that a continuum in care can be maintained, with patients having the opportunity to have wearable devices at their homes and using the Telemedicine platform for transmission of medical data from these devices for consultations. Methods: Literary search on the various applications of Telemedicine in healthcare with specific reference to Cardiology. Results: This article highlights our experience in the utilization of this technology for various cardiac conditions, comprehending the challenges of this technology at the practical level and the impact of making healthcare deliver accessible and cost-effective. Conclusions: Information and communication technology (ICT) and the advent of Internet of Things for Medical Devices (IoT-MD) has empowered telemedicine as a powerful model for healthcare delivery in an effective manner. Immense data generated from these devices have further encouraged development of algorithms based on Artificial Intelligence thereby improving clinical effectiveness and ensuring continuity of care. Though possibilities of improving clinical efficacy and healthcare outcomes through AI are enormous, we need to be aware of the associated risks and challenges and try to minimize those through multidisciplinary research, and renewed legal and ethical policies.

Diseases of the circulatory (Cardiovascular) system

Halaman 25 dari 24375