Lorenzo Chicchi, Duccio Fanelli, Diego Febbe
et al.
The continuous-variable firing rate (CVFR) model, widely used in neuroscience to describe the complex dynamics of excitatory biological neurons, is here trained and tested as a dynamical classifier. To this end the model is supplied with a set of attractors which are a priori embedded in the inter-node coupling matrix, via its spectral decomposition. Learning amounts to tuning the residual parameters, in order to shape a non-equilibrium path which bridges the input (the data to be classified) and the output (the target memory slot). The imposed attractors are unaltered by the training, and this enables for ex post comparisons to be eventually drawn, e.g. as it concerns the size of their associated basins of attraction. A stochastic variant of the CVFR model is also studied and found to be robust to non-targeted adversarial attacks, which corrupt with a random perturbation the items to be eventually classified. Taken as a whole, here we show that a family of biologically plausible models written in terms of coupled ODEs can efficiently cope with a non-trivial classification task.
Abstract With the advancement of text summarization technology, the issue of hallucinations in summaries has garnered increasing attention. Pretrained models often incorporate additional factual information to minimize the occurrence of hallucinations. In research on text summarization using LLMs, accurate samples are typically provided via a chain-of-thought approach, enabling the model to learn the implicit relationship between the source text and the target summary. To address the hallucination problem in text summarization, this paper proposes CLSeq and Nscp, which are specifically designed for pre-trained models and LLMs, respectively. CLSeq integrates the strengths of human-generated summaries and model-generated summaries to produce high-quality positive samples as target, while refining the loss function to handle negative samples more effectively. Nscp provides negative examples and explanatory information through the chain-of-thought mechanism. These strategies aim to enhance the model’s understanding of the characteristics and causes of hallucinations, thereby reducing the likelihood of factual inconsistencies in the summaries. Experimental results demonstrate that both methods effectively mitigate the hallucination problem in text summarization and exhibit a certain degree of robustness.
Computer engineering. Computer hardware, Information technology
Alonso Ingar Romero, Qianru Jin, Kevin Kit Parker
et al.
Studying the behavior of electroactive cells, such as firing dynamics and chemical secretion, is crucial for developing human disease models and therapeutics. Following the recent advances in cell culture technology, traditional monolayers are optimized to resemble more 3D, organ‐like structures. The biological and electrochemical complexity of these structures requires devices with adaptive shapes and novel features, such as precise electrophysiological mapping and stimulation in the case of brain‐ and heart‐derived tissues. However, conventional organ‐on‐chip platforms often fall short, as they do not recreate the native environment of the cells and lack the functional interfaces necessary for long‐term monitoring. Origami‐on‐a‐chip platforms offer a solution for this problem, as they can flexibly adapt to the structure of the desired biological sample and can be integrated with functional components enabled by chosen materials. In this review, the evolution of origami‐on‐a‐chip biointerfaces is discussed, emphasizing folding stimuli, materials, and critical findings. In the prospects, microfluidic integration, functional tissue engineering scaffolds, and multi‐organoid networks are included, allowing patient‐specific diagnoses and therapies through computational and in vitro disease modeling.
Computer engineering. Computer hardware, Control engineering systems. Automatic machinery (General)
This editorial introduces the first issue of the third volume of the Journal of Edge Computing (JEC). It provides an overview of the five articles featured in this issue, which cover diverse applications of edge computing technologies in domains such as cybersecurity, healthcare, and distributed systems. The first article summarizes the 4th Edge Computing Workshop (doors 2024), highlighting research advances in edge computing. The second article proposes an LSTM-based model for detecting cyber attacks in IoT systems using the CIC-IoT2023 dataset. The third article presents a machine-learning model for classifying respiratory system sounds to aid in the early diagnosis of respiratory diseases. The fourth article describes an IoT system that analyzes environmental data using geolocation to generate alerts about potential health risks. The fifth article explores the use of telemetry for dynamic analysis of distributed systems to identify architectural smells and anomalies. The editorial highlights the potential of edge computing technologies in addressing various challenges and expresses gratitude to the authors, reviewers, and editorial team for their contributions.
Abstract As digital transformation progresses across industries, digital twins have emerged as an important technology. In healthcare, digital twins are created by digitizing patient parameters, medical records, and treatment plans to enable personalized care, assist diagnosis, and improve planning. Data is core to digital twins, originating from physical and virtual entities as well as services. Once processed and integrated, data drives various components. Medical records are critical healthcare data but present unique challenges for digital twins. However, directly storing or encrypting medical records has issues. Plaintext risks privacy leaks while encryption hinders retrieval. To address this, we present a cloud-based solution combining post-quantum searchable encryption. Our system includes key generation using Physical Unable Functions (PUF). It encrypts medical records in cloud storage, verifies records using blockchain, and retrieves records via cloud. By integrating cloud encryption, blockchain verification and cloud retrieval, we propose a secure and efficient cloud-based medical records system for digital twins. Our implementation demonstrates the system provides users efficient and secure medical record services, compared to related designs. This highlights digital twins’ potential to transform healthcare through secure data-driven personalized care, diagnosis and planning.
Abstract Data are often correlated in real-world datasets. Existing data privacy algorithms did not consider data correlation an inherent property of datasets. This data correlation caused privacy leakages that most researchers left unnoticed. Such privacy leakages are often caused by homogeneity, background knowledge, and linkage attacks, and the probability of such attacks increases with the magnitude of correlation among data. This problem further got magnified by the large size of real-world datasets, and we refer to these large datasets as ’Big Data.’ Several researchers proposed algorithms using machine learning models, correlation analysis, and data privacy algorithms to prevent privacy leakages due to correlation in large-sized data. The current proposed work first analyses the correlation among data. We studied the Mutual Information Correlation analysis technique and the distance correlation analysis technique for data correlation analysis. We found out distance correlation analysis technique to be more accurate for high-dimensional data. It then divides the data into blocks using the correlation computed earlier and applies the differential privacy algorithm to ensure the data privacy expectations. The results are derived based upon multiple parameters such as data utility, mean average error, variation with data size, and privacy budget values. The results showed that the proposed methodology provides better data utility when compared to the works of other researchers. Also, the data privacy commitments offered by the proposed method are comparable to the other results. Thus, the proposed methodology gives a better data utility while maintaining the required data privacy commitments.
Computer engineering. Computer hardware, Information technology
Abstract Outside the explosive successful applications of deep learning (DL) in natural language processing, computer vision, and information retrieval, there have been numerous Deep Neural Networks (DNNs) based alternatives for common security-related scenarios with malware detection among more popular. Recently, adversarial learning has gained much focus. However, unlike computer vision applications, malware adversarial attack is expected to guarantee malwares’ original maliciousness semantics. This paper proposes a novel adversarial instruction learning technique, DeepMal, based on an adversarial instruction learning approach for static malware detection. So far as we know, DeepMal is the first practical and systematical adversarial learning method, which could directly produce adversarial samples and effectively bypass static malware detectors powered by DL and machine learning (ML) models while preserving attack functionality in the real world. Moreover, our method conducts small-scale attacks, which could evade typical malware variants analysis (e.g., duplication check). We evaluate DeepMal on two real-world datasets, six typical DL models, and three typical ML models. Experimental results demonstrate that, on both datasets, DeepMal can attack typical malware detectors with the mean F1-score and F1-score decreasing maximal 93.94% and 82.86% respectively. Besides, three typical types of malware samples (Trojan horses, Backdoors, Ransomware) prove to preserve original attack functionality, and the mean duplication check ratio of malware adversarial samples is below 2.0%. Besides, DeepMal can evade dynamic detectors and be easily enhanced by learning more dynamic features with specific constraints.
To reduce the changes to the initial community structure of the networks during complex network robustness optimization, the influence of the edge rewiring strategy on network community structure is analyzed, and a robustness optimization strategy based on community structure for complex network is proposed. The strategy employs the Louvain algorithm to determine the complex network community structure, and uses the Simulated Annealing(SA) algorithm to improve the internal robustness of each community in the complex network. Then an improved Smart Rewiring strategy is used to enhance the robustness of connections between communities. On this basis, the Normalized Mutual Information(NMI) indicator is used to evaluate how much the community structure is retained during robustness optimization. Experimental results on BA, WS and WU-PowerGrid networks show that compared with Smart Rewiring strategy and MA strategy, the proposed strategy can improve the network robustness while retaining the initial community structure of the network as much as possible.
The PXIe reconfigurable instrument has the ability of multi-channel parallel test, which provides a good solution to the problems of test resource competition and deadlock in the shared-resource test system. In order to realize the smooth operation of the PXIe reconfigurable instrument under the domestic operating system, PXIe device driver is developed under the Deepin operating system to solve the communication problem between the upper computer and the instrument device. Linux character device driver is introduced and the development process of PXIe device driver is designed based on the Linux character device driver structure. On this basis, the shared-memory mapping is used to improve the efficiency of data interactions between application program and driver, and Direct Memory Access(DMA) transmission is implemented based on the block and interrupt mechanism. Qt Creator is used to design the graphic interface test program to verify the working condition of the driver, and the test results show that the device driver runs stably and the data transmission is accurate and reliable, which meets the communication requirements of PXIe reconfigurable instrument.
Yor Alex Remond Recio, Rosa María Figueredo Rodríguez
Los profesionales de informática y ramas afines reciben actividades de superación a través de la modalidad a distancia. La educación a distancia con el uso de las Tecnologías de la Información y las Comunicaciones (TIC), es una necesidad para la formación continua de los profesionales. Como parte del III perfeccionamiento educativo, se lleva a cabo transformaciones en los planes de estudio de las diferentes enseñanzas. Uno de ellos es enseñar a los estudiantes a programar usando el lenguaje de programación Scratch. Los docentes responsables de dirigir el proceso de enseñanza aprendizaje no están preparados técnica ni metodológicamente en la utilización de la aplicación Scratch para estimular el pensamiento lógico algorítmico en los mismos, aspecto que fue constatado en el diagnóstico realizado para caracterizar las necesidades de superación. Este trabajo tiene como objetivo brindarle al docente una alternativa que les permita fortalecer el proceso de enseñanza aprendizaje de la Informática ante los nuevos retos y los requerimientos que demandan las actuales transformaciones de la Educación. Se desarrolló un curso de superación con contenidos de fundamentos de la programación y en el uso de la aplicación informática para su empleo, estos conocimientos se revierten en la calidad del proceso de enseñanza aprendizaje. Se comparte la experiencia desarrollada en la institución en la Escuela Internacional de Verano a Distancia 2020 (EVD2020).
Abstract Diabetes is a chronic disease or group of metabolic disease where a person suffers from an extended level of blood glucose in the body, which is either the insulin production is inadequate, or because the body’s cells do not respond properly to insulin. The constant hyperglycemia of diabetes is related to long-haul harm, brokenness, and failure of various organs, particularly the eyes, kidneys, nerves, heart, and veins. The objective of this research is to make use of significant features, design a prediction algorithm using Machine learning and find the optimal classifier to give the closest result comparing to clinical outcomes. The proposed method aims to focus on selecting the attributes that ail in early detection of Diabetes Miletus using Predictive analysis. The result shows the decision tree algorithm and the Random forest has the highest specificity of 98.20% and 98.00%, respectively holds best for the analysis of diabetic data. Naïve Bayesian outcome states the best accuracy of 82.30%. The research also generalizes the selection of optimal features from dataset to improve the classification accuracy.
Computer engineering. Computer hardware, Information technology
LIU Dongdong,LI Yong,XU Dong,RUAN Chiguang,LU Yakai,LIU Jiangbing
Routing Protocol for Low Power and Lossy Networks (RPL) results in unbalanced energy consumption due to unbalanced load.To solve this problem,a RPL multipath data transmission mechanism is proposed.In the process of network topology construction,the optimal multi-parent node set of each node is selected according to the data transmission cost.Combining with wireless link quality,node residual energy,node cache occupancy rate and the number of relay nodes,a data flow allocation metric is proposed.Based on this metric,a flow allocation strategy is proposed to maximize the load balance of the network to obtain the optimal data transmission scheme.Simulation results show that compared with RPL and ELT-RPL,this mechanism can maximize load balance and node energy consumption balance,prolong network lifetime and improve routing reliability.
Eduard-Florin PREDESCU, Alexandru STEFAN, Alexis-Valentin ZAHARIA
Software effort estimation is a hot topic for study in the last decades. The biggest challenge for project managers is to meet their goals within the given time limit. Machine learning software can take project management software to a whole new level. The objective of this paper is to show the applicability of using neural network algorithms in software effort estimation for pro-ject management. To prove the concept we are using two machine learning algorithms: Multi-layer Perceptron (MLP) and Long Short-Term Memory (LSTM). To train and test these ma-chine learning algorithms we are using the Desharnais dataset. The dataset consists of 77 sam-ple projects. From our results we have seen that Multilayer Perceptron algorithm has better performance than Long Short-Term Memory, by having a better determination coefficient for software effort estimation. Our success in implementing a machine learning that can estimate the software effort brings real benefits in the field of project management assisted by computer, further enhancing the ability of a manager to organize the tasks within the time limit of the pro-ject. Although, we need to take into consideration that we had a limited dataset that we could use so a real advancement would be to implement and test these algorithms using a real life company as a subject of testing.
Computer engineering. Computer hardware, Bibliography. Library science. Information resources
Data synchronization is the key technology for the implementation of distant double live data center.As current remote database synchronization mechanisms have low efficiency and there is no good solution for the synchronization of heterogeneous databases,this paper presents a new remote database synchronization mechanism.It analyzes the procedure of an application manipulating the database,researches the method of capturing SQL from the database connection driver,and designs a consistency algorithm for data verification.Experimental result shows that this mechanism can improve the efficiency of remote database synchronization and support the synchronization of heterogeneous databases,which provides technical support for the construction of multi-data center and disaster recovery system in the age of big data.