{"results":[{"id":"ss_4246d60507e8164c55a8a5c0b402c5e786ddc70c","title":"A Survey of Neuromorphic Computing and Neural Networks in Hardware","authors":[{"name":"Catherine D. Schuman"},{"name":"T. Potok"},{"name":"R. Patton"},{"name":"J. Birdwell"},{"name":"Mark E. Dean"},{"name":"G. Rose"},{"name":"J. Plank"}],"abstract":"Neuromorphic computing has come to refer to a variety of brain-inspired computers, devices, and models that contrast the pervasive von Neumann computer architecture. This biologically inspired approach has created highly connected synthetic neurons and synapses that can be used to model neuroscience theories as well as solve challenging machine learning problems. The promise of the technology is to create a brain-like ability to learn and adapt, but the technical challenges are significant, starting with an accurate neuroscience model of how the brain works, to finding materials and engineering breakthroughs to build devices to support these models, to creating a programming framework so the systems can learn, to creating applications with brain-like capabilities. In this work, we provide a comprehensive survey of the research and motivations for neuromorphic computing over its history. We begin with a 35-year review of the motivations and drivers of neuromorphic computing, then look at the major research areas of the field, which we define as neuro-inspired models, algorithms and learning approaches, hardware and devices, supporting systems, and finally applications. We conclude with a broad discussion on the major research topics that need to be addressed in the coming years to see the promise of neuromorphic computing fulfilled. The goals of this work are to provide an exhaustive review of the research conducted in neuromorphic computing since the inception of the term, and to motivate further work by illuminating gaps in the field where new research is needed.","source":"Semantic Scholar","year":2017,"language":"en","subjects":["Computer Science"],"url":"https://www.semanticscholar.org/paper/4246d60507e8164c55a8a5c0b402c5e786ddc70c","is_open_access":true,"citations":772,"published_at":"","score":84.16},{"id":"crossref_10.1016/b978-0-443-15892-6.00001-8","title":"Hardware and Software","authors":[{"name":"Marilyn Wolf"}],"abstract":"","source":"CrossRef","year":2026,"language":"en","subjects":null,"doi":"10.1016/b978-0-443-15892-6.00001-8","url":"https://doi.org/10.1016/b978-0-443-15892-6.00001-8","is_open_access":true,"published_at":"","score":70},{"id":"doaj_A+Systematic+Literature+Review+on+Modern+Cryptographic+and+Authentication+Schemes+for+Securing+the+Internet+of+Things","title":"A Systematic Literature Review on Modern Cryptographic and Authentication Schemes for Securing the Internet of Things","authors":[{"name":"Tehseen Hussain"},{"name":"Fraz Ahmad"},{"name":"Dr. Zia  Ur Rehman"}],"abstract":"\nThe rapid integration of the Internet of Things (IoT) into healthcare ecosystems has revolutionized patient monitoring and data accessibility; however, it has simultaneously expanded the cyber-attack surface, leaving sensitive medical data vulnerable to sophisticated breaches. This systematic literature review (SLR) addresses the critical challenge of balancing high-level security with the severe resource constraints of medical sensors and edge devices. By synthesizing evidence from 80 high-impact studies including 18 primary research articles published between 2022 and 2025 this paper evaluates the quality and efficacy of emerging cryptographic frameworks. The methodology utilizes a rigorous quality assessment framework to categorize research into \"Strong,\" \"Moderate,\" and \"Weak\" tiers. Key findings reveal a significant paradigm shift toward lightweight symmetric ciphers, such as GIFT and PRESENT, and certificateless authentication protocols like ELWSCAS, which reduce communication overhead in narrow-band environments. The analysis further explores the role of blockchain-assisted decentralization and DNA-based encryption in mitigating Single Point of Failure risks and providing high entropy. While decentralized models significantly enhance data integrity, they frequently encounter a scalability wall regarding transaction latency. Furthermore, the review assesses quantum readiness, noting that while lattice-based standards are being ported to microcontrollers, memory footprints remain a barrier for simpler sensors. Ultimately, this SLR maps the current technical frontiers and provides a strategic roadmap for future research, emphasizing the transition toward lightweight, quantum-resistant architectures as the next essential step in securing the global healthcare IoT infrastructure.\n\n\nConflict of Interest \n\n\nThe authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.\n\n\nFunding\n\n\nThe research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.\n\n\nData Fabrication/Falsification Statement\n\n\nThe author(s) declare that no data has been fabricated, falsified, or manipulated in this study.\n\n\nParticipant Consent \n\n\nThe authors confirm that Informed consent was obtained from all participants, and confidentiality was duly maintained.\n\n\nCopyright and Licensing\n\n\nFor all articles published in the NIJEC journal, Copyright (c) of this study is with author(s).\n","source":"DOAJ","year":2026,"language":"","subjects":["Systems engineering","Engineering design","Mechanics of engineering. Applied mechanics","Computer engineering. Computer hardware","Computer software"],"url":"https://polyglot.numl.edu.pk/index.php/nijec/article/view/113","is_open_access":true,"published_at":"","score":70},{"id":"doaj_10.1088/2632-2153/ae1f05","title":"Towards instance-wise calibration: local amortized diagnostics and reshaping of conditional densities (LADaR)","authors":[{"name":"Biprateep Dey"},{"name":"David Zhao"},{"name":"Brett H Andrews"},{"name":"Jeffrey A Newman"},{"name":"Rafael Izbicki"},{"name":"Ann B Lee"}],"abstract":"Key science questions, such as galaxy distance estimation and weather forecasting, often require knowing the full predictive distribution of a target variable Y given complex inputs X . Despite recent advances in machine learning and physics-based models, it remains challenging to assess whether an initial model is calibrated for all x , and when needed, to reshape the densities of y toward ‘instance-wise’ calibration. This paper introduces the local amortized diagnostics and reshaping of conditional densities (LADaR) framework and proposes a new computationally efficient algorithm ( Cal-PIT ) that produces interpretable local diagnostics and provides a mechanism for adjusting conditional density estimates (CDEs). Cal-PIT learns a single interpretable local probability–probability map from calibration data that identifies where and how the initial model is miscalibrated across feature space, which can be used to morph CDEs such that they are well-calibrated. We illustrate the LADaR framework on synthetic examples, including probabilistic forecasting from image sequences, akin to predicting storm wind speed from satellite imagery. Our main science application involves estimating the probability density functions of galaxy distances given photometric data, where Cal-PIT achieves better instance-wise calibration than all 11 other literature methods in a benchmark data challenge, demonstrating its utility for next-generation cosmological analyzes ^9 .","source":"DOAJ","year":2025,"language":"","subjects":["Computer engineering. Computer hardware","Electronic computers. Computer science"],"doi":"10.1088/2632-2153/ae1f05","url":"https://doi.org/10.1088/2632-2153/ae1f05","is_open_access":true,"published_at":"","score":69},{"id":"doaj_10.1007/s43926-025-00100-0","title":"Explainable and perturbation-resilient model for cyber-threat detection in industrial control systems Networks","authors":[{"name":"Urslla Uchechi Izuazu"},{"name":"Cosmas Ifeanyi Nwakanma"},{"name":"Dong-Seong Kim"},{"name":"Jae Min Lee"}],"abstract":"Abstract Deep learning-based intrusion detection systems (DL-IDS) have proven effective in detecting cyber threats. However, their vulnerability to adversarial attacks and environmental noise, particularly in industrial settings, limits practical application. Current IDS models often assume ideal conditions, overlooking noise and adversarial manipulations, leading to degraded performance when deployed in real-world environments. Additionally, the black-box nature of DL model complicates decision-making, especially in industrial control systems (ICS) network, where understanding model behavior is crucial. This paper introduces the eXplainable Cyber-Threat Detection Framework (XC-TDF), a novel solution designed to overcome these challenges. XC-TDF enhances robustness against noise and adversarial attacks using regularization and adversarial training respectively, and also improves transparency through an eXplainable Artificial Intelligence (XAI) module. Simulation results demonstrate its effectiveness, showing resilience to perturbation by achieving commendable accuracy of 100% and 99.4% on the Wustl-IIoT2021 and Edge-IIoT datasets, respectively.","source":"DOAJ","year":2025,"language":"","subjects":["Computer engineering. Computer hardware","Computer software"],"doi":"10.1007/s43926-025-00100-0","url":"https://doi.org/10.1007/s43926-025-00100-0","is_open_access":true,"published_at":"","score":69},{"id":"ss_a6c330d9d239ee5503e094217cdfbfd02dacde3e","title":"Large-scale experiment in STEM education for high school students using artificial intelligence kit based on computer vision and Python","authors":[{"name":"Meechai Lohakan"},{"name":"C. Seetao"}],"abstract":"This study proposes an artificial intelligence (AI) kit for high school students in science, technology, engineering, and mathematics (STEM). The AI kit includes an edge AI machine and electronic components. A compact, purpose-built kit resembling a laptop was designed for ease of replication and portability. Utilizing pre-trained convolutional neural network models and computer vision algorithms, five Thai schools participated in on-site instructions. A quasi-experimental study assessed the students' learning outcomes using a paired sample t-test. Results revealed improved knowledge and reduced score variation. Additionally, gender analysis confirmed that both male and female students met the learning criteria. The students expressed satisfaction with the distinctive hardware and learning method employed during the class activities. Notably, the test results demonstrated that the AI kit enhanced students’ enthusiasm and facilitated comprehension.","source":"Semantic Scholar","year":2024,"language":"en","subjects":["Medicine"],"doi":"10.1016/j.heliyon.2024.e31366","url":"https://www.semanticscholar.org/paper/a6c330d9d239ee5503e094217cdfbfd02dacde3e","pdf_url":"http://www.cell.com/article/S2405844024073973/pdf","is_open_access":true,"citations":19,"published_at":"","score":68.57},{"id":"ss_b0e1b35ef9418a02d8b4f87f926a29f8a6cad2d5","title":"HamLib: A Library of Hamiltonians for Benchmarking Quantum Algorithms and Hardware","authors":[{"name":"Nicolas P. D. Sawaya"},{"name":"Daniel Marti-Dafcik"},{"name":"Y. Ho"},{"name":"Daniel P. Tabor"},{"name":"D. E. B. Neira"},{"name":"Alicia B. Magann"},{"name":"S. Premaratne"},{"name":"P. Dubey"},{"name":"A. Matsuura"},{"name":"Nathan L. Bishop"},{"name":"W. A. Jong"},{"name":"S. Benjamin"},{"name":"Ojas D. Parekh"},{"name":"N. Tubman"},{"name":"Katherine Klymko"},{"name":"Daan Camps"}],"abstract":"For a considerable time, large datasets containing problem instances have proven valuable for analyzing computer hardware, software, and algorithms. One notable example of the value of large datasets is ImageNet [1], a vast repository of images that has been instrumental in testing numerous deep learning packages. Similarly, in the domain of computational chemistry and materials science, the availability of extensive datasets such as the Protein Data Bank [2], the Materials Project [3], and QM9 [4] has greatly facilitated the evaluation of new algorithms and software approaches, while also promoting standardization within the field. These well-defined datasets and problem instances, in turn, serve as the foundation for creating benchmarking suites like MLPerf [5] and LINPACK [6], [7]. These suites enable fair and rigorous comparisons of different methodologies and solutions, fostering continuous advancements in various areas of computer science and beyond.","source":"Semantic Scholar","year":2023,"language":"en","subjects":["Computer Science","Physics"],"doi":"10.22331/q-2024-12-11-1559","url":"https://www.semanticscholar.org/paper/b0e1b35ef9418a02d8b4f87f926a29f8a6cad2d5","pdf_url":"https://doi.org/10.22331/q-2024-12-11-1559","is_open_access":true,"citations":38,"published_at":"","score":68.14},{"id":"doaj_10.1016/j.array.2024.100345","title":"A comprehensive review of explainable AI for disease diagnosis","authors":[{"name":"Al Amin Biswas"}],"abstract":"Nowadays, artificial intelligence (AI) has been utilized in several domains of the healthcare sector. Despite its effectiveness in healthcare settings, its massive adoption remains limited due to the transparency issue, which is considered a significant obstacle. To achieve the trust of end users, it is necessary to explain the AI models' output. Therefore, explainable AI (XAI) has become apparent as a potential solution by providing transparent explanations of the AI models' output. In this review paper, the primary aim is to review articles that are mainly related to machine learning (ML) or deep learning (DL) based human disease diagnoses, and the model's decision-making process is explained by XAI techniques. To do that, two journal databases (Scopus and the IEEE Xplore Digital Library) were thoroughly searched using a few predetermined relevant keywords. The PRISMA guidelines have been followed to determine the papers for the final analysis, where studies that did not meet the requirements were eliminated. Finally, 90 Q1 journal articles are selected for in-depth analysis, covering several XAI techniques. Then, the summarization of the several findings has been presented, and appropriate responses to the proposed research questions have been outlined. In addition, several challenges related to XAI in the case of human disease diagnosis and future research directions in this sector are presented.","source":"DOAJ","year":2024,"language":"","subjects":["Computer engineering. Computer hardware","Electronic computers. Computer science"],"doi":"10.1016/j.array.2024.100345","url":"http://www.sciencedirect.com/science/article/pii/S2590005624000110","is_open_access":true,"published_at":"","score":68},{"id":"ss_156658d76e7c22c30bf31e3111b85caaef3da089","title":"Engineering Challenges for AI-Supported Computer Vision in Small Uncrewed Aerial Systems","authors":[{"name":"Muhammed Tawfiq Chowdhury"},{"name":"J. Cleland-Huang"}],"abstract":"Computer Vision (CV) is used in a broad range of Cyber-Physical Systems such as surgical and factory floor robots and autonomous vehicles including small Unmanned Aerial Systems (sUAS). It enables machines to perceive the world by detecting and classifying objects of interest, reconstructing 3D scenes, estimating motion, and maneuvering around objects. CV algorithms are developed using diverse machine learning and deep learning frameworks, which are often deployed on limited resource edge devices. As sUAS rely upon an accurate and timely perception of their environment to perform critical tasks, problems related to CV can create hazardous conditions leading to crashes or mission failure. In this paper, we perform a systematic literature review (SLR) of CV-related challenges associated with CV, hardware, and software engineering. We then group the reported challenges into five categories and fourteen sub-challenges and present existing solutions. As current literature focuses primarily on CV and hardware challenges, we close by discussing implications for Software Engineering, drawing examples from a CV-enhanced multi-sUAS system.","source":"Semantic Scholar","year":2023,"language":"en","subjects":["Computer Science"],"doi":"10.1109/CAIN58948.2023.00033","url":"https://www.semanticscholar.org/paper/156658d76e7c22c30bf31e3111b85caaef3da089","is_open_access":true,"citations":7,"published_at":"","score":67.21000000000001},{"id":"doaj_Introducing+the+Walkability+Index%2C+an+Index+That+Measures+the+Walkability+of+Public+Spaces","title":"Introducing the Walkability Index, an Index That Measures the Walkability of Public Spaces","authors":[{"name":"Viktória Hideg"},{"name":"Emese Makó"}],"abstract":"In recent years an increasing number of cities and transport planning documents (such as Sustainable Urban Mobility Plan) aim to reduce car traffic and promote active modes of transport – walking and cycling. The development of active modes of transport is increasingly becoming a focus of urban planning. However, detailed information on the needs of pedestrians and aspects of the assessment of a pedestrian-friendly environment are usually not available. In most cases, the only indicator of the effectiveness of improvements is the modal split and the rate of pedestrians. An objective assessment method is needed to help identify areas that need to be developed for walking.\nThe various planning regulations and legislation provide a framework for the design of pedestrian infrastructure, but many aspects that make public spaces attractive and pedestrian-friendly (green spaces, aesthetics, sense of safety, etc.) are not included in the regulations.\nThis problem can be addressed by the walkability index, which can provide an objective, data-based measure of how pedestrian-friendly an area is. It can also be a tool for analysing and monitoring. It can show areas where walking conditions are inadequate and intervention is needed. Regularly carrying out the survey can also serve to analyse the impact of measures taken in the meantime. This article describes the methodology and application of the walkability index.","source":"DOAJ","year":2023,"language":"","subjects":["Chemical engineering","Computer engineering. Computer hardware"],"url":"https://www.cetjournal.it/index.php/cet/article/view/14387","is_open_access":true,"published_at":"","score":67},{"id":"ss_f16a5862d8ef9d725d1d51b893dbd6e269d642b2","title":"Case of Study in Online Course of Computer Engineering during COVID-19 Pandemic","authors":[{"name":"P. Lamo"},{"name":"Mikel Perales"},{"name":"Luis de-la-Fuente-Valentín"}],"abstract":"Practical activities and laboratories, where the students handle hardware devices, are an important part of the curriculum in STEAM degrees. In face-to-face learning, the students go to a specific classroom where the hardware is available. However, laboratories are one of the challenges of distance education, due to the impossibility of the students attending classes at a certain place. This is especially relevant since the COVID-19 pandemic, thanks to the increase of students enrolled in distance education. Different approaches to tackle this problem have been adopted, ranging from mixed models where lectures are online but physical attendance to laboratories is required to purely virtual models where virtual worlds or augmented reality have been used to simulate the real hardware. This paper presents the case of study of a flexible laboratory for the use of Arduino in a Computer Technology course with 153 students, geographically distributed in Spain and Latin America. The goal of the case study is to study the impact of such a flexible laboratory in the course, based on four fundamental parameters: student access to the online lectures, participation in the course and marks obtained, and satisfaction surveys. The results show that students have increased their marks by 28.8% and their class attendance by 247.18%, doing more elaborate and complex work than in previous courses. Therefore, it is considered that they have satisfactorily integrated the knowledge acquired during the subject, and the projects with Arduino in Computer Technology have an impact on the flexibility and personalization of the education, motivate students and increase its educational productivity and effect on the quality of education, influencing the learning experience.","source":"Semantic Scholar","year":2022,"language":"en","subjects":["Medicine"],"doi":"10.3390/electronics11040578","url":"https://www.semanticscholar.org/paper/f16a5862d8ef9d725d1d51b893dbd6e269d642b2","pdf_url":"https://www.mdpi.com/2079-9292/11/4/578/pdf?version=1645018159","is_open_access":true,"citations":11,"published_at":"","score":66.33},{"id":"ss_ca739ecfa29254f478edfca6367989883ab32cd9","title":"A Survey on Quaternion Algebra and Geometric Algebra Applications in Engineering and Computer Science 1995–2020","authors":[{"name":"E. Bayro-Corrochano"}],"abstract":"Geometric Algebra (GA) has proven to be an advanced language for mathematics, physics, computer science, and engineering. This review presents a comprehensive study of works on Quaternion Algebra and GA applications in computer science and engineering from 1995 to 2020. After a brief introduction of GA, the applications of GA are reviewed across many fields. We discuss the characteristics of the applications of GA to various problems of computer science and engineering. In addition, the challenges and prospects of various applications proposed by many researchers are analyzed. We analyze the developments using GA in image processing, computer vision, neurocomputing, quantum computing, robot modeling, control, and tracking, as well as improvement of computer hardware performance. We believe that up to now GA has proven to be a powerful geometric language for a variety of applications. Furthermore, there is evidence that this is the appropriate geometric language to tackle a variety of existing problems and that consequently, step-by-step GA-based algorithms should continue to be further developed. We also believe that this extensive review will guide and encourage researchers to continue the advancement of geometric computing for intelligent machines.","source":"Semantic Scholar","year":2021,"language":"en","subjects":["Computer Science"],"doi":"10.1109/ACCESS.2021.3097756","url":"https://www.semanticscholar.org/paper/ca739ecfa29254f478edfca6367989883ab32cd9","pdf_url":"https://ieeexplore.ieee.org/ielx7/6287639/9312710/09488174.pdf","is_open_access":true,"citations":37,"published_at":"","score":66.11},{"id":"doaj_10.19678/j.issn.1000-3428.0060237","title":"Group-Strategy-Proof Virtual Traffic Light under V2V Environment","authors":[{"name":"SONG Wei, ZHAO Huifen, CAI Wenqin, ZHOU Wanqiang"}],"abstract":"The Virtual Traffic Light (VTL) in a Vehicle-to-Vehicle (V2V) environment can negotiate the right-of-way allocation through the information directly exchanged between vehicles.When the equipment obtains relevant information, the vehicle can strategically provide information to obtain the priority right of way.To apply to a scene where unmeasurable factors affect the right of way, a virtual traffic light with group strategy protection characteristics is proposed.By abstracting the real information provided by vehicles into a cost allocation and cooperative game, a group strategy protection auction mechanism is designed, and the Shapley value is used to calculate the cost allocation of each vehicle as the payment of vehicles.On this basis, the green light signal is established according to the real evaluation value in the auction results, and the green light signal generated by multiple auctions is integrated through the signal merging algorithm to produce a reasonable right-of-way allocation.The experimental results show that the virtual traffic light has the characteristics of group strategy protection, which can prevent vehicles from forming an alliance of false information to obtain benefits and can also prevent vehicles from obtaining the right-of-way priority through false information.Compared with the virtual traffic light with a fixed threshold of the number of green lights, the virtual traffic lights protected by the group strategy show some improvement in the overall average driving time and the average driving time of high-value vehicles.","source":"DOAJ","year":2022,"language":"","subjects":["Computer engineering. Computer hardware","Computer software"],"doi":"10.19678/j.issn.1000-3428.0060237","url":"https://www.ecice06.com/fileup/1000-3428/PDF/20220437.pdf","pdf_url":"https://www.ecice06.com/fileup/1000-3428/PDF/20220437.pdf","is_open_access":true,"published_at":"","score":66},{"id":"doaj_10.1186/s13677-022-00286-6","title":"Agent-based multi-tier SLA negotiation for intercloud","authors":[{"name":"Lin Li"},{"name":"Li Liu"},{"name":"Shalin Huang"},{"name":"Shibiao Lv"},{"name":"Kaibiao Lin"},{"name":"Shunzhi Zhu"}],"abstract":"Abstract The evolving intercloud enables idle resources to be traded among cloud providers to facilitate utilization optimization and to improve the cost-effectiveness of the service for cloud consumers. However, several challenges are raised for this multi-tier dynamic market, in which cloud providers not only compete for consumer requests but also cooperate with each other. To establish a healthier and more efficient intercloud ecosystem, in this paper a multi-tier agent-based fuzzy constraint-directed negotiation (AFCN) model for a fully distributed negotiation environment without a broker to coordinate the negotiation process is proposed. The novelty of AFCN is the use of a fuzzy membership function to represent imprecise preferences of the agent, which not only reveals the opponent’s behavior preference but can also specify the possibilities prescribing the extent to which the feasible solutions are suitable for the agent’s behavior. Moreover, this information can guide each tier of negotiation to generate a more favorable proposal. Thus, the multi-tier AFCN can improve the negotiation performance and the integrated solution capacity in the intercloud. The experimental results demonstrate that the proposed multi-tier AFCN model outperforms other agent negotiation models and demonstrates the efficiency and scalability of the intercloud in terms of the level of satisfaction, the ratio of successful negotiation, the average revenue of the cloud provider, and the buying price of the unit cloud resource.","source":"DOAJ","year":2022,"language":"","subjects":["Computer engineering. Computer hardware","Electronic computers. Computer science"],"doi":"10.1186/s13677-022-00286-6","url":"https://doi.org/10.1186/s13677-022-00286-6","is_open_access":true,"published_at":"","score":66},{"id":"ss_5e71ba88770f81508e6f2c3f3cf783aaadc82dc6","title":"A virtual reality experiment system for an introductory computer hardware course","authors":[{"name":"Xinrun Chen"},{"name":"Hengxin Chen"},{"name":"Songtao Guo"},{"name":"Jie Li"},{"name":"Jie Zhang"},{"name":"Zihao Li"}],"abstract":"The University Computer Foundation (UCF) course, which is a compulsory theoretical course for freshmen, especially science and engineering department students, lays the foundation for more advanced courses. However, the UCF course covers a wide variety of concepts, including the computer hardware, which are difficult for freshmen to comprehend. In recent years, virtual reality (VR) has become prevalent in many educational settings. To improve the teaching of the UCF course, we developed a VR program about introductory computer hardware and invited freshmen from two universities in China to take part in the experiment as participants. Four aspects of computer contents, namely the evolutionary history of computers, the computer components, the computer assembly, and the workflow of computer hardware, are displayed in the program. During the experiment, students' behavioral data were recorded. Then, the data were analyzed and the reasons were speculated for the differences among groups of various categories. From the experiment, the relationships between the students' behavior and their groups were found, and we demonstrated that our VR program is effective at attracting the students' curiosity and increasing their understanding of computer hardware in the UCF course. The study concludes with suggestions for practitioners and researchers in the field of VR for university education.","source":"Semantic Scholar","year":2021,"language":"en","subjects":["Computer Science"],"doi":"10.1002/cae.22418","url":"https://www.semanticscholar.org/paper/5e71ba88770f81508e6f2c3f3cf783aaadc82dc6","is_open_access":true,"citations":8,"published_at":"","score":65.24000000000001},{"id":"ss_3e698bd33e438570a171de6b78e01506a1245110","title":"Rigorous engineering for hardware security: Formal modelling and proof in the CHERI design and implementation process","authors":[{"name":"Kyndylan Nienhuis"},{"name":"Alexandre Joannou"},{"name":"Thomas Bauereiss"},{"name":"A. Fox"},{"name":"M. Roe"},{"name":"B. Campbell"},{"name":"Matthew Naylor"},{"name":"Robert M. Norton"},{"name":"S. Moore"},{"name":"P. Neumann"},{"name":"I. Stark"},{"name":"R. Watson"},{"name":"Peter Sewell"}],"abstract":"The root causes of many security vulnerabilities include a pernicious combination of two problems, often regarded as inescapable aspects of computing. First, the protection mechanisms provided by the mainstream processor architecture and C/C++ language abstractions, dating back to the 1970s and before, provide only coarse-grain virtual-memory-based protection. Second, mainstream system engineering relies almost exclusively on test-and-debug methods, with (at best) prose specifications. These methods have historically sufficed commercially for much of the computer industry, but they fail to prevent large numbers of exploitable bugs, and the security problems that this causes are becoming ever more acute.In this paper we show how more rigorous engineering methods can be applied to the development of a new security-enhanced processor architecture, with its accompanying hardware implementation and software stack. We use formal models of the complete instruction-set architecture (ISA) at the heart of the design and engineering process, both in lightweight ways that support and improve normal engineering practice - as documentation, in emulators used as a test oracle for hardware and for running software, and for test generation - and for formal verification. We formalise key intended security properties of the design, and establish that these hold with mechanised proof. This is for the same complete ISA models (complete enough to boot operating systems), without idealisation.We do this for CHERI, an architecture with hardware capabilities that supports fine-grained memory protection and scalable secure compartmentalisation, while offering a smooth adoption path for existing software. CHERI is a maturing research architecture, developed since 2010, with work now underway on an Arm industrial prototype to explore its possible adoption in mass-market commercial processors. The rigorous engineering work described here has been an integral part of its development to date, enabling more rapid and confident experimentation, and boosting confidence in the design.","source":"Semantic Scholar","year":2020,"language":"en","subjects":["Computer Science"],"doi":"10.1109/SP40000.2020.00055","url":"https://www.semanticscholar.org/paper/3e698bd33e438570a171de6b78e01506a1245110","pdf_url":"https://ieeexplore.ieee.org/ielx7/9144328/9152199/09152777.pdf","is_open_access":true,"citations":41,"published_at":"","score":65.22999999999999},{"id":"doaj_10.1186/s42400-020-00067-1","title":"Efficient functional encryption for inner product with simulation-based security","authors":[{"name":"Wenbo Liu"},{"name":"Qiong Huang"},{"name":"Xinjian Chen"},{"name":"Hongbo Li"}],"abstract":"Abstract Functional encryption (FE) is a novel paradigm for encryption scheme which allows tremendous flexibility in accessing encrypted information. In FE, a user can learn specific function of encrypted messages by restricted functional key and reveal nothing else about the messages. Inner product encryption (IPE) is a special type of functional encryption where the decryption algorithm, given a ciphertext related to a vector x and a secret key related to a vector y, computes the inner product x·y. In this paper, we construct an efficient private-key functional encryption (FE) for inner product with simulation-based security, which is much stronger than indistinguishability-based security, under the External Decisional Linear assumption in the standard model. Compared with the existing schemes, our construction is faster in encryption and decryption, and the master secret key, secret keys and ciphertexts are shorter.","source":"DOAJ","year":2021,"language":"","subjects":["Computer engineering. Computer hardware","Electronic computers. Computer science"],"doi":"10.1186/s42400-020-00067-1","url":"https://doi.org/10.1186/s42400-020-00067-1","is_open_access":true,"published_at":"","score":65},{"id":"doaj_10.1186/s40537-021-00418-w","title":"CaReAl: capturing read alignments in a BAM file rapidly and conveniently","authors":[{"name":"Yoomi Park"},{"name":"Heewon Seo"},{"name":"Kyunghun Yoo"},{"name":"Ju Han Kim"}],"abstract":"Abstract Some of the variants detected by high-throughput sequencing (HTS) are often not reproducible. To minimize the technical-induced artifacts, secondary experimental validation is required but this step is unnecessarily slow and expensive. Thus, developing a rapid and easy to use visualization tool is necessary to systematically review the statuses of sequence read alignments. Here, we developed a high-performance alignment capturing tool, CaReAl, for visualizing the read-alignment status of nucleotide sequences and associated genome features. CaReAl is optimized for the systematic exploration of regions of interest by visualizing full-depth read-alignment statuses in a set of PNG files. CaReAl was 7.5 times faster than IGV ‘snapshot’, the only stand-alone tool which provides an automated snapshot of sequence reads. This rapid user-programmable capturing tool is useful for obtaining read-level data for evaluating variant calls and detecting technical biases. The multithreading and sequential wide-genome-range-capturing functionalities of CaReAl aid the efficient manual review and evaluation of genome sequence alignments and variant calls. CaReAl is a rapid and convenient tool for capturing aligned reads in BAM. CaReAl facilitates the acquisition of highly curated data for obtaining reliable analytic results.","source":"DOAJ","year":2021,"language":"","subjects":["Computer engineering. Computer hardware","Information technology","Electronic computers. Computer science"],"doi":"10.1186/s40537-021-00418-w","url":"https://doi.org/10.1186/s40537-021-00418-w","is_open_access":true,"published_at":"","score":65},{"id":"doaj_10.1002/aisy.202100012","title":"Photoresponsive Biomimetic Soft Robots Enabled by Near‐Infrared‐Driven and Ultrarobust Sandwich‐Structured Nanocomposite Films","authors":[{"name":"Yi Yu"},{"name":"Ran Peng"},{"name":"Zihe Chen"},{"name":"Li Yu"},{"name":"Jinhua Li"},{"name":"Jianying Wang"},{"name":"Xinyu Liu"},{"name":"Qian Wang"},{"name":"Xianbao Wang"}],"abstract":"Soft robots, intelligent structures built up of smart soft materials, are capable of being programmed to perform delicate work. Recently, plenty of biomimetic soft robots with functionalities of grasping, sensing, searching, and transporting have been exploited by emulating activities of living creatures adapting to ecological environments. However, mass production of biomimetic soft robots has remained a grand challenge while maintaining stable pre‐engineered functionalities under distinct circumstances, which significantly constrains their practical applications. To this end, a facile and scalable approach that can be utilized for mass‐producing sandwich‐structured photoresponsive polyimide (PI)/Au/low‐density polyethylene (LDPE) nanocomposite films is reported. Attributed to the remote and precise‐driven mode, reversible and stable actuation behavior, and the ultrarobust mechanical properties of the sandwich‐structured PI/Au/LDPE nanocomposite films, it was possible to devise a variety of photoresponsive biomimetic soft robots such as artificial flytrap, directionally moveable caterpillar‐inspired walker, and dolphin‐like cruisable and loadable swimmer via simply tailoring them into predesigned geometries.","source":"DOAJ","year":2021,"language":"","subjects":["Computer engineering. Computer hardware","Control engineering systems. Automatic machinery (General)"],"doi":"10.1002/aisy.202100012","url":"https://doi.org/10.1002/aisy.202100012","is_open_access":true,"published_at":"","score":65},{"id":"ss_8b5ded5ecf459a5caf11aae32d7b828c40e61ef2","title":"Neuromorphic Engineering for Hardware Computational Acceleration and Biomimetic Perception Motion Integration","authors":[{"name":"Shuiyuan Wang"},{"name":"Xiaozhang Chen"},{"name":"Xiaohe Huang"},{"name":"David-Wei Zhang"},{"name":"P. Zhou"}],"abstract":"In the tide of artificial intelligence evolution, the demand for data computing has exploded, and the von Neumann architecture computer with separate memory and computing units require cumbersome data interaction, which leads to serious degradation in performance and efficiency. Biologically inspired neuromorphic engineering performs digital/analog computations in memory, with massive parallelism and high energy efficiency, making it a promising candidate to get out of the woods. Memristive device‐based artificial synapses and neurons are building blocks to form hardware neural networks for computing acceleration. In addition, it enables the implementation of integrated bionic perception and motion systems to mimic the human peripheral nervous system for information sensing and processing. Herein, the biological basis and inspiration are described first, and the memristive synapses and circuit‐emulation neurons used for neuromorphic engineering are addressed and evaluated as well as the mechanisms. The computational acceleration and bionic perception motion integration of neuromorphic systems are discussed. Finally, the challenges and opportunities for neuromorphic engineering to accelerate computation and enrich biomimetic perception motion functions are prospected, and it is hoped that light is shed on future advances.","source":"Semantic Scholar","year":2020,"language":"en","subjects":["Computer Science"],"doi":"10.1002/aisy.202000124","url":"https://www.semanticscholar.org/paper/8b5ded5ecf459a5caf11aae32d7b828c40e61ef2","pdf_url":"https://onlinelibrary.wiley.com/doi/pdfdirect/10.1002/aisy.202000124","is_open_access":true,"citations":27,"published_at":"","score":64.81}],"total":8495952,"page":1,"page_size":20,"sources":["CrossRef","DOAJ","Semantic Scholar"],"query":"Computer engineering. Computer hardware"}