Matthew Jouny, Wesley Luc, F. Jiao
Hasil untuk "Technology (General)"
Menampilkan 20 dari ~22207491 hasil · dari arXiv, DOAJ, Semantic Scholar, CrossRef
A. Granić, Nikola Marangunic
A respectable amount of work dealing with Technology Acceptance Model (TAM) clearly indicates a popularity of TAM in the field of technology acceptance in general. Nevertheless, there is still a gap in existing knowledge regarding representative academic literature that underlie research on TAM in educational context. The main objective of this systematic literature review is to provide an overview of the current state of research efforts on TAM application in the field of learning and teaching for a variety of learning domains, learning technologies and types of users. Through systematic search by the use of EBSCO Discovery Service, the review has identified 71 relevant studies ranged between 2003 and 2018. The main findings indicate that TAM and its many different versions represent a credible model for facilitating assessment of diverse learning technologies. TAM's core variables, perceived ease of use and perceived usefulness, have been proven to be antecedent factors affecting acceptance of learning with technology. The paper identifies some gaps in current work and suggests areas for further investigation. The results of this systematic review provide a better understanding of TAM acceptance studies in educational context and create a firm foundation for advancing knowledge in the field. Practitioner NotesWhat is already known about this topic Technology acceptance research in teaching and learning context has become an attractive trend.A number of reviews and meta‐analysis focused on specific topics related to technology acceptance in education have been conducted.The Technology Acceptance Model (TAM) is the key model in understanding predictors of human behaviour towards potential acceptance or rejection of the technology.What this paper adds The state of current research on Technology Acceptance Model application in educational context lacks comprehensive reviews addressing variety of learning domains, learning technologies and types of users.The paper presents systematic review of relevant academic literature on Technology Acceptance Model (TAM) in the field of learning and teaching.The paper provides empirical evidence on the predictive validity of the models based on TAM presented in selected literature.The findings revealed that TAM, along with its many different versions called TAM++, is a leading scientific paradigm and credible model for facilitating assessment of diverse technological deployments in educational context.TAM's core variables, perceived ease of use and perceived usefulness, have been proven to be antecedent factors that have affected acceptance of learning with technology.Implications for practice and/or policy The systematic review adds to the body of knowledge and creates a firm foundation for advancing knowledge in the field.By following the most common research objectives and/or by filling current gaps in applied research methods, chosen sample groups and types of result analysis, an own study could be conducted.Future research may well focus on identifying additional external factors that could further explain acceptance and usage of various learning technologies. [ABSTRACT FROM AUTHOR] uracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
June-Goo Lee, Sanghoon Jun, Younghoon Cho et al.
The artificial neural network (ANN)–a machine learning technique inspired by the human neuronal synapse system–was introduced in the 1950s. However, the ANN was previously limited in its ability to solve actual problems, due to the vanishing gradient and overfitting problems with training of deep architecture, lack of computing power, and primarily the absence of sufficient data to train the computer system. Interest in this concept has lately resurfaced, due to the availability of big data, enhanced computing power with the current graphics processing units, and novel algorithms to train the deep neural network. Recent studies on this technology suggest its potentially to perform better than humans in some visual and auditory recognition tasks, which may portend its applications in medicine and healthcare, especially in medical imaging, in the foreseeable future. This review article offers perspectives on the history, development, and applications of deep learning technology, particularly regarding its applications in medical imaging.
Brandon Amos, Bartosz Ludwiczuk, M. Satyanarayanan
Eleni Adamopoulou, Lefteris Moussiades
Abstract This literature review presents the History, Technology, and Applications of Natural Dialog Systems or simply chatbots. It aims to organize critical information that is a necessary background for further research activity in the field of chatbots. More specifically, while giving the historical evolution, from the generative idea to the present day, we point out possible weaknesses of each stage. After we present a complete categorization system, we analyze the two essential implementation technologies, namely, the pattern matching approach and machine learning. Moreover, we compose a general architectural design that gathers critical details, and we highlight crucial issues to take into account before system design. Furthermore, we present chatbots applications and industrial use cases while we point out the risks of using chatbots and suggest ways to mitigate them. Finally, we conclude by stating our view regarding the direction of technology so that chatbots will become really smart.
Marc G. Bellemare, Yavar Naddaf, J. Veness et al.
In this article we introduce the Arcade Learning Environment (ALE): both a challenge problem and a platform and methodology for evaluating the development of general, domain-independent AI technology. ALE provides an interface to hundreds of Atari 2600 game environments, each one different, interesting, and designed to be a challenge for human players. ALE presents significant research challenges for reinforcement learning, model learning, model-based planning, imitation learning, transfer learning, and intrinsic motivation. Most importantly, it provides a rigorous testbed for evaluating and comparing approaches to these problems. We illustrate the promise of ALE by developing and benchmarking domain-independent agents designed using well-established AI techniques for both reinforcement learning and planning. In doing so, we also propose an evaluation methodology made possible by ALE, reporting empirical results on over 55 different games. All of the software, including the benchmark agents, is publicly available.
K. Tamilmani, Nripendra P. Rana, S. Wamba et al.
Abstract The extended unified theory of acceptance and use of technology (UTAUT2) is less than ten years old and has already garnered more than 6000 citations with extensive usage in information systems and beyond. This research employed cited reference search to systematically review studies that cited UTAUT2 originating article. Based on UTAUT2 usage, the downloaded articles were classified into four categories such as: 1) General citation, 2) UTAUT2 application, 3) UTAUT2 integration, and 4) UTAUT2 extensions. Weber's (2012) theory evaluation framework revealed UTAUT2 as a robust theory on most dimensions except for parsimony arising from the complex model. UTAUT2 extensions emerged as popular UTAUT2 utilization category as researchers extended the model with context specific variables. Finally, UTAUT2 extensions were mapped to Johns' (2006) context dimensions to identify various limitations of the existing technology adoption research and to provide multi-level framework for future researchers with libraries of context dimensions.
A. Stirling
Robert M. Bernard, Eugene F. Borokhovski, Richard F. Schmid et al.
V. Venkatesh
Much previous research has established that perceived ease of use is an important factor influencing user acceptance and usage behavior of information technologies. However, very little research has been conducted to understand how that perception forms and changes over time. The current work presents and tests an anchoring and adjustment-based theoretical model of the determinants of system-specific perceived ease of use. The model proposes control (internal and external--conceptualized as computer self-efficacy and facilitating conditions, respectively), intrinsic motivation (conceptualized as computer playfulness), and emotion (conceptualized as computer anxiety) as anchors that determine early perceptions about the ease of use of a new system. With increasing experience, it is expected that system-specific perceived ease of use, while still anchored to the general beliefs regarding computers and computer use, will adjust to reflect objective usability, perceptions of external control specific to the new system environment, and system-specific perceived enjoyment. The proposed model was tested in three different organizations among 246 employees using three measurements taken over a three-month period. The proposed model was strongly supported at all points of measurement, and explained up to 60% of the variance in system-specific perceived ease of use, which is twice as much as our current understanding. Important theoretical and practical implications of these findings are discussed.
K. Al-Saedi, M. Al-Emran, Thurasamy Ramayah et al.
Abstract To determine the most frequent factors that extended the Unified Theory of Acceptance and Use of Technology (UTAUT) in the context of Mobile payment (M-payment) adoption, a quantitative meta-analysis approach of 25 studies was undertaken. The results indicated that perceived risk, perceived trust, perceived cost, and self-efficacy were the most frequent factors that achieved significant results in the surveyed studies. Accordingly, this study is an attempt to extend the UTAUT model with these factors; proposing a general extended UTAUT model for M-payment adoption. The proposed model is validated using the partial least squares-structural equation modeling (PLS-SEM) approach. The data were collected from a total of 436 M-payment users in Oman. The results indicated that the best predictor of M-payment users’ intention to use the M-payment system is performance expectancy, followed by social influence, effort expectancy, perceived trust, perceived cost, and self-efficacy, respectively. Nonetheless, perceived risk was found to have an insignificant negative impact on the behavioral intention to use M-payment systems. The conclusions derived from this study enhance the understanding of the factors determining the adoption of M-payment systems in Oman.
Xuan Guo, Jianlong Wang
Abstract Adsorption technology has been widely applied for water and wastewater treatment due its low cost and easy operation. Understanding the adsorption kinetic could help to design the adsorption system. The pseudo-first-order (PFO) and pseudo-second-order (PSO) kinetic models were usually used, however their specific theoretical meanings and the application conditions were still unclear. In addition, for a practical adsorption system, the kinetic process is complex, which may include both the PFO and PSO kinetic process. Therefore, it is necessary to develop a general kinetic model to describe the whole adsorption process more accurately. In this paper, based on the Langmuir kinetics, and the theoretical analysis of the PFO and PSO models, a general form of adsorption kinetic model was developed. The experimental data from our previous studies and literature were used to fit the mixed-order (MO) model. The results showed that it is capable of describing both the PFO and PSO kinetic process and suitable for whole adsorption process, suggesting that it can describe the kinetics of whole adsorption process better. The MO kinetic model can be solved by using 4–5 order Runge-Kutta method, and the solving program as well as the method illustration was provided in Appendixes, which can be used by the readers who are interested in this model.
Jeremiah Mugambi Ananga, Tobia Mwalili, Samson Nyang’au Paul
The general objective of the study was to examine the role of Technology Processes on business performance of Commercial Banks in Kenya. The philosophy that guided the research is positivism philosophy. The study adopted correlational research design. The target population was commercial Banks in Kenya register by the Central Bank of Kenya. The population consisted of all 42 commercial banks in Kenya. Respondents’ population comprised of five top managers from each bank translating to 210 top managers. Slovin’s formula was used to calculate the sample size. Purposive sampling technique was used to select 138 top managers of the 42 commercial Banks in Kenya. This study used a self-administered, closed and open-ended questionnaire to obtain primary data. A pilot study was conducted to test the validity and reliability of the data collection instrument. Quantitative data was collected and analyzed in this study by calculating the response rate with descriptive statistics such as mean, standard deviation, median and proportions using the Statistical Package for Social Sciences (SPSS) version 24). Regression analysis and correlation analysis was used to carry out inferential data analysis to determine the direction and strength of the relationship between the independent and the dependent variables. In order to test the influence of information technology capability on business performance of Commercial Banks in Kenya, the study employed a hierarchical regression analysis with moderation. The study results were presented through use of tables and figures. The study concludes that technology processes has a positive and significant effect on business performance of Commercial Banks in Kenya. The study revealed that idea generation, technology acquisition and technology Implementation influence business performance of Commercial Banks in Kenya. This implies that improvement in information technology processes (idea generation, technology acquisition and technology Implementation) would lead to improvement in business performance of Commercial Banks in Kenya. Based on the findings, the study recommends that the management of commercial banks in Kenya should ensure they have in place an effective plan for idea generation, technology acquisition and technology Implementation.
Pengcheng Cao, Kai Wang, Ting Zhang et al.
Accurate extrapolation of multiphase flow behaviour in offshore pipelines is hindered by limited field data, simulator bias, and strong nonlinearities. A multi-fidelity surrogate approach with stacking ensemble is proposed to address these challenges, in which a field-trained high-fidelity expert and a simulation-trained expert are adaptively fused through a k-nearest-neighbours (k-NN) competence metric and a Lipschitz-continuous convex combiner. This design ensures mean-squared-error dominance, such that the fused predictor never underperforms the better expert and variance is suppressed in transitional regimes. Data efficiency is further enhanced by a hybrid active learning strategy (ZECR Sampling) that integrates geometric coverage with uncertainty-driven refinement. When applied to a real offshore pipeline dataset containing more than 5,700 samples, the proposed method achieves an R2 of 0.740 and reduces RMSE by over 20% compared with the best baseline. These results indicate that the framework functions not only as a fast surrogate but also as a spatially aware risk controller, enabling reliable extrapolative prediction and supporting automated, real-time decision-making in multiphase flow pipeline systems.
Rebecca Eynon, Nabeel Gillani
As belief around the potential of computational social science grows, fuelled by recent advances in machine learning, data scientists are ostensibly becoming the new experts in education. Scholars engaged in critical studies of education and technology have sought to interrogate the growing datafication of education yet tend not to use computational methods as part of this response. In this paper, we discuss the feasibility and desirability of the use of computational approaches as part of a critical research agenda. Presenting and reflecting upon two examples of projects that use computational methods in education to explore questions of equity and justice, we suggest that such approaches might help expand the capacity of critical researchers to highlight existing inequalities, make visible possible approaches for beginning to address such inequalities, and engage marginalised communities in designing and ultimately deploying these possibilities. Drawing upon work within the fields of Critical Data Studies and Science and Technology Studies, we further reflect on the two cases to discuss the possibilities and challenges of reimagining computational methods for critical research in education and technology, focusing on six areas of consideration: criticality, philosophy, inclusivity, context, classification, and responsibility.
Isabel Pedersen, Ann Hill Duin
This short paper provides a means to classify augmentation technologies to reconceptualize them as sociotechnical, discursive and rhetorical phenomena, rather than only through technological classifications. It identifies a set of value systems that constitute augmentation technologies within discourses, namely, the intent to enhance, automate, and build efficiency. This short paper makes a contribution to digital literacy surrounding augmentation technology emergence, as well as the more specific area of AI literacy, which can help identify unintended consequences implied at the design stages of these technologies.
Mengshang Liang, Changxin Xu, Mingxian Li et al.
With the deceleration of China’s economic growth, the crude economic model will progressively diminish in its competitive edge, thereby posing challenges for state-level economic and technological development zones (ETDZs) in terms of transitioning their development model and grappling with low levels of total factor productivity (TFP). This study aims to evaluate the TFP of prominent cities in China, examine the influence of the establishment of state-level ETDZs on urban TFP, and investigate the moderating effect of transportation infrastructure on this relationship. The results show that the aggregate TFP of Chinese urban areas declined from 1999 to 2020, a trend linked to structural economic adjustments and persistent underutilization of capital in several regions. The establishment of state-level ETDZs has been found to exert a notable positive influence on regional TFP. The presence of transportation infrastructure plays a moderating role in facilitating state-level ETDZs, thereby enhancing regional TFP. Among various modes of transportation, highways and railways are particularly prominent in this regard. These conclusions provide a theoretical basis and decision-making reference for further unleashing the policy potential of development zones in China.
Ehsan Latif, Gengchen Mai, Matthew Nyaaba et al.
Xinyue Wang, Weifan Lin, Weiting Zhang et al.
In this paper, the Merkle-Transformer model is introduced as an innovative approach designed for financial data processing, which combines the data integrity verification mechanism of Merkle trees with the data processing capabilities of the Transformer model. A series of experiments on key tasks, such as financial behavior detection and stock price prediction, were conducted to validate the effectiveness of the model. The results demonstrate that the Merkle-Transformer significantly outperforms existing deep learning models (such as RoBERTa and BERT) across performance metrics, including precision, recall, accuracy, and F1 score. In particular, in the task of stock price prediction, the performance is notable, with nearly all evaluation metrics scoring above 0.9. Moreover, the performance of the model across various hardware platforms, as well as the security performance of the proposed method, were investigated. The Merkle-Transformer exhibits exceptional performance and robust data security even in resource-constrained environments across diverse hardware configurations. This research offers a new perspective, underscoring the importance of considering data security in financial data processing and confirming the superiority of integrating data verification mechanisms in deep learning models for handling financial data. The core contribution of this work is the first proposition and empirical demonstration of a financial data analysis model that fuses data integrity verification with efficient data processing, providing a novel solution for the fintech domain. It is believed that the widespread adoption and application of the Merkle-Transformer model will greatly advance innovation in the financial industry and lay a solid foundation for future research on secure financial data processing.
J. de Curtò, I. de Zarzà, Juan-Carlos Cano et al.
This paper presents a novel approach to enhancing the security and reliability of drone communications through the integration of Quantum Random Number Generators (QRNG) in Frequency Hopping Spread Spectrum (FHSS) systems. We propose a multi-drone framework that leverages QRNG technology to generate truly random frequency hopping sequences, significantly improving resistance against jamming and interception attempts. Our method introduces a concurrent access protocol for multiple drones to share a QRNG device efficiently, incorporating robust error handling and a shared memory system for random number distribution. The implementation includes secure communication protocols, ensuring data integrity and confidentiality through encryption and Hash-based Message Authentication Code (HMAC) verification. We demonstrate the system’s effectiveness through comprehensive simulations and statistical analyses, including spectral density, frequency distribution, and autocorrelation studies of the generated frequency sequences. The results show a significant enhancement in the unpredictability and uniformity of frequency distributions compared to traditional pseudo-random number generator-based approaches. Specifically, the frequency distributions of the drones exhibited a relatively uniform spread across the available spectrum, with minimal discernible patterns in the frequency sequences, indicating high unpredictability. Autocorrelation analyses revealed a sharp peak at zero lag and linear decrease to zero values for other lags, confirming a general absence of periodicity or predictability in the sequences, which enhances resistance to predictive attacks. Spectral analysis confirmed a relatively flat power spectral density across frequencies, characteristic of truly random sequences, thereby minimizing vulnerabilities to spectral-based jamming. Statistical tests, including Chi-squared and Kolmogorov-Smirnov, further confirm the unpredictability of the frequency sequences generated by QRNG, supporting enhanced security measures against predictive attacks. While some short-term correlations were observed, suggesting areas for improvement in QRNG technology, the overall findings confirm the potential of QRNG-based FHSS systems in significantly improving the security and reliability of drone communications. This work contributes to the growing field of quantum-enhanced wireless communications, offering substantial advancements in security and reliability for drone operations. The proposed system has potential applications in military, emergency response, and secure commercial drone operations, where enhanced communication security is paramount.
Halaman 2 dari 1110375