The growth rate of scientific publication has been studied from 1907 to 2007 using available data from a number of literature databases, including Science Citation Index (SCI) and Social Sciences Citation Index (SSCI). Traditional scientific publishing, that is publication in peer-reviewed journals, is still increasing although there are big differences between fields. There are no indications that the growth rate has decreased in the last 50 years. At the same time publication using new channels, for example conference proceedings, open archives and home pages, is growing fast. The growth rate for SCI up to 2007 is smaller than for comparable databases. This means that SCI was covering a decreasing part of the traditional scientific literature. There are also clear indications that the coverage by SCI is especially low in some of the scientific areas with the highest growth rate, including computer science and engineering sciences. The role of conference proceedings, open access archives and publications published on the net is increasing, especially in scientific fields with high growth rates, but this has only partially been reflected in the databases. The new publication channels challenge the use of the big databases in measurements of scientific productivity or output and of the growth rate of science. Because of the declining coverage and this challenge it is problematic that SCI has been used and is used as the dominant source for science indicators based on publication and citation numbers. The limited data available for social sciences show that the growth rate in SSCI was remarkably low and indicate that the coverage by SSCI was declining over time. National Science Indicators from Thomson Reuters is based solely on SCI, SSCI and Arts and Humanities Citation Index (AHCI). Therefore the declining coverage of the citation databases problematizes the use of this source.
Time is an exceptional dimension that is common to many application domains such as medicine, engineering, business, or science. Due to the distinct characteristics of time, appropriate visual and analytical methods are required to explore and analyze them. This book starts with an introduction to visualization and historical examples of visual representations. At its core, the book presents and discusses a systematic view of the visualization of time-oriented data along three key questions: what is being visualized (data), why something is visualized (user tasks), and how it is presented (visual representation). To support visual exploration, interaction techniques and analytical methods are required that are discussed in separate chapters. A large part of this book is devoted to a structured survey of 101 different visualization techniques as a reference for scientists conducting related research as well as for practitioners seeking information on how their time-oriented data can best be visualized.
Eva B. Aamodt, Martin Røvang, Mona K. Beyer
et al.
BackgroundMeasures of white matter hyperintensities (WMHs) represent a crucial part of post-stroke outcome prediction. Automatic WMH segmentation has proven particularly challenging in stroke cases. Using an improved method for WMH segmentation that incorporates stroke lesions, we set out to explore factors associated with higher WMH burden, as well as the association between WMH burden and post-stroke dependency across two different countries that may demonstrate significant variation in radiological presentation.MethodsA total of 384 acute ischemic stroke (AIS) survivors from the Norwegian Cognitive Impairment After Stroke (Nor-COAST; NO) study and the Houston Methodist Registry of Neurological Endpoint Assessments among Patients with Ischemic and Hemorrhagic Stroke (REINAH; US) database were analyzed. MRI and clinical data were collected upon acute care hospital admission. WMHs were measured automatically using the nnU-Net methodology, taking into account the acute stroke lesion.ResultsNo significant difference in WMH percentage was found between sites. Factors associated with higher WMH burden included only age in NO, while in US, very high age (≥ 85), smoking, and being underweight were key factors. The two sites showed significant differences in demographics and clinical characteristics: the US cohort exhibited greater racial heterogeneity, higher body mass index (BMI) with more extremely obese patients, higher National Institutes of Health Stroke Scale (NIHSS) scores, and more thrombectomies, whereas the NO cohort exhibited more tobacco use, hypercholesterolemia, and longer stay at the hospital. Post-stroke dependency was initially associated with higher WMH percentage overall but only remained significant after adjusment in Norwegians aged ≥85, while in the US, dependency was driven by stroke severity and treatment after adjustment.ConclusionCohorts from the US and Norway exhibit no significant difference in WMH burden, but differ in the factors associated with WMHs.
Machine learning systems deployed in real-world environments frequently encounter data imperfections such as noise, missing values, class imbalance, and distribution shifts. Despite substantial progress in model development, most evaluation protocols rely on clean benchmark datasets, creating a gap between laboratory performance and operational reliability. Existing robustness studies often focus on isolated perturbation types or single model families, lacking a unified benchmarking framework. This study proposes a structured and reproducible benchmarking methodology to systematically evaluate model robustness under controlled data degradation scenarios. Multiple classical machine learning algorithms and deep learning models were assessed across diverse benchmark datasets. Controlled perturbations—including feature noise, label corruption, missingness mechanisms, imbalance ratios, and covariate shifts—were introduced at progressive levels. Performance was evaluated using predictive metrics, robustness degradation rate (RDR), and computational efficiency, with statistical validation across repeated experimental runs. Results indicate that ensemble-based methods consistently achieved the strongest robustness, maintaining degradation rates below 10% under moderate noise and imbalance conditions. Deep neural networks demonstrated superior clean-data accuracy but experienced sharper degradation under structured corruption and distribution shifts. Mitigation strategies such as regularization and resampling reduced degradation by 5–12% under moderate perturbations but showed limited effectiveness under extreme conditions. The findings demonstrate that robustness is multidimensional and dependent on alignment between model inductive bias and data imperfection type. The proposed benchmarking framework provides practical guidance for selecting machine learning models suited to imperfect data environments, advancing reliable and deployment-ready AI systems
Abstract
BackgroundThe adoption of common data models (CDMs) has transformed pharmacoepidemiologic research by enabling standardized data formatting and shared analytical tools across institutions. These models facilitate large-scale, multicenter studies and support timely real-world evidence generation. However, no comprehensive global evaluation of CDM applications in pharmacoepidemiology has been conducted.
ObjectiveThis study aimed to conduct a systematic review and bibliometric analysis to map the landscape of CDM usage in pharmacoepidemiology, including publication trends, institutional authors and collaborations, and citation impacts.
MethodsIn total, 5 English databases (PubMed, Web of Science, Embase, Scopus, and Virtual Health Library) and 4 Chinese databases (CNKI, Wan-Fang Data, VIP, and SinoMed) were searched for studies applying CDMs in pharmacoepidemiology from database inception to January 2024. Two reviewers independently screened studies and extracted information about basic publication details, methodological details, and exposure and outcome information. The studies were categorized into 2 groups according to their Total Citations per Year (TCpY), and a comparative analysis was conducted to examine the differences in characteristics between the 2 groups.
ResultsA total of 308 studies published between 1997 and 2024 were included, involving 1580 authors across 32 countries and 140 journals. The United States led in both publication volume and citation counts, followed by South Korea. Among the 10 most cited studies, 7 used the Vaccine Safety Datalink, 2 used Sentinel, and one used Observational Medical Outcomes Partnership. Studies were stratified by TCpY to reduce citation bias from publication timing. Comparative analysis showed that high-TCpY studies were significantly more associated with multicenter collaboration (PPP
ConclusionsThis study presents the first bibliometric overview of CDM-based pharmacoepidemiologic research. The consistent output from United States institutions and increasing engagement from South Korea underscore their central roles in this field. High-TCpY studies tend to be multicenter, collaborative, and vaccine-focused, reflecting structural factors linked to research visibility and influence. Stratified citation analysis supports the value of real-world data integration and international cooperation in producing impactful studies. The dominance of limited-income countries in collaboration networks highlights a need for broader inclusion of underrepresented regions. These findings can help researchers identify key contributors, guide partner selection, and target appropriate journals. As CDM-based methods continue to expand, fostering diverse and collaborative research efforts will be crucial for advancing pharmacoepidemiologic knowledge globally.
Computer applications to medicine. Medical informatics
Abstract Background Retinal vein occlusion (RVO) is a significant retinal vascular disorder that has been hypothesized to increase the risk of cerebrovascular accidents (CVA). Given the shared vascular pathology between the retina and cerebral circulation, understanding the association between RVO and stroke incidence is critical for early intervention and risk management. This systematic review and meta-analysis aim to evaluate the risk of CVA, including ischemic and hemorrhagic subtypes, in patients with RVO. Methods This study followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines and was registered in PROSPERO (CRD42024557820). A systematic search of PubMed, Cochrane Library, Scopus, Web of Science, and Embase was conducted from inception to February 2025. Studies assessing the incidence of CVA post-RVO in adult patients (≥ 18 years) were included. Two independent reviewers performed study selection, data extraction, and quality assessment using the Cochrane Risk of Bias tool for Non-Randomized studies (ROBINS-I) was used for observational cohort studies. Meta-analysis was conducted using Comprehensive Meta-Analysis (CMA) software version 3.7, applying a fixed-effects model for low heterogeneity. Subgroup and sensitivity analyses were performed based on RVO type (BRVO vs. CRVO) and stroke subtype (ischemic vs. hemorrhagic CVA). Publication bias was evaluated using Egger’s test and funnel plots. Results A total of 14 studies (n = 97,812 patients) were included. The pooled event rate for CVA post-RVO was 37.5% (95% CI: 37.3%–37.8%), with no significant heterogeneity (I2 = 0%, p = 0.97). Subgroup analysis showed that both ischemic CVA (37.8%; 95% CI: 37.3%–38.3%) and hemorrhagic CVA (32.7%; 95% CI: 32.3%–33.1%) occurred at similar rates across branch retinal vein occlusion (BRVO) and central retinal vein occlusion (CRVO). The mortality rate post-CVA in RVO patients was 69.0% (95% CI: 68.4%–69.5%), highlighting the severity of stroke outcomes in this population. The incidence of ischemic cardiovascular events, including myocardial infarction, was 15.7% (95% CI: 15.4%–16.0%), reinforcing the need for cardiovascular monitoring in RVO patients. The incidence of deep vein thrombosis (DVT) was relatively low (0.05%) but still warrants clinical attention in high-risk populations. Publication bias was minimal, as confirmed by Egger’s test (p > 0.24) and funnel plot symmetry. Sensitivity analyses confirmed the robustness of the pooled estimates. Conclusion This meta-analysis provides strong evidence linking RVO to an increased risk of CVA and mortality. Given the high incidence of stroke (37.5%) and mortality post-CVA (69%), early cardiovascular risk assessment and intervention are crucial. Patients with RVO should undergo comprehensive vascular risk evaluation, including blood pressure control, lipid regulation, and anticoagulation therapy when indicated. The findings support a multidisciplinary approach involving ophthalmologists, neurologists, and cardiologists for proactive stroke prevention strategies in RVO patients. Future research should explore genetic predispositions, inflammatory markers, and AI-based predictive models to improve early risk stratification and intervention.
Oddvar Uleberg, Sarah King, Lars Petter Bjørnsen
et al.
Introduction Delirium is commonly observed in older patients who are admitted to the emergency department (ED). Previous systematic reviews have identified poor outcomes associated with delirium in surgical, intensive care and other hospital settings, yet none have specifically considered the ED. This systematic review aims to examine associations between older patients who present or develop delirium in the ED and adverse outcomes within the hospital and after discharge.Methods and analysis Searches will be conducted in MEDLINE, Embase, Web of Science, Cumulative Index to Nursing and Allied Health Literature, and the Cochrane Library. There will be no date or language restrictions. Key terms will include concepts related to delirium, the ED and older adults. Observational studies or non-intervention clinical studies will be included that compare outcomes in older patients (ie, ≥65 years) with and without delirium. Outcomes of interest will include length of hospital stay, non-home discharge (eg, nursing home/residential aged care facility), cognitive impairment, decreased physical function, mortality, readmission to hospital and quality of life measures. Two reviewers will independently screen the studies. Data extraction and quality assessment will be extracted by one reviewer and checked by a second reviewer, with any disagreements resolved by discussion or by a third reviewer. Where appropriate, data will be combined in a meta-analysis and a GRADE assessment will be made for each outcome. All methods will be guided by the Cochrane Handbook and the Centre for Reviews and Dissemination and reported following the Preferred Reporting Items for Systematic Review and Meta-Analysis statement as well as the recommendations set out by the Meta-analysis Of Observational Studies in Epidemiology group.Ethics and dissemination As this systematic review will use published data, ethical approval is not required. The results will be disseminated through a peer-reviewed publication and conference presentations.PROSPERO registration number CRD42024594975.
Abstract Blood Flow Restriction Training (BFRT) is a low-load training technique that involves applying pressure to partially restrict arterial blood flow while occluding venous return. Despite its growing popularity, there is still no consensus on how combining BFRT with resistance or aerobic training influences hemodynamic responses, or on the safest and most effective methods for implementing it. This review aims to systematically identify the effects of BFRT on hemodynamic parameters. A systematic review was conducted following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement guidelines. The Chinese literature search was performed in the China National Knowledge Infrastructure (CNKI) database. English literature search was conducted in the Web of Science, PubMed, and Google Scholar databases. The studies included human subjects, the outcome indicators included hemodynamic evaluation indicators, and only randomized controlled trials and randomized crossover trials were considered. Non-Chinese or English literature, duplicate studies, and those with missing data were excluded. The adapted STROBE checklist was used to assess the risk of bias, 44 articles were included in this review. Results indicated that BFRT has increased heart rate and blood lactate levels, while its effect on blood oxygen saturation varies. Additionally, BFRT significantly enhances cardiac output but may either have no significant effect or cause a decrease in stroke volume. Furthermore, BFRT improves pulse wave velocity from the femur to the posterior tibia, suggesting a positive influence on cardiovascular function. BFRT induces changes in arterial structure and function, with these indicators interacting to produce both positive and negative effects on cardiovascular health. The primary mechanisms by which BFRT influences hemodynamics include the activation of the sympathetic and vagus nerves, as well as the regulation of chemical mediators in body fluids that modulate cardiovascular function. Convenient, economical, non-invasive, and easily measurable hemodynamic indicators are expected to become an efficient tool for evaluating the effects of exercise training. Further research is needed to establish the optimal compression thresholds and durations for different populations and exercise types, as well as to assess the long-term impact of BFRT on hemodynamic parameters.
Bilge Gök, Elçin Çetinkale, Zehra Kurşun - Şen
et al.
Within the scope of this research, it is aimed to obtain academician opinions on the use of artificial intelligence applications such as ChatGPT in the teaching-learning process and classroom applications in primary schools. In this research, the case study method, which is one of the qualitative research approaches, was used. The study group of the research consists of 10 academicians working in state and foundation universities. A semi-structured interview form prepared by the researchers was used as a data collection tool. The data obtained were analyzed by content analysis method. The findings reveal that artificial intelligence tools make significant contributions to the teaching-learning process in areas such as measurement and evaluation and material preparation; to teachers in terms of saving time and understanding the curriculum; to professional development in terms of following the literature and improving field knowledge; and to students in areas such as increasing motivation and supporting individual learning. However, negative aspects such as ethical and security issues, weakening of higher-order thinking skills in students, and mislearning were also pointed out. In addition, it was stated that artificial intelligence tools can be used effectively in various classroom applications in social studies, Turkish, mathematics and science courses. The study emphasizes the importance of balanced and conscious integration of artificial intelligence in education.
Shuvechha Chakraborty, Indumathi Palanikumar, Yash Gune
et al.
Abstract Candida albicans, responsible for approximately 70% of all Candida infections, is a leading cause of invasive candidiasis and poses a significant global health threat. With the emergence of drug-resistant strains, mortality rates have reached a staggering 63.6% in severe cases, complicating treatment options and demanding the discovery of novel therapeutic targets. To address this pressing need, using a unique multidisciplinary approach, we attempted to identify some the critical metabolic pathways that can be targeted to modulate the virulence of CAL. Condition-specific genome-scale metabolic models (GSMMs), along with a novel integrated host-CAL model developed in this study, highlighted the central role of arginine (Arg) metabolism and uncovered ALT1, an arginine biosynthesis enzyme, as a critical metabolic vulnerability in CAL virulence. Heightened expression of arginine biosynthesis genes indicated that increased arginine synthesis mainly occurred through proline intermediates during host interaction. Significantly impaired virulence and in vivo pathogenicity of ALT1-deleted CAL highlighted the potential of targeting arginine metabolism as a novel strategy to combat antifungal resistance and underscored the power of integrating systems biology with experimental approaches in identifying new therapeutic targets.
The rapid evolution of communication networks has been significantly influenced by advancements in data science, artificial intelligence (AI), and machine learning (ML). Data-driven approaches enhance network performance, optimize bandwidth allocation, and improve security in telecommunications. This paper explores the role of data science in enhancing communication networks, covering key applications such as AI-driven network traffic prediction, fault detection, and 5G network optimization. Additionally, challenges such as data privacy, latency issues, and cybersecurity threats are discussed, along with emerging trends in AI-powered smart networking.