Understanding the dynamics of passenger interactions and their epidemiological impact throughout public transportation systems is crucial for both service efficiency and public health. High passenger density and close physical proximity has been shown to accelerate the spread of infectious diseases. During the COVID-19 pandemic, many public transportation companies took measures to slow down and minimize disease spreading. One of these measures was introducing spacing and capacity constraints to public transit vehicles. Our objective is to explore the effects of demand changes and transportation measures from an epidemiological point of view, offering alternative measures to public transportation companies to keep the system alive while minimizing the epidemiological risk as much as possible.
Yuri Gardinazzi, Roger Gonzaléz March, Suprabhath Kalahasti
et al.
Comorbidity networks, which capture disease-disease co-occurrence usually based on electronic health records, reveal structured patterns in how diseases cluster and progress across individuals. However, how these networks evolve across different age groups and how this evolution relates to properties like disease prevalence and mortality remains understudied. To address these issues, we used publicly available comorbidity networks extracted from a comprehensive dataset of 45 million Austrian hospital stays from 1997 to 2014, covering 8.9 million patients. These networks grow and become denser with age. We identified groups of diseases that exhibit similar patterns of structural centrality throughout the lifespan, revealing three dominant age-related components with peaks in early childhood, midlife, and late life. To uncover the drivers of this structural change, we examined the relationship between prevalence and degree. This allowed us to identify conditions that were disproportionately connected to other diseases. Using betweenness centrality in combination with mortality data, we further identified high-mortality bridging diseases. Several diseases show high connectivity relative to their prevalence, such as iron deficiency anemia (D50) in children, nicotine dependence (F17), and lipoprotein metabolism disorders (E78) in adults. We also highlight structurally central diseases with high mortality that emerge at different life stages, including cancers (C group), liver cirrhosis (K74), subarachnoid hemorrhage (I60), and chronic kidney disease (N18). These findings underscore the importance of targeting age-specific, network-central conditions with high mortality for prevention and integrated care.
Recent advancements in speech-to-text translation have led to the development of multilingual models capable of handling multiple language pairs simultaneously. However, these unified models often suffer from large parameter sizes, making it challenging to balance inference efficiency and performance, particularly in local deployment scenarios. We propose an innovative Parasitic Dual-Scale Approach, which combines an enhanced speculative sampling method with model compression and knowledge distillation techniques. Building on the Whisper Medium model, we enhance it for multilingual speech translation into whisperM2M, and integrate our novel KVSPN module, achieving state-of-the-art (SOTA) performance across six popular languages with improved inference efficiency. KVSPN enables a 40\% speedup with no BLEU score degradation. Combined with distillation methods, it represents a 2.6$\times$ speedup over the original Whisper Medium with superior performance.
Background. Infectious diseases, particularly COVID-19, continue to be a significant global health issue. Although many countries have reduced or stopped large-scale testing measures, the detection of such diseases remains a propriety. Objective. This study aims to develop a novel, lightweight deep neural network for efficient, accurate, and cost-effective detection of COVID-19 using a nasal breathing audio data collected via smartphones. Methodology. Nasal breathing audio from 128 patients diagnosed with the Omicron variant was collected. Mel-Frequency Cepstral Coefficients (MFCCs), a widely used feature in speech and sound analysis, were employed for extracting important characteristics from the audio signals. Additional feature selection was performed using Random Forest (RF) and Principal Component Analysis (PCA) for dimensionality reduction. A Dense-ReLU-Dropout model was trained with K-fold cross-validation (K=3), and performance metrics like accuracy, precision, recall, and F1-score were used to evaluate the model. Results. The proposed model achieved 97% accuracy in detecting COVID-19 from nasal breathing sounds, outperforming state-of-the-art methods such as those by [23] and [13]. Our Dense-ReLU-Dropout model, using RF and PCA for feature selection, achieves high accuracy with greater computational efficiency compared to existing methods that require more complex models or larger datasets. Conclusion. The findings suggest that the proposed method holds significant potential for clinical implementation, advancing smartphone-based diagnostics in infectious diseases. The Dense-ReLU-Dropout model, combined with innovative feature processing techniques, offers a promising approach for efficient and accurate COVID-19 detection, showcasing the capabilities of mobile device-based diagnostics
Shaheer Ahmad Khan, Muhammad Usamah Shahid, Ahmad Abdullah
et al.
This study addresses a critical gap in the healthcare system by developing a clinically meaningful, practical, and explainable disease surveillance system for multiple chronic diseases, utilizing routine EHR data from multiple U.S. practices integrated with CureMD's EMR/EHR system. Unlike traditional systems--using AI models that rely on features from patients' labs--our approach focuses on routinely available data, such as medical history, vitals, diagnoses, and medications, to preemptively assess the risks of chronic diseases in the next year. We trained three distinct models for each chronic disease: prediction models that forecast the risk of a disease 3, 6, and 12 months before a potential diagnosis. We developed Random Forest models, which were internally validated using F1 scores and AUROC as performance metrics and further evaluated by a panel of expert physicians for clinical relevance based on inferences grounded in medical knowledge. Additionally, we discuss our implementation of integrating these models into a practical EMR system. Beyond using Shapley attributes and surrogate models for explainability, we also introduce a new rule-engineering framework to enhance the intrinsic explainability of Random Forests.
Urinary tract infections (UTIs) are the most common type of infections second only to respiratory tract infections. Millions of UTI cases are reported each year, affecting in- and outpatients. The most frequent causative agents of UTIs are the enteric Gram-negative bacteria, among which Escherichia coli (E. coli) dominates. While most strains of E. coli are harmless and indeed play a beneficial role in gut health, some strains (uropathogenic Escherichia coli, UPEC) can cause infections when they are translocated to generally sterile body areas, such as the urinary tract. This review presents the wide range of virulence factors of UPEC, involved in the urinary tract colonization, infection development and host tissue invasion. Cell-associated and extracellular key virulence factors such as adhesins, invasins, iron acquisition factors, factors mediating serum resistance, toxins and structural components are discussed in detail. Also, the review focuses on the process of biofilm formation, another crucial virulence factor in UPEC, responsible for UTI persistence, reoccurrence and antimicrobial therapy failure. The regulatory mechanisms involved in biofilm production are also discussed.
Intravenous Immunoglobulin (IVIg) were used for the first time at the late seventies for treatment of patient with primary and secondary immunodeficiencies. Тhe first observations opened a wide field for basic and clinical research leading to a rapidly expanding use of IVIg for the treatment of patients with multiple diseases. The immunoglobulin preparations contain large amount of intact form of IgG molecules with comparable values of subclasses as in native plasma. Тhose properties account for the normal half-life of injected immunoglobulin of three weeks, and its ability to react normal with the complement components and with the specific Fcγ-receptors on the surface of phagocytes and lymphocytes. IVIg is produced from plasma pules from several thousand donors, which leads to wide spectre of variable regions of IgG molecules in the product. Some of the antibodies in the product can recognize bacterial, virus and fungal antigens are essential in replacement therapy in patients with antibody deficiency. А good understanding of the molecular and cellular basis of the immunoregulatory actions of intravenous immunoglobulin preparations is important for optimizing their use in inflammatory diseases for conducting new clinical observations.
Bimarsha Khanal, Paras Poudel, Anish Chapagai
et al.
Plant diseases significantly impact our food supply, causing problems for farmers, economies reliant on agriculture, and global food security. Accurate and timely plant disease diagnosis is crucial for effective treatment and minimizing yield losses. Despite advancements in agricultural technology, a precise and early diagnosis remains a challenge, especially in underdeveloped regions where agriculture is crucial and agricultural experts are scarce. However, adopting Deep Learning applications can assist in accurately identifying diseases without needing plant pathologists. In this study, the effectiveness of various computer vision models for detecting paddy diseases is evaluated and proposed the best deep learning-based disease detection system. Both classification and detection using the Paddy Doctor dataset, which contains over 20,000 annotated images of paddy leaves for disease diagnosis are tested and evaluated. For detection, we utilized the YOLOv8 model-based model were used for paddy disease detection and CNN models and the Vision Transformer were used for disease classification. The average mAP50 of 69% for detection tasks was achieved and the Vision Transformer classification accuracy was 99.38%. It was found that detection models are effective at identifying multiple diseases simultaneously with less computing power, whereas classification models, though computationally expensive, exhibit better performance for classifying single diseases. Additionally, a mobile application was developed to enable farmers to identify paddy diseases instantly. Experiments with the app showed encouraging results in utilizing the trained models for both disease classification and treatment guidance.
Yi Jiang, Kristin M. Kurianski, Jane HyoJin Lee
et al.
We develop a mechanistic model that classifies individuals both in terms of epidemiological status (SIR) and vaccination attitude (willing or unwilling), with the goal of discovering how disease spread is influenced by changing opinions about vaccination. Analysis of the model identifies existence and stability criteria for both disease-free and endemic disease equilibria. The analytical results, supported by numerical simulations, show that attitude changes induced by disease prevalence can destabilize endemic disease equilibria, resulting in limit cycles.
ABSTRACTVaccines utilizing modified messenger RNA (mRNA) technology have shown robust protective efficacy against SARS-CoV-2 in humans. As the virus continues to evolve in both human and non-human hosts, risk remains that the performance of the vaccines can be compromised by new variants with strong immune escape abilities. Here we present preclinical characterizations of a novel bivalent mRNA vaccine RQ3025 for its safety and effectiveness in animal models. The mRNA sequence of the vaccine is designed to incorporate common mutations on the SARS-CoV-2 spike protein that have been discovered along the evolutionary paths of different variants. Broad-spectrum, high-titer neutralizing antibodies against multiple variants were induced in mice (BALB/c and K18-hACE2), hamsters and rats upon injections of RQ3025, demonstrating advantages over the monovalent mRNA vaccines. Effectiveness in protection against several newly emerged variants is also evident in RQ3025-vaccinated rats. Analysis of splenocytes derived cytokines in BALB/c mice suggested that a Th1-biased cellular immune response was induced by RQ3025. Histological analysis of multiple organs in rats following injection of a high dose of RQ3025 showed no evidence of pathological changes. This study proves the safety and effectiveness of RQ3025 as a broad-spectrum vaccine against SARS-CoV-2 variants in animal models and lays the foundation for its potential clinical application in the future.
Mandy Swann, Amy Lucas, Christian Ostrowski
et al.
Background: Patients without urinary tract infection (UTI) symptoms but with a positive urine culture are considered to have asymptomatic bacteriuria (ASB). This often represents colonization and treatment is not recommended or clinically beneficial. Treatment of ASB can promote antimicrobial resistance and increased rates of Clostriodies difficile infections. Many cases of ASB are incorrectly assigned as CAUTIs due to over-culturing practices. We hypothesized that a urine culture algorithm, embedded within a best practice alert (BPA) in the electronic medical record (EMR), would reduce urine culturing practices for ASB. Methods: From Feb 2022 through May 2023, a multidisciplinary team implemented an Inpatient Urine Culturing Stewardship Guideline. A BPA fired when a provider placed a urinalysis with reflex to culture (UACC) or urine culture (UC) order for patients who met criteria (Image 1). The BPA directed providers to remove the order, select the appropriate pathway from the guideline, or provide a rationale for placing the order. The intervention was piloted on three intensive care units and two progressive care units, containing both medical and surgical patients. Monthly ordering practices, CAUTI rates, and gram-negative rod (GNR) bacteremia rates from a 13-month pre-intervention baseline period were compared to a 16-month intervention period. Over the same time periods, we also assessed changes in ordering practices for comparison units which did not implement the intervention. Pre-and-post intervention cohorts were analyzed using median two sample tests and Exact Poison Method, as appropriate. Results: On intervention units there was a 41.0% reduction in the median number of UACC and UC orders per 1000 patient days from 16.31 during the baseline period to 9.62 in the intervention period (p=0.0036). Pan cultures per 1000 patient days in which one of the orders was a UACC or UC fell by 42.2% from a median of 10.20 per 1000 patient days to 5.90 (p=0.0008). The comparison units saw no significant reductions in UACC and UC orders (p=0.21) or pan cultures (p=1.0). On the intervention units, the CAUTI rate for the baseline period was 1.31 per 1000 catheter days versus 0.79 in the intervention period (IRR = 1.65; p=0.44). GNR bacteremias remained stable on the intervention units between the baseline and intervention periods (p=0.82). Conclusion: This multidisciplinary intervention, leveraging EMR clinical decision support, reduced urine and pan culturing practices while demonstrating a trend towards a reduced CAUTI rate. The prevalence of GNR bacteremias remained consistent with baseline levels, suggesting the intervention did not cause harm.
Infectious and parasitic diseases, Public aspects of medicine
Jirarat Songsri, Moragot Chatatikun, Sueptrakool Wisessombat
et al.
Background: Burkholderia pseudomallei, a Gram-negative pathogen, causes melioidosis. Although various clinical laboratory identification methods exist, culture-based techniques lack comprehensive evaluation. Thus, this systematic review and meta-analysis aimed to assess the diagnostic accuracy of culture-based automation and non-automation methods. Methods: Data were collected via PubMed/MEDLINE, EMBASE, and Scopus using specific search strategies. Selected studies underwent bias assessment using QUADAS-2. Sensitivity and specificity were computed, generating pooled estimates. Heterogeneity was assessed using I2 statistics. Results: The review encompassed 20 studies with 2988 B. pseudomallei samples and 753 non-B. pseudomallei samples. Automation-based methods, particularly with updating databases, exhibited high pooled sensitivity (82.79%; 95% CI 64.44–95.85%) and specificity (99.94%; 95% CI 98.93–100.00%). Subgroup analysis highlighted superior sensitivity for updating-database automation (96.42%, 95% CI 90.01–99.87%) compared to non-updating (3.31%, 95% CI 0.00–10.28%), while specificity remained high at 99.94% (95% CI 98.93–100%). Non-automation methods displayed varying sensitivity and specificity. In-house latex agglutination demonstrated the highest sensitivity (100%; 95% CI 98.49–100%), followed by commercial latex agglutination (99.24%; 95% CI 96.64–100%). However, API 20E had the lowest sensitivity (19.42%; 95% CI 12.94–28.10%). Overall, non-automation tools showed sensitivity of 88.34% (95% CI 77.30–96.25%) and specificity of 90.76% (95% CI 78.45–98.57%). Conclusion: The study underscores automation's crucial role in accurately identifying B. pseudomallei, supporting evidence-based melioidosis management decisions. Automation technologies, especially those with updating databases, provide reliable and efficient identification.
Infectious and parasitic diseases, Public aspects of medicine
Ritesh Chandra, Sadhana Tiwari, Sonali Agarwal
et al.
Vector-borne diseases (VBDs) are a kind of infection caused through the transmission of vectors generated by the bites of infected parasites, bacteria, and viruses, such as ticks, mosquitoes, triatomine bugs, blackflies, and sandflies. If these diseases are not properly treated within a reasonable time frame, the mortality rate may rise. In this work, we propose a set of ontologies that will help in the diagnosis and treatment of vector-borne diseases. For developing VBD's ontology, electronic health records taken from the Indian Health Records website, text data generated from Indian government medical mobile applications, and doctors' prescribed handwritten notes of patients are used as input. This data is then converted into correct text using Optical Character Recognition (OCR) and a spelling checker after pre-processing. Natural Language Processing (NLP) is applied for entity extraction from text data for making Resource Description Framework (RDF) medical data with the help of the Patient Clinical Data (PCD) ontology. Afterwards, Basic Formal Ontology (BFO), National Vector Borne Disease Control Program (NVBDCP) guidelines, and RDF medical data are used to develop ontologies for VBDs, and Semantic Web Rule Language (SWRL) rules are applied for diagnosis and treatment. The developed ontology helps in the construction of decision support systems (DSS) for the NVBDCP to control these diseases.
Thamires Souza Pires, Wemerson de Oliveira Freitas, Geser Mascarenhas de Barros
et al.
Introdução/Objetivo: No Brasil, a vacina BCG (Bacilo Calmette-Guérin) é preconizada pelo Programa Nacional de Imunizações após o nascimento, sendo a principal forma de combate à tuberculose grave em crianças menores de 5 anos. Com o advento da COVID-19, o Brasil registrou um aumento no número de óbitos por tuberculose pela primeira vez em uma década. Tal fato pode, entre outras causas, ser reflexo da diminuição da cobertura vacinal de BCG no país. Diante disso, nosso estudo se propõe a analisar comparativamente o perfil da cobertura vacinal dos imunizantes BCG entre as capitais do Brasil entre os anos de 2018-2022. Métodos: Trata-se de um estudo ecológico com dados extraídos do TABNET/DATASUS, coletados em maio de 2023, referente à cobertura vacinal dos imunizantes BCG nas capitais brasileiras no período de 2018-2022. Os dados foram tabulados no Excel 2019, onde foi realizado o cálculo de percentual de variação da cobertura vacinal no período estudado. Resultados: Analisando comparativamente os anos de 2018-2019, nota-se uma tendência decrescente na imunização para BCG em algumas capitais como São Luís (-57,8%), Cuiabá (-75,2%) e Florianópolis (-86,44%). Em 2020, se comparado a 2018, percebe-se uma redução da cobertura nas capitais: São Paulo (-36,52%), Recife (-46,87%) e Campo Grande (-83,17%). Em 2021, a cobertura vacinal diminuiu na maioria das capitais, em comparação a 2018, como em Salvador (-24,22%), Porto Velho (-34,69%), com destaque para Florianópolis (-95,6%). Em 2022, nota-se um aumento na maioria das capitais, comparado a 2021, como Rio de Janeiro (+19,85%), Aracaju (+85,1%) e São Luís (+144,16%). No ano de 2022, a cobertura de Salvador manteve-se em queda (24,4%) quando comparado a 2021. No período estudado, Brasília, Porto Alegre e Manaus não tiveram variações expressivas nas taxas de coberturas vacinais da BCG. Conclusão: A cobertura vacinal de BCG entre 2018-2022, apesar de não reduzir em algumas capitais, seguiu uma tendência de declínio na maioria das capitais brasileiras, principalmente em 2020-2021 (período que coincide com a pandemia de COVID-19), com destaque para Florianópolis, com a menor cobertura vacinal. A cobertura tende a aumentar nos próximos anos, visto que, em 2022, muitas capitais tiveram um salto importante, como em São Luís. Infere-se que a pandemia como fator contribuinte para a queda na adesão à vacinação para BCG, e que o incentivo à imunização é fundamental para que bons índices de saúde pública sejam alcançados.
Reinaldo Perez, Michael Yarrington, Connor Deri
et al.
Background: Antimicrobial stewardship strategies must be tailored to effectively engage prescribers with distinct training, experiences, and career paths. Advanced practice providers (APPs) have taken on increasing roles as primary team members in acute-care hospitals, but the impact of this practice shift on antimicrobial prescribing is unknown. We describe longitudinal trends in antimicrobial days of therapy (DOT) by attributed provider type in 3 hospitals. Methods: We performed a retrospective time-series analysis of antimicrobial use for the 7-year period of July 2015–June 2022 to investigate the changes by provider type at 3 hospitals: a major university hospital and 2 community hospitals. DOT, antibacterial, and antifungal agent groups were defined using National Healthcare Safety Network methods. We included anti-influenza and antiherpesvirus agents in the antiviral group. We defined protected agents as those targeted by hospital antimicrobial stewardship program policy (eg, requiring preauthorization). Provider type was defined by electronic health record user profiles in 3 categories: physician, trainees (residents, fellows and medical students), and APPs (nurse practitioners, physician assistants, and nurse anesthetists). We evaluated DOT per 1,000 days present over time by agent group to assess quarterly rate trends. Then, we calculated the percentage of total DOT by provider group. We used multinomial logistic regression to measure changes in percentage DOT across the clinician groups over time using physicians as the referent. Results: Across hospitals and provider groups, we observed an overall decrease in use rates for antibacterial and protected agents (17% each) and increased use rates for antiviral agents (38%) and antifungal agents (4%) (Table 1). Baseline distribution of DOT by provider group and change in distribution over time varied by hospital and agent group (Fig. and Table 2). The largest increases in percentage DOT attributed to APPs compared with physicians occurred in the university hospital with the following average increases per quarter: 1.5% for antibacterials, 3.9% for antivirals, 3.3% for antifungals, and 3.8% for protected agents (Table 3). Community hospitals had higher initial percentage DOT attributed to physicians, but both hospitals experienced increased percentage DOT attributed to APPs. Percentage DOT attributed to trainees varied significantly across agent groups and hospitals. Conclusions: Hospitals had differing baseline patterns of DOT attributed to provider groups, but all experienced increases in DOT attributed to APPs. APPs have increasing involvement in antimicrobial use decisions and should be engaged in future antimicrobial stewardship initiatives.
Infectious and parasitic diseases, Public aspects of medicine
According to the World Health Organization, infectious and parasitic diseases account for about a quarter of all deaths in the world. In Belarus, 2,0-2,5 million cases of infectious diseases are registered annually. An analysis of the levels and trends of mortality from infectious and parasitic diseases is important for improving the practice of anti-epidemic work.Aims: Conduct an analysis of mortality from infectious and parasitic diseases in the Gomel region for the period 2009-2019.For the study, official data of the National Statistical Committee of the Republic of Belarus on the number of deceased persons and estimated data (mortality rates) of the «Gomel Regional Clinical Hospital» organizational and methodological department on infectious and parasitic diseases for 2009-2019 were used. The mortality rates of the Gomel region population from infectious and parasitic diseases were analyzed. We used epidemiological and statistical methods of information processing. To assess the statistical significance of differences, Student’s t-test was used.During the analyzed period in the Gomel region, there was a statistically significant trend towards a decrease in the mortality rate of the population from infectious and parasitic dis -eases, both among urban and rural residents. A statistically significant downward trend in mortality rates was also noted among people of working age. In the dynamics of the analyzed years in the period from 2009 to 2015, there was a statistically significant decrease in the mortality rate of the population older than working age, from 2016 to 2019 a statistically significant increase was established.In order to further reduce mortality from infectious and parasitic diseases, the main emphasis should be placed on two aspects: the formation of the population’s commitment to preventive measures and the improvement of the quality of medical care for the population.
In stochastic modeling of infectious diseases, it has been established that variations in infectivity affect the probability of a major outbreak, but not the shape of the curves during a major outbreak, which is predicted by deterministic models [Diekmann et al., 2012]. However, such conclusions are derived under idealized assumptions such as the population size tending to infinity, and the individual degree of infectivity only depending on variations in the infectiousness period. In this paper we show that the same conclusions hold true in a finite population representing a medium size city, where the degree of infectivity is determined by the offspring distribution, which we try to make as realistic as possible for SARS-CoV-2. In particular, we consider distributions with fat tails, to incorporate the existence of super-spreaders. We also provide new theoretical results on convergence of stochastic models which allows to incorporate any offspring distribution with a finite variance.
Hanung Adi Nugroho, Rizki Nurfauzi, E. Elsa Herdiana Murhandarwati
et al.
Indonesia holds the second-highest-ranking country for the highest number of malaria cases in Southeast Asia. A different malaria parasite semantic segmentation technique based on a deep learning approach is an alternative to reduce the limitations of traditional methods. However, the main problem of the semantic segmentation technique is raised since large parasites are dominant, and the tiny parasites are suppressed. In addition, the amount and variance of data are important influences in establishing their models. In this study, we conduct two contributions. First, we collect 559 microscopic images containing 691 malaria parasites of thin blood smears. The dataset is named PlasmoID, and most data comes from rural Indonesia. PlasmoID also provides ground truth for parasite detection and segmentation purposes. Second, this study proposes a malaria parasite segmentation and detection scheme by combining Faster RCNN and a semantic segmentation technique. The proposed scheme has been evaluated on the PlasmoID dataset. It has been compared with recent studies of semantic segmentation techniques, namely UNet, ResFCN-18, DeepLabV3, DeepLabV3plus and ResUNet-18. The result shows that our proposed scheme can improve the segmentation and detection of malaria parasite performance compared to original semantic segmentation techniques.