Objective: Very low birth weight (VLBW) infants (those with birth weights ~ 1 5 0 0 g) account for only 1.2% of births but 46% of infant deaths. Large improvements in neonatal technology in the last 2 decades have significantly improved survival prospects for infants with low birth weights, but at a high cost. Due largely to a lack of data, the costs of medical care during the period in which infant mortality is measured (the first year of life), as well as the cost-effectiveness of that care for VLBW infants, have not been quantified. Despite this fact, public policies both toward providing insurance coverage for their care, as well as denying payment for their treatment, have either been proposed or implemented on cost-effectiveness grounds. Patients: The study includes all VLBW single live births in the state of California during 1986 and 1987 that were continuously eligible (through traditional channels) for the state’s Medicaid program. Main Outcome Measures. Treatment costs were measured for all medical care received during the first year of life, including all inpatient and outpatient care received. The costeffectiveness of care is measured by aggregate treatment costs for all singleton VLBW liveborns divided by the number of first-year survivors. Results: Average treatment costs per first-year survivor for infants 750 g, significant gains can accrue from even a small shift in the birth weight distribution. A shift of 250 g at birth saves an average of $12 000 to $16 000 in first year medical costs and a shift of 500 g generates $28 000 in savings. However, there is a threshold effect on birth weight. For infants <750 g, increases in birth weight may increase medical expenditures. For instance, a shift in birth weight to the 750 to 999 g range increases costs by $29,000.
ABSTRACT We investigated the presence of viral DNA and RNA in cutaneous squamous cell carcinoma (cSCC) tumor and normal tissues from nine individuals with a history of hematopoietic stem cell transplantation (HCT). Microbiome quantification through DNA and RNA sequencing (RNA-seq) revealed the presence of 18 viruses in both tumor and normal tissues. DNA sequencing (DNA-seq) identified Torque teno virus, Saimiriine herpesvirus 1, Merkel cell polyomavirus, Human parvovirus B19, Human gammaherpesvirus-4, Human herpesvirus-6, and others. RNA-seq revealed additional viruses such as Tobamovirus, Pinus nigra virus, Orthohepadnavirus, Human papillomavirus-5, Human herpesvirus-7, Human gammaherpesvirus-4, Gammaretrovirus, and others. Notably, DNA-seq indicated that tumor samples exhibited low levels of Escherichia virus in three out of nine subjects and elevated levels of Human gammaherpesvirus-4 in one subject, while normal samples frequently contained Gammaretrovirus and occasionally Escherichia virus. A comparative analysis using both DNA- and RNA-seq captured three common viruses: Abelson murine leukemia virus, Murine type C retrovirus, and Human gammaherpesvirus-4. These findings were corroborated by an independent data set, supporting the reliability of the viral detection methods utilized. The study provides insights into the viral landscape in post-HCT patients, emphasizing the need for comprehensive viral monitoring in this vulnerable population.IMPORTANCEThis study is important because it explores the potential role of viruses in the development of cSCC in individuals who have undergone allogeneic HCT. cSCC is common in this population, particularly in those with chronic graft-versus-host disease on long-term immunosuppression. By using advanced metagenomic and metatranscriptomic next-generation sequencing, we aimed to identify viral pathogens present in tumor and adjacent normal tissue. The results could lead to targeted preventive or therapeutic interventions for these high-risk people, potentially improving their outcomes and management of cSCC.
Nancy Keller, Julian Midgley, Ehtesham Khalid
et al.
Abstract Background Sphingosine-1-phosphate lyase insufficiency syndrome (SPLIS) is a recently recognized inborn error of metabolism associated with steroid-resistant nephrotic syndrome as well as adrenal insufficiency and immunological, neurological, and skin manifestations. SPLIS is caused by inactivating mutations in SGPL1, encoding the pyridoxal 5’phosphate-dependent enzyme sphingosine-1-phosphate lyase, which catalyzes the final step of sphingolipid metabolism. Some SPLIS patients have undergone kidney transplantation, and others have been treated with vitamin B6 supplementation. In addition, targeted therapies including gene therapy are in preclinical development. In anticipation of clinical trials, it will be essential to characterize the full spectrum and natural history of SPLIS. We performed a retrospective analysis of 76 patients in whom the diagnosis of SPLIS was established in a proband with at least one suggestive finding and biallelic SGPL1 variants identified by molecular genetic testing. The main objective of the study was to identify factors influencing survival in SPLIS subjects. Results Overall survival at last report was 50%. Major influences on survival included: (1) age and organ involvement at first presentation; (2) receiving a kidney transplant, and (3) SGPL1 genotype. Among 48 SPLIS patients with nephropathy who had not received a kidney transplant, two clinical subgroups were distinguished. Of children diagnosed with SPLIS nephropathy before age one (n = 30), less than 30% were alive 2 years after diagnosis, and 17% were living at last report. Among those diagnosed at or after age one (n = 18), ~ 70% were alive 2 years after diagnosis, and 72% were living at time of last report. SPLIS patients homozygous for the SPL R222Q variant survived longer compared to patients with other genotypes. Kidney transplantation significantly extended survival outcomes. Conclusion Our results demonstrate that SPLIS is a phenotypically heterogeneous condition. We find that patients diagnosed with SPLIS nephropathy in the first year of life and patients presenting with prenatal findings represent two high-risk subgroups, whereas patients harboring the R222Q SGPL1 variant fare better than the rest. Time to progression from onset of proteinuria to end stage kidney disease varies from less than one month to five years, and kidney transplantation may be lifesaving.
Andrew D. Leavitt, Johnny Mahlangu, Priyanka Raheja
et al.
Background: Valoctocogene roxaparvovec, an adeno-associated virus-mediated gene therapy for severe hemophilia A, enables endogenous factor (F)VIII expression and provides bleed protection. Objectives: Determine valoctocogene roxaparvovec durability, efficacy, and safety 4 years after treatment. Methods: In the phase 3 GENEr8-1 trial, 134 adult male persons with severe hemophilia A without inhibitors and previously using FVIII prophylaxis received a 6 × 1013 vg/kg infusion of valoctocogene roxaparvovec. Efficacy endpoints included annualized bleed rate, annualized FVIII infusion rate, FVIII activity, and the Haemophilia-Specific Quality of Life Questionnaire for Adults. Adverse events and immunosuppressant use were assessed. Change from baseline was assessed after participants discontinued prophylaxis (scheduled for week 4). Results: Median follow-up was 214.3 weeks; 2 participants discontinued since the previous data cutoff. Declines from baseline in mean treated annualized bleed rate (−82.6%; P < .0001) and annualized FVIII infusion rate (−95.5%; P < .0001) were maintained from previous years in the primary analysis population of 112 participants who enrolled from a noninterventional study. During year 4, 81 of 110 rollover participants experienced 0 treated bleeds. Week 208 mean and median chromogenic FVIII activity were 16.1 IU/dL and 6.7 IU/dL, respectively, in 130 modified intention-to-treat participants. Seven participants resumed prophylaxis since the previous data cutoff. Mean change from baseline to week 208 in Haemophilia-Specific Quality of Life Questionnaire for Adults Total Score (P < .0001) remained clinically meaningful for modified intention-to-treat participants. Alanine aminotransferase elevation was the most common adverse event during year 4 (56/131 participants); none required immunosuppressants. Conclusion: Valoctocogene roxaparvovec provides persistent FVIII expression, hemostatic control, and health-related quality of life improvements with no new safety signals.
Abstract Study design Systematic review of Randomised controlled trials. Objectives With the increasing incidence of back pain among children and its untold implications to their future, back education tailored in an effective way would be indicated. However literature appears unsettled. This study aims to review available literature to determine the effect of school-based back education in preventing and managing low back pain in school children. Methods Randomized controlled trials carried out on elementary and secondary school children of ages 6 to 18 years and published in English language were included. Back education taught in hospitals or other settings were excluded. Primary outcome was back pain prevalence and secondary outcomes were constituted from the study characteristics of selected studies which includes: back behavior, knowledge, postural habits, physical activity, fear-avoidance beliefs, back pack carriage, pain intensity, skills and self efficacy. Databases searched were PEDro, HINARI, PubMed, Cochrane, and Google Scholar. Available stiudies from 2000 to March 2022 were retrieved. Quality of studies were assessed using the PEDro scale. Obtained studies were descriptively analyzed. Results A total 8420 studies were retrieved and 8 studies (with 1239 participants) were included in this review. Four studies each assessed back knowledge and back behavior, and two assessed back pain prevalence. There were improvements in back knowledge and back behaviour, but effectiveness of back care education on back pain prevalence was not conclusive. Forms of education used involved the indirect method of conditioning the environment and the direct method which made use of theory, practical lessons and educational books and materials. Conclusion Back care education programmes in schools are effective in improving back care knowledge, behavior and reduction in low back pain frequency. Reduction in back pain prevalence is not conclusive. Back care education could be incorporated as part of schools’ education programmes. Limitations include exclusion of non English language studies and inconsistent outcome measures. Funding source None. Registration This review protocol was registered under the International platform of Registered systematic review and meta-analysis protocol (INPLASY) with the registration number; INPLASY202310044 and DOI number; https://doi.org/10.37766/inplasy2023.1.0044
Larisa S. Kruglova, Anna G. Stenko, Lyubov A. Rubtsova
et al.
Background. Post-burn scars are common among pediatric patients. Pathological scarring is clear indication for conservative or surgical management of pediatric patients who continue to grow and develop after the resolution of burn injuries. Such lesions can significantly reduce patients' quality of life and, moreover, cause significant functional and aesthetic discomfort. Clinical cases description. The results of observation over two children (2 years 7 months and 12 years) with developing post-burn scars are demonstrated. Successful management method using physiotherapeutic complex (including monopolar radiofrequency treatment combined with ultrasound therapy, photodynamic therapy, and close-focus X-ray therapy) is presented. Conclusion. Modern trends in post-burn scars management are based on timely implementation of effective and safe methods early on rehabilitation stages and tissue restoration after burn injury to prevent pathological scarring and achieve control over its activity.
Advances in genetic research promise great strides in the diagnosis and treatment of many childhood diseases. However, emerging genetic technology often enables testing and screening before the development of definitive treatment or preventive measures. In these circumstances, careful consideration must be given to testing and screening of children to ensure that use of this technology promotes the best interest of the child. This statement reviews considerations for the use of genetic technology for newborn screening, carrier testing, and testing for susceptibility to late-onset conditions. Recommendations are made promoting informed participation by parents for newborn screening and limited use of carrier testing and testing for late-onset conditions in the pediatric population. Additional research and education in this developing area of medicine are encouraged.
Mansour S Aljabry, Fahad Alabbas, Ghaleb Elyamany
et al.
BACKGROUND: Rare bleeding disorder (RBDs) encompasses a deficiency of one or more of FXIII, FXI, FX, FVII, FV, FII, and FI clotting factors, leading to bleeding disorders with variable presentations and outcomes ranging from none or minimal to life-threatening events. RBDs are still underdiagnosed and underreported, especially in Saudi population with a high prevalence of consanguinity.
OBJECTIVES: The study aimed to determine the frequency of RBDs, grading of their bleeding severity, and assessment of clinical manifestations and management of RBDs in tertiary Saudi Arabian hospitals.
DESIGN AND SETTINGS: This retrospective study of RBDs describes the clinicopathological features of refereed cases to both Prince Sultan Military Medical City and King Khaled University Hospital in Riyadh, Saudi Arabia, from September 2018 to September 2021. Any patient who had already been diagnosed or suspected to have RBDs was enrolled in the study.
PATIENTS AND METHODS: Patient's medical records were reviewed for demographic data, clinical presentations, bleeding and family history, consanguinity, treatment outcomes, and molecular testing. Samples were run in specialized coagulation laboratories. Patients with liver dysfunction or acquired factor deficiency were excluded. Patients were categorized into four groups according to the severity of bleeding episodes: asymptomatic, Grade I, Grade II, and Grade III.
RESULTS: A total of 26 cases with RBDs were identified during the study period. Most of the included patients are males and pediatrics (<14 years) representing 15 (57.7%) and 14 (53.8%), respectively. FVII was the most common factor deficiency encountered in 9 (35%) patients, followed by FXIII in 5 (19%), FXI in 4 (15%), FX in 3 (11.5%), FV in 3 (11.5%), and combined factor deficiency in 2 (8%) patients. 17 (65.4%) RBD patients presented with bleeding manifestation either with Grade I (9%), Grade II (39%), or Grade III (15%), whereas 47% were asymptomatic.
CONCLUSION: The study emphasizes on importance of establishing a national registry of RBDs in Saudi Arabia and the need for further genetic studies to clarify the genotype/phenotype relationships.
Diseases of the circulatory (Cardiovascular) system
Julia R. Varshavsky, Swati D. G. Rayasam, Jennifer B. Sass
et al.
Abstract A key element of risk assessment is accounting for the full range of variability in response to environmental exposures. Default dose-response methods typically assume a 10-fold difference in response to chemical exposures between average (healthy) and susceptible humans, despite evidence of wider variability. Experts and authoritative bodies support using advanced techniques to better account for human variability due to factors such as in utero or early life exposure and exposure to multiple environmental, social, and economic stressors. This review describes: 1) sources of human variability and susceptibility in dose-response assessment, 2) existing US frameworks for addressing response variability in risk assessment; 3) key scientific inadequacies necessitating updated methods; 4) improved approaches and opportunities for better use of science; and 5) specific and quantitative recommendations to address evidence and policy needs. Current default adjustment factors do not sufficiently capture human variability in dose-response and thus are inadequate to protect the entire population. Susceptible groups are not appropriately protected under current regulatory guidelines. Emerging tools and data sources that better account for human variability and susceptibility include probabilistic methods, genetically diverse in vivo and in vitro models, and the use of human data to capture underlying risk and/or assess combined effects from chemical and non-chemical stressors. We recommend using updated methods and data to improve consideration of human variability and susceptibility in risk assessment, including the use of increased default human variability factors and separate adjustment factors for capturing age/life stage of development and exposure to multiple chemical and non-chemical stressors. Updated methods would result in greater transparency and protection for susceptible groups, including children, infants, people who are pregnant or nursing, people with disabilities, and those burdened by additional environmental exposures and/or social factors such as poverty and racism.
Industrial medicine. Industrial hygiene, Public aspects of medicine
Aneta D. Krakowski, Peter Szatmari, Peter Szatmari
et al.
Background: Many phenotypic studies have estimated the degree of comorbidity between Autism Spectrum Disorder (ASD) and Attention Deficit Hyperactivity Disorder (ADHD), but few have examined the latent, or unobserved, structure of combined ASD and ADHD symptoms. This is an important perquisite toward better understanding the overlap between ASD and ADHD.Methods: We conducted a scoping review of studies that examined the factor or latent class structure of ASD and ADHD symptoms within the same clinical or general population sample.Results: Eight studies met final inclusion criteria. Four factor analysis studies found that ASD and ADHD domains loaded separately and one found that some ASD and ADHD domains loaded together. In the three latent class studies, there were evidence of profiles with high levels of co-occurring ASD and ADHD symptoms.Conclusions: Our scoping review provides some evidence of phenotypic overlap between ASD and ADHD at the latent, or unobserved, level, particularly when using a “person-centered” (latent class analysis) vs. a “variable-centered” (factor analysis) approach.
Massimiliano Leigheb, Alessandro de Sire, Matteo Colangelo
et al.
Sarcopenia is a skeletal muscle disorder characterized by reduced muscle mass, strength, and performance. Muscle ultrasound can be helpful in assessing muscle mass, quality, and architecture, and thus possibly useful for diagnosing or screening sarcopenia. The objective of this study was to evaluate the reliability of ultrasound assessment of tibialis anterior muscle in sarcopenia diagnosis. We included subjects undergoing total or partial hip replacement, comparing measures with a healthy control group. We measured the following parameters: tibialis anterior muscle thickness, echogenicity, architecture, stiffness, skeletal muscle index (SMI), hand grip strength, and sarcopenia related quality of life evaluated through the SarQoL questionnaire. We included 33 participants with a mean age of 54.97 ± 23.91 years. In the study group we found reduced tibialis anterior muscle thickness compared to the healthy control group (19.49 ± 4.92 vs. 28.94 ± 3.63 mm, <i>p</i> < 0.05) with significant correlation with SarQoL values (r = 0.80, <i>p</i> < 0.05), dynamometer hand strength (r = 0.72, <i>p</i> < 0.05) and SMI (r = 0.76, <i>p</i> < 0.05). Moreover, we found reduced stiffness (32.21 ± 12.31 vs. 27.07 ± 8.04 Kpa, <i>p</i> < 0.05). AUC measures of ROC curves were 0.89 predicting reduced muscle strength, and 0.97 predicting reduced SMI for tibialis anterior muscle thickness, while they were 0.73 and 0.85, respectively, for muscle stiffness. Our findings showed that ultrasound assessment of tibialis anterior muscle might be considered a reliable measurement tool to evaluate sarcopenia.