B. David, J. Wolfender, D. Dias
Hasil untuk "Pharmaceutical industry"
Menampilkan 20 dari ~5211158 hasil · dari arXiv, DOAJ, CrossRef, Semantic Scholar
Silvia Lomartire, A. Gonçalves
Nowadays, seaweeds are widely involved in biotechnological applications. Due to the variety of bioactive compounds in their composition, species of phylum Ochrophyta, class Phaeophyceae, phylum Rhodophyta and Chlorophyta are valuable for the food, cosmetic, pharmaceutical and nutraceutical industries. Seaweeds have been consumed as whole food since ancient times and used to treat several diseases, even though the mechanisms of action were unknown. During the last decades, research has demonstrated that those unique compounds express beneficial properties for human health. Each compound has peculiar properties (e.g., antioxidant, antimicrobial, antiviral activities, etc.) that can be exploited to enhance human health. Seaweed’s extracted polysaccharides are already involved in the pharmaceutical industry, with the aim of replacing synthetic compounds with components of natural origin. This review aims at a better understanding of the recent uses of algae in drug development, with the scope of replacing synthetic compounds and the multiple biotechnological applications that make up seaweed’s potential in industrial companies. Further research is needed to better understand the mechanisms of action of seaweed’s compounds and to embrace the use of seaweeds in pharmaceutical companies and other applications, with the final scope being to produce sustainable and healthier products.
Baoyang Ding
Abstract The exploitation of the emerging technologies of Pharma Industry 4.0 facilitates sustainable value creation, leads to more agile, smart and personalised pharma industry, and thereby, in the long-run, enables pharma companies to obtain competitive advantages. A more sustainable pharmaceutical supply chain (PSC) should be implemented to match future operations and management of the pharmaceutical products across the entire life cycle. The main purpose of this study is to identify the potential sustainability barriers of PSC and to investigate how Industry 4.0 can be applied in the sustainable PSC paradigms. This paper systematically reviews 33 relevant articles concerning sustainable PSC and Industry 4.0, taken from peer-reviewed academic journals over a decade (2008–2018). Based on content analysis, we find that the major challenges that inhibit inclusion of sustainability in the PSCs are: high costs and time consumption, little expertise and training, enforcement of regulations, the paucity of business incentives, ineffective collaborations and coordination across the PSC, lack of objective benchmarks, and poor end-customer awareness. The technologies and innovations based on Industry 4.0 can solve these barriers with regards to four aspects: enhancing the flexibility of the PSC for patient-centric drug supplies; improving the effectiveness of coordination and communication across different entities within the PSC; mitigating waste and pollution at different stages; and enabling a more autonomous decision-making process for supply chain managers. Our analysis reveals that future research interest should focus on: cross-linking coordination and cooperation, eco-friendly end-of-life products disposal, proactive product recall management, new benchmarks and measurement of sustainable performance, new regulation system design, and effects of incentives for sustainable activities.
Ioannis Balatsos, Athanasios Liakos, Panagiotis Karakostas et al.
This paper develops a data-driven, constraint-based optimization framework for a complex industrial job shop scheduling problem variant in pharmaceutical manufacturing. The formulation captures fixed routings and designated machines, explicit resource calendars with weekends and planned maintenance, and campaign sequencing through sequence-dependent cleaning times derived from site tables. The model is implemented with an open source constraint solver and evaluated on deterministic snapshots from a solid oral dosage facility under three objective formulations: makespan, makespan plus total tardiness, and makespan plus average tardiness. On three industrial instances of increasing size (10, 30, and 84 jobs) the proposed schedules dominate reference plans that solve a simplified variant without the added site rules. Makespan reductions reach \(88.1\%\), \(77.6\%\), and \(54.9\%\) and total tardiness reductions reach \(72.1\%\), \(58.7\%\), and \(18.2\%\), respectively. The composite objectives further decrease late job counts with negligible makespan change on the smaller instances and a modest increase on the largest instance. Optimality is proven on the small case, with relative gaps of \(0.77\%\) and \(14.92\%\) on the medium and large cases under a fixed time limit. The results show that a compact constraint programming formulation can deliver feasible, transparent schedules that respect site rules while improving adherence to due dates on real industrial data.
Muhammad Abdurrahman Munir, David Fernando, Imam Shofid Alaih et al.
This study investigates the structural, thermal, morphological, and electrical properties of bio-based polyurethane (PU) composites reinforced with carbon nanotubes (CNTs). PU was synthesized using methylene diphenyl diisocyanate (MDI) and palm kernel oil-derived polyol, while CNTs were incorporated in varying concentrations (1 %, 2 %, 5 %, and 10 %) via a sonication-assisted solution casting method. The chemical structure and successful incorporation of CNTs were confirmed using Fourier Transform Infrared Spectroscopy (FTIR), revealing the preservation of the PU backbone and the presence of non-covalent interactions such as hydrogen bonding. Principal Component Analysis (PCA) of FTIR data demonstrated effective differentiation between PU and PU/CNT composites based on subtle changes in the fingerprint region. Field Emission Scanning Electron Microscopy (FESEM) confirmed a uniform and well-integrated dispersion of CNTs in the PU matrix, with minimal aggregation, supporting effective nanofiller incorporation. Thermal analyses using Thermogravimetric Analysis (TGA) and Differential Scanning Calorimetry (DSC) revealed that CNTs improved the thermal stability, delayed decomposition onset, and increased residual char content, particularly at 5–10 wt% CNT. These enhancements were attributed to CNTs' barrier effect and high thermal conductivity. Electrochemical Impedance Spectroscopy (EIS) further demonstrated a significant reduction in bulk resistance with increasing CNT concentration, confirming enhanced electrical conductivity and the formation of conductive networks. The PU/CNT composites exhibited characteristic impedance behavior in line with the Randles circuit model, supporting their potential for electrochemical applications. Overall, the results indicate that CNT-reinforced PU composites possess enhanced thermal, structural, and electrochemical properties, making them promising candidates for flexible electronics, electrochemical sensors, and anti-corrosion coatings.
Ramona Rubini, Siavash Khodakarami, Aniruddha Bora et al.
Accurate time-series forecasting for complex physical systems is the backbone of modern industrial monitoring and control. While deep learning models excel at capturing complex dynamics, currently, their deployment is limited due to physical inconsistency and robustness, hence constraining their reliability in regulated environments. We introduce process-informed forecasting (PIF) models for temperature in pharmaceutical lyophilization. We investigate a wide range of models, from classical ones such as Autoregressive Integrated Moving Average Model (ARIMA) and Exponential Smoothing Model (ETS), to modern deep learning architectures, including Kolmogorov-Arnold Networks (KANs). We compare three different loss function formulations that integrate a process-informed trajectory prior: a fixed-weight loss, a dynamic uncertainty-based loss, and a Residual-Based Attention (RBA) mechanism. We evaluate all models not only for accuracy and physical consistency but also for robustness to sensor noise. Furthermore, we test the practical generalizability of the best model in a transfer learning scenario on a new process. Our results show that PIF models outperform their data-driven counterparts in terms of accuracy, physical plausibility and noise resilience. This work provides a roadmap for developing reliable and generalizable forecasting solutions for critical applications in the pharmaceutical manufacturing landscape.
Viviana Schisa, Matteo Farnè
Climate change is increasingly recognized as a driver of health-related outcomes, yet its impact on pharmaceutical demand remains largely understudied. As environmental conditions evolve and extreme weather events intensify, anticipating their influence on medical needs is essential for designing resilient healthcare systems. This study examines the relationship between climate variability and the weekly demand for respiratory prescription pharmaceuticals in Greece, based on a dataset spanning seven and a half years (390 weeks). Granger causality spectra are employed to explore potential causal relationships. Following variable selection, four forecasting models are implemented: Prophet, a Vector Autoregressive model with exogenous variables (VARX), Random Forest with Moving Block Bootstrap (MBB-RF), and Long Short-Term Memory (LSTM) networks. The MBB-RF model achieves the best performance in relative error metrics while providing robust insights through variable importance rankings. The LSTM model outperforms most metrics, highlighting its ability to capture nonlinear dependencies. The VARX model, which includes Prophet-based exogenous inputs, balances interpretability and accuracy, although it is slightly less competitive in overall predictive performance. These findings underscore the added value of climate-sensitive variables in modeling pharmaceutical demand and provide a data-driven foundation for adaptive strategies in healthcare planning under changing environmental conditions.
Guojing Li, Zhihao Du, Xin Gao et al.
This paper aims to improve the synthetic process of molnupiravir based on previously reported synthetic routes. The route begins with uracil (ML-2), which is protected with an isopropyl group to yield ML-3 (Step 1), followed by an esterification and a triazolation reaction (Steps 2 and 3) to produce ML-5, which, via a hydroxylation reaction and deprotection (Steps 4 and 5), gives the target product ML-1. Nuclear magnetic resonance (1H NMR) and mass spectra were used for chemical structure identification. There are mainly the following improvements, including: (1) replacing the separate addition of acetone and concentrated H2SO4 with 2,2-dimethoxypropane and catalytic p-toluenesulfonic acid monohydrate in Step 1, simplifying the workup operation and reducing the dosage of reaction solvent. (2) Optimize the synthesis conditions of ML-5, reduce the formation of impurities, and improve the purity of the crude product from 43.12 to 85.21%. (3) Three impurities were isolated, two of which are new compounds. This article lays a foundation to obtain molnupiravir with controllable quality and a stable process for the treatment of coronavirus disease 2019.
Seyed Amin Tabatabaei, Sarah Fancher, Michael Parsons et al.
We address the task of hierarchical multi-label classification (HMC) of scientific documents at an industrial scale, where hundreds of thousands of documents must be classified across thousands of dynamic labels. The rapid growth of scientific publications necessitates scalable and efficient methods for classification, further complicated by the evolving nature of taxonomies--where new categories are introduced, existing ones are merged, and outdated ones are deprecated. Traditional machine learning approaches, which require costly retraining with each taxonomy update, become impractical due to the high overhead of labelled data collection and model adaptation. Large Language Models (LLMs) have demonstrated great potential in complex tasks such as multi-label classification. However, applying them to large and dynamic taxonomies presents unique challenges as the vast number of labels can exceed LLMs' input limits. In this paper, we present novel methods that combine the strengths of LLMs with dense retrieval techniques to overcome these challenges. Our approach avoids retraining by leveraging zero-shot HMC for real-time label assignment. We evaluate the effectiveness of our methods on SSRN, a large repository of preprints spanning multiple disciplines, and demonstrate significant improvements in both classification accuracy and cost-efficiency. By developing a tailored evaluation framework for dynamic taxonomies and publicly releasing our code, this research provides critical insights into applying LLMs for document classification, where the number of classes corresponds to the number of nodes in a large taxonomy, at an industrial scale.
Shahin Mirshekari, Mohammadreza Moradi, Hossein Jafari et al.
This research employs Gaussian Process Regression (GPR) with an ensemble kernel, integrating Exponential Squared, Revised Matérn, and Rational Quadratic kernels to analyze pharmaceutical sales data. Bayesian optimization was used to identify optimal kernel weights: 0.76 for Exponential Squared, 0.21 for Revised Matérn, and 0.13 for Rational Quadratic. The ensemble kernel demonstrated superior performance in predictive accuracy, achieving an \( R^2 \) score near 1.0, and significantly lower values in Mean Squared Error (MSE), Mean Absolute Error (MAE), and Root Mean Squared Error (RMSE). These findings highlight the efficacy of ensemble kernels in GPR for predictive analytics in complex pharmaceutical sales datasets.
Xinyi Zheng, Chen Wei, Shenao Wang et al.
The exponential growth of open-source package ecosystems, particularly NPM and PyPI, has led to an alarming increase in software supply chain poisoning attacks. Existing static analysis methods struggle with high false positive rates and are easily thwarted by obfuscation and dynamic code execution techniques. While dynamic analysis approaches offer improvements, they often suffer from capturing non-package behaviors and employing simplistic testing strategies that fail to trigger sophisticated malicious behaviors. To address these challenges, we present OSCAR, a robust dynamic code poisoning detection pipeline for NPM and PyPI ecosystems. OSCAR fully executes packages in a sandbox environment, employs fuzz testing on exported functions and classes, and implements aspect-based behavior monitoring with tailored API hook points. We evaluate OSCAR against six existing tools using a comprehensive benchmark dataset of real-world malicious and benign packages. OSCAR achieves an F1 score of 0.95 in NPM and 0.91 in PyPI, confirming that OSCAR is as effective as the current state-of-the-art technologies. Furthermore, for benign packages exhibiting characteristics typical of malicious packages, OSCAR reduces the false positive rate by an average of 32.06% in NPM (from 34.63% to 2.57%) and 39.87% in PyPI (from 41.10% to 1.23%), compared to other tools, significantly reducing the workload of manual reviews in real-world deployments. In cooperation with Ant Group, a leading financial technology company, we have deployed OSCAR on its NPM and PyPI mirrors since January 2023, identifying 10,404 malicious NPM packages and 1,235 malicious PyPI packages over 18 months. This work not only bridges the gap between academic research and industrial application in code poisoning detection but also provides a robust and practical solution that has been thoroughly tested in a real-world industrial setting.
Dennis Junger, Max Westing, Christopher P. Freitag et al.
Progressing digitalization and increasing demand and use of software cause rises in energy- and resource consumption from information and communication technologies (ICT). This raises the issue of sustainability in ICT, which increasingly includes the sustainability of the software products themselves and the art of creating sustainable software. To this end, we conducted an analysis to gather and present existing literature on three research questions relating to the production of ecologically sustainable software ("Green Coding") and to provide orientation for stakeholders approaching the subject. We compile the approaches to Green Coding and Green Software Engineering (GSE) that have been published since 2010. Furthermore, we considered ways to integrate the findings into existing industrial processes and higher education curricula to influence future development in an environmentally friendly way.
Naseela Pervez, Alexander J. Titus
Biotechnology Industry 5.0 is advancing with the integration of cutting-edge technologies like Machine Learning (ML), the Internet Of Things (IoT), and cloud computing. It is no surprise that an industry that utilizes data from customers and can alter their lives is a target of a variety of attacks. This chapter provides a perspective of how Machine Learning Security Operations (MLSecOps) can help secure the biotechnology Industry 5.0. The chapter provides an analysis of the threats in the biotechnology Industry 5.0 and how ML algorithms can help secure with industry best practices. This chapter explores the scope of MLSecOps in the biotechnology Industry 5.0, highlighting how crucial it is to comply with current regulatory frameworks. With biotechnology Industry 5.0 developing innovative solutions in healthcare, supply chain management, biomanufacturing, pharmaceuticals sectors, and more, the chapter also discusses the MLSecOps best practices that industry and enterprises should follow while also considering ethical responsibilities. Overall, the chapter provides a discussion of how to integrate MLSecOps into the design, deployment, and regulation of the processes in biotechnology Industry 5.0.
John Ting Zhi Zhang, Jeng Shiun Lim
The management of product quality in palm oil crystallisation poses a formidable challenge. Although various model-based optimisation control strategies have been widely applied, their effectiveness hinges on understanding the intricate and highly nonlinear dynamic behavior of crystallisation. Notably, existing research has predominantly focused on diverse applications, such as wastewater treatment, sugar cane crystallisation, and the pharmaceutical industry, leaving a notable research gap in the crystallisation processes specific to the palm oil industry. This research attempts to fill this gap by investigating the impact of an optimisation tool that combines artificial neural network and genetic algorithm (ANN-GA) to optimize the crystallisation recipe, specifically the cooling segments of palm oil, for three different cloud points of palm olein (CP 6, CP 8, and CP 10). The artificial neural network (ANN), which uses the Levenberg-Marquardt algorithm, serves as an internal model for predicting process output, whereas the genetic algorithm (GA) investigates a wide range of recipe combinations to maximise yield. Using MATLAB for optimisation, the ANN-GA approach goes through training, testing, and validation steps with industry-derived datasets. The results show root mean sqaure error (RMSE) of 0.8411 for CP 6, 0.4317 for CP 8, and 0.4105 for CP 10, indicating that ANN is sensitive to dataset volumes. Using GA as an optimisation tool, it generates optimal input variables for industrial validation. Validation results reveal an enhanced yield of 63 % for CP 6 palm olein, 74 % for CP 8 palm olein, which is within industrial range (66-76 %), and 77.26 % for CP 10 palm olein, which is within the range of 76-79 %. Overall, the ANN-GA technique is effective in predicting complicated systems such as palm olein and palm stearin crystallisation processes.
Ecevit Bilgili
The pharmaceutical and biotechnology industry continues to be one of the most important industry sectors for various reasons [...]
Faizan Khan, Chandra Shekhar, Tarak Mondal et al.
We show that the rough particles studded with platinum nanoparticles can be fabricated straightforwardly and in a single step at room temperature. These rough particles displayed a good catalytic power (100% removal efficiency) against a model industrial dye (methylene blue) and pharmaceutical residue (tetracycline) within a reasonable time scale. Further, we illustrate the effects of particle size, concentration, and contact patterns on the performance of rough catalytic particles. The semi-batch conditions favoured the complete decomposition of tetracycline within 40 min, but the batch-wise operation offered a good contacting pattern for methylene blue yielding a maximal output within 10 min. The kinetics of the heterogeneous catalytic process modelled by Langmuir-Hinshelwood kinetics predicts that the given methylene blue decomposition reaction induced by the rough particles follows the pseudo-first-order kinetics. The rate constants for the reaction catalyzed by 0.6 and 1.0um-sized rough particles are 0.048 and 0.032 min^-1, respectively. Furthermore, we established the proof-of-concept using magnetically-responsive rough particles for real-time applications, including decontamination and recovery of catalyst particles via an externally applied magnetic field in one cycle. Our proposed method helps achieve a near-100% degrading efficiency within 10 to 40 min at minimal catalytic particle concentration, i.e., 200 ppm. Since we can turn the rough particles into super-paramagnetic, we can recover and reuse them for several wastewater treatment cycles without incurring any running costs.
Tingting Chen, Feng Chu, Jiantong Zhang et al.
The rapid growth of pharmaceutical refrigerated logistics poses sustainability challenges, including elevated costs, energy consumption, and resource inefficiency. Collaborating multiple depots can enhance logistics efficiency when standalone distribution centers have limited transport resources, i.e., refrigerated vehicles. However, the sustainable benefits and performance across different strategies remain unexplored. This study fills this research gap by addressing a refrigerated pharmaceutical routing problem. While many collaborative strategies prioritize economic and environmental benefits, our approach highlights a vital social indicator: maintaining vehicle flow equilibrium at each depot during collaboration. This ensures the stability of transport resources for all stakeholders, promoting sustainable collaborative logistics. The problem is formulated as a multi-depot vehicle routing problem with time windows (MDVRPTW). Three collaborative strategies using Clustering VRP (CLUVRP) and improved Open VRP (OVRP) are proposed and compared. We develop two approaches to address traditional OVRP limitations in ensuring vehicle flow equilibrium at each depot. Our models consider perishable pharmaceuticals and time-dependent travel speeds. Three hybrid heuristics based on Simulated Annealing and Variable Neighborhood Search (SAVNS) are proposed and evaluated for efficacy. Computational experiments and a case study demonstrate distinct sustainable benefits across various strategies, offering valuable insights for decision-makers in the refrigerated logistics market.
Gisèle KOUAKOU SIRANSY
Contexte : Le développement de phytomédicaments ou médicaments traditionnels améliorés en Afrique sub-saharienne connait un succès grandissant. En Côte d’Ivoire, diverses unités artisanales de fabrication de phytomédicaments se développent mais restent peu évalués pour leur efficacité, innocuité et qualité. Justificatif : Parmi toutes les pathologies affectant la population subsaharienne, le paludisme occupe une place importante étant la première cause de maladie infectieuse parasitaire, et la 3ème cause de maladies infectieuses. Les produits de santé des tradithérapeutes restent peu évalués pour leur efficacité, innocuité et qualité. Les chercheurs et enseignants chercheurs au sein des universités ont emboîté le pas dans plusieurs pays. En Côte d’Ivoire aucune université n’a franchi le pas de la production à l’échelle d’unité industrielle pilote. Objectif : L’objectif de ce travail visait à sélectionner des plantes pour la mise au point de phytomédicaments antipaludiques de qualité de catégorie 2 OMS. Méthodologie : La sélection des plantes à l’essai a concerné celles ayant fait l’objet de travaux de recherche des Universités en Côte d’Ivoire. Parmi ces derniers, ceux évaluant l’effet sur des extraits aqueux de parties aériennes des plantes. De ces extraits ceux présentant les meilleures inhibitions de croissance du Plasmodium selon les critères de Wilcox, ont été retenues pour la mise au point de phytomédicaments de catégorie 2 OMS. Les essais de pré formulation er formulation galénique à l’échelle de laboratoire ont permis de mettre en œuvre le procédé de fabrication adéquat. Une transposition à l’échelle pilote a été ensuite réalisée pour démontrer la reproductibilité de la fabrication industrielle de la forme galénique mise au point. Résultats : Ces résultats issus des travaux de chercheurs des universités ivoiriennes ont permis de recenser 58 plantes médicinales étudiées pour leur activité antiplasmodiale depuis 1996. Parmi ces plantes 38 ont fait l’objet d’extraits aqueux, décoctés ou infusés. Sept extraits aqueux présentant de CI50 <5µg/ml ont été retenues entre autres. Cependant la majorité des études scientifiques portant sur les plantes médicinales potentiellement antipaludiques ont été réalisées dans des modèles in vitro, rare sont celle réalisés in vivo, dans des modèles murins. Les résultats des essais pharmacologiques, de formulation et de transposition à l’échelle pilote ont permis de disposer de gélules à base de granulés de plantes issus d’une granulation humide. Conclusion : Les travaux scientifiques des Universités de Côte d’Ivoire offre un large éventail de plantes médicinales à potentiel antimalarial pour la conception de phytomédicaments de qualité de catégorie 2 OMS. Des essais préliminaires réalisés in vivo ont permis d’obtenir un brevet d’invention.
Gaurab Aryal, Federico Ciliberto, Leland E. Farmer et al.
We propose a methodology to estimate the market value of pharmaceutical drugs. Our approach combines an event study with a model of discounted cash flows and uses stock market responses to drug development announcements to infer the values. We estimate that, on average, a successful drug is valued at \$1.62 billion, and its value at the discovery stage is \$64.3 million, with substantial heterogeneity across major diseases. Leveraging these estimates, we also determine the average drug development costs at various stages. Furthermore, we explore applying our estimates to design policies that support drug development through drug buyouts and cost-sharing agreements.
Guido Thömmes, Martin Oliver Sailer, Nicolas Bonnet et al.
The pharmaceutical industry has experienced increasing costs and sustained high attrition rates in drug development over the last years. One proposal that addresses this challenge from a statistical perspective is the use of quantitative decision-making (QDM) methods to support a data-driven, objective appraisal of the evidence that forms the basis of decisions at different development levels. Growing awareness among statistical leaders in the industry has led to the creation of the European EFSPI/PSI special interest group (ESIG) on quantitative decision making to share experiences, collect best practices, and promote the use of QDM. In this paper, we introduce key components of QDM and present examples of QDM methods on trial, program, and portfolio level. The ESIG created a questionnaire to learn how and to what extent QDM methods are currently used in the different development phases. We present the main questionnaire findings, and we show where QDM is already used today but also where areas for future improvement can be identified. In particular, statisticians should increase their visibility, involvement, and leadership in cross-functional decision-making.
Halaman 5 dari 260558