Abstract Centrality metrics quantify a node’s importance within a network based on a node’s connectivity, path position, proximity to other nodes, or influence from neighbors. All of these properties are influenced by the network structure and do not consider a node’s features. To overcome this, two novel centrality metrics, termed inflow and outflow centrality, were introduced here. The metrics were derived from the aggregation approach used in graph convolutional networks, which allow for direct incorporation of node features with graph structure. The metrics were contrasted against the unweighted betweenness centrality and four node-weighted centrality metrics, weighted-degree, weighted-closeness, personalized PageRank, and alpha centrality, for an airport, an airplane trade, and a protein-protein interaction network. By emphasizing the contribution of otherwise little connected neighbor nodes, the new metrics prioritize nodes that are crucial to maintain a graph’s connectivity.
Annesha Sarmah, Kaushik Dehingia, Purnendu Sardar
et al.
In this paper, we develop a novel five-compartment terrorism dynamics model that explicitly incorporates a terror funding class, thereby capturing the critical role of financial resources in sustaining recruitment, logistics, and operational activities. To better reflect real-world processes, the model introduces two discrete time delays: τ1, representing the indoctrination period required for susceptible individuals to become terrorists, and τ2, denoting the lag associated with transferring terrorists to the recovered or quarantined classes. The main contributions of this work include: (i) the formulation of a funding-integrated terrorism model with dual delays; (ii) a complete mathematical analysis of positivity, boundedness, and equilibrium stability; (iii) derivation of the basic reproduction number ℛ0 and a sensitivity analysis identifying the parameters that most strongly influence terrorism persistence; and (iv) a rigorous investigation of delay-induced destabilisation and Hopf bifurcation. For the non-delayed system, we establish conditions ensuring the existence and local stability of the terror-free equilibrium when ℛ0<1 and the terror-persistent equilibrium when ℛ0>1. For the delayed system, we demonstrate that increasing either τ1 or τ2 beyond their respective critical thresholds leads to Hopf bifurcations and sustained oscillations, representing recurrent waves of terrorist activity. Numerical simulations are provided to validate the analytical results. Overall, the study offers insight into how the speed of radicalisation, operational delays, and financial resources interact to shape terrorism dynamics, with potential implications for the design of more effective counter-terrorism policies.
To address the limited defect-detection capability of existing performance testing methods for switching power supplies under varying operating conditions, this paper proposes a defect identification approach based on an enhanced Dung Beetle Optimizer. The algorithm integrates multi-strategy improvements—including piecewise chaotic mapping, Lévy flight perturbation, hybrid sine–cosine updating, and an alert sparrow mechanism—to refine the initial population generation, position update rules, and late-stage exploration. These enhancements strengthen its spatial search ability and computational performance. The experimental results show that the method accurately identifies the predefined defect intervals with a precision of 94.79%, covering 91.3% of the operating conditions. Comparisons with existing mainstream methods confirm the superior performance, effectiveness, and feasibility of the proposed method.
Nadia Samantha Zuñiga-Peña, Salatiel Garcia-Nava, Norberto Hernandez-Romero
et al.
Optimization methods like population-based algorithms are valuable when applied to multidimensional and nonlinear problems. Many engineering problems, such as controller parameterization, can be addressed using population-based algorithms since these parameters are usually found through essays, resulting in high time and resource consumption. Population-based algorithms need to define the range within which the search for the best solution is performed, known as the search space. However, due to the nonlinear nature of the systems to which these controllers are applied, there is no certainty about the search space that must be defined. This study proposes a hybrid optimization strategy that couples the Hunger Games Search (HGS) metaheuristic with an unsupervised Self Organizing Map, Kohonen Neural Network, to improve trajectory-tracking control of unmanned aerial vehicles (UAVs) transporting cable suspended loads. In the proposed NNHGS, the HGS algorithm seeks the controller gains that minimize Root Mean Square tracking Error (RMSE). At the same time, the neural network continuously reshapes the search intervals according to the evolving tracking performance. By expanding the exploration into parameter regions beyond the initial bounds, the NNHGS finds high-quality solutions that standard HGS excludes. The simulation results obtained with a Super Twisting Sliding Mode Controller (STSMC) show a reduction in the final tracking error from RMSE=0.0480 with HGS to RMSE = 0.0204 by NNHGS, along with enhanced disturbance rejection and rapid adaptation to parameter changes. These gains highlight the suitability of this method for real-world missions such as logistics, disaster relief, or remote inspection, where UAVs must remain stable under uncertain or parameter-varying conditions.
Catarina N Carvalho, Andreia Gaspar, Carlos Real
et al.
Myocardial perfusion cardiovascular magnetic resonance (pCMR) using first-pass contrast-enhanced imaging could play an important role in the detection of epicardial and microvascular coronary artery disease. Recently, the emergence of quantitative pCMR has provided a more reliable and observer-independent analysis compared to visual interpretation of dynamic images. This review aims to cover the basics of quantitative pCMR, from acquisition protocols, its use in preclinical and clinical studies, image reconstruction and motion handling, to automated quantitative pCMR pipelines. It also offers an overview of emerging tools in the field, including artificial intelligence-based methods.
Shashikant Waghule, Dinkar Patil, Amjad Shaikh
et al.
This research investigates the numerical solution of the time-fractional Burgers–Fisher equation, utilizing the Atangana–Baleanu differential operator in the Caputo sense. This study addresses the need to comprehend the dynamics of nonlinear phenomena encountered in various scientific and engineering contexts, specifically within the Burgers–Fisher equation, which intertwines diffusion and reaction processes. Our findings reveal that the application of the Atangana–Baleanu operator significantly alters the behavior of the system, exhibiting distinct characteristics compared to traditional methods. Notably, we identify unique patterns of propagation, such as enhanced wave speed and altered front dynamics, that emerge due to the fractional dynamics. The simulations demonstrate improved stability and convergence properties when utilizing the Atangana–Baleanu operator, allowing for more accurate representations of physical processes. Additionally, we observe the emergence of non-local effects and the potential for multiple equilibrium states, enriching our understanding of the complex interactions within the system. Through the finite difference method, we efficiently discretize the continuous problem, facilitating simulations that illustrate the intricate temporal behavior of the time-fractional system. This methodology not only enhances the understanding of the physical processes involved but also contributes a novel framework for studying time-fractional equations, emphasizing the rich dynamics introduced by the Atangana–Baleanu operator in conjunction with the Caputo fractional derivative.
Nazir L. Gandur, Stephen Ekwaro-Osire, Jahan Rasty
et al.
Car accidents, a major US public safety issue, demand precise analysis and predictive models for mitigation. This study asks the following question: Can the safest car routes across the US be determined? The paper analyzes historical data to forecast future accidents and calculates the safest route between two locations. The study builds a predictive model utilizing statistical analyses, data mining, and machine learning. A joint probability density function (PDF) is devised to calculate the safest route for risk modeling, factoring in latitude and longitude. The model quantifies accident probabilities in areas and travel routes. Additionally, the safest direction can be determined using the gradient of the joint PDF curve. The predictive model enables policymakers to allocate resources proactively. The safest route selection enables drivers to navigate safer areas and routes, which can reduce the number of accidents. Through its analysis and joint PDF model, this research enriches accident analysis and prevention engineering, potentially fostering safer US roads.
Florian Anderl, Gabriela Salvadori, Mladen Veletic
et al.
Bacterial sensor systems can be used for the detection and measurement of molecular signal concentrations. The dynamics of the sensor directly depend on the biological properties of the bacterial sensor cells; manipulation of these features in the wet lab enables the engineering and optimization of the bacterial sensor kinetics. This necessitates the development of biologically meaningful computational models for bacterial sensors comprising a variety of different molecular mechanisms, which further facilitates a systematic and quantitative evaluation of optimization strategies. In this work, we dissect the detection chain of bacterial sensors from a mathematical perspective from which we derive, supported by wet-lab data, a complete computational model for a Streptococcus mutans-based bacterial sensor as a case example. We address the engineering of bacterial sensors by investigating the impact of altered bacterial cell properties on the sensor response characteristics, specifically sensor sensitivity and response signal intensity. This is achieved through a sensitivity analysis targeting both the steady-state and transient sensor response characteristics. Alongside the demonstration of suitability of our methodological approach, our analysis shows that an increase of sensor sensitivity, through a targeted manipulation of bacterial physiology, often comes at the cost of generally diminished sensor response intensity.
Machines can model and improve the human mind’s capabilities through artificial intelligence. One of the most popular tools of artificial intelligence is fuzzy sets, which can capture and model the vagueness and impreciseness in human thoughts. This paper, first of all, introduces the recent extensions of ordinary fuzzy sets and then presents a literature review on the integration of fuzzy sets with other artificial intelligence techniques such as automated reasoning, autonomous agents, multi-agent systems, machine learning, case-based reasoning, deep learning, information reasoning, information representation, natural language processing, symbolic reasoning, and neural networks. Graphical illustrations of literature review results are presented for each of these integrated artificial intelligence techniques. The results of a patent search on fuzzy artificial intelligence are also given.
To address the challenges of incomplete knowledge representation, independent decision ranges, and insufficient causal decisions in bogie welding decisions, this paper proposes a hybrid decision-making method and develops a corresponding intelligent system. The collaborative case, rule, and knowledge graph approach is used to support structured documents and domain causality decisions. In addition, we created a knowledge model of bogie welding characteristics and proposed a case-matching method based on empirical weights. Several entity categorizations and relationship extraction models were trained under supervised conditions while building the knowledge graph. CRF and CR-CNN obtained high combined F1 scores (0.710 for CRF and 0.802 for CR-CNN) in the entity classification and relationship extraction tasks, respectively. We designed and developed an intelligent decision system based on the proposed method to implement engineering applications. This system was validated with some actual engineering data. The results show that the system obtained a high score on the accuracy test (0.947 for Corrected Accuracy) and can effectively complete structured document and causality decision-making tasks, having large research significance and engineering value.
The eighteenth century saw a flourishing of scientific and philosophical thought throughout Scotland, known as the Scottish Enlightenment. The accomplishments of prominent male figures of this period have been well documented in all disciplines. However, studies of women's experiences are relatively sparse. This paper partially corrects this oversight by drawing together evidence for women's participation in mathematics in Scotland between 1730 and 1850. In considering women across all social classes, it argues for a broad definition of 'mathematics' that includes arithmetic and astronomy, and assesses women's opportunities for engagement under three headings: education, family, and sociability. It concludes that certain elements of Scottish Enlightenment culture promoted wider participation by women in mathematical activities than has previously been recognized, but that such participation continued to be circumscribed by societal views of the role of women within family formation.
Dolagobinda Das, Gauranga Charan Samanta, Abhijit Barman
et al.
This study develops an inventory model for things that deteriorate at a rate that depends on time. The research work is about minimizing the total cost of a two-warehouse system during the lockdown period. We consider two scenarios and the LIFO policy has been used in this model. In the first scenario, the stocks of the rented warehouse (RW) become empty after the lockdown eased, and in the second scenario, the stocks of the RW become empty during the lockdown. Here, two parametric Weibull distributions for the deterioration rate and a time-dependent demand are taken into consideration. Subsequently, sensitivity analysis is examined for both scenarios by using two different examples to make the research more realistic. In an emergency like the COVID-19 epidemic, the models may be used effectively when taking into consideration the actual circumstances.
Babak Haghighi, Warren B. Gefter, Lauren Pantalone
et al.
Purpose: To utilize high-resolution quantitative CT (QCT) imaging features for prediction of diagnosis and prognosis in fibrosing interstitial lung diseases (ILD). Approach: 40 ILD patients (20 usual interstitial pneumonia (UIP), 20 non-UIP pattern ILD) were classified by expert consensus of 2 radiologists and followed for 7 years. Clinical variables were recorded. Following segmentation of the lung field, a total of 26 texture features were extracted using a lattice-based approach (TM model). The TM model was compared with previously histogram-based model (HM) for their abilities to classify UIP vs non-UIP. For prognostic assessment, survival analysis was performed comparing the expert diagnostic labels versus TM metrics. Results: In the classification analysis, the TM model outperformed the HM method with AUC of 0.70. While survival curves of UIP vs non-UIP expert labels in Cox regression analysis were not statistically different, TM QCT features allowed statistically significant partition of the cohort. Conclusions: TM model outperformed HM model in distinguishing UIP from non-UIP patterns. Most importantly, TM allows for partitioning of the cohort into distinct survival groups, whereas expert UIP vs non-UIP labeling does not. QCT TM models may improve diagnosis of ILD and offer more accurate prognostication, better guiding patient management.
Bibiane de Fátima Santos, Maria Danielle Araújo Mota, Paulo Meireles Barguil
Nas últimas décadas, inúmeros trabalhos relatam a contribuição das Feiras de Ciências e dos Museus de Ciências na divulgação científica, elencando possibilidades, conquistas e desafios. Embora diversos autores relatem a importância das Feiras de Ciências na divulgação do conhecimento científico, no ambiente escolar e na sociedade, é necessário admitir que elas possuem limitações na realização e/ou na exposição dos trabalhos. Essa pesquisa teve como objetivo analisar como foram desenvolvidos os trabalhos apresentados na Feira de Ciências do Estado de Alagoas (FECEAL) em 2019, pois conhecer as escolhas, os desafios e as conquistas que os estudantes vivenciam na realização dos trabalhos pode contribuir para ampliar as aprendizagens dos participantes – estudantes e visitantes – das Feiras de Ciências. Foi realizada uma pesquisa qualitativa, mediante questionário, com 41 (quarenta e um) estudantes de escolas públicas do estado de Alagoas, que apresentaram trabalho na FECEAL 2019. O questionário continha 7 (sete) perguntas sobre: o tipo de trabalho, o motivo da participação, o processo de escolha do tema, o local, a metodologia, o alcance dos objetivos e o tipo de auxílio recebido. Os dados colhidos ratificam que a Feira de Ciência pode favorecer a produção e divulgação científica, melhorando a aprendizagem discente, bem como o vínculo entre escola e sociedade. Identificamos limites no desenvolvimento dos trabalhos apresentados na FECEAL 2019, de modo especial, no tipo de trabalho e na metodologia. Para enfrentar esses desafios, é necessário o apoio dos órgãos competentes para melhorar as condições de trabalho docente, a formação continuada e a infraestrutura.
Special aspects of education, Applied mathematics. Quantitative methods
Gaisi Takeuti introduced Boolean valued analysis around 1974 to provide systematic applications of Boolean valued models of set theory to analysis. Later, his methods were further developed by his followers, leading to solving several open problems in analysis and algebra. Using the methods of Boolean valued analysis, he further stepped forward to construct set theory based on quantum logic, as the first step to construct "quantum mathematics", a mathematics based on quantum logic. While it is known that the distributive law does not apply to quantum logic, and the equality axiom turns out not to hold in quantum set theory, he showed that the real numbers in quantum set theory are in one-to-one correspondence with the self-adjoint operators on a Hilbert space, or equivalently the physical quantities of the corresponding quantum system. As quantum logic is intrinsic and empirical, the results of the quantum set theory can be experimentally verified by quantum mechanics. In this paper, we analyze Takeuti's mathematical world view underlying his program from two perspectives: set theoretical foundations of modern mathematics and extending the notion of sets to multi-valued logic. We outlook the present status of his program, and envisage the further development of the program, by which we would be able to take a huge step forward toward unraveling the mysteries of quantum mechanics that have persisted for many years.
Christophe Bastien, Clive Neal-Sturgess, Huw Davies
et al.
In the real world, the severity of traumatic injuries is measured using the Abbreviated Injury Scale (AIS). However, the AIS scale cannot currently be computed by using the output from finite element human computer models, which currently rely on maximum principal strains (MPS) to capture serious and fatal injuries. In order to overcome these limitations, a unique Organ Trauma Model (OTM) able to calculate the threat to the life of a brain model at all AIS levels is introduced. The OTM uses a power method, named Peak Virtual Power (PVP), and defines brain white and grey matter trauma responses as a function of impact location and impact speed. This research has considered ageing in the injury severity computation by including soft tissue material degradation, as well as brain volume changes due to ageing. Further, to account for the limitations of the Lagrangian formulation of the brain model in representing hemorrhage, an approach to include the effects of subdural hematoma is proposed and included as part of the predictions. The OTM model was tested against two real-life falls and has proven to correctly predict the post-mortem outcomes. This paper is a proof of concept, and pending more testing, could support forensic studies.
Milena Capiglioni, Analia Zwick, Pablo Jimenez
et al.
Extracting reliable and quantitative microstructure information of living tissue by non-invasive imaging is an outstanding challenge for understanding disease mechanisms and allowing early stage diagnosis of pathologies. Magnetic Resonance Imaging is the favorite technique to pursue this goal, but still provides resolution of sizes much larger than the relevant microstructure details on in-vivo studies. Monitoring molecular diffusion within tissues, is a promising mechanism to overcome the resolution limits. However, obtaining detailed microstructure information requires the acquisition of tens of images imposing long measurement times and results to be impractical for in-vivo studies. As a step towards solving this outstanding problem, we here report on a method that only requires two measurements and its proof-of-principle experiments to produce images of selective microstructure sizes by suitable dynamical control of nuclear spins with magnetic field gradients. We design microstructure-size filters with spin-echo sequences that exploit magnetization "decay-shifts" rather than the commonly used decay-rates. The outcomes of this approach are quantitative images that can be performed with current technologies, and advance towards unravelling a wealth of diagnostic information based on microstructure parameters that define the composition of biological tissues.
Sarcasm detection is an important task in affective computing, requiring large amounts of labeled data. We introduce reactive supervision, a novel data collection method that utilizes the dynamics of online conversations to overcome the limitations of existing data collection techniques. We use the new method to create and release a first-of-its-kind large dataset of tweets with sarcasm perspective labels and new contextual features. The dataset is expected to advance sarcasm detection research. Our method can be adapted to other affective computing domains, thus opening up new research opportunities.
Damiano Di Francesco Maesa, Andrea Marino, Laura Ricci
Abstract The availability of the entire Bitcoin transaction history, stored in its public blockchain, offers interesting opportunities for analysing the transaction graph to obtain insight on users behaviour. This paper presents an analysis of the Bitcoin users graph, obtained by clustering the transaction graph, to highlight its connectivity structure and the economical meaning of the different obtained components. In fact, the bow tie structure, already observed for the graph of the web, is augmented, in the Bitocoin users graph, with the economical information about the entities involved. We study the connectivity components of the users graph individually, to infer their macroscopic contribution to the whole economy. We define and evaluate a set of measures of nodes inside each component to characterize and quantify such a contribution. We also perform a temporal analysis of the evolution of the resulting bow tie structure. Our findings confirm our hypothesis on the components semantic, defined in terms of their economical role in the flow of value inside the graph.