Structural approaches to myth and narrative are compelling in close reading but hard to compare across traditions, media, and scale. We propose a formal framework that renders Lévi-Straussian transformation as mathematics while remaining readable as narrative analysis. Variants, superhero continuities, and franchise arcs are modeled as typed rewrite programs on a coupled two-register state $(X,Y)$, abstracting an everyday/social channel and a symbolic/legitimation channel. The canonical formula becomes coherence data: a natural transformation $η:U\Rightarrow V$ between update endofunctors, where $U$ updates each register in place and $V$ performs a swap+inversion. Context is internalized by operator choice, turning naturality into a corpus-facing type check: failures diagnose mis-specified oppositions or illegal transport; successes witness coherent structural models. Order effects are summarized by a five-value invariant (Key). We apply the method to 80 narratives (20 folktales, 20 religious myths, 20 superheroes, 20 franchises), each encoded as $(a,b,x,y)$ with a Key. 59/80 (74\%) explicitly name a normative constraint in $y$ (law, taboo, contract, prophecy), supporting the two-register abstraction. The result is a testable bridge between structural anthropology and cultural analytics: stories remain interpretable yet become transportable objects for computation, comparison, and falsifiable constraints on transformation.
In response to the global challenge of climate change, financial institutions are increasingly called upon to assess and disclose their carbon emissions. Various global carbon quantification and reporting standards were developed, such as the Greenhouse Gas (GHG) Protocol, Task Force on Climate-related Financial Disclosures (TCFD), Partnership for Carbon Accounting Financials (PCAF) and others. Unfortunately, the now diverse landscape of standards increases the complexity for institutions seeking to develop voluntary carbon quantification and reporting. This study addresses the complexity issue by developing a criteria-based tool that summarizes the various components and requirements of the carbon standards. We propose eight criteria that summarize the standards’ key elements, requirements and relevance to the financial industry. We analyze seven major carbon quantification and reporting standards, systematically evaluating them against our tool. By doing so, we provide financial institutions with valuable insights in selecting appropriate standards to inform their emissions quantification and reporting decisions.
Akram Aqil Syahru, Nasrullah, Aven Ghina Salsabila
et al.
Environmental degradation driven by negative externalities and fiscal inequality demands a reconfiguration of taxation grounded in the Polluter Pays Principle (PPP). This study aims to develop a normative–comparative framework for a green tax system that internalizes pollution costs while promoting fiscal justice. Using a normative legal research method, the analysis explores the theoretical and institutional foundations of green taxation, drawing from Indonesia’s environmental legislation, the Rio Declaration, and European Union guidelines, while examining fiscal equity and progressive redistribution. A comparative perspective highlights the implementation of PPP across jurisdictions: South Africa’s carbon tax, Portugal’s corporate and VAT-based green tax, and Indonesia’s emerging carbon pricing scheme. The study focuses on legal mechanisms of redistribution, including targeted cash transfers, tax credits, and tax-shift models, as well as the role of fiscal transparency and administrative oversight in mitigating regressive impacts. The findings indicate that a green tax framework rooted in PPP and supported by progressive redistribution and legal transparency enhances ecological accountability, social equity, and policy legitimacy. This paper contributes to environmental fiscal reform discourse by proposing a legally grounded and equitable model for sustainable green tax implementation.
The Ishango Bone, a prehistoric artifact dated to approximately 20,000 years ago and discovered near the Semliki River in what is now the Democratic Republic of Congo, has intrigued researchers for the past 75 years. The artifact displays sixteen groups of notches arranged in three columns. While its function remains debated, this study suggests that the first two columns consist exclusively of all prime or odd numbers between 9 and 21, with the exception of 15, which appears only in the third column as two grouped pairs. Five groupings totaling 30 could be identified, and their arrangement may follow a consistent pattern. Additional numerical relationships between all three columns can be interpreted to support all four basic arithmetic operations. It is hypothesized that the notches may have served as reference marker to lay out their values for storytelling or teaching in the form of mathematical art. This study aims to broaden perspectives on the Ishango Bone and its traditional interpretation as a simple tallying device, and to encourage a re-evaluation of the mathematical capabilities of prehistoric humans.
Abstract Per‐ and poly‐fluoroalkyl substances (PFAS) are interfacially‐active contaminants that adsorb at air‐water interfaces (AWIs). Water‐unsaturated soils have abundant AWIs, which generally consist of two types: one is associated with the pendular rings of water between soil grains (i.e., bulk AWI) and the other arises from the thin water films covering the soil grains. To date, the two types of AWIs have been treated the same when modeling PFAS retention in vadose zones. However, the presence of electrical double layers of soil grain surfaces and the subsequently modified chemical potential of PFAS at the AWI may significantly change the PFAS adsorption at the thin‐water‐film AWI relative to that at the bulk AWI. Given that thin water films contribute to over 90% of AWIs in the vadose zone under many field‐relevant wetting conditions, it is critical to quantify the potential anomalous adsorption of PFAS at the thin‐water‐film AWI. We develop a thermodynamic‐based mathematical model to quantify this anomalous adsorption. The model couples the chemical equilibrium of PFAS with the Poisson‐Boltzmann equation that governs the distribution of electrical potential in a thin water film. Our model analyses suggest that PFAS adsorption at thin‐water‐film AWI can deviate significantly (up to 82%) from that at bulk AWIs. The deviation increases for lower porewater ionic strength, thinner water film, and higher soil grain surface charge. These results highlight the importance of accounting for the anomalous adsorption of PFAS at the thin‐water‐film AWI when modeling PFAS fate and transport in the vadose zone.
Three-dimensional voxel models are widely applied in various fields such as 3D imaging, industrial design, and medical imaging. The advancement of 3D modeling techniques and measurement devices has made the generation of three-dimensional models more convenient. The exponential increase in the number of 3D models presents a significant challenge for model retrieval. Currently, these models are numerous and typically represented as point clouds or meshes, resulting in sparse data and high feature dimensions within the retrieval database. Traditional methods for 3D model retrieval suffer from high computational complexity and slow retrieval speeds. To address this issue, this paper combines spatial-filling curves with octree structures and proposes a novel approach for representing three-dimensional voxel model sequence data features, along with a similarity measurement method based on symbolic operators. This approach enables efficient similarity calculations and rapid dimensionality reduction for the three-dimensional model database, facilitating efficient similarity calculations and expedited retrieval.
Thibault Clérice, Juliette Janes, Hugo Scheithauer
et al.
We present a novel, open-access dataset designed for semantic layout analysis, built to support document recreation workflows through mapping with the Text Encoding Initiative (TEI) standard. This dataset includes 7,254 annotated pages spanning a large temporal range (1600-2024) of digitised and born-digital materials across diverse document types (magazines, papers from sciences and humanities, PhD theses, monographs, plays, administrative reports, etc.) sorted into modular subsets. By incorporating content from different periods and genres, it addresses varying layout complexities and historical changes in document structure. The modular design allows domain-specific configurations. We evaluate object detection models on this dataset, examining the impact of input size and subset-based training. Results show that a 1280-pixel input size for YOLO is optimal and that training on subsets generally benefits from incorporating them into a generic model rather than fine-tuning pre-trained weights.
Segregation is a widely recognised phenomenon with profound implications for societies worldwide. From political science and gender studies to anthropology and urban studies, it has garnered considerable attention across numerous scientific fields due to its multifaceted nature. However, what makes segregation such a far-reaching phenomenon? In fact, how many forms of segregation exist? Have different disciplines engaged in segregation research uncovered all its facets? This article systematically explores the landscape of segregation research spanning over a century. We analyzed 10,754 documents from the Scopus database to unveil the dynamics of the discovery of segregation forms through several findings. We identify (1) the exponential growth and increasing diversification of segregation forms, driven by combinatorial and exploratory work and increasing transdisciplinarity and intersectionality in research; (2) the evolution and structure of the field in hierarchies and clusters of segregation forms, revealing trends, persistence, and shifts over time; (3) the timing and geographical distribution of first publications on segregation forms, along with contextual variations across world regions and countries; (4) path dependencies in the historical and geographical shaping of segregation research; and (5) the structure of knowledge production. Aiming to contribute semantic organization to an increasingly complex field, we explore these findings to introduce a bottom-up ontology of segregation, marking the first comprehensive effort of its kind.
Selecting urban regions for metro network expansion to meet maximal transportation demands is crucial for urban development, while computationally challenging to solve. The expansion process relies not only on complicated features like urban demographics and origin-destination (OD) flow but is also constrained by the existing metro network and urban geography. In this paper, we introduce a reinforcement learning framework to address a Markov decision process within an urban heterogeneous multi-graph. Our approach employs an attentive policy network that intelligently selects nodes based on information captured by a graph neural network. Experiments on real-world urban data demonstrate that our proposed methodology substantially improve the satisfied transportation demands by over 30\% when compared with state-of-the-art methods. Codes are published at https://github.com/tsinghua-fib-lab/MetroGNN.
Sung Youn Boo, Steffen Allan Shelley, Seung-Ho Shin
et al.
There has been growing interest recently in hybrid installations integrating the offshore wind farm and aquaculture farm as co-existence while optimizing ocean space use. The offshore marine farms beyond coastal or sheltered areas will require mooring to ensure the station-keeping of the farm system during the storms. In the present work, a sub-surface longline farm is installed in a fixed offshore wind farm at a distance from the wind foundations. The farm is designed to cultivate oysters in multi-compartment bags attached to the longlines vertically. The farm with a cultivating area of 200 m × 200 m is supported by the various farm lines made of polypropylene and buoys that is moored with catenary mooring arrangements. Drag coefficients of a full-scale oyster bag in wave and current are determined using the results of wave basin tests. A lumped model is developed and validated with a complete model for a partial farm. The lumped model is used to simulate the coupled responses of the whole farm in the site extreme waves and currents of a 50-year return period. The strength and fatigue designs of the mooring and farm lines are evaluated against the industry standards and confirmed to comply with the design requirements.
Waktole Mosisa, Nigussie Dechassa, Kibebew Kibret
et al.
Abstract Maize (Zea mays L.) is an important food security crop in Ethiopia. However, low soil fertility and the use of haphazard nitrogen (N) fertilizer with little attention to the rate and timing of N application constrain productivity. Therefore, field experiments were conducted during the 2019 and 2020 cropping seasons to investigate the response of maize to different N application rates and timings. The treatments consisted of six N fertilizer rates (0, 23, 46, 69, 92, and 115 kg N ha−1) and four application timings (all at vegetative stages; one‐half at sowing + one‐half at vegetative stages; one‐third at sowing + one‐third at vegetative stages + one‐third at tasseling; one‐fourth at sowing + two‐fourths at vegetative stages + one‐fourth at tasseling). The experiments were a randomized block design in a factorial arrangement with three replications. The results of the study revealed that ears per plant, ear length, grains per row, grains per ear, stover, and grain yield were significantly (p ≤ .001) influenced by the interaction effects of N application rates and timings. The highest stover (9.99 t ha−1) and grain yield (9.41 t ha−1) were obtained from the application of 92 kg N ha−1 in three split applications of one‐fourth at sowing, one‐half at vegetative stages, and one‐fourth at tasseling. Therefore, it is concluded that 92 kg N ha−1 in three split applications of one‐fourth at sowing, two‐fourths at vegetative stages, and one‐fourth at tasseling was found to be the most economical in the study area.
Gia Dvali, Juan Valbuena-Bermudez, Michael Zantedeschi
In this work, we study the annihilation of a pair of `t Hooft-Polyakov monopoles due to confinement by a string. We analyze the regime in which the scales of monopoles and strings are comparable. We compute the spectrum of the emitted gravitational waves and find it to agree with the previously calculated point-like case for wavelengths longer than the system width and before the collision. However, we observe that in a head-on collision, monopoles are never re-created. Correspondingly, not even once the string oscillates. Instead, the system decays into waves of Higgs and gauge fields. We explain this phenomenon by the loss of coherence in the annihilation process. Due to this, the entropy suppression makes the recreation of a monopole pair highly improbable. We argue that in a similar regime, analogous behaviour is expected for the heavy quarks connected by a QCD string. There too, instead of re-stretching a long string after the first collapse, the system hadronizes and decays in a high multiplicity of mesons and glueballs. We discuss the implications of our results.
Firm clusters are seen as having a positive effect on innovations, what can be interpreted as economies of scale or knowledge spillovers. The processes underlying the success of these clusters remain difficult to isolate. We propose in this paper a stylised agent-based model to test the role of geographical proximity and informal knowledge exchanges between firms on the emergence of innovations. The model is run on synthetic firm clusters. Sensitivity analysis and systematic model exploration unveil a strong impact of interaction distance on innovations, with a qualitative shift when spatial interactions are more intense. Model bi-objective optimisation shows a compromise between innovation and product diversity, suggesting trade-offs for clusters in practice. This model provides thus a first basis to systematically explore the interplay between firm cluster geography and innovation, from an evolutionary perspective.
Apazhev Aslan, Shekikhachev Yuri, Batyrov Vladimir
et al.
One of the significant problems in ensuring the reliable operation of the nozzles is the intensive coking of the injector spray nozzle ports. Based on the assumption that all fuel left by injection under the needle burns, some researchers believe that the reason for coking is insufficiently emptying of this volume. There is also a well-known opinion about the impact of atomization quality at the final phase of injection. The lack of consensus and conflicting recommendations on the issue make the research relevant. A set of investigations has been carried out at OJSC “TsNITA” to study the influence of various factors, including design factors of the fuel system, on the coking of injector spray nozzle ports. This article describes the investigation results carried out on the basis of test materials for 24 variants of fuel systems on the D-240 engine, the analysis of varying parameters for the injection final phase is carried out depending on the combination of design factors and the relationship of coking of the injector spray nozzle ports with the final phase parameters is shown.
<p>The influence of the anti-religious campaign of the Soviet government on confessional life of the Roman Catholic Church is examined in the article, based on the study of special literature and involvement and complex analysis of the archival sources. Changes that took place in the Roman Catholic denomination due to pressure of the Soviet authorities on it, in particular reduction of its network in the western regions ofUkrainein 1958–1964, are analyzed.</p>
Annotating images for semantic segmentation requires intense manual labor and is a time-consuming and expensive task especially for domains with a scarcity of experts, such as Forensic Anthropology. We leverage the evolving nature of images depicting the decay process in human decomposition data to design a simple yet effective pseudo-pixel-level label generation technique to reduce the amount of effort for manual annotation of such images. We first identify sequences of images with a minimum variation that are most suitable to share the same or similar annotation using an unsupervised approach. Given one user-annotated image in each sequence, we propagate the annotation to the remaining images in the sequence by merging it with annotations produced by a state-of-the-art CAM-based pseudo label generation technique. To evaluate the quality of our pseudo-pixel-level labels, we train two semantic segmentation models with VGG and ResNet backbones on images labeled using our pseudo labeling method and those of a state-of-the-art method. The results indicate that using our pseudo-labels instead of those generated using the state-of-the-art method in the training process improves the mean-IoU and the frequency-weighted-IoU of the VGG and ResNet-based semantic segmentation models by 3.36%, 2.58%, 10.39%, and 12.91% respectively.
Card shuffling models have provided simple motivating examples for the mathematical theory of mixing times for Markov chains. As a complement, we introduce a more intricate realistic model of a certain observable real-world scheme for mixing human players onto teams. We quantify numerically the effectiveness of this mixing scheme over the 7 or 8 steps performed in practice. We give a combinatorial proof of the non-trivial fact that the chain is indeed irreducible.