David Lewis (1941-2001) was Class of 1943 University Professor of Philosophy at Princeton University. His contributions spanned philosophical logic, philosophy of language, philosophy of mind, philosophy of science, metaphysics, and epistemology. In On the Plurality of Worlds, he defended his challenging metaphysical position, "modal realism." He was also the author of the books Convention, Counterfactuals, Parts of Classes, and several volumes of collected papers.
The term 'algorithmic fairness' is used to evaluate whether AI models operate fairly in both comparative (where fairness is understood as formal equality, such as "treat like cases as like") and non-comparative (where unfairness arises from the model's inaccuracy, arbitrariness, or inscrutability) contexts. Recent advances in multimodal large language models (MLLMs) are breaking new ground in multimodal understanding, reasoning, and generation; however, we argue that inconspicuous distortions arising from complex multimodal interaction dynamics can lead to systematic bias. The purpose of this position paper is twofold: first, it is intended to acquaint AI researchers with phenomenological explainable approaches that rely on the physical entities that the machine experiences during training/inference, as opposed to the traditional cognitivist symbolic account or metaphysical approaches; second, it is to state that this phenomenological doctrine will be practically useful for tackling algorithmic fairness issues in MLLMs. We develop a surrogate physics-based model that describes transformer dynamics (i.e., semantic network structure and self-/cross-attention) to analyze the dynamics of cross-modal bias in MLLM, which are not fully captured by conventional embedding- or representation-level analyses. We support this position through multi-input diagnostic experiments: 1) perturbation-based analyses of emotion classification using Qwen2.5-Omni and Gemma 3n, and 2) dynamical analysis of Lorenz chaotic time-series prediction through the physical surrogate. Across two architecturally distinct MLLMs, we show that multimodal inputs can reinforce modality dominance rather than mitigate it, as revealed by structured error-attractor patterns under systematic label perturbation, complemented by dynamical analysis.
This paper argues that the traditional opposition between determinism and indeterminism in physics is representational rather than ontological. Deterministic-stochastic dualities are available in principle, and arise in a non-contrived way in many scientifically important models. When dynamical systems admit mathematically equivalent deterministic and stochastic formulations, their observable predictions depend only on the induced structure of correlations between preparations and measurement outcomes. I use this model-equivalence to motivate a model-invariance criterion for ontological commitment, according to which only structural features that remain stable across empirically equivalent representations, and whose physical effects are invariant under such reformulations, are candidates for realism. This yields a fallibilist form of structural realism grounded in modal robustness rather than in the specifics of any given mathematical representation. Features such as conservation laws, symmetries, and causal or metric structure satisfy this criterion and can be encoded in observable relations in mathematically intelligible ways. By contrast, the localisation of modal selection -- whether in initial conditions, stochastic outcomes, or informational collapse mechanisms -- is not invariant under empirically equivalent reformulations and is therefore best understood as a gauge choice rather than an ontological feature. The resulting framework explains how certain long-standing problems in the foundations of physics, including the measurement problem and the perceived conflict between physical determinism and free agency, arise from the reification of representational artefacts. By distinguishing model-invariant structure from modelling conventions, I offer a realist ontology for modern physics that combines empirical openness with resistance to metaphysical overreach.
This paper investigates whether contemporary AI architectures employing deep recursion, meta-learning, and self-referential mechanisms provide evidence of machine consciousness. Integrating philosophical history, cognitive science, and AI engineering, it situates recursive algorithms within a lineage spanning Cartesian dualism, Husserlian intentionality, Integrated Information Theory, the Global Workspace model, and enactivist perspectives. The argument proceeds through textual analysis, comparative architecture review, and synthesis of neuroscience findings on integration and prediction. Methodologically, the study combines conceptual analysis, case studies, and normative risk assessment informed by phenomenology and embodied cognition. Technical examples, including transformer self-attention, meta-cognitive agents, and neuromorphic chips, illustrate how functional self-modeling can arise without subjective experience. By distinguishing functional from phenomenal consciousness, the paper argues that symbol grounding, embodiment, and affective qualia remain unresolved barriers to attributing sentience to current AI. Ethical analysis explores risks of premature anthropomorphism versus neglect of future sentient systems; legal implications include personhood, liability, authorship, and labor impacts. Future directions include quantum architectures, embodied robotics, unsupervised world modeling, and empirical tests for non-biological phenomenality. The study reframes the "hard problem" as a graded and increasingly testable phenomenon, rather than a metaphysical impasse. It concludes that recursive self-referential design enhances capability but does not entail consciousness or justify moral status. Keywords: Recursive algorithms; self-reference; machine consciousness; AI ethics; AI consciousness
Andrew T. McKenzie, Michael Cerullo, Navid Farahani
et al.
Biostasis has the potential to extend human lives by offering a bridge to powerful life extension technologies that may be developed in the future. However, key questions in the field remain unresolved, including which biomarkers reliably indicate successful preservation, what technical obstacles pose the greatest barriers, and whether different proposed revival methods are theoretically feasible. To address these gaps, we conducted a collaborative forecasting exercise with 22 practitioners in biostasis, including individuals with expertise in neuroscience, cryobiology, and clinical care. Our results reveal substantial consensus in some areas, for example that synaptic connectivity can serve as a reliable surrogate biomarker for information preservation quality. Practitioners identified three most likely failure modes in contemporary biostasis: inadequate preservation quality even under ideal conditions, geographic barriers preventing timely preservation, and poor procedural execution. Regarding revival strategies, most respondents believe that provably reversible cryopreservation of whole mammalian organisms is most likely decades away, with provably reversible human cryopreservation expected even later, if it is ever achieved. Among revival strategies from contemporary preservation methods, whole brain emulation was considered the most likely to be developed first, though respondents were divided on the metaphysical question of whether it could constitute genuine revival. Molecular nanotechnology was viewed as nearly as likely to be technically feasible, and compatible with both pure cryopreservation and aldehyde-based methods. Taken together, these findings delineate current barriers to high-quality preservation, identify future research priorities, and provide baseline estimates for key areas of uncertainty.
Hetero-masculine violence in South Africa continues to be an obstacle to peace. As indicated by crime statistics in South Africa, heterosexual women, heterosexual men and LGBTQI+ community have been victims of murder. Many continue to be victims of sexual violence as a result of hetero-masculine violence. While some Christian confessional traditions in South Africa have evolved and have become more welcoming to the LGBTQI+ community, this evolution has not made much difference to the public violence the LGBTQI+ community is exposed to in South Africa. Using the intersections of Caputo’s radical ethics and Meiring’s body theology, I propose a new theological framework that will assist and encourage confessional ecclesiological traditions in South Africa to deal with their own internal contradictions influenced by masculinist heterosexist discourse. This is an attempt to meaningfully contribute to the discourse on violence experienced by the LGBTQI+ community in South Africa. This article contends that the intersections of body theology and radical ethics assist ecclesiological traditions to recognise and embrace the fragility of metaphysics even in the face of discomfort. I argue that the exercise of continuously embracing the fragility of metaphysics assists ecclesiological traditions to be open to their own flaws. This gives them an authentic voice to constantly reconstruct and effectively speak out against the rejection of and violence perpetuated against the LGBTQIA+ community in South Africa.
Intradisciplinary and/or interdisciplinary implications: This paper draws on conversations from Christian theology, ethics and their engagement with LGBTQI+ and public homophobic violence in South Africa.
This paper introduces a powerful new tool for topological redescription, the ISE Methodology. These tools allow us to remove and replace a theory's topological underpinnings just as easily as we can switch between different coordinate systems. Aspirationally, these novel topological redescription techniques can be used to provide new support for a roughly Kantian view of space and time; Rather than corresponding to any fundamental substances or relations, we can see the spacetime manifolds which appear in our theories as merely being an aspect of how we represent the world. This view of spacetime topology parallels the dynamic-first view of geometry as well as a Humean view of laws; The spacetime manifolds which feature in our best theories reflect nothing metaphysically substantial in the world beyond them it being one particularly nice way (among others) of codifying the dynamical behavior of matter. A parallel publication (namely, Grimmer (2023)) will explicitly characterize the power and scope of the topological redescription techniques offered to us by the ISE Methodology. The modest goal of this paper is simply to introduce the ISE Methodology by applying it to two example theories. Firstly, to familiarize ourselves with these techniques, I will show how they can be used to redescribe a spacetime theory via a Fourier transform. Secondly, I will show how the exact same techniques can be used to redescribe a lattice theory (i.e., a theory set on a discrete spacetime, M=RxZ) as existing on a continuous spacetime manifold,M=RxR.
Isaac Newton, in popular imagination the Ur-scientist, was an outstanding humanist scholar. His researches on, among others, ancient philosophy, are thorough and appear to be connected to and fit within his larger philosophical and theological agenda. It is therefore relevant to take a closer look at Newton's intellectual choices, at how and why precisely he would occupy himself with specific text-sources, and how this interest fits into the larger picture of his scientific and intellectual endeavours. In what follows, we shall follow Newton into his study and look over his shoulder while reading compendia and original source-texts in his personal library at Cambridge, meticulously investigating and comparing fragments and commentaries, and carefully keeping track in private notes of how they support his own developing ideas. Indeed, Newton was convinced that precursors to his own insights and discoveries were present already in Antiquity, even before the Greeks, in ancient Egypt, and he puts a lot of time and effort into making the point, especially, and not incidentially, in the period between the first and the second edition of the Principia. A clear understanding of his reading of the classic sources therefore matters to our understanding of its content and gestation process. In what follows we will confine ourselves to the classical legacy, and investigate Newton's intellectual intercourse with it.
Is it possible to draw a border line between ontology and epistemology? A positive answer to this question looks attractive, mainly because it reflects convictions deeply entrenched in our common sense view of the world. However, anyone wishing to clarify the distinction between the ontological and the epistemological dimensions meets problems. This is due to the fact that the separation between factual and conceptual is not clean, but rather fuzzy. It is certainly correct to state that science means to offer correct information about the world, but the extent to which it succeeds in accomplishing this task is always questionable. We cannot claim that the picture provided by today science - our current scientific image of the world - is absolutely correct, because the history of science itself shows us that any such statement is likely to be rejected by future generations. While it may be recognized that science purports to offer a correct description of the real world, the past experience should also prompt us to accept its claims sub condicione, and to view them as merely provisional.
After the completion of I. Kant’s “Copernican” turn in metaphysics, all subsequent European philosophy to one degree or another was under his influence. The purpose of the article is to consider the reception and transformation of the Kantian theoretical philosophy by the Marburg school of neo-Kantianism. It is necessary to analyze the reasons for H. Cohen's and P. Natorp’s interpretation of Kant's criticism. To do this, one should consider (i) internalist and (ii) externalist factors in the formation of the Marburg School. Neo-Kantianism, on the one hand, emerged as a response to materialism, naturalism, and post-Kantian German idealism. In addition, the Marburg School was strongly influenced by the change in the scientific paradigm in mathematical natural science at the end of the 19th century. The justification by the Marburgers of Kant’s a priori doctrine presupposed thematization, first of all, of: a) purity of thought; b) systematic unity of thought and experience; c) the orientation of philosophy to the “fact of science”; d) transcendental method. As a result, the Marburg School interpreted the Kantian concept of the unity of consciousness; abandoned the principle of synthetic (real) unity of consciousness in favor of systematic (logical) unity; substantiated the purity of scientific thinking; put forward the requirement of orientation of philosophy to the “fact of science”; developed the concept of the origin of thinking (Ursprung); abandoned the idea of “givenness” of the subject of knowledge and proved its “assignment”; changed the understanding of the essence and functions of the transcendental method; put forward the concept of thinking as “generation” (“production”); formulated a new understanding a priori. The changes that took place in the 19th century in philosophy, mathematical natural science and mathematical sciences led to a sharp activation of constructivism. It can be concluded that Kant’s epistemological paradigm was realistic constructivism. Pure constructivism became the paradigm of the Marburg School.
Quantum nonseparability is a central feature of quantum mechanics, and raises important philosophical questions. Interestingly, a particular theoretical development of quantum mechanics, called the process matrix formalism (PMF), features another kind of nonseparability, called causal nonseparability. The PMF appeals to the notion of quantum process, which is a generalisation of the concept of quantum state allowing to represent quantum-like correlations between quantum events over multiple parties without specifying a priori their spatiotemporal locations. Crucially, since the PMF makes no assumption about the global causal structure between quantum events, it allows for the existence of causally nonseparable quantum processes having an indefinite causal structure. This work aims at investigating the philosophical implications of causal nonseparability, especially for the notion of spatiotemporal relations. A preliminary discussion will study the formal connection between quantum and causal nonseparability. It will be emphasised that, although quantum processes can be seen as a generalisation of density matrices, the conceptual distinction between the two notions yields significant differences between quantum and causal nonseparability. From there, it will be shown that, depending on the interpretative framework, causal nonseparability suggests some kind of indeterminacy of spatiotemporal relations. Namely, within a realist context, spatiotemporal relations can be epistemically or metaphysically indeterminate. Finally, it will be argued that, in spite of the disanalogies between standard and causal nonseparability, similar implications for spatial relations can already be defended in the context of standard quantum mechanics. This work highlights the potentially very fruitful explorations of the implications of quantum features on the conception of spacetime.
Unity is a central concept in the Critique of Pure Reason, since it is only through the unifying act of our spontaneous faculties that an experience can emerge, according to Kant. However, the faculty of reason brings forth a different unity than that of the understanding: Kant characterizes the former as a collective unity, while the latter as a distributive unity. This article aims to explain the meaning of these terms, with reference to the Nachlass on metaphysics and the writings on right where Kant employs them in a clearer manner. This explanation can provide a basis to understand the difference between the faculty of understanding and the faculty of reason within the first Critique, a difference rather neglected by scholars, who have focused mainly on Kant’s distinction between sensibility and understanding.
This paper discusses the active role of the prophet within divine providence, namely her understanding of the prophetic message and her use of prophecy. I focus on Aquinas’ account of prophecy and I adopt two methods: the phenomenological method that describes the experience of prophecy and the metaphysical method that starts from the divine attribute of goodness and works through the order of divine providence. In Aquinas’ view, prophecy is a personal mission that the prophet receives to fulfill God’s plan for humankind. This mission involves the prophet’s mental operations and practical engagement. I start with the metaphysics of providence and then describe the prophetic experience. Finally, I address the issue of judgment in the understanding of the prophetic message and the use of prophecy.
What exists at the fundamental level of reality? On the standard picture, the fundamental reality contains (among other things) fundamental matter, such as particles, fields, or even the quantum state. Non-fundamental facts are explained by facts about fundamental matter, at least in part. In this paper, I introduce a non-standard picture called the "cosmic void" in which the universe is devoid of any fundamental material ontology. Facts about tables and chairs are recovered from a special kind of laws that satisfy strong determinism. All non-fundamental facts are completely explained by nomic facts. I discuss a concrete example of this picture in a strongly deterministic version of the many-worlds theory of quantum mechanics. I discuss some philosophical and scientific challenges to this view, as well as some connections to ontological nihilism.
To a certain extent, the Principles of Nature and Grace (PNG), of which the title alone already reminds Malebranche, still seem to be animated by the Leibnizian imitation of the ‘Malebranche-Limbo’ (the descending movement and ascending from God to creation, then horizontally to creatures, and then going back to souls, spirits and the kingdom of grace). Here we will highlight, on the contrary, the main aspects by which Leibniz diverges from it in the PNG and in fact abandons it altogether. We will also analyze the meaning of the bipartition of Leibnizian writing in two sections, ‘physics’ and ‘metaphysics’, on the background on Leibniz’s classification of sciences and theory of the substance, and from there we will move to the consideration of the different levels of perception that Leibniz’s substances present.
Dans une certaine mesure, les Principes de la nature et de la grâce (PNG), dont le titre seul rappelle déjà Malebranche, semblent encore animés par l’imitation leibnizienne du ‘Malebranche-Limbo’ (le mouvement descendant et ascendant de Dieu à la création puis, horizontalement, vers les créatures, et enfin revenant aux âmes, aux esprits et au royaume de la grâce). Nous mettrons ici en évidence, au contraire, les principaux aspects par lesquels Leibniz s’en écarte dans les PNG et en fait y renonce complétement. Nous analyserons aussi le sens de la partition de l’écrit leibnizien en deux sections (‘physique’ et ‘métaphysique’), dans le contexte de la classification leibnizienne des sciences et de sa théorie de la substance, et de là nous passerons à considérer les différents niveaux de perception que les substances présentent selon Leibniz.
Franjo Marković (1845-1914) was the first professor of philosophy at the restored University of Zagreb (1874). The manuscript of his Logic is kept at the Archives of the Croatian Academy of Sciences and Arts, and consists of the autograph indexed as XV 37/1 and six lithographed copies (a-f), the most extensive of which is indexed as 2a (approximately 820 pages). The manuscript Logic can be said to consist of two parts: the first is an introduction and the second is entitled The System of Logic. The first part is further divided into seven sections, while the second part includes eleven sections. This article discusses the sixth section of the first (introductory) part, entitled Reasons against Absolute Logic (in 2a pp. 94-119), in which Marković sets out his criticism of Hegel’s logic. First, the manuscript Logic is described, and then, Marković’s two substantial objections to Hegel are considered: first, that it is not possible, starting from one concept (the most abstract one), which would be the beginning of all other concepts, to develop the wholeness of concepts (and at the same time the totality of the whole of everything that is) without the aid of thoughts acquired by perception; and second, that the procedure of absolute logic is in itself »illogical«, i.e. contrary to the irrefutable laws of logic. Finally, it is concluded that Marković’s criticism of Hegel’s logic, which is actually metaphysics, is made exclusively from a logical viewpoint, as he does not accept Hegel’s »transformation« of logic into metaphysics. Marković’s intent to »outline« his (philosophical) position on Hegel is particularly pointed out, since numerous opponents of Hegel’s philosophising, to whom Marković himself belongs, are generally reluctant to deal with Hegel’s philosophy.
Nietzsche, as one of the most popular philosophers and elite thinkers that formed modern thought, has been studied from many dimensions. There are many books about him, and many scholars are still doing research on his life, works, and thoughts. The plurality of the works and the differences in the representation of Nietzsche’s image has led to doubt and confusion in the determination of Nietzsche’s position and understanding of his thought. The present study aims at introducing and evaluating a major, recent, and important work in Nietzsche field, called “The Oxford Handbook of Nietzsche”. This scientific and standard work has thirty-two essays from world-renowned scholars. Essays have been organized in six discrete sections such as biography, historical relations, principal works and fundamental issues such as values, epistemology and metaphysics, and developments of will to power. These essays contain striking and precise points that can influence our understanding of Nietzsche’s philosophy and provide us with a broad knowledge of the main elements of his philosophy, such as superman, will to power, and eternal recurrence. Therefore this paper, after an overview of the book and description of its general features, summarizes the content of each essay to arouse the audience’s interest in pursuing a thorough study.
Indo-Iranian languages and literature, General Works
We present a computer-supported approach for the logical analysis and conceptual explicitation of argumentative discourse. Computational hermeneutics harnesses recent progresses in automated reasoning for higher-order logics and aims at formalizing natural-language argumentative discourse using flexible combinations of expressive non-classical logics. In doing so, it allows us to render explicit the tacit conceptualizations implicit in argumentative discursive practices. Our approach operates on networks of structured arguments and is iterative and two-layered. At one layer we search for logically correct formalizations for each of the individual arguments. At the next layer we select among those correct formalizations the ones which honor the argument's dialectic role, i.e. attacking or supporting other arguments as intended. We operate at these two layers in parallel and continuously rate sentences' formalizations by using, primarily, inferential adequacy criteria. An interpretive, logical theory will thus gradually evolve. This theory is composed of meaning postulates serving as explications for concepts playing a role in the analyzed arguments. Such a recursive, iterative approach to interpretation does justice to the inherent circularity of understanding: the whole is understood compositionally on the basis of its parts, while each part is understood only in the context of the whole (hermeneutic circle). We summarily discuss previous work on exemplary applications of human-in-the-loop computational hermeneutics in metaphysical discourse. We also discuss some of the main challenges involved in fully-automating our approach. By sketching some design ideas and reviewing relevant technologies, we argue for the technological feasibility of a highly-automated computational hermeneutics.