Value-based approaches such as Value Sensitive Design (VSD) enable technology designers to engage with and integrate human values in technology through a tripartite methodology of conceptual, empirical, and technical investigations. However, VSD contains pitfalls in both translating values to requirements and a lack of normative grounding, leading to adaptations such as Jacobs' Capability Sensitive Design (CSD). Inspired by CSD and extensions of the design approach, we propose the concept of creating -Sensitive Design (-SD); a meta-framework to embed various political or ideological values as norms in a design research process. We exemplify this through \emph{Dependency}-Sensitive Design (DSD), combining ideas from Kittay's critiques of classical liberal theory within a practical VSD framework. Finally, we push for further work combining philosophy and design in areas beyond CSD and DSD.
The profound transformation of all aspects of society's existence, which began in the last quarter of the previous century and continues to this day, has changed almost the entire structure of social relations, both in its morphological and substantive plans. Social solidarity is presented as a dynamic and non-guaranteed state arising in the course of human interaction and requiring constant activity of all participants of the interactive process. The main integrative role is played by joint action and its interpretations, the proximity of which ensures the consolidation of society. The influence of solidarity on the reproduction function of the social system at all its hierarchical levels – from groups of primary social practices to the society as a whole – has been studied. It is substantiated that the state or level of social solidarity in society determines the degree of harmony of its functioning, and in general determines the life chances and prospects of this society. It is noted that the reduction of solidarity carries a threat of social disintegration. It is emphasized that in the normal functioning of society, solidarity is the object of unremitting attention, both on the part of the ruling group in society, whose actions are almost inseparable from the goals of reproduction of society and its power, and on the part of other social institutions, whose activities are directly conditioned by the level of consolidation and cooperation of their members. Accordingly, a state of social relations in which neither the authorities nor other institutions demonstrates adequate concern for the degree of consolidation of the social whole cannot be considered normal. It is shown that within the post-political consensus, class identity appears as the result of a specific political gesture – a political and discursive construction. Political-discursive analysis is defined as an independent and self-sufficient methodology that enables a new perspective on traditional objects of political inquiry, such as populist movements and ideologies, large-scale social conflicts, the absolutization of ideology, and the ideologization of the functioning of various societal discourses.
In the 1960s and 1970s a series of observations and theoretical developments highlighted the presence of several anomalies which could, in principle, be explained by postulating one of the following two working hypotheses: (i) the existence of dark matter, or (ii) the modification of standard gravitational dynamics in low accelerations. In the years that followed, the dark matter hypothesis as an explanation for dark matter phenomenology attracted far more attention compared to the hypothesis of modified gravity, and the latter is largely regarded today as a non-viable alternative. The present article takes an integrated history and philosophy of science approach in order to identify the reasons why the scientific community mainly pursued the dark matter hypothesis in the years that followed, as opposed to modified gravity. A plausible answer is given in terms of three epistemic criteria for the pursuitworthiness of a hypothesis: (a) its problem-solving potential, (b) its compatibility with established theories and the feasibility of incorporation, and (c) its independent testability. A further comparison between the problem of dark matter and the problem of dark energy is also presented, explaining why in the latter case the situation is different, and modified gravity is still considered a viable possibility.
Kateřina Hlaváčková-Schindler, Rainer Wöß, Vera Pecorino
et al.
Not much has been written about the role of triggers in the literature on causal reasoning, causal modeling, or philosophy. In this paper, we focus on describing triggers and causes in the metaphysical sense and on characterizations that differentiate them from each other. We carry out a philosophical analysis of these differences. From this, we formulate a definition that clearly differentiates triggers from causes and can be used for causal reasoning in natural sciences. We propose a mathematical model and the Cause-Trigger algorithm, which, based on given data to observable processes, is able to determine whether a process is a cause or a trigger of an effect. The possibility to distinguish triggers from causes directly from data makes the algorithm a useful tool in natural sciences using observational data, but also for real-world scenarios. For example, knowing the processes that trigger causes of a tropical storm could give politicians time to develop actions such as evacuation the population. Similarly, knowing the triggers of processes that cause global warming could help politicians focus on effective actions. We demonstrate our algorithm on the climatological data of two recent cyclones, Freddy and Zazu. The Cause-Trigger algorithm detects processes that trigger high wind speed in both storms during their cyclogenesis. The findings obtained agree with expert knowledge.
Głównym celem artykułu jest sklasyfikowanie i omówienie mechanizmów kreowania postprawdy w przestrzeni medialnej. Na wybranych przykładach scharakteryzuję i opiszę funkcjonowanie 10 mechanizmów, które wyróżniłem w trakcie prowadzonych badań. Podzieliłem je na mechanizmy związane z działaniami podjętymi przez nadawcę oraz wynikające z jego zaniechania. Ułatwi to zrozumienie, w jaki sposób te techniki są wykorzystywane w mediach oraz że kreowanie postprawdziwych komunikatów wcale nie musi być intencjonalne. Termin „postprawda” porównam ponadto z takimi pojęciami jak: manipulacja, zwodzenie (ang. deception) czy „wciskanie kitu” (ang. bullshit), chcąc odnaleźć między nimipunkty wspólne i różnice, co pozwoli zrozumieć złożoność problemu również na gruncie semantycznym.
Krechs ursprünglich im Jahr 2012 erschienener Artikel „Religious Contacts in Past and Present Times: Aspects of a Research Programme“ wird in diesem kommentierenden Beitrag in den Kontext mehrerer Pionierarbeiten eingeordnet, die wichtige Impulse für eine Globale Religionsgeschichte gegeben haben. Gerade weil der Artikel als provisorischer Zwischenschritt erschienen ist, wird er für ein Nachdenken über anhaltende Herausforderungen einer Globalen Religionsgeschichte fruchtbar gemacht. Es wird hervorgehoben, wie damals formulierte zentrale Fragestellungen auch in gegenwärtigen Diskussionen fortbestehen, die sich allerdings vorwiegend durch global- und verflechtungsgeschichtliche, oft genealogisch ausgerichtete Ansätze auszeichnen. Dies hebt hervor, wie das von Krech ins Spiel gebrachte methodologische Repertoire für das Nachdenken über Globale Religionsgeschichte wertvoll sein kann.
General relativity and quantum field theory are the cornerstones of our understanding of physical processes, from subatomic to cosmic scales. While both theories work remarkably well in their tested domains, they show minimal overlap. However, our research challenges this separation by revealing that non-perturbative effects bridge these distinct domains. We introduce a novel mechanism wherein, at linear order, spin-2 fields around an arbitrary background acquire \emph{effective mass} due to the spontaneous symmetry breaking (SSB) of either global or local symmetry of complex scalar field minimally coupled to gravity. The action of the spin-2 field is identical to the extended Fierz-Pauli (FP) action, corresponding to the mass deformation parameter $α= 1/2$. We show that this occurs due to the effect of SSB on the variation of the energy-momentum tensor of the matter field, which has a dominant effect during SSB. The extended FP action has a salient feature, compared to the standard FP action: the action has 6 degrees of freedom with no ghosts. For local $U(1)$ SSB, we establish that the effective mass of spin-2 fields is related to the mass of the gauge boson and the electric charge of the complex scalar field. Interestingly, our results indicate that the millicharged dark matter scalar fields, generating dark photons, can produce a mass of spin-2 fields of the same order as the Hubble constant $(H_0)$. Hence, we argue that the dark sector offers a natural explanation for the acceleration of the current Universe.
Guided by the Einstein equivalence principle that identifies the phenomenon of gravitation as a manifestation of the dynamics of spacetime in contrast to a localizable force, we review and explore its consequences on formulating a theory of gravity. The resulting space of metric theories of gravity may address open conceptual and observational puzzles through a wealth of effects beyond general relativity, whose traces can be searched for within today's and tomorrow's gravitational testing grounds. Above all, we offer a generic metric theory generalization of Isaacson's approach to the leading-order field equations of physical perturbations with a well-defined notion of energy-momentum carried by the gravitational waves. Within this framework, we identify the backreaction of the Isaacson energy-momentum flux onto the background spacetime with the displacement memory effect that induces a permanent distortion of space after the passage of a gravitational wave. This effect is a well-known prediction of GR whose dominant contribution captures its inherent non-linear nature, manifest in the ability of gravity to gravitate. However, the novel interpretation of memory as naturally arising within the Isaacson approach to gravitational waves comes with two main advantages. Firstly, it allows for a unified understanding of both the null and the ordinary memory effect, which are respectively sourced by unbound energy fluxes that do and do not reach asymptotic null infinity. Secondly, and most importantly, this approach allows for a consistent derivation of the memory formula for a large class of metric theories with considerable lessons to be learned for upcoming future measurements of the memory effect.
In recent works we have introduced the parameter space $\mathcal{Z}_N$ of $A$-variations of the Hardy $Z$-function, $Z(t)$, whose elements are functions of the form \begin{equation} \label{eq:Z-sections} Z_N(t ; \overline{a} ) = \cos(θ(t))+ \sum_{k=1}^{N} \frac{a_k}{\sqrt{k+1} } \cos ( θ(t) - \ln(k+1) t), \end{equation} where $\overline{a} = (a_1,...,a_N) \in \mathbb{R}^N$. The \( A \)-philosophy advocates that studying the discriminant hypersurface forming within such parameter spaces, often reveals essential insights about the original mathematical object and its zeros. In this paper we apply the $A$-philosophy to our space $\mathcal{Z}_N$ by introducing \( Δ_n(\overline{a} ) \) the $n$-th Gram discriminant of \( Z(t) \). We show that the Riemann Hypothesis (RH) is equivalent to the corrected Gram's law \[ (-1)^n Δ_n(\overline{1}) > 0, \] for any $n \in \mathbb{Z}$. We further show that the classical Gram's law \( (-1)^n Z(g_n) >0\) can be considered as a first-order approximation of our corrected law. The second-order approximation of $Δ_n (\overline{a})$ is then shown to be related to shifts of Gram points along the \( t \)-axis. This leads to the discovery of a new, previously unobserved, repulsion phenomena \[ \left| Z'(g_n) \right| > 4 \left| Z(g_n) \right|, \] for bad Gram points $g_n$ whose consecutive neighbours $g_{n \pm 1}$ are good. Our analysis of the \(A\)-variation space \(\mathcal{Z}_N\) introduces a wealth of new results on the zeros of \(Z(t)\), casting new light on classical questions such as Gram's law, the Montgomery pair-correlation conjecture, and the RH, and also unveils previously unknown fundamental properties.
Ilham habibi Sormin, Muhammad Dalimunthe , Syahrul Abidin
The development of the film world is very diverse and produces films with various styles. Broadly speaking, films can be grouped by story, making orientation, and by genre. This study aims to determine the representation of feminism contained in the science fiction film entitled Level 16. This study uses a qualitative method with the semiotic analysis technique of Ferdinand De Saussure's model which examines the signs in life. Through this method, several scenes are selected in the level 16 film, then these scenes are revealed into denotative and connotative meanings and then interpreted in signifier and signified. In this study, the researcher found ten scenes that presented feminism in level 16 films.
Diffusion models have shown remarkable success in visual synthesis, but have also raised concerns about potential abuse for malicious purposes. In this paper, we seek to build a detector for telling apart real images from diffusion-generated images. We find that existing detectors struggle to detect images generated by diffusion models, even if we include generated images from a specific diffusion model in their training data. To address this issue, we propose a novel image representation called DIffusion Reconstruction Error (DIRE), which measures the error between an input image and its reconstruction counterpart by a pre-trained diffusion model. We observe that diffusion-generated images can be approximately reconstructed by a diffusion model while real images cannot. It provides a hint that DIRE can serve as a bridge to distinguish generated and real images. DIRE provides an effective way to detect images generated by most diffusion models, and it is general for detecting generated images from unseen diffusion models and robust to various perturbations. Furthermore, we establish a comprehensive diffusion-generated benchmark including images generated by eight diffusion models to evaluate the performance of diffusion-generated image detectors. Extensive experiments on our collected benchmark demonstrate that DIRE exhibits superiority over previous generated-image detectors. The code and dataset are available at https://github.com/ZhendongWang6/DIRE.
We present the core support criterion, a voting criterion satisfied by Instant Runoff Voting (IRV) that is analogous to the Condorcet criterion but reflective of a different majority rule philosophy. Condorcet methods can be thought of as conducting elections between each pair of candidates, counting all ballots to determine the winner of each pair-election. IRV can also be thought of as conducting elections between all pairs of candidates but for each pair-election only counting ballots from voters who do not prefer another major candidate (as determined self-consistently from the IRV social ranking) to the two candidates in contention. The appropriateness of including all ballots or a subset of ballots for a pair-election, depends on whether the society deems the entire or a selected ballot set in compliance with freedom of association (which implies freedom of non-association) for a given pair election. Arguments based on freedom of association rely on more information about an electorate than can be learned from ranked ballots alone. We present a freedom-of-association based argument to explain why IRV may be preferable to Condorcet in some circumstances, including the 2022 Alaska special congressional election, based on the political context of that election.
Dostoevsky’s philosophy and theology cannot be extracted from his work in the form of explicit statements; instead, they manifest themselves via a complexly structured figurative text; the author’s strategy consists in stepping back in order to implicitly involve the reader in a process of personal discoveries and personal change. This article focuses on philosophical and theological thoughts in Dostoevsky’s works that are associated with the narratives about paintings which the artist paints against the client’s demand to explicitly express their spiritual meaning. This kind of storyline recurs at least twice in Dostoevsky’s works and appears to be highly effective from a philosophical and theological point of view. In the novel The Insulted and the Injured, it demonstrates what happens “on the other side” of the icon, while in The Adolescent, it serves to reveal the images of the spiritual world in their everyday array and to teach the reader to recognize these images not only within the fictional world of the text but also without, in the external world with which she interacts.
Minh N. H. Nguyen, Shashi Raj Pandey, Kyi Thar
et al.
Due to the availability of huge amounts of data and processing abilities, current artificial intelligence (AI) systems are effective in solving complex tasks. However, despite the success of AI in different areas, the problem of designing AI systems that can truly mimic human cognitive capabilities such as artificial general intelligence, remains largely open. Consequently, many emerging cross-device AI applications will require a transition from traditional centralized learning systems towards large-scale distributed AI systems that can collaboratively perform multiple complex learning tasks. In this paper, we propose a novel design philosophy called democratized learning (Dem-AI) whose goal is to build large-scale distributed learning systems that rely on the self-organization of distributed learning agents that are well-connected, but limited in learning capabilities. Correspondingly, inspired by the societal groups of humans, the specialized groups of learning agents in the proposed Dem-AI system are self-organized in a hierarchical structure to collectively perform learning tasks more efficiently. As such, the Dem-AI learning system can evolve and regulate itself based on the underlying duality of two processes which we call specialized and generalized processes. In this regard, we present a reference design as a guideline to realize future Dem-AI systems, inspired by various interdisciplinary fields. Accordingly, we introduce four underlying mechanisms in the design such as plasticity-stability transition mechanism, self-organizing hierarchical structuring, specialized learning, and generalization. Finally, we establish possible extensions and new challenges for the existing learning approaches to provide better scalable, flexible, and more powerful learning systems with the new setting of Dem-AI.
La escasez de órganos es un problema global constante, aun cuando se presentan diversas alternativas para superar ese problema que genera miles de muertes cada año. El presente artículo analiza los problemas éticos y jurídicos del mercado regulado de órganos, partiendo del modelo de donación recompensada de riñones en vida entre no parientes, implantado en Irán en 1988. Para ello, se realizarán algunos apuntes históricos, pasando por la presentación de los procedimientos y características de dicho modelo, concluyendo con sus principales resultados, aciertos y errores. Al final, adoptando una base empírica como punto de partida, se pretende contribuir con el debate teórico que existe en torno a esa polémica alternativa.
Jurisprudence. Philosophy and theory of law, Medical philosophy. Medical ethics
By considering nests on a given space, we explore order-theoretical and topological properties that are closely related to the structure of a nest. In particular, we see how subbases given by two dual nests can be an indicator of how close or far are the properties of the space from the structure of a linearly ordered space. Having in mind that the term interlocking nest is a key tool to a general solution of the orderability problem, we give a characterization of interlocking nest via closed sets in the Alexandroff topology and via lower sets, respectively. We also characterize bounded subsets of a given set in terms of nests and, finally, we explore the possibility of characterizing topological groups via properties of nests. All sections are followed by a number of open questions, which may give new directions to the orderability problem.
The philosophy of blockchain technology is concerned, among other things, with blockchain ontology, how it might be characterised, how it is being created, implemented, and adopted, how it operates in the world, and how it evolves over time. This paper concentrates on whether Bitcoin/blockchain can be considered a complex system and, if so, whether it is a chaotic one. Beyond mere academic curiosity, a positive response would raise concerns about the likelihood of Bitcoin/blockchain entering a 2010-Flash-Crash-type of chaotic regime, with catastrophic consequences for financial systems based on it. The paper starts by highlighting the relevant details of the Bitcoin/blockchain ecosystem formed by the blockchain itself, bitcoin end users (payers and payees), capital gains seekers, miners, full nodes maintainers, and developers, and their interactions. Then the Information Theory of Complex Systems is briefly discussed for later use. Finally, the blockchain is investigated with the help of Crutchfield's Statistical Complexity measure. The low non-null statistical complexity value obtained suggests that the blockchain may be considered algorithmically complicated but hardly a complex system and unlikely to enter a chaotic regime.
Einstein established the theory of general relativity and the corresponding field equation in 1915 and its vacuum solutions were obtained by Schwarzschild and Kerr for, respectively, static and rotating black holes, in 1916 and 1963, respectively. They are, however, still playing an indispensable role, even after 100 years of their original discovery, to explain high energy astrophysical phenomena. Application of the solutions of Einstein's equation to resolve astrophysical phenomena has formed an important branch, namely relativistic astrophysics. I devote this article to enlightening some of the current astrophysical problems based on general relativity. However, there seem to be some issues with regard to explaining certain astrophysical phenomena based on Einstein's theory alone. I show that Einstein's theory and its modified form, both are necessary to explain modern astrophysical processes, in particular, those related to compact objects.