In the aftermath of World War II, Germany’s struggle to redefine its national identity, tainted by the Nazi legacy, led to the appropriation of foreign cultures. Irish culture, with its associations of resilience, mysticism and rural simplicity became a favoured alternative. This article examines the representation of Ireland in popular German media, focusing on the contemporary TV crime series Der Irland-Krimi [The Ireland Thriller]. Set in the picturesque landscapes of Ireland, the series reflects German longing to escape its troubled history and challenging present by immersing itself in a romanticised, culturally “pure” setting. The crime genre plays a significant role by combining the allure of crime with the appeal of Ireland’s enigmatic, timeless atmosphere. While Der Irland-Krimi attempts to avoid clichés and national stereotypes, such as the portrayal of Ireland as a land of rugged charm and mystical folklore, it risks perpetuating a “German version” of Irishness. By exploring the intersection of media consumption, genre and national identity, this analysis assesses whether the show serves to construct a nostalgic or idealised imaginary Ireland, thereby facilitating escapism, or whether it offers a reflective engagement with the narratives it creates.
This article surveys the historiography of nineteenth and early twentieth century British protest movements and thus provides a broad framework for the contributions to this volume. It begins by providing a portrait of some of the key developments in the field, with a focus on the impact of post-war social history and the emergence of ‘history from below’. It then details some of the ways in which this body of work was built upon, challenged and consolidated in subsequent decades. A discussion of the type of sources exploited by historians working from a ‘bottom-up’ perspective is followed by some thoughts on the current state of protest studies in a British context. The contested ‘Britishness’ of nineteenth-century social and political movements is considered, along with the question of how historians today continue to study ordinary men and women in the past, and the movements they belonged to, in the words of Malcolm Chase, with both ‘empathy’ and ‘authenticity’.
In the 1960s and 1970s a series of observations and theoretical developments highlighted the presence of several anomalies which could, in principle, be explained by postulating one of the following two working hypotheses: (i) the existence of dark matter, or (ii) the modification of standard gravitational dynamics in low accelerations. In the years that followed, the dark matter hypothesis as an explanation for dark matter phenomenology attracted far more attention compared to the hypothesis of modified gravity, and the latter is largely regarded today as a non-viable alternative. The present article takes an integrated history and philosophy of science approach in order to identify the reasons why the scientific community mainly pursued the dark matter hypothesis in the years that followed, as opposed to modified gravity. A plausible answer is given in terms of three epistemic criteria for the pursuitworthiness of a hypothesis: (a) its problem-solving potential, (b) its compatibility with established theories and the feasibility of incorporation, and (c) its independent testability. A further comparison between the problem of dark matter and the problem of dark energy is also presented, explaining why in the latter case the situation is different, and modified gravity is still considered a viable possibility.
While Harriet Martineau (1802-1876) made a name for herself with her works dedicated to the transmission and popularisation of knowledge, as was the case of her Illustrations of Political Economy (1834) which brought her an almost overnight fame, in her first novel Deerbrook (1839) she is concerned with yet another form of transmission : that of rumours and of epidemics. This article offers to read conjointly these two phenomena relying on a common principle of contagiousness within the social body, whether it be in a literal, pathological sense in the case of epidemics, or in a metaphorical sense when it comes to rumours. Thus I propose to analyse the epidemic qualities of rumours, and conversely the rumour-like aspects of the epidemic that is to be found in the novel. Furthermore, the idea of transmission, which is central to the novel, echoes Martineau’s position as a committed writer and questions her use of fiction for didactic purposes.
This article analyses one of the United Kingdom’s most recent cultural diplomacy programmes, the 2015 “Re:Imagine India” project. It firstly explains the context, then shows the programme’s strengths and argues that this project is a prime example of the way that Britain is trying to develop its soft power in order to grow its influence in India. Thirdly, it delves into the project’s “informally imperialist” rhetoric, which seems to persist behind the narrative of soft power. It concludes by studying the claim put forward by scholars like Oliver Turner, according to whom the concept of “Global Britain” and the interest in soft power, especially in the Commonwealth, are evidence of Britain’s identity crisis in the early 21st century.
One Nation, as a conceptual construct which started to develop since the 19th century has often been associated with the Conservative Party’s politics regarding their effort to bridge the gap between “the two nations” (the underprivileged and the well-off members of society) and in the elevation of the condition of the people in the UK (a requisite for social harmony). Most recent conservative Prime Ministers have been eager to present their own version of the One Nation tradition based on their views of the role the state should play (either limited or extended state intervention). David Seawright explains that these conflicting stances are not expressed in a strict polarisation within the party but rather that they move along the conservative ideological continuum (of limited-extended state policies). More than a successively reconstructed concept (to cater for special expectations of the PM and voters). One Nation is a symbol and as such it is characterized by a high degree of ambiguity, fluidity and multivocality. These are precisely the qualities that make One Nation an efficient instrument for Prime ministers to move along the continuum. Murray Edelman refers to these symbols as condensational symbols laden with powerful emotional content as well as a wide range of meanings. The study will seek to provide an assessment of the position of three Prime Ministers (J. Major, D. Cameron and T. May) on the limited-extended state approach continuum through their own representation of One Nation. It will be examined in the context of parliamentary discourse and through catchphrases used by each PM during Prime Minister’s Question Time (PMQs). This paper will also aim to analyse the process of symbolisation and to demonstrate that the use of One Nation as a symbol fulfils different functions and serves various purposes (strategic and social), especially when used in opposition to what can be considered as the invariant core element of One Nation: anti-Socialism.
This essay looks at post 1798 tensions in Belfast, Antrim and Down, using a survey of literary and cultural texts. It notes a range of responses to the first two decades of the nineteenth century in those associated with United Irish aspiration and those associated with maintaining establishment control. On one hand there was a maintenance of the publication of United Irish political, cultural thought and language in the first two decades after the union through continued publication of collections of poetry and other works (Samuel Thomson, James Orr, William Drennan). The essay explores how this work took shape and how it responded to the trauma of 1798, particularly in how the Scots language was deployed as a cultural tool/ weapon in the early 1800s. Alongside this, I discuss the creation and development of a range of “Enlightenment” ventures such as Academical Institution, Poor House, Literary Society and question the extent of these undertakings by looking at how far these were “enlightenment” or were generated by factional, mercantile interests operating under the guise of philanthropic endeavours (e.g. extension of cotton industry in Belfast, economic expansion built on Transatlantic slave trade, imperialist/colonialist animus of many ventures, rewriting of recent history to fit establishment view). These developments are contextualised and questioned alongside the creation of post-Union “Union-ist” Agendas and Groupings (Conservative Anglican and Presbyterian alliances in the cultural sphere in the work of Bishop Thomas Percy, Thomas Romney Robinson, Hugh Porter and Thomas Stott). The trauma and memory of the late eighteenth century left a legacy played out in Belfast’s development post Union and in its articulation, or non-articulation as an Enlightenment space.
The history of the reception of the Russian Symbolist movement in English begins in the 1890s. Readers in Great Britain and the United States could read about the Russian Symbolist Fedor Sologub long before any of his works were translated into English. During World War I and a parallel wave of interest in Russia, Sologub is one of the most popular Russian writers in the English-speaking world. Some of his poetry and prose works are translated into English and during the years 1915-1950 are included in no fewer than 28 Englishlanguage anthologies. During the first years of this period, almost all of his prose that is accessible to English readers is selected and translated by two translators, John Cournos and Stephen Graham. His poetry, on the other hand, is selected and translated by several translators over the course of this entire period. Anthologies with works by Sologub appear in two main waves: from 1915 until the middle of the 1920s, and in the 1940s after the outbreak of WWII. These anthologies demonstrate how Sologub was presented to English-speaking audiences during these years. This article examines English-language anthologies from this period, comparing what, if anything, is said about Sologub in their introductions to the works by Sologub they include. Some presented him as the quintessential decadent, while others tried to show the various sides of Sologub’s works. It is often the case in anthologies that the opinions of Sologub presented by editors are not supported by the works by Sologub these same editors selected for inclusion. The article ends with three bibliographical appendices listing Sologub’s anthologized poetry and prose and the anthologies that included them.
Quantum entanglement is a key resource, which grants quantum systems the ability to accomplish tasks that are classically impossible. Here, we apply Feynman's sum-over-histories formalism to interacting bipartite quantum systems and introduce entanglement measures for bipartite quantum histories. Based on the Schmidt decomposition of the matrix comprised of the Feynman propagator complex coefficients, we prove that bipartite quantum histories are entangled if and only if the Schmidt rank of this matrix is larger than 1. The proposed approach highlights the utility of using a separable basis for constructing the bipartite quantum histories and allows for quantification of their entanglement from the complete set of experimentally measured sequential weak values. We then illustrate the non-classical nature of entangled histories with the use of Hardy's overlapping interferometers and explain why local hidden variable theories are unable to correctly reproduce all observable quantum outcomes. Our theoretical results elucidate how the composite tensor product structure of multipartite quantum systems is naturally extended across time and clarify the difference between quantum histories viewed as projection operators in the history Hilbert space or viewed as chain operators and propagators in the standard Hilbert space.
Knowledge graphs have been adopted in many diverse fields for a variety of purposes. Most of those applications rely on valid and complete data to deliver their results, pressing the need to improve the quality of knowledge graphs. A number of solutions have been proposed to that end, ranging from rule-based approaches to the use of probabilistic methods, but there is an element that has not been considered yet: the edit history of the graph. In the case of collaborative knowledge graphs (e.g., Wikidata), those edits represent the process in which the community reaches some kind of fuzzy and distributed consensus over the information that best represents each entity, and can hold potentially interesting information to be used by knowledge graph refinement methods. In this paper, we explore the use of edit history information from Wikidata to improve the performance of type prediction methods. To do that, we have first built a JSON dataset containing the edit history of every instance from the 100 most important classes in Wikidata. This edit history information is then explored and analyzed, with a focus on its potential applicability in knowledge graph refinement tasks. Finally, we propose and evaluate two new methods to leverage this edit history information in knowledge graph embedding models for type prediction tasks. Our results show an improvement in one of the proposed methods against current approaches, showing the potential of using edit information in knowledge graph refinement tasks and opening new promising research lines within the field.
A family of spherical caps of the 2-dimensional unit sphere $\mathbb{S}^2$ is called a totally separable packing in short, a TS-packing if any two spherical caps can be separated by a great circle which is disjoint from the interior of each spherical cap in the packing. The separable Tammes problem asks for the largest density of given number of congruent spherical caps forming a TS-packing in $\mathbb{S}^2$. We solve this problem up to $8$ spherical caps and upper bound the density of any TS-packing of congruent spherical caps in terms of their angular radius. Based on this, we show that the centered separable kissing number of $3$-dimensional Euclidean balls is $8$. Furthermore, we prove bounds for the maximum of the smallest inradius of the cells of the tilings generated by $n>1$ great circles in $\mathbb{S}^2$. Next, we prove dual bounds for TS-coverings of $\mathbb{S}^2$ by congruent spherical caps. Here a covering of $\mathbb{S}^2$ by spherical caps is called a totally separable covering in short, a TS-covering if there exists a tiling generated by finitely many great circles of $\mathbb{S}^2$ such that the cells of the tiling are covered by pairwise distinct spherical caps of the covering. Finally, we extend some of our bounds on TS-coverings to spherical spaces of dimension $>2$.
This paper emphasizes the importance of a robot's ability to refer to its task history, especially when it executes a series of pick-and-place manipulations by following language instructions given one by one. The advantage of referring to the manipulation history can be categorized into two folds: (1) the language instructions omitting details but using expressions referring to the past can be interpreted, and (2) the visual information of objects occluded by previous manipulations can be inferred. For this, we introduce a history-dependent manipulation task which objective is to visually ground a series of language instructions for proper pick-and-place manipulations by referring to the past. We also suggest a relevant dataset and model which can be a baseline, and show that our model trained with the proposed dataset can also be applied to the real world based on the CycleGAN. Our dataset and code are publicly available on the project website: https://sites.google.com/view/history-dependent-manipulation.
Mrigank Rochan, Mahesh Kumar Krishna Reddy, Linwei Ye
et al.
Recently, there is an increasing interest in highlight detection research where the goal is to create a short duration video from a longer video by extracting its interesting moments. However, most existing methods ignore the fact that the definition of video highlight is highly subjective. Different users may have different preferences of highlight for the same input video. In this paper, we propose a simple yet effective framework that learns to adapt highlight detection to a user by exploiting the user's history in the form of highlights that the user has previously created. Our framework consists of two sub-networks: a fully temporal convolutional highlight detection network $H$ that predicts highlight for an input video and a history encoder network $M$ for user history. We introduce a newly designed temporal-adaptive instance normalization (T-AIN) layer to $H$ where the two sub-networks interact with each other. T-AIN has affine parameters that are predicted from $M$ based on the user history and is responsible for the user-adaptive signal to $H$. Extensive experiments on a large-scale dataset show that our framework can make more accurate and user-specific highlight predictions.
Enoch Powell was appointed Minister of Health by Harold Macmillan in July 1960, before being promoted to the position of Cabinet Minister in 1962. This was seen as a political manoeuvre from a Prime Minister who was hardly well disposed towards Powell. He thus planned to both prevent him from attacking the government with his free market beliefs in the years of the implementation of the Middle Way and put him in a difficult position at the head of a costly department. Powell attempted to leave his mark on it and break with his predecessors’ policies: he intended to streamline NHS spending, while at the same time modernising and humanising the NHS through the introduction of an ambitious Hospital Plan. Powell was convinced that the NHS could be modernised. He believed that the latter should remain in the public domain but at the same time supported private health investment. This paper analyses Powell’s iconoclastic approach by using primary sources from Kew’s national archives and the Powell Papers from Cambridge. In order to enhance historiographical debates, the analysis will focus on three particular points: Powell’s handling of the cost of drugs, cigarette advertising and the fluoridation of water. These issues reveal an additional dilemma that Powell had to face: to what extent could freedom of choice be introduced into a public health service in which the State was the main driver?
The recently proposed Trans-Planckian Censorship Conjecture (TCC) can be used to constrain the energy scale of inflation. The conclusions however depend on the assumptions about post-inflationary history of the Universe. E.g. in the standard case of a thermal post-inflationary history in which the Universe stays radiation dominated at all times from the end of inflation to the epoch of radiation matter equality, TCC has been used to argue that the Hubble parameter during inflation, $H_{\inf}$, is below ${\cal O}(0.1) ~{\rm GeV}$. Cosmological scenarios with a non-thermal post-inflationary history are well-motivated alternatives to the standard picture and it is interesting to find out the possible constraints which TCC imposes on such scenarios. In this work, we find out the amount of enhancement of the TCC compatible bound on $H_{\inf}$ if post-inflationary history before nucleosynthesis was non-thermal. We then argue that if TCC is correct, for a large class of scenarios, it is not possible for the Universe to have undergone a phase of moduli domination.
Software Reliability has just passed the 50-year milestone as a technical discipline along with Software Engineering. This paper traces the roots of Software Reliability Engineering (SRE) from its pre-software history to the beginnings of the field with the first software reliability model in 1967 through its maturation in the 1980s to the current challenges in proving application reliability on smartphones and in other areas. This history began as a thesis proposal for a History of Science research program and includes multiple previously unpublished interviews with founders of the field. The project evolved to also provide a survey of the development of SRE from notable prior histories and from citations of new work in the field including reliability applications to Agile Methods. This history concludes at the modern-day providing bookends in the theory, models, literature, and practice of Software Reliability Engineering from 1968 to 2018 and pointing towards new opportunities to deepen and broaden the field.
In this paper, we construct a tensor network representation of quantum causal histories, as a step towards directly representing states in quantum gravity via bulk tensor networks. Quantum causal histories are quantum extensions of causal sets in the sense that on each event in a causal set is assigned a Hilbert space of quantum states, and the local causal evolutions between events are modeled by completely positive and trace-preserving maps. Here we utilize the channel-state duality of completely positive and trace-preserving maps to transform the causal evolutions to bipartite entangled states. We construct the matrix product state for a single quantum causal history by projecting the obtained bipartite states onto the physical states on the events. We also construct the two dimensional tensor network states for entangled quantum causal histories in a restricted case with compatible causal orders. The possible holographic tensor networks are explored by mapping the quantum causal histories in a way analogous to the exact holographic mapping. The constructed tensor networks for quantum causal histories are exemplified by the non-unitary local time evolution moves in a quantum system on temporally varying discretizations, and these non-unitary evolution moves are shown to be necessary for defining a bulk causal structure and a quantum black hole. Finally, we comment on the limitations of the constructed tensor networks, and discuss some directions for further studies aiming at applications in quantum gravity.