Karoline B. Kuchenbaecker, J. Hopper, Daniel R. Barnes et al.
Hasil untuk "History of France"
Menampilkan 20 dari ~2646834 hasil · dari CrossRef, arXiv, DOAJ, Semantic Scholar
B. B. D. Mesquita, Alastair Smith, Randolph M. Siverson et al.
A. Stoler
E. Bruckert, G. Hayem, S. Dejager et al.
Jonah F Messinger, Florian Metzler, Huw Price
One of the most public episodes of gatekeeping in modern science was the case of so-called 'cold fusion'. At a news conference in 1989 the electrochemists Martin Fleischmann and Stanley Pons announced that they had found evidence of nuclear fusion in palladium electrodes loaded with deuterium. There was worldwide interest. Many groups sought to reproduce the results, most unsuccessfully. Within months, the prevailing view became strongly negative. The claims of Fleischmann and Pons came to be regarded as disreputable, as well as false. As the Caltech physicist David Goldstein put it, cold fusion became 'a pariah field, cast out by the scientific establishment' (Goldstein 1994). The case would already be interesting for students of gatekeeping if the story had ended at that point. Even more interestingly, however, the field survived and persisted. It has been enjoying a modest renaissance, with recent government funding both in the US and the EU. This piece offers an opinionated introduction to cold fusion as a case study of scientific gatekeeping, discussing both its early and recent history
Muriel Barbier, Noémie Wansart
Dori Blakely, Doug Johnstone, Gabriele Cugno et al.
We observed the planet-hosting system PDS 70 with the James Webb Interferometer, JWST's aperture masking interferometric mode within NIRISS. Observing with the F480M filter centered at 4.8 μ m, we simultaneously fit geometrical models to the outer disk and the two known planetary companions. We redetect the protoplanets PDS 70 b and c at a signal-to-noise ratio (SNR) of 14.7 and 7.0, respectively. Our photometry of both PDS 70 b and c provides tentative evidence of mid-IR circumplanetary disk emission through fitting spectral energy distribution models to these new measurements and those found in the literature. We also newly detect emission within the disk gap at an SNR of ~4, a position angle of $22{0}_{-15}^{+10}$ °, and an unconstrained separation within ~200 mas. Follow-up observations will be needed to determine the nature of this emission. We place a 5 σ upper limit of 208 ± 10 μ Jy on the flux of the candidate PDS 70 d at 4.8 μ m, which indicates that if the previously observed emission at shorter wavelengths is due to a planet, this putative planet has a different atmospheric composition than PDS 70 b or c. Finally, we place upper limits on emission from any additional planets in the disk gap. We find an azimuthally averaged 5 σ contrast upper limit >7 mag at separations greater than 110 mas. These are the deepest limits to date within ~250 mas at 4.8 μ m and the first space-based interferometric observations of this system.
Lukáš Likavčan
SETI is not a usual point of departure for environmental humanities. However, this paper argues that theories originating in this field have direct implications for how we think about viable inhabitation of the Earth. To demonstrate SETI's impact on environmental humanities, this paper introduces Fermi paradox as a speculative tool to probe possible trajectories of planetary history, and especially the "Sustainability Solution" proposed by Jacob Haqq-Misra and Seth Baum. This solution suggests that sustainable coupling between extraterrestrial intelligences and their planetary environments is the major factor in the possibility of their successful detection by remote observation. By positing that exponential growth is not a sustainable development pattern, this solution rules out space-faring civilizations colonizing solar systems or galaxies. This paper elaborates on Haqq-Misra's and Baum's arguments, and discusses speculative implications of the Sustainability Solution, thus rethinking three concepts in environmental humanities: technosphere, planetary history, and sustainability. The paper advocates that (1) technosphere is a transitory layer that shall fold back into biosphere; (2) planetary history must be understood in a generic perspective that abstracts from terrestrial particularities; and (3) sustainability is not sufficient vector of viable human inhabitation of the Earth, suggesting instead habitability and genesity as better candidates.
Nathalie Blanpain
In 2023, 639,300 people died in France, 35,900 fewer than in 2022, a year of high mortality. Over the last twenty years, from 2004 to 2023, January 3rd was the deadliest day, while August 15th was the least deadly one. Elderly people die significantly less often in the summer. Deaths are also less frequent on public holidays and Sundays. Finally, the risk of dying is higher on one's birthday, especially for young people.
Philippe Pirard, Valentina Decio, Baptiste Pignon et al.
Background Assessing the risk of subsequent self-harm after hospitalisation for COVID-19 is critical for mental health care planning during and after the pandemic. Aims This study aims to compare the risk of admission to hospital for self-harm within 12 months following a COVID-19 hospitalisation during the first half of 2020, with the risk following hospitalisations for other reasons. Method Using the French administrative healthcare database, logistic regression models were employed to analyse data from patients admitted to hospitals in metropolitan France between January and June 2020. The analysis included adjustments for sociodemographic factors, psychiatric history and the level of care received during the initial hospital stay. Results Of the 96 313 patients hospitalised for COVID-19, 336 (0.35%) were subsequently admitted for self-harm within 12 months, compared to 20 135 (0.72%) of 2 797 775 patients admitted for other reasons. This difference remained significant after adjusting for sociodemographic factors (adjusted odds ratio (aOR) = 0.66, 95% CI: 0.59–0.73), psychiatric disorder history (aOR = 0.65, 95% CI: 0.58–0.73) and the level of care received during the initial hospital stay (aOR = 0.70, 95% CI: 0.63–0.78). History of psychiatric disorders and intensive care were strongly correlated with increased risk, while older age was inversely associated with self-harm admissions. Conclusions Hospitalisation for COVID-19 during the early pandemic was linked to a lower risk of subsequent self-harm than hospitalisation for other reasons. Clinicians should consider psychiatric history and intensive care factors in evaluating the risk of future suicide.
Helen Au-Yang, Jacques H. H. Perk
We present our personal histories with Michael Fisher. We describe how each one of us first came to Cornell University. We also discuss our many subsequent interactions and successful collaborations with him on various physics projects.
Peter Galison, Juliusz Doboszewski, Jamee Elder et al.
This white paper outlines the plans of the History Philosophy Culture Working Group of the Next Generation Event Horizon Telescope Collaboration.
A. A. Watson
A brief history of the development of surface detectors for the study of the high-energy cosmic rays is presented. The paper is based on an invited talk given at UHECR2022 held in LAquila, October 2022. In a complementary talk, P Sokolsky discussed the development of the fluorescence technique for air-shower detection.
Andrea Carosso
In this work, I explore the concept of quantization as a mapping from classical phase space functions to quantum operators. I discuss the early history of this notion of quantization with emphasis on the works of Schrödinger and Dirac, and how quantization fit into their overall understanding of quantum theory in the 1920's. Dirac, in particular, proposed a quantization map which should satisfy certain properties, including the property that quantum commutators should be related to classical Poisson brackets in a particular way. However, in 1946, Groenewold proved that Dirac's mapping was inconsistent, making the problem of defining a rigorous quantization map more elusive than originally expected. This result, known as the Groenewold-Van Hove theorem, is not often discussed in physics texts, but here I will give an account of the theorem and what it means for potential "corrections" to Dirac's scheme. Other proposals for quantization have arisen over the years, the first major one being that of Weyl in 1927, which was later developed by many, including Groenewold, and which has since become known as Weyl Quantization in the mathematical literature. Another, known as Geometric Quantization, formulates quantization in differential-geometric terms by appealing to the character of classical phase spaces as symplectic manifolds; this approach began with the work of Souriau, Kostant, and Kirillov in the 1960's. I will describe these proposals for quantization and comment on their relation to Dirac's original program. Along the way, the problem of operator ordering and of quantizing in curvilinear coordinates will be described, since these are natural questions that immediately present themselves when thinking about quantization.
Betsy Barber-O'Malley, Géraldine Lassalle, Patrick Lambert et al.
EuroDiad version 4.0 is a set of data tables that store information about the presences/absences and population functionality of diadromous species (lampreys and fish) populations in selected catchments in Europe, the Middle East, and North Africa from 1750 to present time. This database contains distribution and life-history trait information for twenty-eight European diadromous species and geomorphological data for each of the selected catchments, though not every species has data for every catchment and time period. EuroDiad was originally created in 2005–2006 (EuroDiad 1.0 and 2.0), and contained data for 196 catchments and two time periods (1851–1950 and 1951–2010). It underwent a major update in 2009–2010 (EuroDiad 3.2) through a validation process by European fisheries experts. Version 3.2 included the addition of 63 small-sized catchments (< 10,000 km2) and an additional time period (1751–1850) for select species and catchments. This database underwent a second validation process in 2019–2020 and was updated to v 4.0, with the primary goal of providing information for a new generation of species distribution models, referred to as hybrid models, which incorporate both habitat suitability and population dynamics within their framework. Secondary objectives of this update were to: (a) incorporate new catchments for which information was provided by additional experts, (b) validate existing information about the presences or absences of diadromous species and categorize their population functionality within a catchment, and (c) perform data hygiene to prepare the database for broad dissemination. Information on the life history, morphology, and phenology of four emblematic species (i.e. eel, salmon, lamprey and shad) were added in this occasion. Data for this update were validated by DiadES project partners (www.diades.eu) and local experts. This update was focused on catchments located in the Atlantic Area for use in the DiadES project. Data were divided by country, and validation was performed for catchments in Ireland, the U.K., Spain, Portugal, and France under the supervision of national organisations in fisheries and environmental management. DiadES project partners were asked to validate geomorphological information for the catchment (location of the outlet, surface area of the drainage basin, length of the main watercourse, elevation at the headwaters), as well as the presences/absences information and population functionality categories for all species already present in EuroDiad for their country. If possible, verification was done for each of the three time periods. Partners were also asked to provide data for any other catchments for which they had access to information on fish population status. EuroDiad 4.0 now stores data for 350 catchments (of which 292 have population functionality records) and three time periods, though the precision of information varies and not every species has information for each time period. This validation process strengthened the usefulness of EuroDiad, which is now updated and available for use by the research community.
Ivan Burel
At a time when Bonapartists struggled to regain power in France, the writer Louis Geoffroy imagined his hero, Napoleon, conquering first Europe, then the world, all nations (even Russia, England) bowing to his irresistible drive which stretches to Africa and the Middle East, to China, Japan, and all Pacific islands, to the American continent–to the point where every nation and every aspect of government is under Napoleon’s thumb, History effectually comes to an end, and strife between nations gives way to spectacular scientific and technical invention. Beyond the gratifications of revanche-driven uchronia, Geoffroy’s work strikes a sinister note as civil liberties are sacrificed on the altar of peace and progress.
Emma Cossez, Philip Baker, Mélissa Mialon
Abstract Most babies in France are fed with infant formula and then commercial complementary foods, many of which are ultra‐processed and harmful to health. Internationally, there is opposition by the baby food industry to the introduction of public health policies that would limit the marketing and consumption of such products. Our aim was to identify the key baby food industry actors, describe their history and corporate political activity (CPA) in France. We sourced publicly available information, which we triangulated with data from 10 semi‐structured interviews. Qualitative thematic analysis was undertaken simultaneously to data collection, guided by an existing classification of the CPA of the food industry. The baby food industry in France has shaped the science on infant and young child nutrition and nurtured long‐established relationships with health professionals. This corporate science and these relationships helped baby food companies to portray themselves as experts on child‐related topics. The baby food industry has also engaged with a broad range of civil society organisations, particularly through the concept of the first 1000 days of life, and during the covid‐19 pandemic. We found evidence, although limited, that the baby food industry directly lobbied the French government. Since its early development in France in the 19th century, the baby food industry used its CPA to promote its products and protect and sustain its market. Our findings can be used to recognise, anticipate and address the CPA of this industry, and to minimise any negative influence it may have on babies' and mother's health.
I. Belyaev, G. Carboni, N. Harnew et al.
In this paper we describe the history of the LHCb experiment over the last three decades, and its remarkable successes and achievements. LHCb was conceived primarily as a b-physics experiment, dedicated to CP violation studies and measurements of very rare b decays, however the tremendous potential for c-physics was also clear. At first data taking, the versatility of the experiment as a general-purpose detector in the forward region also became evident, with measurements achievable such as electroweak physics, jets and new particle searches in open states. These were facilitated by the excellent capability of the detector to identify muons and to reconstruct decay vertices close to the primary pp interaction region. By the end of the LHC Run 2 in 2018, before the accelerator paused for its second long shut down, LHCb had measured the CKM quark mixing matrix elements and CP violation parameters to world-leading precision in the heavy-quark systems. The experiment had also measured many rare decays of b and c quark mesons and baryons to below their Standard Model expectations, some down to branching ratios of order 10-9. In addition, world knowledge of b and c spectroscopy had improved significantly through discoveries of many new resonances already anticipated in the quark model, and also adding new exotic four and five quark states.
Barak Shoshany, Jared Wogan
In a previous paper, we showed that a class of time travel paradoxes which cannot be resolved using Novikov's self-consistency conjecture can be resolved by assuming the existence of multiple histories or parallel timelines. However, our proof was obtained using a simplistic toy model, which was formulated using contrived laws of physics. In the present paper we define and analyze a new model of time travel paradoxes, which is more compatible with known physics. This model consists of a traversable Morris-Thorne wormhole time machine in 3+1 spacetime dimensions. We define the spacetime topology and geometry of the model, calculate the geodesics of objects passing through the time machine, and prove that this model inevitably leads to paradoxes which cannot be resolved using Novikov's conjecture, but can be resolved using multiple histories. An open-source simulation of our new model using Mathematica is available for download on GitHub. We also provide additional arguments against the Novikov self-consistency conjecture by considering two new paradoxes, the switch paradox and the password paradox, for which assuming self-consistency inevitably leads to counter-intuitive consequences. Our new results provide more substantial support to our claim that if time travel is possible, then multiple histories or parallel timelines must also be possible.
Michele Guerra
Technique and creativity Having been called upon to provide a contribution to a publication dedicated to “Techne”, I feel it is fitting to start from the theme of technique, given that for too many years now, we have fruitlessly attempted to understand the inner workings of cinema whilst disregarding the element of technique. And this has posed a significant problem in our field of study, as it would be impossible to gain a true understanding of what cinema is without immersing ourselves in the technical and industrial culture of the 19th century. It was within this culture that a desire was born: to mould the imaginary through the new techniques of reproduction and transfiguration of reality through images. Studying the development of the so-called “pre-cinema” – i.e. the period up to the conventional birth of cinema on 28 December 1895 with the presentation of the Cinématographe Lumière – we discover that the technical history of cinema is not only almost more enthralling than its artistic and cultural history, but that it contains all the great theoretical, philosophical and scientific insights that we need to help us understand the social, economic and cultural impact that cinema had on the culture of the 20th century. At the 1900 Paris Exposition, when cinema had already existed in some form for a few years, when the first few short films of narrative fiction also already existed, the cinematograph was placed in the Pavilion of Technical Discoveries, to emphasise the fact that the first wonder, this element of unparalleled novelty and modernity, was still there, in technique, in this marvel of innovation and creativity. I would like to express my idea through the words of Franco Moretti, who claims in one of his most recent works that it is only possible to understand form through the forces that pulsate through it and press on it from beneath, finally allowing the form itself to come to the surface and make itself visible and comprehensible to our senses. As such, the cinematic form – that which appears on the screen, that which is now so familiar to us, that which each of us has now internalised, that has even somehow become capable of configuring our way of thinking, imagining, dreaming – that form is underpinned by forces that allow it to eventually make its way onto the screen and become artistic and narrative substance. And those forces are the forces of technique, the forces of industry, the economic, political and social forces without which we could never hope to understand cinema. One of the issues that I always make a point of addressing in the first few lessons with my students is that if they think that the history of cinema is made up of films, directors, narrative plots to be understood, perhaps even retold in some way, then they are entirely on the wrong track; if, on the other hand, they understand that it is the story of an institution with economic, political and social drivers within it that can, in some way, allow us to come to the great creators, the great titles, but that without a firm grasp of those drivers, there is no point in even attempting to explore it, then they are on the right track. As I see it, cinema in the twentieth century was a great democratic, interclassist laboratory such as no other art has ever been, and this occurred thanks to the fact that what underpinned it was an industrial reasoning: it had to respond to the capital invested in it, it had to make money, and as such, it had to reach the largest possible number of people, immersing it into a wholly unprecedented relational situation. The aim was to be as inclusive as possible, ultimately giving rise to the idea that cinema could not be autonomous, as other forms of art could be, but that it must instead be able to negotiate all the various forces acting upon it, pushing it in every direction. This concept of negotiation is one which has been explored in great detail by one of the greatest film theorists of our modern age, Francesco Casetti. In a 2005 book entitled “Eye of the Century”, which I consider to be a very important work, Casetti actually argues that cinema has proven itself to be the art form most capable of adhering to the complexity and fast pace of the short century, and that it is for this very reason that its golden age (in the broadest sense) can be contained within the span of just a hundred years. The fact that cinema was the true epistemological driving force of 20th-century modernity – a position now usurped by the Internet – is not, in my opinion, something that diminishes the strength of cinema, but rather an element of even greater interest. Casetti posits that cinema was the great negotiator of new cultural needs, of the need to look at art in a different way, of the willingness to adapt to technique and technology: indeed, the form of cinema has always changed according to the techniques and technologies that it has brought to the table or established a dialogue with on a number of occasions. Barry Salt, whose background is in physics, wrote an important book – publishing it at his own expense, as a mark of how difficult it is to work in certain fields – entitled “Film Style and Technology”, in which he calls upon us stop writing the history of cinema starting from the creators, from the spirit of the time, from the great cultural and historical questions, and instead to start afresh by following the techniques available over the course of its development. Throughout the history of cinema, the creation of certain films has been the result of a particular set of technical conditions: having a certain type of film, a certain type of camera, only being able to move in a certain way, needing a certain level of lighting, having an entire arsenal of equipment that was very difficult to move and handle; and as the equipment, medium and techniques changed and evolved over the years, so too did the type of cinema that we were able to make. This means framing the history of cinema and film theory in terms of the techniques that were available, and starting from there: of course, whilst Barry Salt’s somewhat provocative suggestion by no means cancels out the entire cultural, artistic and aesthetic discourse in cinema – which remains fundamental – it nonetheless raises an interesting point, as if we fail to consider the methods and techniques of production, we will probably never truly grasp what cinema is. These considerations also help us to understand just how vast the “construction site” of cinema is – the sort of “factory” that lies behind the production of any given film. Erwin Panofsky wrote a single essay on cinema in the 1930s entitled “Style and Medium in the Motion Pictures” – a very intelligent piece, as one would expect from Panofsky – in which at a certain point, he compares the construction site of the cinema to those of Gothic cathedrals, which were also under an immense amount of pressure from different forces, namely religious ones, but also socio-political and economic forces which ultimately shaped – in the case of the Gothic cathedral and its development – an idea of the relationship between the earth and the otherworldly. The same could be said for cinema, because it also involves starting with something very earthly, very grounded, which is then capable of unleashing an idea of imaginary metamorphosis. Some scholars, such as Edgar Morin, will say that cinema is increasingly becoming the new supernatural, the world of contemporary gods, as religion gradually gives way to other forms of deification. Panofsky’s image is a very focused one: by making film production into a construction site, which to all intents and purposes it is, he leads us to understand that there are different forces at work, represented by a producer, a scriptwriter, a director, but also a workforce, the simple labourers, as is always the case in large construction sites, calling into question the idea of who the “creator” truly is. So much so that cinema, now more than ever before, is reconsidering the question of authorship, moving towards a “history of cinema without names” in an attempt to combat the “policy of the author” which, in the 1950s, especially in France, identified the director as the de facto author of the film. Today, we are still in that position, with the director still considered the author of the film, but that was not always so: back in the 1910s, in the United States, the author of the film was the scriptwriter, the person who wrote it (as is now the case for TV series, where they have once again taken pride of place as the showrunner, the creator, the true author of the series, and nobody remembers the names of the directors of the individual episodes); or at times, it can be the producer, as was the case for a long time when the Oscar for Best Picture, for example, was accepted by the producer in their capacity as the commissioner, as the “owner” of the work. As such, the theme of authorship is a very controversial one indeed, but one which helps us to understand the great meeting of minds that goes into the production of a film, starting with the technicians, of course, but also including the actors. Occasionally, a film is even attributed to the name of a star, almost as if to declare that that film is theirs, in that it is their body and their talent as an actor lending it a signature that provides far more of a draw to audiences than the name of the director does. In light of this, the theme of authorship, which Panofsky raised in the 1930s through the example of the Gothic cathedral, which ultimately does not have a single creator, is one which uses the image of the construction site to also help us to better understand what kind of development a film production can go through and to what extent this affects its critical and historical reception; as such, grouping films together based on their director means doing something that, whilst certainly not incorrect in itself, precludes other avenues of interpretation and analysis which could have favoured or could still favour a different reading of the “cinematographic construction site”. Design and execution The great classic Hollywood film industry was a model that, although it no longer exists in the same form today, unquestionably made an indelible mark at a global level on the history not only of cinema, but more broadly, of the culture of the 20th century. The industry involved a very strong vertical system resembling an assembly line, revolving around producers, who had a high level of decision-making autonomy and a great deal of expertise, often inclined towards a certain genre of film and therefore capable of bringing together the exact kinds of skills and visions required to make that particular film. The history of classic American cinema is one that can also be reconstructed around the units that these producers would form. The “majors”, along with the so-called “minors”, were put together like football teams, with a chairman flanked by figures whom we would nowadays refer to as a sporting director and a managing director, who built the team based on specific ideas, “buying” directors, scriptwriters, scenographers, directors of photography, and even actors and actresses who generally worked almost exclusively for their major – although they could occasionally be “loaned out” to other studios. This system led to a very marked characterisation and allowed for the film to be designed in a highly consistent, recognisable way in an age when genres reigned supreme and there was the idea that in order to keep the audience coming back, it was important to provide certain reassurances about what they would see: anyone going to see a Western knew what sorts of characters and storylines to expect, with the same applying to a musical, a crime film, a comedy, a melodrama, and so on. The star system served to fuel this working method, with these major actors also representing both forces and materials in the hands of an approach to the filmmaking which had the ultimate objective of constructing the perfect film, in which everything had to function according to a rule rooted in both the aesthetic and the economic. Gore Vidal wrote that from 1939 onwards, Hollywood did not produce a single “wrong” film: indeed, whilst certainly hyperbolic, this claim confirms that that system produced films that were never wrong, never off-key, but instead always perfectly in tune with what the studios wished to achieve. Whilst this long-entrenched system of yesteryear ultimately imploded due to certain historical phenomena that determined it to be outdated, the way of thinking about production has not changed all that much, with film design remaining tied to a professional approach that is still rooted within it. The overwhelming majority of productions still start from a system which analyses the market and the possible economic impact of the film, before even starting to tackle the various steps that lead up to the creation of the film itself. Following production systems and the ways in which they have changed, in terms of both the technology and the cultural contexts, also involves taking stock of the still considerable differences that exist between approaches to filmmaking in different countries, or indeed the similarities linking highly disparate economic systems (consider, for example, India’s “Bollywood” or Nigeria’s “Nollywood”: two incredibly strong film industries that we are not generally familiar with as they lack global distribution, although they are built very solidly). In other words, any attempt to study Italian cinema and American cinema – to stay within this double field – with the same yardstick is unthinkable, precisely because the context of their production and design is completely different. Composition and innovation Studying the publications on cinema in the United States in the early 1900s – which, from about 1911 to 1923, offers us a revealing insight into the attempts made to garner an in-depth understanding of how this new storytelling machine worked and the development of the first real cultural industry of the modern age – casts light on the centrality of the issues of design and composition. I remain convinced that without reading and understanding that debate, it is very difficult to understand why cinema is as we have come to be familiar with it today. Many educational works investigated the inner workings of cinema, and some, having understood them, suggested that they were capable of teaching others to do so. These publications have almost never been translated into Italian and remain seldom studied even in the US, and yet they are absolutely crucial for understanding how cinema established itself on an industrial and aesthetic level. There are two key words that crop up time and time again in these books, the first being “action”, one of the first words uttered when a film starts rolling: “lights, camera, action”. This collection of terms is interesting in that “motore” highlights the presence of a machine that has to be started up, followed by “action”, which expresses that something must happen at that moment in front of that machine, otherwise the film will not exist. As such, “action” – a term to which I have devoted some of my studies – is a fundamental word here in that it represents a sort of moment of birth of the film that is very clear – tangible, even. The other word is “composition”, and this is an even more interesting word with a history that deserves a closer look: the first professor of cinema in history, Victor Oscar Freeburg (I edited the Italian translation of his textbook “The Art of Photoplay Making”, published in 1918), took up his position at Columbia University in 1915 and, in doing so, took on the task of teaching the first ever university course in cinema. Whilst Freeburg was, for his time, a very well-educated and highly-qualified person, having studied at Yale and then obtained his doctorate in theatre at Columbia, cinema was not entirely his field of expertise. He was asked to teach a course entitled “Photoplay Writing”. At the time, a film was known as a “photoplay”, in that it was a photographed play of sorts, and the fact that the central topic of the course was photoplay writing makes it clear that back then, the scriptwriter was considered the main author of the work. From this point of view, it made sense to entrust the teaching of cinema to an expert in theatre, based on the idea that it was useful to first and foremost teach a sort of photographable dramaturgy. However, upon arriving at Columbia, Freeburg soon realised whilst preparing his course that “photoplay writing” risked misleading the students, as it is not enough to simply write a story in order to make a film; as such, he decided to change the title of his course to “photoplay composition”. This apparently minor alteration, from “writing” to “composition”, in fact marked a decisive conceptual shift in that it highlighted that it was no longer enough to merely write: one had to “compose”. So it was that the author of a film became, according to Freeburg, not the scriptwriter or director, but the “cinema composer” (a term of his own coinage), thus directing and broadening the concept of composition towards music, on the one hand, and architecture, on the other. We are often inclined to think that cinema has inherited expressive modules that come partly from literature, partly from theatre and partly from painting, but in actual fact, what Freeburg helps us to understand is that there are strong elements of music and architecture in a film, emphasising the lofty theme of the project. In his book, he explores at great length the relationship between static and dynamic forms in cinema, a topic that few have ever addressed in that way and that again, does not immediately spring to mind as applicable to a film. I believe that those initial intuitions were the result of a reflection unhindered by all the prejudices and preconceived notions that subsequently began to condition film studies as a discipline, and I feel that they are of great use to use today because they guide us, on the one hand, towards a symphonic idea of filmmaking, and on the other, towards an idea that preserves the fairly clear imprint of architecture. Space-Time In cinema as in architecture, the relationship between space and time is a crucial theme: in every textbook, space and time are amongst the first chapters to be studied precisely because in cinema, they undergo a process of metamorphosis – as Edgar Morin would say – which is vital to constructing the intermediate world of film. Indeed, from both a temporal and a spatial point of view, cinema provides a kind of ubiquitous opportunity to overlap different temporalities and spatialities, to move freely from one space to another, but above all, to construct new systems of time. The rules of film editing – especially so-called “invisible editing”, i.e. classical editing that conceals its own presence – are rules built upon specific and precise connections that hold together different spaces – even distant ones – whilst nonetheless giving the impression of unity, of contiguity, of everything that cinema never is in reality, because cinema is constantly fragmented and interrupted, even though we very often perceive it in continuity. As such, from both a spatial and a temporal perspective, there are technical studies that explain the rules of how to edit so as to give the idea of spatial continuity, as well as theoretical studies that explain how cinema has transformed our sense of space and time. To mark the beginning of Parma’s run as Italy’s Capital of Culture, an exhibition was organised entitled “Time Machine. Seeing and Experiencing Time”, curated by Antonio Somaini, with the challenge of demonstrating how cinema, from its earliest experiments to the digital age, has managed to manipulate and transform time, profoundly affecting our way of engaging with it. The themes of time and space are vital to understanding cinema, including from a philosophical point of view: in two of Gilles Deleuze’s seminal volumes, “The Movement Image” and “The Time Image”, the issues of space and time become the two great paradigms not only for explaining cinema, but also – as Deleuze himself says – for explaining a certain 20th-century philosophy. Deleuze succeeds in a truly impressive endeavour, namely linking cinema to philosophical reflection – indeed, making cinema into an instrument of philosophical thought; this heteronomy of filmmaking is then also transferred to its ability to become an instrument that goes beyond its own existence to become a reflection on the century that saw it as a protagonist of sorts. Don Ihde argues that every era has a technical discovery that somehow becomes what he calls an “epistemological engine”: a tool that opens up a system of thought that would never have been possible without that discovery. One of the many examples of this over the centuries is the camera obscura, but we could also name cinema as the defining discovery for 20th-century thought: indeed, cinema is indispensable for understanding the 20th century, just as the Internet is for understanding our way of thinking in the 21st century. Real-virtual Nowadays, the film industry is facing the crisis of cinema closures, ultimately caused by ever-spreading media platforms and the power of the economic competition that they are exerting by aggressively entering the field of production and distribution, albeit with a different angle on the age-old desire to garner audiences. Just a few days ago, Martin Scorsese was lamenting the fact that on these platforms, the artistic project is in danger of foundering, as excellent projects are placed in a catalogue alongside a series of products of varying quality, thus confusing the viewer. A few years ago, during the opening ceremony of the academic year at the University of Southern California, Steven Spielberg and George Lucas expressed the same concept about the future of cinema in a different way. Lucas argued that cinemas would soon have to become incredibly high-tech places where people can have an experience that is impossible to reproduce elsewhere, with a ticket price that takes into account the expanded and increased experiential value on offer thanks to the new technologies used. Spielberg, meanwhile, observed that cinemas will manage to survive if they manage to transform the cinemagoer from a simple viewer into a player, an actor of sorts. The history of cinema has always been marked by continuous adaptation to technological evolutions. I do not believe that cinema will ever end. Jean-Luc Godard, one of the great masters of the Nouvelle Vague, once said in an interview: «I am very sorry not to have witnessed the birth of cinema, but I am sure that I will witness its death». Godard, who was born in 1930, is still alive. Since its origins, cinema has always transformed rather than dying. Raymond Bellour says that cinema is an art that never finishes finishing, a phrase that encapsulates the beauty and the secret of cinema: an art that never quite finishes finishing is an art that is always on the very edge of the precipice but never falls off, although it leans farther and farther over that edge. This is undoubtedly down to cinema’s ability to continually keep up with technique and technology, and in doing so to move – even to a different medium – to relocate, as contemporary theorists say, even finally moving out of cinemas themselves to shift onto platforms and tablets, yet all without ever ceasing to be cinema. That said, we should give everything we’ve got to ensure that cinemas survive.
Halaman 9 dari 132342