Gabriel CAMARĂ
Hasil untuk "Geography (General)"
Menampilkan 20 dari ~9632925 hasil · dari DOAJ, arXiv, CrossRef, Semantic Scholar
Badr Al Faiya, Stephen McArthur, Ivana Kockar
Distribution networks will experience more installations of distributed generation (DG) that is unpredictable and stochastic in nature. Greater distributed control and intelligence will allow challenges such as voltage control to be handled effectively. The partitioning of power networks into smaller clusters provides a method to split the control problem into manageable sub-problems. This paper presents a community detection-based partitioning technique for distribution networks considering local DGs, allowing them to be grouped and controlled in a distributed manner by using local signals and measurements. This method also allows each community to control the voltage using only neighboring DGs, and for each community to self-organize to reflect varying DG conditions and to maintain stable control. Simulations demonstrate that the partitioning of the large distribution network is effective, and each community is able to self-organize and to regulate the voltage independently using only its local DGs.
Tom Benhamou, James Cummings, Gabriel Goldberg et al.
We introduce a new class of ultrafilters which generalizes the well-known class of simple $P$-point ultrafilters. We prove that for any well-founded $σ$-directed partial order $\mathbb{D}$ there is a mild forcing extension where there is an ultrafilter $U$ on $ω$ with a base $\mathcal{B}$ such that $(\mathcal{B},\supseteq^*)\cong \mathbb{D}$. On a measurable cardinal we prove a similar result: relative to a supercompact cardinal, it is consistent that $κ$ is supercompact, and for a $κ^+$-directed well-founded poset $\mathbb{D}$, there is a ${<}κ$-directed closed $κ^+$-cc forcing extension where there is a \emph{normal} ultrafilter $U$ on $κ$ with a base $\mathcal{B}$ such that $(\mathcal{B},\supseteq^*)\cong \mathbb{D}$. These are optimal results in the class of $P$-points and realize every potential structure of a $P$-point. We apply our constructions to obtain ultrafilters with controlled Tukey-type, in particular, an ultrafilter with non-convex Tukey and depth spectra is presented, answering questions from \cite{Benhamou_2024}. Our construction also provides new models where $\mathfrak{u}_κ<2^κ$, answering questions from \cite{Benhamou_Goldberg2025}.
Cecília Laís Santana da Silva, José Eloízio da Costa
A pecuária leiteira é tradição no semiárido sergipano devido à sua relevância histórica e econômica como fonte de renda e de sobrevivência. Nesse sentido, Poço Redondo é hoje o município que mais produz leite em Sergipe, o que indica um processo de reorganização da bacia leiteira do Alto Sertão. Para entender em qual contexto surge este aumento produtivo, o artigo propõe analisar a inserção do pequeno produtor na cadeia produtiva do leite do povoado Santa Rosa do Ermírio face à subordinação e à assimetria. Portanto, como decurso do método e da análise qualitativa e quantitativa, as nuances da produção leiteira da “terra do leite” podem ser compreendidas como parte de uma macroestrutura do sistema econômico político e em sua relação com Poço Redondo e Sergipe.
Dylan Galt, Langte Ma
We study generalized anti-self-dual instantons defined over Riemannian manifolds equipped with a parallel codimension-$4$ differential form. In particular, for product Riemannian manifolds possessing such a form, we study dimension reduction phenomena, finding a topological criterion for bundles which, when satisfied, allows for a complete characterization of dimension reduction for the corresponding moduli space of generalized ASD instantons. By establishing an integrability result for families of connections, we then deduce explicit descriptions for these moduli spaces, including those of Hermitian Yang--Mills connections, $G_2$-, and $\Spin(7)$-instantons. When one factor in the product is a $4$-manifold, we establish well-behaved compactifications for these moduli spaces.
Vadim Musaev
Paul Dominique Barrette, Karl-Erich Lindenschmidt
This article addresses the question: What is expected from frazil ice activity in rivers, taking into account the changing climate? It begins with an overview of what frazil ice is and what is required for the occurrence of frazil ice events, namely a supercooled water column. Methodologies to anticipate frazil ice events in the short term are based on air temperature and water discharge, underlining the significance of these two parameters for any predictive methods. Longer-term approaches, calibrated against past events (hindcasting), are used to anticipate frazil ice activity into the future, with indicators such as frazil ice risk, water temperature and frazil volume. Any of these approaches could conceivably be applied to frazil-prone river stretches. To assess climate impact, each location should be treated separately. River ice dynamics can lead to the formation of a hanging dam, a frequent outcome of frazil ice generation in the early winter, causing flow restriction. Flood modeling and forecasting capabilities have been developed and implemented for operational use. More frequent mid-winter breakups are expected to extend the occurrence of frazil ice events into the winter months – the prediction of these will require climate model output to adequately capture month-to-month variability. HIGHLIGHTS Previous modeling endeavors aimed at foreseeing frazil ice generation in rivers are summarized.; Frazil ice risk, water temperature and frazil ice volume are model outputs.; Each frazil-prone location should be the subject of its own climate impact study.; Mid-winter breakups (MWBs) will likely be more frequent in the future, which implies that clogging risks at water intakes will extend well into the winter months.;
Mohamed Mahmoud Sebbab, Abdelhadi El Ouahidi, Mehdi Ousbih et al.
The purpose of this paper is to identify, quantify and delineate the areas with suitable aggregate resources in the Precambrian massif of Ifni and the limestone plateau of Lakhssas (southwest Morocco). To fulfill this objective, a study was undertaken on the geotechnical parameters of the various geological outcrops of the region based on the analysis of 42 rock samples (carbonate, magmatic, detritic and volcano-detritic). Initially, we subjected these samples to a series of laboratory tests (impact resistance (L.A), wear resistance (MDE), density, porosity, absorption), to classify them according to geotechnical standards. Then, a geospatial database was created, to exploit these geotechnical data, from a geographical information system (GIS) to produce various thematic maps. Based on the results of this study, all geotechnical classes according to the standards (A to E for the European standard and 1A to 6D for the Moroccan standard) are present with good to very good geomechanical properties (L.A between 12% and 35%, MDE between 5% and 30%). This classification allowed us to use GIS to identify and quantify potential areas for exploitation by assigning five categories of geotechnical suitability levels (high (4), medium (3), low (2), very low (1) and others (0)) and to show that approximately 72% of the study area belongs to the categories high, medium and low. The combination of laboratory results and GIS has allowed us to carry out geotechnical mapping that will be used by regional authorities and actors for good management of the field of quarrying to rationalize the national natural heritage.
Laura Gustafson, Megan Richards, Melissa Hall et al.
Despite impressive advances in object-recognition, deep learning systems' performance degrades significantly across geographies and lower income levels raising pressing concerns of inequity. Addressing such performance gaps remains a challenge, as little is understood about why performance degrades across incomes or geographies. We take a step in this direction by annotating images from Dollar Street, a popular benchmark of geographically and economically diverse images, labeling each image with factors such as color, shape, and background. These annotations unlock a new granular view into how objects differ across incomes and regions. We then use these object differences to pinpoint model vulnerabilities across incomes and regions. We study a range of modern vision models, finding that performance disparities are most associated with differences in texture, occlusion, and images with darker lighting. We illustrate how insights from our factor labels can surface mitigations to improve models' performance disparities. As an example, we show that mitigating a model's vulnerability to texture can improve performance on the lower income level. We release all the factor annotations along with an interactive dashboard to facilitate research into more equitable vision systems.
Athanase Papadopoulos
These are notes on the impact of Lagrange's memoir on the construction of geographical maps. We mention the relations of some ideas and questions introduced in this memoir with other notions that appeared later in the works of several mathematicians, including in particular Chebyshev (19th c.) and Darboux (19th-20th c.), two mathematicians who were particularly interested in geography.
Hao Huang, Katherine R. Davis, H. Vincent Poor
The long-term resilient property of ecosystems has been quantified as ecological robustness (RECO) in terms of the energy transfer over food webs. The RECO of resilient ecosystems favors a balance of food webs' network efficiency and redundancy. By integrating RECO with power system constraints, the authors are able to optimize power systems' inherent resilience as ecosystems through network design and system operation. A previous model used on real power flows and aggregated redundant components for a rigorous mapping between ecosystems and power systems. However, the reactive power flows also determine power systems resilience; and the power components' redundancy is part of the global network redundancy. These characteristics should be considered for RECO-oriented evaluation and optimization for power systems. Thus, this paper extends the model for quantifying RECO in power systems using real, reactive, and apparent power flows with the consideration of redundant placement of generators. Recalling the performance of RECO-oriented optimal power flows under N-x contingencies, the analyses suggest reactive power flows and redundant components should be included for RECO to capture power systems' inherent resilience.
Tyler Cody, Niloofar Shadab, Alejandro Salado et al.
Engineering methods are centered around traditional notions of decomposition and recomposition that rely on partitioning the inputs and outputs of components to allow for component-level properties to hold after their composition. In artificial intelligence (AI), however, systems are often expected to influence their environments, and, by way of their environments, to influence themselves. Thus, it is unclear if an AI system's inputs will be independent of its outputs, and, therefore, if AI systems can be treated as traditional components. This paper posits that engineering general intelligence requires new general systems precepts, termed the core and periphery, and explores their theoretical uses. The new precepts are elaborated using abstract systems theory and the Law of Requisite Variety. By using the presented material, engineers can better understand the general character of regulating the outcomes of AI to achieve stakeholder needs and how the general systems nature of embodiment challenges traditional engineering practice.
Matheus E. Leusin, Bjoern Jindra, Daniel S. Hain
This paper draws upon the evolutionary concepts of technological relatedness and knowledge complexity to enhance our understanding of the long-term evolution of Artificial Intelligence (AI). We reveal corresponding patterns in the emergence of AI - globally and in the context of specific geographies of the US, Japan, South Korea, and China. We argue that AI emergence is associated with increasing related variety due to knowledge commonalities as well as increasing complexity. We use patent-based indicators for the period between 1974-2018 to analyse the evolution of AI's global technological space, to identify its technological core as well as changes to its overall relatedness and knowledge complexity. At the national level, we also measure countries' overall specialisations against AI-specific ones. At the global level, we find increasing overall relatedness and complexity of AI. However, for the technological core of AI, which has been stable over time, we find decreasing related variety and increasing complexity. This evidence points out that AI innovations related to core technologies are becoming increasingly distinct from each other. At the country level, we find that the US and Japan have been increasing the overall relatedness of their innovations. The opposite is the case for China and South Korea, which we associate with the fact that these countries are overall less technologically developed than the US and Japan. Finally, we observe a stable increasing overall complexity for all countries apart from China, which we explain by the focus of this country in technologies not strongly linked to AI.
Laila Loudiki, Mustapha Kchikech, El Hassan Essaky
Due to their broad application to different fields of theory and practice, generalized Petersen graphs $GPG(n,s)$ have been extensively investigated. Despite the regularity of generalized Petersen graphs, determining an exact formula for the diameter is still a difficult problem. In their paper, Beenker and Van Lint have proved that if the circulant graph $C_n(1,s)$ has diameter $d$, then $GPG(n,s)$ has diameter at least $d+1$ and at most $d+2$. In this paper, we provide necessary and sufficient conditions so that the diameter of $GPG(n,s)$ is equal to $d+1,$ and sufficient conditions so that the diameter of $GPG(n,s)$ is equal to $d+2.$ Afterwards, we give exact values for the diameter of $GPG(n,s)$ for almost all cases of $n$ and $s.$ Furthermore, we show that there exists an algorithm computing the diameter of generalized Petersen graphs with running time $O$(log$n$).
Marleen C. deRuiter, Anaïs Couasnon, Marc J. C. van denHomberg et al.
Abstract In recent decades, a striking number of countries have suffered from consecutive disasters: events whose impacts overlap both spatially and temporally, while recovery is still under way. The risk of consecutive disasters will increase due to growing exposure, the interconnectedness of human society, and the increased frequency and intensity of nontectonic hazard. This paper provides an overview of the different types of consecutive disasters, their causes, and impacts. The impacts can be distinctly different from disasters occurring in isolation (both spatially and temporally) from other disasters, noting that full isolation never occurs. We use existing empirical disaster databases to show the global probabilistic occurrence for selected hazard types. Current state‐of‐the art risk assessment models and their outputs do not allow for a thorough representation and analysis of consecutive disasters. This is mainly due to the many challenges that are introduced by addressing and combining hazards of different nature, and accounting for their interactions and dynamics. Disaster risk management needs to be more holistic and codesigned between researchers, policy makers, first responders, and companies.
Kumar Ayush, Burak Uzkent, Chenlin Meng et al.
Contrastive learning methods have significantly narrowed the gap between supervised and unsupervised learning on computer vision tasks. In this paper, we explore their application to geo-located datasets, e.g. remote sensing, where unlabeled data is often abundant but labeled data is scarce. We first show that due to their different characteristics, a non-trivial gap persists between contrastive and supervised learning on standard benchmarks. To close the gap, we propose novel training methods that exploit the spatio-temporal structure of remote sensing data. We leverage spatially aligned images over time to construct temporal positive pairs in contrastive learning and geo-location to design pre-text tasks. Our experiments show that our proposed method closes the gap between contrastive and supervised learning on image classification, object detection and semantic segmentation for remote sensing. Moreover, we demonstrate that the proposed method can also be applied to geo-tagged ImageNet images, improving downstream performance on various tasks. Project Webpage can be found at this link geography-aware-ssl.github.io.
Alexander Gallego Cadavid, Yeinzon Rodriguez, L. Gabriel Gomez
As a modified gravity theory that introduces new gravitational degrees of freedom, the generalized SU(2) Proca theory (GSU2P for short) is the non-Abelian version of the well-known generalized Proca theory where the action is invariant under global transformations of the SU(2) group. This theory was formulated for the first time in Phys. Rev. D 94 (2016) 084041, having implemented the required primary constraint-enforcing relation to make the Lagrangian degenerate and remove one degree of freedom from the vector field in accordance with the irreducible representations of the Poincaré group. It was later shown in Phys. Rev. D 101 (2020) 045008, ibid 045009, that a secondary constraint-enforcing relation, which trivializes for the generalized Proca theory but not for the SU(2) version, was needed to close the constraint algebra. It is the purpose of this paper to implement this secondary constraint-enforcing relation in GSU2P and to make the construction of the theory more transparent. Since several terms in the Lagrangian were dismissed in Phys. Rev. D 94 (2016) 084041 via their equivalence to other terms through total derivatives, not all of the latter satisfying the secondary constraint-enforcing relation, the work was not so simple as directly applying this relation to the resultant Lagrangian pieces of the old theory. Thus, we were motivated to reconstruct the theory from scratch. In the process, we found the beyond GSU2P.
Kalyan Sundar Som
Equitable provision of health care services and full coverage of health accessibility are the major challenge for developing countries to achieve the sustainable development goal (SDG 3 and 10). A geographical information system (GIS) is an effective platform for knowing how much area and population are covered by the existing MCH (maternal child health) services network for better health care planning. The aim of this study is to assess the geographical accessibility of MCH services and how they give impact on infant mortality and fertility in Sagar District. To uncover the answer, this study used buffer zone analysis, service area analysis, and multiple regression analysis. The findings highlight lower accessibility has prevailed in the study area in which 41 percent village was underserved by the buffer zone analysis while 62 percent was underserved by the service area analysis out of 2075 village. It is diversified from higher accessibility in north western Khurai plain region to lower in the central upland exclude the Sagar community development Block. We also find that health accessibility can explain 53 percent of the infant mortality of the district and IMR may control 33 percent of the children ever born in the district. The service area and buffer mapped output may have policy implication for the future establishment of the health center and road network. This policy can be helpful for reducing infant mortality and fertility through this they achieved SDG target.
Ramón Labarca, Belmary Barreto, Jorge Bernal
It is proposed to use the geographical potentialities of the Peonies lagoon scenario as a natural museum for the teaching of Physical Geography. The methodology is descriptive, projective, under a field design, not experimental. When diagnosing the «didactic resources», more than 80% of the respondents affirmed that the teacher does not use «directed and natural resources» in teaching. Regarding the «level of knowledge», there is weakness in the domain of the same, more than 75% of students did not succeed in the indicators «presence of lagoons», «presence of beaches and coastal dunes» and «sedimentation processes» of the Peonies lagoon. A proposal for a natural museum was designed based on the scenario of the referred lagoon, materialized in a guided tour that includes five (5) stations.
J. Brian Pitts
The renaissance of General Relativity witnessed considerable progress regarding both understanding and justifying Einstein's equations. Both general relativists and historians of the subject tend to share a view, General Relativity exceptionalism. But does some of the renaissance progress in understanding and justifying Einstein's equations owe something to particle physics egalitarianism? If so, how should the historiography of gravitation and Einstein's equations reflect that fact? The idea of a graviton mass has a 19th century Newtonian pre-history in Neumann's and Seeliger's long-distance modification of gravity, which (especially for Neumann) altered Poisson's equation to give a potential $e^{-mr}/r$ for a point mass, improving convergence for homogeneous matter. Einstein reinvented the idea before introducing his faulty analogy with $Λ$. This confusion was first critiqued by Heckmann in the 1940s (without effect) and by Trautman, DeWitt, Treder, Rindler, and Freund et al. in the 1960s, and especially more recently by Schücking, but it has misled North, Jammer, Pais, Kerszberg, the Einstein Papers, and Kragh. The error is difficult to catch if one has an aversion to perturbative thinking, but difficult to make if one thinks along the lines of particle physics. The $Λ$-graviton mass confusion not only distorted the interpretation of Einstein's theory, but also obscured a potentially serious particle physics-motivated rivalry (massless vs. massive spin 2). How could one entertain massive spin 2 gravity if $Λ$ is thought already analogous to the Neumann-Seeliger scalar theory? Historiography, like physics, is best served by overcoming the divide between the two views of gravitation.
Halaman 14 dari 481647